This is an extension of http://rrr00bb.blogspot.com/2011/08/midi-instruments-on-ios.html, of which this could be a more focused restatement.
In order to facilitate the correct interpretation of MIDI instruments that are based on iPad and iPhone instruments, we need to be specific about some of the areas that MIDI leaves up to the implementer (to get wrong in most cases!). This is being written for iOS developers who are writing MIDI instruments.
Wizdom Music is involved with, or actually makes these instruments at the current time:
- MorphWiz
- MorphWizMidi
- SampleWiz
- Geo Synth
- Haken Continuum (not owned by Wizdom, just very involved)
These new instruments need to be able to do smooth bends from the lowest renderable note to the highest at the fullest pitch resolution. A simple use case is to just hand your ipad to a child. Putting one finger on the glass and dragging it around for the duration of an entire song, or doing so with two fingers bending in different directions; this is obvious, and so needs to be made to work within what MIDI standardizes.
They are also a special case in that we are not doing bending with any real pitch wheels, and these virtual pitch wheels have no real top and bottom position. So we include a definition of a Non-Registered-Parameter-Number (NRPN) that functions as a note tie. It allows for bends to be as wide as you would like, independent of bend width. It's best to NOT set the bend width away from the small default, because the bend width only has 14 bits of resolution, which is quite small.
The NRPN is a standard feature of MIDI, but it's up to us to standardize any definitions that we create.
Frequency
MIDI specifies notes because it was originally tightly tied to discrete keys going up and down. The frequency of the note was implicit. But when pitch wheels become involved you can make the desired frequency explicit. Assume for a moment that we set the pitch wheel to mean plus or minus only one semitone:
frequency = baseFrequencyLowC * 2^((note + bend)/12)
Note is an integer value, and bend is a number between -1 and 1. This gives the exact intended frequency. The synth should not be taking any liberties and do exactly what is stated here.
Pitch Wheels
Pitch wheels are assumed to be centered at zero when the synth starts. Once messages come in, the last value that the pitch wheel was set to for a channel should stay where it is until a new bend message comes in. This is important because if the pitch wheel is not centered and a note comes on, the initial pitch on attack includes the bend in its frequency - with no portamento, and when the note is released, it also releases at the exact frequency; which includes that channel's bend value.
Assume that the pitch wheel is set to the 2 semitone default here. We behave as if we had gotten this when there is a reset of MIDI:
bend ch1 0%
bend ch2 0%
....
bend ch16 0%
So if we know this...
off ch1 33
...
bend ch1 100%
on ch1 33
off ch1 33
Because the bend width is 2 semitones, then this note strikes exactly the same pitch as B on attack and on release. There should be no audible difference from:
off ch1 33
...
on ch1 35
off ch1 35
If this isn't done right, then fretlessnes doesn't work right. Fretlessness is the foundation to getting correct *great* pitch handling. It's the only way to get microtonality working right as well, because real-world microtonality requires actual fretlessness (not MIDI tuning tables). Choral singers do NOT actually sing in 12ET and violinists don't play in it; they will tend to Just and Pythagorean intervals (which arise by the laws of physics, not man-made music theory) in the absence of other instruments providing the constant reference tones to keep that from happening. This is even more true with ethnic instruments like Sitar. It's also true that every instrument has characteristic intonation differences that helps to identify it.
So, these pitch issues are not something to be taken lightly as 'it never happens in the real world'. This is part of the reason why electronic music is still having trouble getting past the 'uncanny valley' into being convincing. The ability to start solving this problem is one of the more exciting things about using an iPad (or any continuous surface) as an instrument.
MIDI Channels
Omni is provided as a way to have a synth render MIDI that was originally being sent out across multiple channels. But what seems to typically get done is to just substitute the original channel number with its own before interpretation, as an easy hack that mostly works. Like this:
on ch1 33
bend ch1 +25%
on ch2 33
bend ch2 -25%
off ch1 33
The expected result is two instances of the same note where one is bent up a quartertone, and the other is bent down a quartertone. At the end of this the note bent down a quartertone is still playing. So, if we are doing omni, just treat it all like it's channel 1:
on ch1 33
bend ch1 +25%
on ch1 33
bend ch1 -25%
off ch1 33
This is wrong, given what the original channels were, and it's what most synths will do. First, we end up with silence at the end of the sequence, a dropped note (almost as bad as a stuck note), because we told the only note 33 on ch1 to be silent. The second thing is instead of chording two of the same note at detuned values (ie: chorusing), the first note 33 just got bent to the value of the second instance of 33.
So specifically, you must pass this test:
on ch1 33
bend ch1 +25%
on ch2 33
bend ch2 -25%
off ch1 33
The note should still be playing, detuned a quartertone here. This means that there are two separate instances of A, simultaneously played, chording, possibly out of phase, etc. This is no stranger than playing A and A# together. It should work, and this happens all the time on string instruments.
New Note - New Channel
You don't know if a note will be bent in the future, so it must have its own pitch wheel. If you run out of channels and have to share a note with another channel then you will get artifacts if one of the notes on that channel needs to be bent.
As a compromise, you don't have to monopolize all 16 channels for this. You just need a reasonable number, which if you are emulating a string instrument, is generally going to be the number of strings, or maybe just the size of the largest playable chord.
Adding more channels does help if you can afford it in performance, because a note being turned off is not the end of its life; releases are still happening when the note is turned off.
Single Patch "Omni"
Just because there is only one patch at a time, it should not mean that the MIDI data should be misinterpreted like the above situation. The channels specified in the original MIDI output exist to keep the various expression parameters separate. It is so important that the pitch wheel is kept separate; this intent should never be violated on pitch, even if it has to be violated for other parameters.
As a hack to make the separate pitch bends work right, a typical way to do this is to setup a multi-timbral synth to simply duplicate the same exact patch and settings across all of the channels (or just a subset of them). But just because you have a synth that can only load one patch should not change the meaning of the bend messages.
Overlaps
One of the issues created by MIDI channeling is overlaps. You must not get a silence when note on/off pairs for same note overlap. This happens when you try to play more notes than there are channels available to play them on:
on ch1 33
on ch1 33
off ch1 33
This should play two notes and then silence.
on ch1 33
on ch2 33
off ch1 33
This should play two notes, where you can hear two of the exact same note playing, then one of them cuts off. You should not hear silence yet. This happens because on a string instrument, you will trill on the exact same note to play very fast, like guitar picking:
on ch1 33
on ch2 33
off ch1 33
off ch2 33
Note: The MIDI specification states that it is up to the implementer what happens when note off and note on are not exactly balanced. But as with internet protocols "Be conservative in what you send and liberal in what you understand".
http://www.gweep.net/~prefect/eng/reference/protocol/midispec.html
"
If a device receives a Note On for a note (number) that is already playing (ie, hasn't been turned off yet), it is up the device whether to layer another "voice" playing the same pitch, or cut off the voice playing the preceding note of that same pitch in order to "retrigger" that note.
"
This is a disaster in the making. It is up to the implementer, but you will get a stuck note if the client and server choose different ways. Don't take that statement to mean that you should create a synth that will leave a stuck note when the notes don't balance.
Most synths take the (on, note, vol) tuple to be setting the state for a note, not pushing an instance of the note onto a stack, to be popped by a note off. If you are liberal in what you send, then you will only add to the number of synths that get stuck notes. If you need to play a voice twice, you should be using the channels. This is consistent with the practice of new note new channel that we are using here.
Think about the fact that (on, note, 0) is an alias for a note off. (on, note, vol) leaves vol as a variable. you can have balances get undone if the vol is equal to zero even though the intent wasn't to turn the note off.
Release Time
When a note is turned off, that's not the end of its life. That's the beginning of the release phase only. The frequency that it releases on will change if you mess with the pitch wheel for that channel before the sound is gone. It will be an annoying artifact for instruments like hammered dulcimers with special tunings (Indian music, etc).
Because every new note down picks a new channel of the 16 available, we need to carefully cycle through the available channels evenly. This will maximize the amount of time that a channel has been dead before we steal that channel to play a new note.
Note Tie Non Registered Parameter Number (restated from previous post)
It is not impossible to bend MIDI notes to any width you want at fullest possible resolution. the problem is that there is no defacto or dejure standard on how this is done. Imagine a piano player trying to simulate a bend, and it's on our channel cycling instrument....
bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
...
off ch1 33
on ch2 34
...
off ch2 34
on ch3 35
...
off ch3 35
on ch4 36
...
So he's playing chromatics to simulate the bend because that's the best he can do. But if we are on a synth that inserts bend messages, the synth can at least bend from one chromatic to the next like this:
bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
bend ch1 20%
bend ch1 40%
bend ch1 60%
bend ch1 80%
bend ch1 100%
off ch1 33
on ch2 34
bend ch2 20%
bend ch2 40%
bend ch2 60%
bend ch2 80%
bend ch2 100%
off ch2 34
on ch3 35
bend ch3 20%
bend ch3 40%
bend ch3 60%
bend ch3 80%
bend ch3 100%
off ch3 35
on ch4 36
bend ch4 20%
bend ch4 40%
bend ch4 60%
bend ch4 80%
bend ch4 100%
So, this would be a smooth bend, except we hear the note retrigger every time we reach the next chromatic. So let's say that we have a special message that notes that there is a note tie coming and that it's done when the next note on appears.
bend ch1 0%
bend ch2 0%
...
bend ch16 0%
on ch1 33
bend ch1 20%
bend ch1 40%
bend ch1 60%
bend ch1 80%
bend ch1 100%
tie ch1 33
off ch1 33
on ch2 34
bend ch2 20%
bend ch2 40%
bend ch2 60%
bend ch2 80%
bend ch2 100%
tie ch2 34
off ch2 34
bend ch3 0% #note that from tie to note on, we expect bends and a note off to happen
bend ch3 0% #note that from tie to note on, we expect bends and a note off to happen
on ch3 35
bend ch3 20%
bend ch3 40%
bend ch3 60%
bend ch3 80%
bend ch3 100%
tie ch3 35
off ch3 35
on ch4 36
bend ch4 20%
bend ch4 40%
bend ch4 60%
bend ch4 80%
bend ch4 100%
We can continue this from the lowest note on the keyboard to the highest for a super-wide bend. It is at the full pitch resolution as well because we aren't playing tricks with the MIDI bend width. It is also the case that if we broadcast this both to a piano that can't bend, and the synth that understands, we get a similar result. It degrades gracefully on the piano, and sounds perfect on the synth that understands. We can use this to track up to 16 fingers at arbitrary pitches (in MIDI range of course!) bending in whatever wild directions they need.
The NRPN looks like this in our code:
#define TRANSITION 1223
static inline void sendNRPN(int ochannel,int msg,int val)
{
//B0 63 6D
//B0 62 30
//B0 06 100
int lsb = msg&0x7f;
int msb = (msg>>7)&0x7f;
//midiPlatform_sendMidiPacket7(0xB0+ochannel, 0x63, msb, 0x62, lsb, 6, val);
midiPlatform_sendMidiPacket3(0xB0+ochannel, 0x63, msb);
midiPlatform_sendMidiPacket3(0xB0+ochannel, 0x62, lsb);
midiPlatform_sendMidiPacket3(0xB0+ochannel, 6, val);
}
static inline void retriggerNewMidiNote(int finger,float midiFloat,int vol,int expr)
{
int channel = midiFingerUsesChannel[finger];
if(channel >= 0)
{
int ochannel = midiChannelOChannelSent[channel];
sendNRPN(ochannel,TRANSITION,midiChannelNote[channel]);
}
stopMidiNote(finger);
startNewMidiNote(finger,midiFloat,vol,expr);
}
Let us know if there is something unreasonable about that message. I haven't used NRPNs before, and since we write both ends of it, they could both be 'wrong' and work just fine between our synths.
Bend Width RPN
Just in case it isn't completely obvious what happens when we change the bend width slider, here's the code that tells the synth our assumed bend width when we move that slider:
static void bendRangeSliderMoved() {
// Check the current bend range with the previous. If it has changed, send the appropriate RPN.
static int previousBendRange = -1;
int bendRange = bendRangeSlider.scaledValue;
if (bendRange != previousBendRange)
{
NSLog(@"bend range has changed from %d to %d", previousBendRange, bendRange);
previousBendRange = bendRange;
// Activate the pitch-bend sensitivity (bend range) RPN.
SoundEngine_sendMIDIPacket3(0xB0, 101, 0);
SoundEngine_sendMIDIPacket3(0xB0, 100, 0);
// Send the new bend range value.
//BUG: earlier versions sent 2*bendRange, which was wrong
SoundEngine_sendMIDIPacket3(0xB0, 6, bendRange );
SoundEngine_sendMIDIPacket3(0xB0, 38, 0);
// Deactivate the pitch-bend sensitivity RPN.
SoundEngine_sendMIDIPacket3(0xB0, 101, 127);
SoundEngine_sendMIDIPacket3(0xB0, 100, 127);
// Flush the MIDI queue.
midiPlatform_flush();
}
// Update the bend range with the sound engine.
SoundEngine_midiBendsInSemitones(bendRange);
// Flag that the texture needs to be updated.
bendRangeSlider.needsNewTexture = YES;
}
Although we've already discussed your views on it, 99% of all that code could be eliminated by using OSC =).
ReplyDeleteOn the topic of support for (or lack thereof) for OSC: while many of the commercial devices and programs are locked into midi (and therefor midi shouldn't be completely disregarded) a great deal of people working with non 12TET tunings will probably be using something that does use OSC (MAX/MSP, PD, SuperCollider). Also many iOS apps could very easily implement OSC in addition to (or instead of) midi since they aren't developed by huge monolithic companies that refuse to support anything that is "non-standard".
Just saying =). Awesome work on all if this however and I can't wait to try Geo Synth once it's out!
Oh yeah, I am well aware that all of what is described is a workaround for a broken system. I have fought against sound engine issues for so long that I caved in and did MIDI, then background MIDI came along and it was finally relevant, especially since we can get iOS apps to do poly bends correctly (omni is usually completely wrong on most synths). Now that I got this, I feel like MIDI got made irrelevant by sample loading. :-| It never ends.
ReplyDeleteIf you want a standard. It would be much better to either use MIDI guitar mode or MIDI Tuning Standard's single note tuning messages, which allow total pitch control per note.
ReplyDeleteI do not support the creation of new standards and formats that bring nothing new to the table, it's just creating a giant mess, and I am now actively advocating against this stuff.
Xj scott: i agree with this basic sentiment, otherwise, midi should simply be tossed as too complex. What this recommendation really is, is multi-timbral midi with many channels set to same patch as the 90% solution, and setting pitch bend width higher - 12 semitones - to get almost all the way there. This works well against Korg Karma for instance. I am working on AlephOne which is the successor to Geo, but as pure Midi. It does full polyphonic fretlessness, and can simultaneously do chording, legato, and microtonality. I may release the library as open source at some point. Try Geo against ThumbJam in fretless mode with string poly turned on to see whay i mean, as it does almost all of this right, except the legato part of string polyphony.
ReplyDeleteI have gone over the full official midi spec in great detail, and have run these ideas past other devs and the guy who maintains the spec.
Tuning tables dont work. They assume that you are retuning a set of discrete keys. Legato is not a mode that you turn on and off. Pitch, bending, and note attack are decided on a per note basis at the controller. Even notes themselves should be decided at the controller. AlephOne's main fretting mode has 15 notes per octave (chromatic, plus a few quartertones). The fretting happens at the controller, not the synth - because pitch tracks the exact finger location and drifts to a fret ; completely the opposite of portamento, which takes pre-quantized notes as input, and arrives to the target later.
The note tie NRPN has no approximation in straight midi. You cant play the lowest note on the instrument and bend it to the highest while simultaneously doing the same for a different pitch. Midi pitch bend only has 12 bits of resolution, and will only accept 24 semitones. This is a basic test that you must get past to start making a correctly behaving fretless instrument.
It is a mess - to the point that the client-side messaging should be standardized in a library of code that takes actual pitch values as input to hide midi's mess. To say that you should just use midi the way it is currently done is to advocate for never fixing the problems. If we dont do it this way, then it is better to take advantage of iOS's dominance to skip out on midi all togeter. What is so maddening is that this is all about the most basic problem of correct pitches, something that OSC gets right trivially
Err. 14 bits... But you get the point. :-). Almost nothing reconized tuning tables if you try them, and not even suporting wider midi bends in most cases. So this gives working microtonality on a wide range of instrumets.
ReplyDeleteSingle note tuning messages do work and are not the same as tables. You haven't thought about the problem. I'm the guy that originally invented the basis of the bend method you are using. The problem I have is adding this NRPN as a new part of the MIDI spec, which no instruments support, when the correct answer is single note, which is part of the spec and has been well thought out to support exactly this scenario that includes total pitch control including hand controlled polyphonic portamento or glides per voice.
ReplyDeleteWith real time single note messages, note pitch is entirely within the single note message. The note number is irrelevant and only is used to identify the note. It's quite similar to the CoreAudio microtonal implementation which has voices separate from pitch. The voice identifier is now the note number. The note number is decorrelated from pitch except through its last received associated single note message. There is no tuning table. The 128 note numbers are voices not keyboard locations. Single note messages control continuous pitch per voice. Similar flexibility as OSC in fact, at least as far as pitch goes.
Guitar mode is also great, but yes, there's a tradeoff between resolution and range. Most instruments support +/-24 semitone range with encompasses 4 octaves. That's good enough for most things. For total pitch control within the constraints of MIDI single note retuning messages are the answer rather than modifying the MIDI spec. If advocating for something better, advocate for instruments to support the MTS spec since it is already the solution to this problem.
The problem with anything that's not made of on/off/bend is graceful degradation.
ReplyDeleteI'm planning on releasing the core library as open source if I can get the blessing, so that people can see how well this actually works in practice.
I have thought about this for a very long time. I use (note+bend) for all this because of the simple fact that almost no devices recognize anything else. The fact that nothing recognizes the note-tie NRPN yet just means that it degrades *gracefully*; IMHO, not doing this is what is wrong with current approaches. You will get a note-retrigger on an old synth, but you won't get a wrong pitch.
Think about what happens if you broadcast the MIDI to multiple devices. One is a piano that won't recognize the bends at all, one is a fairly standard single-channel synth, and the other is our mythical synth that recognizes this scheme (ThumbJam, SampleWiz, Arctic at present).
The most important thing is to never play the wrong pitch, even if you don't recognize the messages. This is why I simply re-trigger the note when bend-width is exceeded. No matter what you set bend width to, it can always be exceeded. This is a touch screen, where it's a perfectly reasonable thing to drop a finger on the glass and play a whole tune without ever picking the finger up - across many octaves.
Since most devices recognize nothing more than note on/off and bend, this has to be the basis upon which anything else is built. Missing support for our NRPN is not a catastrophic thing because the pitches come out right. If the patch has little or no attack, then you can't even tell that it isn't working exactly right.
There is also the issue of what is relevant. Since we now control the synth and the controller on iOS, the hardware is largely irrelevant. It's so hard to create an experience that 'just works' to the iPhone/iPad standard that enhancing MIDI so that there is no parameter futzing and simply ignoring the limitations of current MIDI hardware devices might be the way to go. This is what NRPNS are designed for anyhow, no?
Remember that with both synth and controller living in iOS, if you are going to do something that doesn't degrade gracefully then you should simply ignore MIDI and just write a protocol that works 100% (to avoid all the stupid setup headaches like setting bend width and having channel cycling at all). iOS may be the thing that finally jettisons MIDI from its status as the only answer. :-)
btw, I think the idea of single note tuning messages is a good one.
ReplyDeletefrom what i read about single-note tuning, correct me if i am wrong about whether it degrades gracefully. (ie: will my quartertone scale still come out right on a single channel synth?).
in any case, the note tie is needed for more than this reason. it's a fundamental unit that completes on/off/bend, because moving the note off to a new name and keeping its state without re-attacking it is how legato works as well.
give me a link to read on the single-note retuning.