Don't talk to an external device over a protocol in order to use its patch. That has inherent latency, risk of stuck notes, lost connections, and setup complexity. Move the patch into the instrument if it is possible.
What people really want when they ask for MIDI or OSC is the ability to control arbitrary patches. What they are not asking for is the latency hassle of talking over a network, or a rats nest of devices connecting to each other, or some configuration so that the components find each other. Pd is interesting in that if it becomes a standard, then you can move the patches into an on-board brain rather than talking over a protocol. Once you have a patch installed on a device, you can unplug it and chuck it off into a corner until you need it, rather than having to get devices strung together correctly.
In any case, Pythagoras is the name of this app for now. It's a basic bit of code that just exercises fast 10 finger tracking, getting libpd fired up, and activating a polyphonic synth. I am a complete Pd newbie, so I may be writing the world's most inefficient synth, but in any case the sound is smooth with no skips.
Mugician was painful to write in large part because I was learning signal processing and making the efficiency tradeoffs for reverb vs harmonics vs the rest of the system. I managed to get around 30ms of latency. Many things come into play to make this happen. The most important of all is the buffer size being used. Since buffers must complete before the next one can start, they must be as short as performance allows. I use length 256 buffers in my app so that responses can be handled right away. There is a tradeoff however. If you reduce the buffer size, as you approach size 1, CPU usage gets out of control and you end up being late. When you are late there is silence, which creates clicking noises in the sound; impulses. There is a lesser known thing to worry about as well. The touch screen of an ipad physically takes something like 12ms to detect the touches. Then the touches have to be discovered by the OS and handed to the application for processing. In addition, when the app gets these touches it tells the sound engine to change a parameter like volume or frequency as fast as possible. Depending upon implementation, more than the buffer length may determine how long that takes. Think of thread scheduling, etc.
I figured out what I was doing wrong with latency when using libpd. Pythagoras's original goals are being pursued now.
No comments:
Post a Comment