SuperCollider is a framework for making OSC based synthesizers. It is low-level so that you really can write the synth dynamically and send it to the device to be played. I downloaded the SuperCollider distribution for OSX, and clicked on the SuperCollider binary. Here's a bit from the example, where I am typing in a command-line that is a client to the synth:
//boot the synth server that we will send messages to
s.boot;
//Define a simple synthesizer called "sine"
//that just puts out an 800hz wave to the left channel
//... write this definition down for later use
(
SynthDef(
"sine",
{ arg freq=800; var osc; osc = SinOsc.ar(freq,0,0.1); Out.ar(0,osc); }
).writeDefFile;
)
//send it to the synthesizer
//highlight all code typed so far and press shift-return (the real 'enter' key).
s.sendSynthDef("sine");
//tell the synth to play the sound!!! - but set the pitch parm to 440
s.sendMsg("/s_new","sine", x = s.nextNodeID,1,1, "freq",440);
//to shut the synth up, press command and '.' at the same time.
Wow! SuperCollider is 10 years old, and is now an open source project that has been ported to multiple platforms (I believe ARM is one of them). Judging from its age, it seems probable that it performs well; possibly other than not taking advantage of special FFT acceleration; iOS4.2 supports Accelerate.Framework. If you could embed the SuperCollider server into an iPad instrument program, then you can make it a straightforward synthesizer that's easily re-programmable. Given that OSX supports this sort of use case with its multi-tasking, it might even be possible to just get somebody to ship a SuperCollider server and let the iPad instruments just use it as yet another OSC source.
Is this crazy? Is anybody doing it? This would be the way forward if there are no major performance issues. Under these circumstances, app developers would mostly just write OSC apps and tell you to point at something external like Ableton if you have it, or to the embedded SC server if you need to run stand-alone.
Oh, and... I was looking for SuperCollider books online. One is scheduled to come out in March 2011. Books, especially first books, coming out on a computing topic are good signs on what is coming next.
prediction, you're working at apple before the end of the year :)
ReplyDeleteI think that honor will go to Oren Shomron. Dude in my office is such an Apple fanatic, that you would think that he's trying to boost the value of his Apple shares. ;-)
ReplyDeleteThere is a lot of red-tape in the way of creating the sort of facility you describe.
ReplyDeleteOne major problem with using SuperCollider in an iPad app is that SuperCollider is GPL licensed, and as such it's generally considered to be incompatible with the Apple App Store terms of service. In order to publish such an app through the App Store you would have to get the copyright holder for SuperCollider to agree to relicense the code with a different license that is compatible with Apple's terms.
The other problems getting in the way of creating an iOS synthesis engine is that iOS doesn't support background services nor general interprocess communication besides URL handler, pasteboard and the document transfer protocols.
Exactly. So I went with libpd because that is BSD licensed (which is Max/MSP). My new app, Pythagoras, compiled libpd in along side Mugician's sound engine. The whole libpd choice engine is a directory of files started by a *.pd file now. This is really awesome! .... except latency is just too high. Maybe somebody will have some wisdom that gets me past this problem.
ReplyDeleteThe theoretical buffersize imposed latency is really low, and the engine never skips. This is great. But changes to values that get read into my synth (libpd_senfloat or something like that) appear to enqueue to give an overall response of over 150ms. I haven't figured out why yet.
Mugician is about 30ms overall (with 256 byte sample buffers at 44.1khz), pretty reasonable for an iPad instrument. Inside of Pythagoras, that sound engine is still below 60ms.
Does James Mccartney still work at Apple? It is a little silly that his thing doesn't run on their thing...
ReplyDeleteOccupy Supercollider
I have no idea. I'm just an iOS dev from outside Apple. :-) I abandoned SuperCollider and Pd in favor of going back to my own custom engine. Now that background MIDI exists, I think the only reasonable course of action is to move forward with a new controller that completely throws out the custom sound engine, and enhances MIDI to the point that it actually *works* correctly. (Ie: backwards compatible for an OK rendition on common synths, but put in NRPNS so that it's absolutely perfect on iOS.) This is what my note-tie NRPNs described in my MIDI recommendations are about.
ReplyDeleteSuperCollider's GNU licensing is a problem. Pd can be made to work from a licensing standpoint, but I had a lot of trouble getting things to work with libpd once I got the basic skeleton going. No matter though; It's better that controllers stick to being controllers, and we fix the interop issues between controllers, synths, and DAWs.