Tuesday, April 26, 2011

iPhone programming - How to play a tone at a given frequency

Many novice developers expect some sort of way to play a simple tone of a chosen frequency, maybe using an API similar to playTone(someFrequency, someDuration). However, nothing like this is available in the stock iOS SDK frameworks.

It's easy to generate sine waves if one understands sampled sound. The most common sample format is 44100 samples per second, with each sample being a signed integer between -32767 and 32767. If one could push samples to the iOS audio system, it would look like this:

len = secondsDuration / sampleRate;
for (int i=0; i<len; i++) {
  sample = volume * sinf(2.0*M_PI * i * myFrequency/sampleRate);
  sendToAudio(sample);
}

However, iOS and Cocoa Touch use the event driven design pattern. An app can't push samples to the audio system. Instead, the audio systems asks you for some number of samples when it is good and ready. Your app just sets up the audio and waits to get called. Here is an example RemoteIO audio buffer callback:

float f  = myDesiredFrequency;
float v  = myVolume;  // in the range 0.0 to 32767.0
float sr = sampleRate;
float a, da;
int   b, n;
short int *p;

a = ...
da = 2.0 * M_PI * f / sr;  // delta phase per sample

for (b = 0; b < ioData->mNumberBuffers; b++) {
  // number of 16 bit packets in each buffer
  n = ioData->mBuffers[b].mDataByteSize / 2; 
  // pointer to sample buffer
  p = (short int *)ioData->mBuffers[b].mData; 
  
  for (int i = 0; i < n; i++) {
    p[i] = v * sinf(a);   // single precision
    a = a + da;    // update phase
    if (a > 2.0 * M_PI) { 
      a -= 2.0 * M_PI;   // and range reduce
    }
  }
}

Note that the above code snippet doesn't show where variable a, the phase, is initialized. The phase angle a should be saved between callbacks so that there is no discontinuity in the sinewave phase between each audio buffer. I'll leave that here as an exercise for the student.

I used to question including the call to the transcendental function sin() inside a audio inner loop as something far too slow for an audio callback. But a little benchmarking on an actual iPhone showed that calling the single precision sinf() for every audio sample actually uses less than 1% of the CPU time, and really isn't that much slower than something like an interpolated wave table lookup.

If you don't know how to set up the RemoteIO Audio Unit to call your app, here's a blog article on Using the RemoteIO Audio Unit (by Michael Tyson at Tasty Pixel), and here is Apple's Documentation on the RemoteIO Audio Unit. Don't forget to set up a suitable audio session for your app as well, using the iOS Audio Session APIs.

No comments:

Post a Comment