Monday, July 2, 2012

How to set up OpenAL and play a sound file on iOS

OpenAL is a powerful library that provides audio playback, 3D sound and other cool stuff. This post will help you get up and running quickly with OpenAL on iOS, but it only scratches the surface of what you can do. For a more detailed look at this topic, check out the excellent book Beginning iPhone Games Development.

A complete Xcode project based on this post can be found on github.

OK, here we go.

First, you'll need a Core Audio Format (.caf) sound file that is little-endian, 16-bit, and has a sampling rate of 44,100 Hz. OS X comes with a utility called afconvert that can be used to convert audio files into the proper format:

/usr/bin/afconvert -f caff -d LEI16@44100 Sosumi.aiff Sosumi.caf

Once you've got your .caf file, go ahead and add it to your Xcode project. Then modify your build configuration to link the OpenAL.framework and AudioToolbox.framework libraries.

Now you're ready to import OpenAL headers:

#import <OpenAl/al.h>
#import <OpenAl/alc.h>
#include <AudioToolbox/AudioToolbox.h>

To set up OpenAL, you will need a device, a context, a source and a buffer.

The device represents a physical sound device, such as a sound card. Create a device with openALDevice, passing NULL to indicate you wish to use the default device:

ALCdevice* openALDevice = alcOpenDevice(NULL);

You can use alGetError at any time to see if there is a problem with the last OpenAL call you made:

ALenum error = alGetError();

if (AL_NO_ERROR != error) {
    NSLog(@"Error %d when attemping to open device", error, operation);
}

The context keeps track of the current OpenAL state. Use openALContext to create a context and associate it with your device:

ALCcontext* openALContext = alcCreateContext(openALDevice, NULL);

Then make the context current:

alcMakeContextCurrent(openALContext);

A source in OpenAL emits sound. Use alGenSources to generate one or more sources, noting their identifiers (either a single ALuint or an array). This allocates memory:

ALuint outputSource;
alGenSources(1, &outputSource);

You can set various source parameters using alSourcef. For example, you can set the pitch and gain:

alSourcef(outputSource, AL_PITCH, 1.0f);
alSourcef(outputSource, AL_GAIN, 1.0f);

Buffers hold audio data. Use alGenBuffers to generate one or more buffers:

ALuint outputBuffer;
alGenBuffers(1, &outputBuffer);

Now we have a buffer we can put audio data into, a source that can emit that data, a device we can use to output the sound, and a context to keep track of state. The next step is to get audio data into the buffer. First we'll get a reference to the audio file:

NSString* filePath = [[NSBundle mainBundle] pathForResource:@"Sosumi" ofType:@"caf"];
NSURL* fileUrl = [NSURL fileURLWithPath:filePath];

Now we need to open the file and get its AudioFileID, which is an opaque identifier that Audio File Services uses:

AudioFileID afid;
OSStatus openResult = AudioFileOpenURL((__bridge CFURLRef)fileUrl, kAudioFileReadPermission, 0, &afid);
    
if (0 != openResult) {
    NSLog(@"An error occurred when attempting to open the audio file %@: %ld", filePath, openResult);
    return;
}

A couple things to note about this last bit of code: First is the use of __bridge: This is only necessary if you are using ARC in iOS 5. Second is the literal value 0: This indicates that we're not providing a file type hint. We don't need to provide a hint because the extension will suffice.

Now we have to determine the size of the audio file. To do this, we will use AudioFileGetProperty. This function takes the AudioFileID we got from AudioFileOpenURL, a constant indicating the property we're interested in (see the complete list), and a reference to a variable containing the size of the property value. The reason you pass this by reference is because AudioFileGetProperty will set it to the actual property value.

UInt64 fileSizeInBytes = 0;
UInt32 propSize = sizeof(fileSizeInBytes);

OSStatus getSizeResult = AudioFileGetProperty(afid, kAudioFilePropertyAudioDataByteCount, &propSize, &fileSizeInBytes);
    
if (0 != getSizeResult) {
    NSLog(@"An error occurred when attempting to determine the size of audio file %@: %ld", filePath, getSizeResult);
}
    
UInt32 bytesRead = (UInt32)fileSizeInBytes;

Note that kAudioFilePropertyAudioDataByteCount is an unsigned 64-bit integer, but I've downcast it to an unsigned 32-bit integer. The reason I've done this is because we can't use the 64-bit version with the code coming up. Hopefully your audio files aren't long enough for this to matter. ;-)

OK, now we're ready to read data from the file and put it into the output buffer. The first thing we have to do is allocate some memory to hold the file contents:

void* audioData = malloc(bytesRead);

Then we read the file. We pass the AudioFileID, false to indicate that we don't want to cache the data, 0 to indicate that we want to read the file from the beginning, a reference to bytesRead, and the pointer to the memory location where the file data should be placed. After the data is read, bytesRead will contain the actual number of bytes read.

OSStatus readBytesResult = AudioFileReadBytes(afid, false, 0, &bytesRead, audioData);
    
if (0 != readBytesResult) {
    NSLog(@"An error occurred when attempting to read data from audio file %@: %ld", filePath, readBytesResult);
}

Now we can close the file:

AudioFileClose(afid);

And we can copy the data into our OpenAL buffer:

alBufferData(outputBuffer, AL_FORMAT_STEREO16, audioData, bytesRead, 44100);

Now that we've copied the data we can clean it up:

if (audioData) {
    free(audioData);
    audioData = NULL;
}

Then we can attach the buffer to the source:

alSourcei(outputSource, AL_BUFFER, outputBuffer);

At long last, the source can emit the sound data contained in the buffer!

alSourcePlay(outputSource);

When you're ready to clean up you should delete your source and buffers, destroy the context and close the device:

alDeleteSources(1, &outputSource);
alDeleteBuffers(1, &outputBuffer);
alcDestroyContext(openALContext);
alcCloseDevice(openALDevice);

I had trouble getting the sound to play when I tried to initialize OpenAL and play the sound inside my viewDidLoad method. So I created a button and used its action to play the sound. Then everything worked fine.

If you have any questions or feedback, please feel free to comment. Thanks!

12 comments:

  1. Replies
    1. Good idea :-)

      https://github.com/boringuser/devnulldb/tree/master/OpenAL-1

      Delete
  2. Thanks for this, pure gold =)

    I just tried it out and can confirm it is working on iOS6 and XCode 4.5.1

    ReplyDelete
  3. Thank you so much!! This is EXACTLY what I was looking for!! :D

    ReplyDelete
  4. Thanks for your blog. I am trying to slow down the audio or stretch of the audio length. I hope,it can be done by varying the AL_PITCH to lower values(eg:0.5f) in "alSourcef(outputSource, AL_PITCH, 1.0f);". And after slowing down the audio, i have to combine with video and have to store together. Can i communicate the OpenAL audio buffer or something else, with AVFoundation to combine audio and video and to store/save?Or is it possible to store/save the audio using OpenAl directly?

    ReplyDelete
    Replies
    1. I'm afraid I don't know the answer to your question... You might check out Audio File Services to see what options you have for saving audio (http://developer.apple.com/library/ios/#documentation/MusicAudio/Reference/AudioFileConvertRef/Reference/reference.html).

      Delete
  5. Great post! I started using it last year with IOS6. Now on xcode5 / IOS 7 I'm getting an error -39 at:

    OSStatus readBytesResult = AudioFileReadBytes(afid, false, 0, &bytesRead, audioData);

    if (0 != readBytesResult) {
    NSLog(@"An error occurred when attempting to read data from audio file %@: %ld", filePath, readBytesResult);
    }

    Any thoughts on what may have changed or what I should investigate next?

    ReplyDelete
  6. Awesome little tutorial, great introduction to OpenAL. Thanks so much!

    ReplyDelete
  7. Hi,
    I have a problem U thought I'd ask about here.
    I implemented this code in two of my apps.
    Problem is if one app is running in the background (i.e. I launched it, then pressed the HOME button to launch the other one) - when I launch the other one 'alcOpenDevice' fails. It returns:

    2013-12-04 11:39:23.866 Safari Puzzle[4132:4803] 11:39:23.866 ERROR: 185: Error creating aggregate audio device: 'nope'
    2013-12-04 11:39:23.867 Safari Puzzle[4132:4803] 11:39:23.867 WARNING: 219: The input device is 0x39; 'AppleHDAEngineInput:1B,0,1,0:1'
    2013-12-04 11:39:23.867 Safari Puzzle[4132:4803] 11:39:23.867 WARNING: 223: The output device is 0x2f; 'AppleHDAEngineOutput:1B,0,1,1:0'
    2013-12-04 11:39:23.868 Safari Puzzle[4132:4803] 11:39:23.868 ERROR: 398: error 'nope'
    2013-12-04 11:39:23.869 Safari Puzzle[4132:4503] 11:39:23.869 ERROR: 398: error -66680
    2013-12-04 11:39:23.869 Safari Puzzle[4132:70b] 11:39:23.869 ERROR: >aurioc> 783: failed: -10851 (enable 2, outf< 2 ch, 44100 Hz, Int16, inter> inf< 2 ch, 0 Hz, Int16, inter>)

    Any help, anyone ?

    TIA
    Shimon

    ReplyDelete
  8. I was developing an app that emits Morse code beeping sound based on given code words. Originally implemented this functionality with AVAudioPlayer, but it was unable to rapidly replay the sound file for continuous beeping. This article solved my issue! Thanks!

    ReplyDelete