A complete Xcode project based on this post can be found on github.
OK, here we go.
First, you'll need a Core Audio Format (.caf) sound file that is little-endian, 16-bit, and has a sampling rate of 44,100 Hz. OS X comes with a utility called afconvert that can be used to convert audio files into the proper format:
/usr/bin/afconvert -f caff -d LEI16@44100 Sosumi.aiff Sosumi.caf
Once you've got your .caf file, go ahead and add it to your Xcode project. Then modify your build configuration to link the OpenAL.framework and AudioToolbox.framework libraries.
Now you're ready to import OpenAL headers:
#import <OpenAl/al.h>
#import <OpenAl/alc.h>
#include <AudioToolbox/AudioToolbox.h>
The device represents a physical sound device, such as a sound card. Create a device with openALDevice, passing NULL to indicate you wish to use the default device:
ALCdevice* openALDevice = alcOpenDevice(NULL);
You can use alGetError at any time to see if there is a problem with the last OpenAL call you made:
ALenum error = alGetError();
if (AL_NO_ERROR != error) {
NSLog(@"Error %d when attemping to open device", error, operation);
}
ALCcontext* openALContext = alcCreateContext(openALDevice, NULL);
Then make the context current:
alcMakeContextCurrent(openALContext);
A source in OpenAL emits sound. Use alGenSources to generate one or more sources, noting their identifiers (either a single ALuint or an array). This allocates memory:
ALuint outputSource;
alGenSources(1, &outputSource);
alSourcef(outputSource, AL_PITCH, 1.0f);
alSourcef(outputSource, AL_GAIN, 1.0f);
ALuint outputBuffer;
alGenBuffers(1, &outputBuffer);
NSString* filePath = [[NSBundle mainBundle] pathForResource:@"Sosumi" ofType:@"caf"];
NSURL* fileUrl = [NSURL fileURLWithPath:filePath];
AudioFileID afid;
OSStatus openResult = AudioFileOpenURL((__bridge CFURLRef)fileUrl, kAudioFileReadPermission, 0, &afid);
if (0 != openResult) {
NSLog(@"An error occurred when attempting to open the audio file %@: %ld", filePath, openResult);
return;
}
Now we have to determine the size of the audio file. To do this, we will use AudioFileGetProperty. This function takes the AudioFileID we got from AudioFileOpenURL, a constant indicating the property we're interested in (see the complete list), and a reference to a variable containing the size of the property value. The reason you pass this by reference is because AudioFileGetProperty will set it to the actual property value.
UInt64 fileSizeInBytes = 0;
UInt32 propSize = sizeof(fileSizeInBytes);
OSStatus getSizeResult = AudioFileGetProperty(afid, kAudioFilePropertyAudioDataByteCount, &propSize, &fileSizeInBytes);
if (0 != getSizeResult) {
NSLog(@"An error occurred when attempting to determine the size of audio file %@: %ld", filePath, getSizeResult);
}
UInt32 bytesRead = (UInt32)fileSizeInBytes;
OK, now we're ready to read data from the file and put it into the output buffer. The first thing we have to do is allocate some memory to hold the file contents:
void* audioData = malloc(bytesRead);
Then we read the file. We pass the AudioFileID, false to indicate that we don't want to cache the data, 0 to indicate that we want to read the file from the beginning, a reference to bytesRead, and the pointer to the memory location where the file data should be placed. After the data is read, bytesRead will contain the actual number of bytes read.
OSStatus readBytesResult = AudioFileReadBytes(afid, false, 0, &bytesRead, audioData);
if (0 != readBytesResult) {
NSLog(@"An error occurred when attempting to read data from audio file %@: %ld", filePath, readBytesResult);
}
AudioFileClose(afid);
And we can copy the data into our OpenAL buffer:
alBufferData(outputBuffer, AL_FORMAT_STEREO16, audioData, bytesRead, 44100);
Now that we've copied the data we can clean it up:
if (audioData) {
free(audioData);
audioData = NULL;
}
alSourcei(outputSource, AL_BUFFER, outputBuffer);
At long last, the source can emit the sound data contained in the buffer!
alSourcePlay(outputSource);
When you're ready to clean up you should delete your source and buffers, destroy the context and close the device:
alDeleteSources(1, &outputSource);
alDeleteBuffers(1, &outputBuffer);
alcDestroyContext(openALContext);
alcCloseDevice(openALDevice);
If you have any questions or feedback, please feel free to comment. Thanks!
Good idea :-)
ReplyDeletehttps://github.com/boringuser/devnulldb/tree/master/OpenAL-1
Thanks for this, pure gold =)
ReplyDeleteI just tried it out and can confirm it is working on iOS6 and XCode 4.5.1
You're welcome :-)
DeleteThank you so much!! This is EXACTLY what I was looking for!! :D
ReplyDeleteGlad to hear it :-)
DeleteThanks for your blog. I am trying to slow down the audio or stretch of the audio length. I hope,it can be done by varying the AL_PITCH to lower values(eg:0.5f) in "alSourcef(outputSource, AL_PITCH, 1.0f);". And after slowing down the audio, i have to combine with video and have to store together. Can i communicate the OpenAL audio buffer or something else, with AVFoundation to combine audio and video and to store/save?Or is it possible to store/save the audio using OpenAl directly?
ReplyDeleteI'm afraid I don't know the answer to your question... You might check out Audio File Services to see what options you have for saving audio (http://developer.apple.com/library/ios/#documentation/MusicAudio/Reference/AudioFileConvertRef/Reference/reference.html).
DeleteGreat post! I started using it last year with IOS6. Now on xcode5 / IOS 7 I'm getting an error -39 at:
ReplyDeleteOSStatus readBytesResult = AudioFileReadBytes(afid, false, 0, &bytesRead, audioData);
if (0 != readBytesResult) {
NSLog(@"An error occurred when attempting to read data from audio file %@: %ld", filePath, readBytesResult);
}
Any thoughts on what may have changed or what I should investigate next?
SAME ERROR HAVE YOU GOT ANY SOLUTION .....
DeleteAwesome little tutorial, great introduction to OpenAL. Thanks so much!
ReplyDeleteHi,
ReplyDeleteI have a problem U thought I'd ask about here.
I implemented this code in two of my apps.
Problem is if one app is running in the background (i.e. I launched it, then pressed the HOME button to launch the other one) - when I launch the other one 'alcOpenDevice' fails. It returns:
2013-12-04 11:39:23.866 Safari Puzzle[4132:4803] 11:39:23.866 ERROR: 185: Error creating aggregate audio device: 'nope'
2013-12-04 11:39:23.867 Safari Puzzle[4132:4803] 11:39:23.867 WARNING: 219: The input device is 0x39; 'AppleHDAEngineInput:1B,0,1,0:1'
2013-12-04 11:39:23.867 Safari Puzzle[4132:4803] 11:39:23.867 WARNING: 223: The output device is 0x2f; 'AppleHDAEngineOutput:1B,0,1,1:0'
2013-12-04 11:39:23.868 Safari Puzzle[4132:4803] 11:39:23.868 ERROR: 398: error 'nope'
2013-12-04 11:39:23.869 Safari Puzzle[4132:4503] 11:39:23.869 ERROR: 398: error -66680
2013-12-04 11:39:23.869 Safari Puzzle[4132:70b] 11:39:23.869 ERROR: >aurioc> 783: failed: -10851 (enable 2, outf< 2 ch, 44100 Hz, Int16, inter> inf< 2 ch, 0 Hz, Int16, inter>)
Any help, anyone ?
TIA
Shimon
I was developing an app that emits Morse code beeping sound based on given code words. Originally implemented this functionality with AVAudioPlayer, but it was unable to rapidly replay the sound file for continuous beeping. This article solved my issue! Thanks!
ReplyDeleteThank you for the short and sweet tutorial, it helped a ton!
ReplyDelete