Monday, July 30, 2012

How to set up OpenGL on iOS using GLKit

In this post we'll set up OpenGL ES on iOS and clear the screen, just like we did in a previous post, but this time we'll use GLKit to do it. I recommend checking out the previous post before you dig into this one, because it explains some of the OpenGL concepts that are glossed over here.

A finished Xcode project with the code from this post can be found on github.

GLKit is a framework built on top of OpenGL ES. Using it can save time because it reduces the amount of boilerplate code you have to write.

For this example, create a new Empty Application iOS project in Xcode. I'm using automatic reference counting -- you may have to change the code if you want to use manual memory management. I'm also using Xcode 4.4.

Add QuartzCore, OpenGLES and GLKit frameworks to your project (project settings, Build Phases, Link Binary With Libraries):
Now add a new Storyboard file to your project and adjust your project settings to make it your main storyboard. Storyboards allow you to create multiple scenes and specify how they are related. For this example we'll just use a single scene:
With the Storyboard open, check out the Object Library (in the Utilities drawer). You should see a GLKit View Controller. Drag this on to your Storyboard:
If you have the Navigator drawer open you should see the GLKit View Controller. Make sure it's selected:
Back in the Utilities drawer, under the Identity inspector, you should see a Custom Class section where GLKViewController is specified. This means that GLKViewController is the backing controller class for the view:
We'll need to change this to our own custom subclass. Add a new Objectve-C class to your project named MyViewController. Make it a subclass of GLKViewController. Make sure to import GLKit.h in your header file:

#import <GLKit/GLKit.h>

@interface MyViewController : GLKViewController


Now you can go back to your Storyboard and set your custom class:
While you're here you can click on the Connections inspector. Notice how your view controller's outlets have been configured to reference a GLKit View:
Now click on the GLKit View in your Scene:
Check out its Identity inspector. Notice how the custom class is GLKView:
When we added the GLKit View Controller to the scene it was automatically set up to manage a GLKView instance. This GLKView instance, in turn, will manage your framebuffer for you.

OK, let's get back to the code. Open up MyViewController.m. Let's add a property for an EAGLContext:

#import "MyViewController.h"

@interface MyViewController ()

@property (strong) EAGLContext *glContext;


@implementation MyViewController

@synthesize glContext = _glContext;


Now let's add an empty viewDidLoad method to the MyViewController class:

- (void)viewDidLoad
    [super viewDidLoad];

Now let's fill it out. First, let's create our EAGLContext:

self.glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

Here we've specified the use of OpenGL ES version 2. Let's check to make sure the context got created properly:

if (!self.glContext) {
    NSLog(@"Unable to create OpenGL context");

Next, make the context current:

[EAGLContext setCurrentContext:self.glContext];

Before leaving the viewDidLoad method we'll tell the GLKView instance (being managed by our view controller) about the context:

GLKView *view = (GLKView *)self.view;
view.context = self.glContext;

Let's finish off the MyViewController class with a simple mechanism for clearing the screen:

- (void)glkView:(GLKView *)view drawInRect:(CGRect)rect
    glClearColor(150.0/255.0, 200.0/255.0, 255.0/255.0, 1.0);

The method we're overriding here, glkView:drawInRect, is part of the GLKViewDelegate protocol, which is used by GLKViewController. We can implement this method to draw whatever we want using OpenGL commands.

The last thing we need to do before running this application is to go into AppDelegate.m and modify our application:didFinishLaunchingWithOptions method to simply return YES so our Storyboard gets used:

- (BOOL)application:(UIApplication *)application didFinishLaunchingWithOptions:(NSDictionary *)launchOptions
    return YES;

Launch the app, and you should see a lovely periwinkle screen!
Thanks for reading. Feel free to post comments, questions or suggestions.

Sunday, July 15, 2012

Using Blender on a MacBook

Blender is an amazing 3D modeling, animating, video editing, compositing, game making tool that's designed to be used when sitting at a desk with a full keyboard (including numpad) and a three-button mouse. If you want to use Blender on a MacBook then you'll have to do a bit of extra work to set it up.

There are two main issues you have to solve when using Blender on a MacBook: First is the fact that newer MacBooks have no numpad, second is the fact that OS X provides no built-in way of making a middle mouse click with the trackpad.

Two free tools will solve this problem.

First, the keyboard.

Most information I've seen on the web for using Blender on a laptop advises the use of the Emulate Numpad" setting in Blender user preferences. This setting causes Blender to act as though the regular number keys are in fact numpad keys. This means you can use the number keys to switch between various 3D views but unfortunately you use the ability to use the number keys for their original purpose, which is selecting layers.

I recommend you leave "Emulate Numpad" off. Instead, use KeyRemap4MacBook. This is a nifty preference pane that will let you use the fn key plus number keys to simulate numpad input. This means you can use the numbers to switch between layers in Blender, or use fn+numbers to switch between views. To enable this functionality, install KeyRemap4MacBook, go into System Preferences, open the KeyRemap4MacBook preference pane, and under the "Change Key" tab locate the "Change Num Key (1...0)" item. Click the little triangle to open the item, then check the "Fn+Number to KeyPad" preference. Now, whenever you press fn+<some number> it will be as though you used the numpad to make the key press.

Next, the mouse.

To get left mouse input just make sure you have "Secondary click" enabled in the Trackpad system preference pane. This lets you use two fingers to get RMB clicks.

The middle mouse button is a bit tricker. For this we'll need another piece of software, the very awesome BetterTouchTool.

After installing and running BTT you will see a little icon at the top of your screen that looks like a finger on a trackpad. Go into its preferences. Click "Basic Settings", and enable "Launch BetterTouchTool on startup" (if you want).

In Blender, middle mouse is used to move about the 3D view. We want to be able to hold MMB and move the mouse. It's pretty straightforward to enable MMB clicks in BTT, but being able to drag MMB is a little tricker.

To enable MMB drag in BBT, go into its preferences and click the "Advanced" button. You should now see a little magic wand icon at the top of the window labeled "Action Settings". Click this. Go to the "Stuff" tab, and select "Use special middleclick mode".

Special middleclick mode won't work if you don't have a middleclick gesture defined, so click on "Gestures", select "Global" in the menu on the left, and then click "Add new gesture". Set the Touchpad Gesture to "Three Finger Click" (not Three Finger Tap, that won't work) and set the Predefined Action to "Middleclick".

Now if you go into Blender, push the trackpad down with three fingers and keep it down while lifting two of them, you should be able to move your remaining finger around to navigate in the 3D view.

All in all, a little bit of effort and now you can use Blender full-on with just the MacBook keyboard and trackpad, no external devices required. Enjoy!

Thursday, July 12, 2012

How to set up OpenGL on iOS

OpenGL ES is a scaled-down version of the OpenGL API for 2D and 3D graphics programming on mobile devices. iOS supports version 1.1 and 2.0 of the API. Version 1.0 is more simple, version 2.0 is more powerful and flexible. For this particular example I'll be using version 2.0 to create a bare bones OpenGL app that does nothing but clear the screen with a particular color. There's a lot to it, and OpenGL does have a bit of a learning curve, but I think in the long run it's a rewarding thing to learn.

A complete Xcode project for this post can be found on github.

OK, let's go.

On iOS all OpenGL content is rendered to a special Core Animation layer called CAEAGLLayer. Our basic application will create a UIView subclass called GLView which will wrap a CAEAGLLayer. We do this by overriding UIView's layerClass method to specify that our view is backed by a CAEAGLLayer:

+ (Class)layerClass
    return [CAEAGLLayer class];

The CAEAGLLayer instance is managed for us by our parent. We can retrieve it via the layer property:

CAEAGLLayer *glLayer;
glLayer = (CAEAGLLayer *)self.layer;

Once we have a reference to our CAEAGLLayer we can configure it. By default the layer is transparent. We have to change that. If the layer is not opaque performance will suffer:

glLayer.opaque = YES;

Now we need a context. In OpenGL the context is used to store current state. The class we use for this on iOS is EAGLContext. When we initialize the context we tell it which API version we wish to use:

EAGLContext *glContext;
glContext = [[EAGLContext alloc] initWithAPI:kEAGLRenderingAPIOpenGLES2];

We then make our context the current context so OpenGL will use it:

[EAGLContext setCurrentContext:glContext]

Next we need to ask OpenGL to create a renderbuffer for us. A renderbuffer is a chunk of memory where the rendered image for the current frame will be stored. To create one, we use the glGenRenderbuffers command:

GLuint renderbuffer;
glGenRenderbuffers(1, &renderbuffer);

Notice we passed in the address of a GLuint variable. This variable holds an identifier that we can use to refer to this particular renderbuffer.

Once we have a renderbuffer, we bind it to the GL_RENDERBUFFER target. All this means is that when we execute commands that involve the bound renderbuffer in some way, this particular renderbuffer will be used:

glBindRenderbuffer(GL_RENDERBUFFER, renderbuffer);

Now we need to allocate storage for the renderbuffer:

[glContext renderbufferStorage:GL_RENDERBUFFER fromDrawable:glLayer];

Notice how we didn't explicitly specify which renderbuffer to allocate storage for. Instead, we specified the GL_RENDERBUFFER target. This is a good example of how the OpenGL API is "stateful". OpenGL creates and manages a bunch of internal objects that we don't directly control. Instead, we use OpenGL commands to build up the current state, and then use other OpenGL commands to manipulate the current state. If we want to manipulate some other state, say if we wanted to work in another context or use a different renderbuffer, we would have to tell OpenGL to use this other state before executing commands that would manipulate it.

This is an important concept in OpenGL. When working with the API, we have to make sure that we're using the correct state. If I was managing multiple renderbuffers I would have to make sure I told OpenGL which one was bound to the GL_RENDERBUFFER target before executing commands that manipulate the currently bound renderbuffer. This simple example only has a single renderbuffer and a single context, but this is a fundamental aspect of OpenGL and important to keep in mind.

Now that we have our renderbuffer, we need a framebuffer. The framebuffer is another chunk of memory that is used when rendering the current frame:

GLuint framebuffer;
glGenFramebuffers(1, &framebuffer);

Now we bind the framebuffer to the GL_FRAMEBUFFER target so that framebuffer-related commands act up on it:

glBindFramebuffer(GL_FRAMEBUFFER, framebuffer);

Then we attach the renderbuffer to the framebuffer:

glFramebufferRenderbuffer(GL_FRAMEBUFFER, GL_COLOR_ATTACHMENT0, GL_RENDERBUFFER, renderbuffer);

Notice we specified the GL_COLOR_ATTACHMENT0 slot. This is the framebuffer's attachment point for a renderbuffer. Sometimes renderbuffers are called "color buffers" or "color renderbuffers" because they're basically just a color image.

Now OpenGL is initialized and ready to use. Let's fill our renderbuffer with a solid color and push it to the screen.

First, we set the clear color:

glClearColor(150.0/255.0, 200.0/255.0, 255.0/255.0, 1.0);

When setting a color we specify values for four channels: red, green, blue and alpha (transparency). Each can have a value between 0 and 1. Normally we're using a color mode that has 8 bits per channel, so there are 256 distinct values any channel can have. Specifying a value between 0 and 255 is easy, and a lot of tools for woking with color support this. To change these "human readable" values into the 0 to 1 range OpenGL expects, we simply divide.

Now that we've specified the clear color we can fill the currently bound renderbuffer with it:


Finally, we present the contents of the renderbuffer to the screen:

[glContext presentRenderbuffer:GL_RENDERBUFFER];

Take a look at the Xcode project on github to see the full GLView class. Take a look at the AppDelegate code to see how we attach GLView as a subview of the window. Run it, and you should see a blue screen. :-)

That's it for now. Comments and questions welcome. Thanks!

Monday, July 2, 2012

How to set up OpenAL and play a sound file on iOS

OpenAL is a powerful library that provides audio playback, 3D sound and other cool stuff. This post will help you get up and running quickly with OpenAL on iOS, but it only scratches the surface of what you can do. For a more detailed look at this topic, check out the excellent book Beginning iPhone Games Development.

A complete Xcode project based on this post can be found on github.

OK, here we go.

First, you'll need a Core Audio Format (.caf) sound file that is little-endian, 16-bit, and has a sampling rate of 44,100 Hz. OS X comes with a utility called afconvert that can be used to convert audio files into the proper format:

/usr/bin/afconvert -f caff -d LEI16@44100 Sosumi.aiff Sosumi.caf

Once you've got your .caf file, go ahead and add it to your Xcode project. Then modify your build configuration to link the OpenAL.framework and AudioToolbox.framework libraries.

Now you're ready to import OpenAL headers:

#import <OpenAl/al.h>
#import <OpenAl/alc.h>
#include <AudioToolbox/AudioToolbox.h>

To set up OpenAL, you will need a device, a context, a source and a buffer.

The device represents a physical sound device, such as a sound card. Create a device with openALDevice, passing NULL to indicate you wish to use the default device:

ALCdevice* openALDevice = alcOpenDevice(NULL);

You can use alGetError at any time to see if there is a problem with the last OpenAL call you made:

ALenum error = alGetError();

if (AL_NO_ERROR != error) {
    NSLog(@"Error %d when attemping to open device", error, operation);

The context keeps track of the current OpenAL state. Use openALContext to create a context and associate it with your device:

ALCcontext* openALContext = alcCreateContext(openALDevice, NULL);

Then make the context current:


A source in OpenAL emits sound. Use alGenSources to generate one or more sources, noting their identifiers (either a single ALuint or an array). This allocates memory:

ALuint outputSource;
alGenSources(1, &outputSource);

You can set various source parameters using alSourcef. For example, you can set the pitch and gain:

alSourcef(outputSource, AL_PITCH, 1.0f);
alSourcef(outputSource, AL_GAIN, 1.0f);

Buffers hold audio data. Use alGenBuffers to generate one or more buffers:

ALuint outputBuffer;
alGenBuffers(1, &outputBuffer);

Now we have a buffer we can put audio data into, a source that can emit that data, a device we can use to output the sound, and a context to keep track of state. The next step is to get audio data into the buffer. First we'll get a reference to the audio file:

NSString* filePath = [[NSBundle mainBundle] pathForResource:@"Sosumi" ofType:@"caf"];
NSURL* fileUrl = [NSURL fileURLWithPath:filePath];

Now we need to open the file and get its AudioFileID, which is an opaque identifier that Audio File Services uses:

AudioFileID afid;
OSStatus openResult = AudioFileOpenURL((__bridge CFURLRef)fileUrl, kAudioFileReadPermission, 0, &afid);
if (0 != openResult) {
    NSLog(@"An error occurred when attempting to open the audio file %@: %ld", filePath, openResult);

A couple things to note about this last bit of code: First is the use of __bridge: This is only necessary if you are using ARC in iOS 5. Second is the literal value 0: This indicates that we're not providing a file type hint. We don't need to provide a hint because the extension will suffice.

Now we have to determine the size of the audio file. To do this, we will use AudioFileGetProperty. This function takes the AudioFileID we got from AudioFileOpenURL, a constant indicating the property we're interested in (see the complete list), and a reference to a variable containing the size of the property value. The reason you pass this by reference is because AudioFileGetProperty will set it to the actual property value.

UInt64 fileSizeInBytes = 0;
UInt32 propSize = sizeof(fileSizeInBytes);

OSStatus getSizeResult = AudioFileGetProperty(afid, kAudioFilePropertyAudioDataByteCount, &propSize, &fileSizeInBytes);
if (0 != getSizeResult) {
    NSLog(@"An error occurred when attempting to determine the size of audio file %@: %ld", filePath, getSizeResult);
UInt32 bytesRead = (UInt32)fileSizeInBytes;

Note that kAudioFilePropertyAudioDataByteCount is an unsigned 64-bit integer, but I've downcast it to an unsigned 32-bit integer. The reason I've done this is because we can't use the 64-bit version with the code coming up. Hopefully your audio files aren't long enough for this to matter. ;-)

OK, now we're ready to read data from the file and put it into the output buffer. The first thing we have to do is allocate some memory to hold the file contents:

void* audioData = malloc(bytesRead);

Then we read the file. We pass the AudioFileID, false to indicate that we don't want to cache the data, 0 to indicate that we want to read the file from the beginning, a reference to bytesRead, and the pointer to the memory location where the file data should be placed. After the data is read, bytesRead will contain the actual number of bytes read.

OSStatus readBytesResult = AudioFileReadBytes(afid, false, 0, &bytesRead, audioData);
if (0 != readBytesResult) {
    NSLog(@"An error occurred when attempting to read data from audio file %@: %ld", filePath, readBytesResult);

Now we can close the file:


And we can copy the data into our OpenAL buffer:

alBufferData(outputBuffer, AL_FORMAT_STEREO16, audioData, bytesRead, 44100);

Now that we've copied the data we can clean it up:

if (audioData) {
    audioData = NULL;

Then we can attach the buffer to the source:

alSourcei(outputSource, AL_BUFFER, outputBuffer);

At long last, the source can emit the sound data contained in the buffer!


When you're ready to clean up you should delete your source and buffers, destroy the context and close the device:

alDeleteSources(1, &outputSource);
alDeleteBuffers(1, &outputBuffer);

I had trouble getting the sound to play when I tried to initialize OpenAL and play the sound inside my viewDidLoad method. So I created a button and used its action to play the sound. Then everything worked fine.

If you have any questions or feedback, please feel free to comment. Thanks!