I am a big fan of applications using non-standard ways of interacting with their environment, and an easy way to achieve interaction is by measuring the amount of decibels coming from an iOS Devices microphone. You can do this do this 2 ways, the first being to use RemoteIO and filter the individual samples, or the second being the lazy way and just get the iOS API’s to do. We will be doing it the lazy way, because in this case all I want is the decibels, had I wanted the decibels with a measure of fundamental frequency or a fourier transform I would do it using samples, but if an API can provide me with what I need easily there is no sense using other methods.
The first thing you have to do is import the AVFoundation framework. Once you have imported it, add the following header to whatever class you want to measure the decibels:
Now create an AVAudioRecorder instance in the class:
After this is done, you need to make an instance of the AVAudioRecorder, do this like so:
NSDictionary* recorderSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kAudioFormatAppleIMA4],AVFormatIDKey, [NSNumber numberWithInt:44100],AVSampleRateKey, [NSNumber numberWithInt:1],AVNumberOfChannelsKey, [NSNumber numberWithInt:16],AVLinearPCMBitDepthKey, [NSNumber numberWithBool:NO],AVLinearPCMIsBigEndianKey, [NSNumber numberWithBool:NO],AVLinearPCMIsFloatKey, nil]; NSError* error = nil; recorder = [[AVAudioRecorder alloc] initWithURL:[NSURL URLWithString:[NSTemporaryDirectory() stringByAppendingPathComponent:@"tmp.caf"]] settings:recorderSettings error:&error];
You’ll see that we are actually creating an audio CAF file in the temporary directory, this isn’t ideal but it’s the only way to get this class functioning (the temporary directory is erased once the app quits). Here we also specifying a sample rate of 44.1kHz (which is capable of representing 22 kHz of sound frequencies according to the Nyquist theorem), and 1 channel (we do not need stereo to measure noise).
Next we have to enable measuring:
recorder.meteringEnabled = YES;
And finally tell the recorder to start recording:
Now our app is capable of reading the decibel level with the following functions:
Before calling these functions, it is necessary to call updateMeters:
[recorder updateMeters]; decibels = [recorder averagePowerForChannel:0];
The decibels value will now be in the decibels variable. Now the first thing that you will notice when looking at these values is they are all negative, and the louder something is the closer to 0 it gets. Decibels are a relative measure, and have to relate to either noise or silence to be an effective tool at measuring either. In this case, 0 represents full scale (or an incredibly loud) noise, in rare cases it can exceed 0 and go into the positives. -160 represents near silence.
It’s also worth noting that decibels are not linear, they are logarithmic. This means that -80 does not represent half way between silence and loud, it represents around 170,000 times the relative silence. This may seem like a lot, but also remember that silence is not a lot (around 20 µPa), so even 170,000 times that number is still quite a relatively low number, this is also why it is important to make the distinction that sound isn’t measured linearly (and visual meters should be calibrated to reflect this).
Another interesting thing to remember about decibels is that human ears and iOS Microphones do not treat all frequencies equally (whereas a decibel does). Good examples of this are high frequencies some people can not hear (such as 18 kHz+), while others can only just hear them (even if they are blasting on the decibel levels). Unfortunately, each human is unique so we cannot absolutely map the human ears frequency responses, but we can map the iOS Microphones frequency responses. However this is getting into a subject matter more suited for Fourier Transforms, and that is out of the scope of this article, it is simply something to keep in mind.