Good day, Habrasoobschestvo!
I would like to share the solution to the problem of playing audio from iOS applications. We encountered this during the development of the next application: we wanted to start and stop playing music and sound effects in different places, often in different classes of the application.

')
Usually, the “blank” of the necessary functionality for this is copied and adapted for a specific use case. We did this more than once and decided it was time for a more elegant solution. This decision turned out to make a “singleton”, which would not only be available from different places in the application, but would also save system resources if the same audio was used several times.
Implementation
In iOS, you can play sounds in
several ways . The system divides the sounds into “systemic” - short sounds that are played to inform the user about an action; for example, “voice over” clicking a button or confirm sending an email. Another category is “music” - continuous audio, such as songs, melodies, etc. In our case, it was a melody written for the application by the wonderful composer Bakhtiyar Amanzhol.
Playing short sounds is provided by “
System Sound Services ”. Playing longer audio is provided by a whole series of tools working at different levels of abstraction; we decided to use
AVAudioPlayer .
For ease of use, we decided to give access to the functionality through “class methods” rather than “object methods”. As a result, you can play the sound through the following code:
[MCSoundBoard playSoundForKey:@"ding"];
In order to implement such a call and, at the same time, cache audio fragments, we used the “singleton” template. Objective-C singletons can be implemented in many ways. Exploring this problem, we came across a very neat method described
here . Here is what the necessary code looks like:
+ (MCSoundBoard *)sharedInstance { __strong static id _sharedObject = nil; static dispatch_once_t onceToken; dispatch_once(&onceToken, ^{ _sharedObject = [[self alloc] init]; }); return _sharedObject; }
Then we defined “public” methods (note that these are class methods):
+ (void)addSoundAtPath:(NSString *)filePath forKey:(id)key; + (void)playSoundForKey:(id)key; + (void)addAudioAtPath:(NSString *)filePath forKey:(id)key; + (void)playAudioForKey:(id)key; + (void)stopAudioForKey:(id)key; + (void)pauseAudioForKey:(id)key;
These public static methods call singleton object methods:
- (void)addSoundAtPath:(NSString *)filePath forKey:(id)key { NSURL* fileURL = [NSURL fileURLWithPath:filePath]; SystemSoundID soundId; AudioServicesCreateSystemSoundID((__bridge CFURLRef)fileURL, &soundId); [_sounds setObject:[NSNumber numberWithInt:soundId] forKey:key]; } + (void)addSoundAtPath:(NSString *)filePath forKey:(id)key { [[self sharedInstance] addSoundAtPath:filePath forKey:key]; }
Fade – out
On this, the “musical” functionality of the application was ready. It remains to get rid of too "sharp" on and off music. The decision was to gradually add or reduce the sound level. AVFoundation did not give such an opportunity. Fortunately, implementing this with the main library is not so difficult:
- (void)fadeOutAndStop:(NSTimer *)timer { AVAudioPlayer *player = timer.userInfo; float volume = player.volume; volume = volume - 1.0 / MCSOUNDBOARD_AUDIO_FADE_STEPS; volume = volume < 0.0 ? 0.0 : volume; player.volume = volume; if (volume == 0.0) { [timer invalidate]; [player pause]; } } - (void)stopAudioForKey:(id)key fadeOutInterval:(NSTimeInterval)fadeOutInterval { AVAudioPlayer *player = [_audio objectForKey:key];
That's all. I hope the material will help people faced with a similar task.
Optional:
MCSoundBoard page on GitHub