Chapter    9

Multimedia Recipes

In the words of Aldous Huxley, “After silence, that which comes nearest to expressing the inexpressible is music.” We live in a world where we are surrounded by sound and music. From the most subtle background tune in an advertisement, to the immense blast of the electric guitar at a rock concert, sound has a tremendous impact and plays an integral part in our lives. It is our responsibility as developers to translate this force into our applications and bring the most complete and ideal experience to users.

Throughout this chapter, a variety of recipes make use of accessing the music library. Therefore, in order to fully test these recipes, you should ensure that there are at least a few songs in your device’s music library.

Recipe 9-1: Playing Audio

If you ask almost any random person what they think of when they hear the words “iPhone” and “audio,” they will probably be thinking along the lines of their iPod and the thousands of songs they have downloaded. What most users tend to overlook, despite its immense importance, is the concept of background audio and sound effects. These sound clips and tunes may go completely unnoticed by the user in normal use, but in terms of app functionality and design they can tremendously improve the quality of an app. It may be the little “shutter click” when you take a picture, or some background music that gets stuck in your head after you play a game for too long; regardless of whether the user notices it, sound can make a world of difference. The iOS AV Foundation framework provides a simple way to access, play, and manipulate sound files using the AVAudioPlayer. Here you create a sample project that allows you to play an audio file, as well as allows the user to manipulate the clip’s playback.

Setting Up the Application

Start by creating a new single-view application project. In this recipe you utilize two frameworks that aren’t linked by default so you need to add them to your project. These are AVFoundation.framework, which includes the AVAudioPlayer class, and AudioToolbox.framework, which you’ll use to vibrate the device.

Next, to import the APIs for these frameworks, switch over to your view controller’s header file and add the following statements:

//
//  ViewController.h
//  Recipe 9.1: Playing Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>


@interface ViewController : UIViewController

@end

Now build your view in the view controller’s XIB file. Add three sliders for rate, pan, and volume. Also add three buttons; one for playing the device vibration, one for starting the player, and one for pausing it. Finally, you monitor the audio player’s channel levels via labels at the top of the view. Set up your view so it looks like Figure 9-1. (Note that the “0.0” labels are separate from the “Average” and the “Peak” labels.)

9781430245995_Fig09-01.jpg

Figure 9-1.  A user interface to control an AVAudioPlayer

You need your slider’s values to match the possible values of the properties they control. Using the Attribute inspector, adjust the minimum and maximum values of your “Rate” slider to 0.5 and 2.0 (corresponding to half speed and 2x speed.) Also, set its current value set to 0. The values for the “Pan” slider should be -1 and 1 (correspond to left pan and right pan) with 0 as the current value. The “Volume” slider’s default values should already be fine, as the volume property goes from 0 to 1. Just set its current value to 1 (maximum volume).

As you have done in the past recipes, you create outlets for the controls that are referenced from your code, and actions for the events that you need to respond to. Create the following outlets:

  • rateSlider, panSlider, and volumeSlider for the sliders
  • averageLabel and peakLabel for your two level-monitoring labels (the ones with the texts “0.0” in Figure 9-1)

And the following actions:

  • updateRate, updatePan, and updateVolume for the Value Changed events of the respective sliders
  • playVibrateSound, startPlayer, and pausePlayer for the Touch Up Inside events of the buttons

Also add a property to your header file to keep track of your AVAudioPlayer:

@property (strong, nonatomic) AVAudioPlayer *player;

The last step in your header file is to make your view controller conform to the AVAudioPlayerDelegate protocol. Your ViewController.h file should now resemble this code:

//
//  ViewController.h
//  Recipe 9.1: Playing Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface ViewController : UIViewController<AVAudioPlayerDelegate>

@property (weak, nonatomic) IBOutlet UILabel *averageLabel;
@property (weak, nonatomic) IBOutlet UILabel *peakLabel;
@property (weak, nonatomic) IBOutlet UISlider *rateSlider;
@property (weak, nonatomic) IBOutlet UISlider *panSlider;
@property (weak, nonatomic) IBOutlet UISlider *volumeSlider;
@property (strong, nonatomic) AVAudioPlayer *player;

- (IBAction)updateRate:(id)sender;
- (IBAction)updatePan:(id)sender;
- (IBAction)updateVolume:(id)sender;
- (IBAction)playVibrateSound:(id)sender;
- (IBAction)startPlayer:(id)sender;
- (IBAction)pausePlayer:(id)sender;

@end

Before you proceed, you need to select and import the sound file that your application will be playing. The file we use is called midnight-ride.mp3, and the code reflects this file name. You need to change any filename or file type according to the file that you choose. You should consult Apple’s documentation1 on which file types are appropriate, but it is fairly safe to assume that most commonly used file types such as .wav or .mp3 will work.

Tip  We downloaded our sound file from Sound Jay, which offers sound and music files free of charge. Be sure to read the terms of use (http://www.soundjay.com/tos.html) for how you may use Sound Jay’s files in your projects.

Add the sound file to your project by dragging and dropping it into the Supported Files Folder. For more detailed information on adding resource files refer to Chapter 1, Recipe 1-8.

Setting Up the Audio Player

Now, switch over to ViewController.m and locate the viewDidLoad method. Add the following code to set up the AVAudioPlayer:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    NSString *fileName = @"midnight-ride"; // Change this to your own file
    NSString *fileType = @"mp3";
    NSString *soundFilePath =
        [[NSBundle mainBundle] pathForResource:fileName ofType:fileType];
    NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
    NSError *error;
    self.player =
        [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:&error];
    if (error)
    {
        NSLog(@"Error creating the audio player: %@", error);
    }
    self.player.enableRate = YES; //Allows us to change the playback rate.
    self.player.meteringEnabled = YES; //Allows us to monitor levels
    self.player.delegate = self;
    self.volumeSlider.value = self.player.volume;
    self.rateSlider.value = self.player.rate;
    self.panSlider.value = self.player.pan;
    [self.player prepareToPlay]; //Preload audio to decrease lag
    [NSTimer scheduledTimerWithTimeInterval:0.1
        target:self selector:@selector(updateLabels) userInfo:nil repeats:YES];

}

As you can see, you’ve gotten the URL for your sound file and initialized your AVAudioPlayer with it; You set up the enableRate property to allow you to change the playback rate, and set the meteringEnabled property to allow you to monitor the player’s levels; You called the optional prepareToPlay on your player to pre-load the sound file, hopefully making your application slightly faster; you created a timer at the end, which performs your updateLabels method ten times a second. This way you can have your labels updating at a nearly constant rate.

Let’s put in a simple implementation of the updateLabels method.

-(void)updateLabels
{
    [self.player updateMeters];
    self.averageLabel.text =
        [NSString stringWithFormat:@"%f", [self.player averagePowerForChannel:0]];
    self.peakLabel.text =
        [NSString stringWithFormat:@"%f", [self.player peakPowerForChannel:0]];

}

You need to call updateMeters anytime that you use the averagePowerForChannel or peakPowerForChannel methods to get up-to-date values. Both methods take an NSUInteger argument that specifies the channel for which to retrieve information. By giving it the value of 0, you specify the left channel for a stereo track, or the single channel for a mono track. Given that you are dealing with only a basic use of the functionality, channel 0 is a good default.

Next, implement your action methods for your sliders. These actions are called every time the respective slider’s value is changed.

- (IBAction)updateRate:(id)sender
{
    self.player.rate = self.rateSlider.value;
}

- (IBAction)updatePan:(id)sender
{
    self.player.pan = self.panSlider.value;
}

- (IBAction)updateVolume:(id)sender
{
    self.player.volume = self.volumeSlider.value;
}

Now, implement your button action methods, which are also quite simple.

- (IBAction)playVibrateSound:(id)sender
{
    AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
}

- (IBAction)startPlayer:(id)sender
{
    [self.player play];
}

- (IBAction)pausePlayer:(id)sender
{
    [self.player pause];
}

Note  Although most of the AV Foundation functionalities that you are currently working with will work on the simulator (using your computer’s microphone and speakers), the foregoing vibrate sound will not. You need a physical device to test this functionality.

Handling Errors and Interruptions

At this point, your app can successfully play and pause your music, and you can adjust your playback rate, pan, and volume, and monitor your output levels. However, it lacks some basic error handling and interruption handling.

To catch any errors in playing files, you can implement the following method from the AVAudioPlayerDelegate protocol:

-(void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
    NSLog(@"Error playing file: %@", [error localizedDescription]);
}

Whenever you are dealing with an app that has sound or music involved, there is always a concern that your app may be interrupted by a phone call or text message, and you should always include functionality to deal with these concerns. This can be done through a couple of AVAudioPlayer delegate methods. The audioPlayerBeginInterruption: method is called when an audio player has been interrupted while playing. For most cases, you don’t have to provide an implementation for that method because your player is automatically paused by the system. However, if you want your player to resume playing after such an interruption, you need to implement audioPlayerEndInterruption:. In this recipe you want the audio player to resume, so add the following code to your view controller:

- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [player play];
    }
}

You can now see the flexibility with which you can use the AVAudioPlayer, despite its simplistic use. By using multiple instances of AVAudioPlayer, you can implement complex audio designs using multiple sounds at the same time. One could possibly have a background music track running in one AVAudioPlayer, and have one or two others to handle event-based sound effects. The power, simplicity, and flexibility of the AVAudioPlayer class are what make it so popular among iOS developers.

Recipe 9-2: Recording Audio

Now that you have dealt with the key concept of playing audio, you can familiarize yourself with the reverse: recording audio. This process is very similar in both structure and implementation to playing audio. You use the AVAudioRecorder class to do your recording in conjunction with an AVAudioPlayer to handle the playback of your recording. We also make this project slightly more complicated by setting up two multi-functional buttons; one for starting and stopping a recording, and one for playing and pausing a recording.

Start by creating a new single view application project. You need to link and import the AVFoundation framework into your project again, as you did in the previous recipe. Unlike the previous recipe, however, you do not need the Audio Toolbox framework.

Now, set up the user interface so that it looks like Figure 9-2.

9781430245995_Fig09-02.jpg

Figure 9-2.  User interface for recording and playing audio

Create the following outlets:

  • averageLabel and peakLabel for the level-monitoring labels
  • recordButton and playButton for the buttons

Create the following actions:

  • toggleRecording for the Change Value event of the Record button
  • togglePlaying for the Change Value event of the Play button

Before you proceed to your implementation file, you’ll make some additional changes to the ViewController.h file. The first is to prepare the view controller for being a delegate for both the audio player and audio recorder by conforming to the AVAudioPlayerDelegate and the AVAudioRecorderDelegate protocols.

@interface ViewController : UIViewController<AVAudioPlayerDelegate,
                                             AVAudioRecorderDelegate>

Next, add an instance variable to flag when a new recording is available:

@interface ViewController : UIViewController<AVAudioPlayerDelegate,
                                             AVAudioRecorderDelegate>
{
    @private
    BOOL _newRecordingAvailable;
}


// . . .

@end

Finally, add three properties for holding the instance of the audio player, the audio recorder, and the file path to the recorded file. With these and the previous changes, your ViewController.h file should resemble this code, changes in bold:

//
//  ViewController.h
//  Recipe 9.2: Recording Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController<AVAudioPlayerDelegate, AVAudioRecorderDelegate>
{
    BOOL _newRecordingAvailable;
}


@property (weak, nonatomic) IBOutlet UILabel *averageLabel;
@property (weak, nonatomic) IBOutlet UILabel *peakLabel;
@property (weak, nonatomic) IBOutlet UIButton *recordButton;
@property (weak, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) AVAudioPlayer *player;
@property (strong, nonatomic) AVAudioRecorder *recorder;
@property (strong, nonatomic) NSString *recordedFilePath;


- (IBAction)toggleRecording:(id)sender;
- (IBAction)togglePlaying:(id)sender;

@end

Setting Up an Audio Recorder

Now, it’s time to implement the viewDidLoad method in the ViewController.m file. First, you’ll define a file path for the recording:

self.recordedFilePath = [[NSString alloc] initWithFormat:@"%@%@",
    NSTemporaryDirectory(), @"recording.wav"];

Next, you’ll initialize the audio recorder with the file path converted to a URL.

NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];
NSError *error;
self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:nil error:&error];
if (error)
{
    NSLog(@"Error initializing recorder: %@", error);
}
self.recorder.meteringEnabled = YES;
self.recorder.delegate = self;
[self.recorder prepareToRecord];

The call to prepareToRecord assures that when the user taps the Record button later, the recording will start immediately.

Finally, as in Recipe 9-1, start a timer that triggers updating of the level-monitoring labels. The viewDidLoad method should now look like the following code:

- (void)viewDidLoad
{
    [super viewDidLoad];
 // Do any additional setup after loading the view, typically from a nib.
    self.recordedFilePath = [[NSString alloc] initWithFormat:@"%@%@",
        NSTemporaryDirectory(), @"recording.wav"];
    NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];
    NSError *error;
    self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:nil error:&error];
    if (error)
    {
        NSLog(@"Error initializing recorder: %@", error);
    }
    self.recorder.meteringEnabled = YES;
    self.recorder.delegate = self;
    [self.recorder prepareToRecord];
    [NSTimer scheduledTimerWithTimeInterval:0.01 target:self
        selector:@selector(updateLabels) userInfo:nil repeats:YES];

}

You may be wondering why you aren’t also initializing the audio player in the viewDidLoad method. The reason is that the player’s initializer requires a URL that points to an audio file, but at the time of the view loading, there are no audio files recorded. Therefore, as you’ll see later, you create the player when the user taps the Play button.

Your updateLabels method resembles the one in Recipe 9-1 with the exception that now it’s the audio recorder that’s being monitored and not the audio player.

-(void)updateLabels
{
    [self.recorder updateMeters];
    self.averageLabel.text =
        [NSString stringWithFormat:@"%f", [self.recorder averagePowerForChannel:0]];
    self.peakLabel.text =
        [NSString stringWithFormat:@"%f", [self.recorder peakPowerForChannel:0]];
}

Now, let’s turn to the action methods. Start with the simplest one, toggleRecording:. It has just two cases. If the recorder is currently active, it should stop the recording and reset the title of the Record button; if not, it should start recording and change the title of the Record button to “Stop”, as shown here:

- (IBAction)toggleRecording:(id)sender
{
    if ([self.recorder isRecording])
    {
        [self.recorder stop];
        [self.recordButton setTitle:@"Record" forState:UIControlStateNormal];
    }
    else
    {
        [self.recorder record];
        [self.recordButton setTitle:@"Stop" forState:UIControlStateNormal];
    }

}

Also, implement the following method of the AVAudioRecorderDelegate protocol. It will be called when a recording has finished, with a flag that indicates whether the recording was completed successfully. Here’s the implementation:

- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag
{
    _newRecordingAvailable = flag;
    [self.recordButton setTitle:@"Record" forState:UIControlStateNormal];
}

As you can see, if the recording was successful you indicate that a new recording is available by setting an instance variable flag. You also reset the title of the button to read “Record” again.

Now to the slightly more complicated Play button. When the user taps it there are four possible states (we’ll only need to consider three):

  1. The audio player is active, in which case you pause it and reset the button’s title to “Play.”
  2. A new recording is available, which forces you to re-create the audio player with the new file. Then start the player and set the button’s title to “Pause.”
  3. A player has been created but is currently not active, which means it has been paused and should be restarted. Start the player and set the button title to “Pause.”
  4. No player has been created yet. This means that there is no valid recording available. Just ignore this case.

Here are the preceding points translated into code:

- (IBAction)togglePlaying:(id)sender
{
    if (self.player.playing)
    {
        [self.player pause];
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (_newRecordingAvailable)
    {
        NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];
        NSError *error;
        self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        if (!error)
        {
            self.player.delegate = self;
            [self.player play];
        }
        else
        {
            NSLog(@"Error initializing player: %@", error);
        }
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
        _newRecordingAvailable = NO;
    }
    else if (self.player)
    {
        [self.player play];
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }

}

When the player has finished playing the button’s title should be reset. The following delegate method takes care of that:

-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Handling Interruptions

At this point, your application successfully records and plays a sound. As with the previous recipe, you should implement the delegate methods to handle interruptions such as phone calls or text messages. Here are the methods for handling interruptions for both the audio and the recorder:

- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [player play];
    }
}

- (void)audioRecorderEndInterruption:(AVAudioRecorder *)recorder withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [recorder record];
    }
}

Now you have a fully functional app to record and play sounds from your device. As you can see, the AVAudioRecorder and AVAudioPlayer work well together to provide a complete yet simple audio interface for the user.

Recipe 9-3: Accessing the Music Library

So far you have been able to deal with playing and manipulating sound files that you have included in your project, but there is an easy way to access a significantly larger supply of sound files: by accessing the user’s music library.

Here you make another new project. This time you need to link it with the Media Player framework, and as usual add an import statement for it to your view controller.

Setting Up a Basic Music Player

You set up your view to work as a basic music player, so it looks like Figure 9-3.

9781430245995_Fig09-03.jpg

Figure 9-3.  User interface for queuing music from the music library

Create the following outlets for the controls which are referenced from your code:

  • infoLabel
  • volumeSlider
  • playButton

And the following actions:

  • addItems, for the Touch Up Inside event of the Add Music to Queue button
  • prevTapped, playTapped, and nextTapped, respectively, for the three buttons at the bottom side.
  • updateVolume, for the Value Changed event of the volume slider

You also define two properties in your header file, one of type MPMusicPlayerController called player, which you use to play music, and one of type MPMediaItemCollection called myCollection, which helps you keep track of your chosen tracks to play. Finally, you make your view controller the delegate for a class calledMPMediaPickerController, which allows your user to select music to play. Overall, your header file should now look like so:

//
//  ViewController.h
//  Recipe 9.3: Accessing Music Library
//

#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate>

@property (weak, nonatomic) IBOutlet UILabel *infoLabel;
@property (weak, nonatomic) IBOutlet UISlider *volumeSlider;
@property (weak, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) MPMediaItemCollection *myCollection;
@property (strong, nonatomic) MPMusicPlayerController *player;


- (IBAction)addItems:(id)sender;
- (IBAction)prevTapped:(id)sender;
- (IBAction)playTapped:(id)sender;
- (IBAction)nextTapped:(id)sender;
- (IBAction)updateVolume:(id)sender;

@end

Now, you can set up your viewDidLoad method.

- (void)viewDidLoad
{
    [super viewDidLoad];
 // Do any additional setup after loading the view, typically from a nib.
    self.infoLabel.text = @". . .";
    self.player = [MPMusicPlayerController applicationMusicPlayer];
    [self setNotifications];
    [self.player beginGeneratingPlaybackNotifications];
    [self.player setShuffleMode:MPMusicShuffleModeOff];
    self.player.repeatMode = MPMusicRepeatModeNone;
    self.volumeSlider.value = self.player.volume;

}

Note  The MPMusicPlayerController class has two important class methods that allow you to access an instance of the class. The one you used previously, applicationMusicPlayer, returns an application-specific music player. This option can be useful for keeping your music separate from the device’s music player, but has the downside of being unable to play once the app enters the background. Alternatively, you can use the iPodMusicPlayer, which allows for continuous play despite being in the background. The main thing to keep in mind in this case, however, is that your player may already have a nowPlayingItem from the actual iPod that you should be able to handle.

Handling Notifications

Whenever you use an instance of MPMusicPlayerController, it is recommended to register for notifications for whenever the playback state changes, or whenever the currently playing song changes. We have extracted this code into a helper method named setNotifications. Here is its implementation:

-(void)setNotifications
{
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    [notificationCenter
     addObserver: self
     selector:    @selector(handleNowPlayingItemChanged:)
     name:        MPMusicPlayerControllerNowPlayingItemDidChangeNotification
     object:      self.player];
    [notificationCenter
     addObserver: self
     selector:    @selector(handlePlaybackStateChanged:)
     name:        MPMusicPlayerControllerPlaybackStateDidChangeNotification
     object:      self.player];
    [notificationCenter
     addObserver: self
     selector:    @selector(handleVolumeChangedFromHardware:)
     name:        @"AVSystemController_SystemVolumeDidChangeNotification"
     object:      nil];
}

Except for the music player notifications, you have included a third notification registration in order to make sure you know any time the user adjusts the device volume using the device’s side buttons. This way, your application can update its slider control to reflect these changes.

Now implement the method that responds to the volume changed notification. It simply updates the slider (animated) with the new level:

-(void)handleVolumeChangedFromHardware:(id)sender
{
    [self.volumeSlider setValue:self.player.volume animated:YES];
}

Next, the method to handle the playback state change notification simply updates the title of the play button to reflect the new state, like so:

- (void) handlePlaybackStateChanged: (id) notification
{
    MPMusicPlaybackState playbackState = [self.player playbackState];
    if (playbackState == MPMusicPlaybackStateStopped)
    {
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (playbackState == MPMusicPlaybackStatePaused)
    {
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (playbackState == MPMusicPlaybackStatePlaying)
    {
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }
}

Finally, whenever the currently playing song is changed, the info label should be updated. The following method handles that:

- (void) handleNowPlayingItemChanged: (id) notification
{
    MPMediaItem *currentItemPlaying = [self.player nowPlayingItem];
    if (currentItemPlaying)
    {
        NSString *info = [NSString stringWithFormat:@"%@ - %@",
            [currentItemPlaying valueForProperty:MPMediaItemPropertyTitle],
            [currentItemPlaying valueForProperty:MPMediaItemPropertyArtist]];
        self.infoLabel.text = info;
    }
    else
    {
        self.infoLabel.text = @". . .";
    }
}

Picking Media to Play

To add music to your list of tunes use the MPMediaPickerController class. It provides, as shown in Figure 9-4, a standardized way to make a music selection. Add the following code to the addItems action method to set up and display a media picker:

- (IBAction)addItems:(id)sender
{
    MPMediaPickerController *picker =
        [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
    picker.delegate = self;
    picker.allowsPickingMultipleItems = YES;
    picker.prompt = NSLocalizedString (@"Add songs to play",
        "Prompt in media item picker");
    [self presentViewController:picker animated:YES completion:NULL];

}

9781430245995_Fig09-04.jpg

Figure 9-4.  The user interface of the MPMediaPickerController to select songs

The media picker communicates with your view controller through the MPMediaPickerControllerDelegate protocol that you added to the header earlier. You implement the following two delegate methods to handle both cancellation and successful selection of media:

-(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker
{
    [self dismissViewControllerAnimated:YES completion:NULL];
}

-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
    [self updateQueueWithMediaItemCollection:mediaItemCollection];
    [self dismissViewControllerAnimated:YES completion:NULL];
}

An MPMediaItemCollection is the group of media items that were selected by the user. You use it to update the media player’s queue in the updateQueueWithMediaItemCollection: method, like so:

-(void)updateQueueWithMediaItemCollection:(MPMediaItemCollection *)collection
{
    if (collection)
    {
        if (self.myCollection == nil)
        {
            self.myCollection = collection;
            [self.player setQueueWithItemCollection: self.myCollection];
            [self.player play];
        }
        else
        {
            BOOL wasPlaying = NO;
            if (self.player.playbackState == MPMusicPlaybackStatePlaying)
            {
                wasPlaying = YES;
            }
            MPMediaItem *nowPlayingItem        = self.player.nowPlayingItem;
            NSTimeInterval currentPlaybackTime = self.player.currentPlaybackTime;
            NSMutableArray *combinedMediaItems =
                [[self.myCollection items] mutableCopy];
            NSArray *newMediaItems = [collection items];
            [combinedMediaItems addObjectsFromArray: newMediaItems];
            self.myCollection =
                [MPMediaItemCollection collectionWithItems:combinedMediaItems];
            [self.player setQueueWithItemCollection:self.myCollection];
            self.player.nowPlayingItem      = nowPlayingItem;
            self.player.currentPlaybackTime = currentPlaybackTime;
            if (wasPlaying)
            {
                [self.player play];
            }
        }
    }
}

This method may seem complex, but it is actually a fairly linear progression. First, after checking to make sure that the collection of newly selected items is not nil, you check to see whether there is a previous queue set up. If not, you simply set your player’s queue to this collection. On the other hand, if a collection does exist, you combine the two, set your player’s queue as the result, and then restore your playback to where it previously was.

The remaining action methods’ implementations are pretty straightforward. Here’s the one that responds to user tapping the Prev button:

- (IBAction)prevTapped:(id)sender
{
    if ([self.player currentPlaybackTime] > 5.0)
    {
        [self.player skipToBeginning];
    }
    else
    {
        [self.player skipToPreviousItem];
    }
}

As you can see, in this example we’ve given the Prev button two functionalities; if the media player is at the beginning of the current song, tapping the button will skip to the previous song; however, if the playback is more than five seconds into the current song, tapping the button will skip to the beginning of the current song.

The next button is even simpler. It just skips to the next song:

- (IBAction)nextTapped:(id)sender
{
    [self.player skipToNextItem];
}

The Play button toggles between Play and Pause and updates the button title accordingly, like so:

- (IBAction)playTapped:(id)sender
{
    if ((self.myCollection != nil) &&
        (self.player.playbackState != MPMusicPlaybackStatePlaying))
    {
        [self.player play];
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }
    else if (self.player.playbackState == MPMusicPlaybackStatePlaying)
    {
        [self.player pause];
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
}

And finally, the action method for when the user drags the volume slider. It simply updates the player volume with the slider’s value:

- (IBAction)updateVolume:(id)sender
{
    self.player.volume = self.volumeSlider.value;
}

Your application is now ready to build and run. One thing to note when you run this application is that until you start playing music, you cannot adjust your AVAudioPlayer’s volume by using the external volume buttons, as these still control the ringer volume, as opposed to the playback volume. After you select a song to play, you receive full control over the playback volume through these buttons.

You’ll now go on by adding the possibility to search the music library for media to add to the playback queue.

Querying Media

The media player comes with a powerful querying capability with which you can search the music library. To give you an idea of its possibilities we’re going to add a new feature to the application. The feature allows the user to query the music library for items containing a certain text, and have them added to the media player’s queue.

First, you add a UIButton as well as a UITextField to your XIB, so that your view now looks like Figure 9-5.

9781430245995_Fig09-05.jpg

Figure 9-5.  User interface with a feature for querying music by artist

Create an outlet with the name artistTextField for referencing the text field, and an action, named queueMusicByArtist, for the button.

The first thing you do with your new UITextField is to set its delegate to your view controller by adding the following lines to the viewDidLoad method:

- (void)viewDidLoad
{
    // . . .

    self.artistTextField.delegate = self;
    self.artistTextField.enablesReturnKeyAutomatically = YES;

}

Make sure to adjust your header file to declare that your view controller conforms to the UITextFieldDelegate protocol.

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate,
                                             UITextFieldDelegate>

// . . .

@end

Next, implement the following delegate method to have your text field dismiss the keyboard and automatically perform the query when the user taps the return key:

-(BOOL)textFieldShouldReturn:(UITextField *)textField
{
    [textField resignFirstResponder];
    [self queueMusicByArtist:self];
    return NO;
}

Finally, the queueMusicByArtist: method.

- (IBAction)queueMusicByArtist:(id)sender
{
    NSString *artist = self.artistTextField.text;
    if (artist != nil && artist != @"")
    {
        MPMediaPropertyPredicate *artistPredicate =
            [MPMediaPropertyPredicate
                predicateWithValue:artist
                forProperty:MPMediaItemPropertyArtist
                comparisonType:MPMediaPredicateComparisonContains];
        MPMediaQuery *query = [[MPMediaQuery alloc] init];
        [query addFilterPredicate:artistPredicate];
        NSArray *result = [query items];
        if ([result count] > 0)
        {
            [self updateQueueWithMediaItemCollection:
                [MPMediaItemCollection collectionWithItems:result]];
        }
        else
            self.infoLabel.text = @"Artist Not Found.";
    }
}

You can now run and test the new feature. Enter a search string in the artist text field, press Return (or tap the Queue Music By button) and the media player should start playing all songs from artists whose names contain the provided text.

As you can see, querying the media library is a fairly simple process, which at its bare minimum requires only an instance of the MPMediaQuery class. You can then add MPMediaPropertyPredicates to a query to make it more specific.

Using MPMediaPropertyPredicates requires a decent knowledge of the different MPMediaItemProperties, so that you can know exactly what kind of information you can acquire. Not all MPMediaItemProperties are filterable, and the filterable properties are also different if you are dealing specifically with a podcast. You should refer to the Apple documentation on MPMediaItem for a full list of properties, but following is a list of the most commonly used ones:

  • MPMediaItemPropertyMediaType
  • MPMediaItemPropertyTitle
  • MPMediaItemPropertyAlbumTitle
  • MPMediaItemPropertyArtist
  • MPMediaItemPropertyArtwork

Tip  Whenever you use the MPMediaItemPropertyArtwork, you can use the imageWithSize: method defined in MPMediaItemPropertyArtwork to create a UIImage from the artwork.

We have barely scratched the surface of media item queries, but here comes a few points to keep in mind when dealing with them:

  1. Whenever multiple filter predicates specifying different properties are added to a query, the predicates are evaluated using the AND operator, meaning that if you specify an artist name and an album name, you will receive only songs by that artist AND from that specific album.
  2. Do not add two filter predicates of the same property to a query because the resulting behavior is not defined. If you wish to query a database for multiple specific values of the same property, such as filtering for all songs by two different artists, the better method is to simply create two queries, and then combine their results afterward.
  3. The comparisonType property of an MPMediaPropertyPredicate helps specify how exact you want your predicate to be. A value of MPMediaPredicateComparisonEqualTo returns only items with the string exactly equal to the given one, while a value of MPMediaPredicateComparisonContains, as shown earlier, returns items that contain the given string, which is a less specific search.

MPMediaQuery instances can also be given a “grouping property”, so that they automatically group their results. You could, for example, filter a query by a specific artist, but group according to the album name:

[query setGroupingType: MPMediaGroupingAlbum];

In this way, you can retrieve all the songs by a specific artist but iterate through them as if they were in albums, as demonstrated by the following code:

NSArray *albums = [query collections];
for (MPMediaItemCollection *album in albums)
{
    MPMediaItem *representativeItem = [album representativeItem];
    NSString *albumName =
        [representativeItem valueForProperty: MPMediaItemPropertyAlbumTitle];
    NSLog (@"%@", albumName);
}

You can also set a grouping type by using MPMediaQuery class methods, such as albumsQuery, which creates your query instance with a pre-set grouping property.

Even though we haven’t dug deep into the media player framework, you can probably see the power of it; accessing the user’s own library opens up a whole new level of audio customization for your applications, possibilities like selecting music to wake up to, or allowing the user to specify her own background music for your game. You’re probably coming up with several other usages yourself right now. Why not go ahead and implement them?

Recipe 9-4: Playing Background Audio

In this recipe you’ll build a basic music player app that can keep on playing even in background mode. Additionally, you’ll use the MPNowPlayingInfoCenter to allow your app to be controlled from the multitasking bar and to display information about the current tune on the lock screen.

Start by creating a new single view application project. You’ll need the following frameworks so make sure you link their binaries to your project.

  • AVFoundation.framework, to play your audio files.
  • MediaPlayer.framework, to access your library of media files.
  • CoreMedia.framework, you won’t use any classes from this framework but you will need some of the CMTime functions to help deal with your audio player.

Also, add the following import statements to your view controller’s header file. You do not need one for the Core Media framework in this project:

#import <MediaPlayer/MediaPlayer.h>
#import <AVFoundation/AVFoundation.h>

Setting Up the User Interface

It’s usually a good idea to start with the design of the user interface. You’ll build a simple media player with the following features:

  • Adding items from the music library to the playlist
  • Starting and pausing playback
  • Navigate backward and forward in the playlist
  • Clear the playlist
  • Information about the current song and album

Create the user interface as shown in Figure 9-6, using a label, an image view, and buttons.

9781430245995_Fig09-06.jpg

Figure 9-6.  A user interface for a simple media player with background playback

Add outlets for the play button, info label, and the image view, name them playButton, infoLabel, and artworkImageView, respectively. Also, create actions for the buttons. Use the names queueFromLibrary, goToPrevTrack, togglePlay, goToNextTrack and clearPlaylist.

Declaring Background Mode Playback

Now set up your app to continue playing music after it has entered the background of the device. The first thing you need to do is declare a property of type AVAudioSession, called session.

@property (nonatomic, strong) AVAudioSession *session;

Next, add the following code to your viewDidLoad method:

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.session = [AVAudioSession sharedInstance];
    NSError *error;
    [self.session setCategory:AVAudioSessionCategoryPlayback error:&error];
    if (error)
    {
        NSLog(@"Error setting audio session category: %@", error);
    }
    [self.session setActive:YES error:&error];
    if (error)
    {
        NSLog(@"Error activating audio session: %@", error);
    }

}

By specifying that your session’s category is of type AVAudioSessionCategoryPlayback, you are telling your device that your application’s main focus is playing music, and should therefore be allowed to continue playing audio while the application is in the background.

Now that you have configured your AVAudioSession, you need to edit your application’s .plist file to specify that your application, when in the background mode, must be allowed to run audio. You do that by adding audio as a required background mode in the properties list, as shown in Figure 9-7.

9781430245995_Fig09-07.jpg

Figure 9-7.  Setting audio as a required background mode

To allow the user to control your media player remotely, either from the buttons of her earphones or from the activity bar, you need to respond to remote control events. For these events to work, your view controller needs to be the first responder so start by allowing this by overriding the canBecomeFirstResponder method:

-(BOOL)canBecomeFirstResponder
{
    return YES;
}

Now, implement the viewDidAppear: and viewWillDisappear: methods to set the first responder status, as well as registering for the remote control events, like so:

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
    [self becomeFirstResponder];
}

- (void)viewWillDisappear:(BOOL)animated
{
    [[UIApplication sharedApplication] endReceivingRemoteControlEvents];
    [self resignFirstResponder];
    [super viewWillDisappear:animated];
}

Now, to receive and respond to remote control events, implement the following method:

- (void)remoteControlReceivedWithEvent: (UIEvent *) receivedEvent
{
    if (receivedEvent.type == UIEventTypeRemoteControl)
    {
        switch (receivedEvent.subtype)
        {
            case UIEventSubtypeRemoteControlTogglePlayPause:
                [self togglePlay:self];
                break;
            case UIEventSubtypeRemoteControlPreviousTrack:
                [self goToPrevTrack:self];
                break;
            case UIEventSubtypeRemoteControlNextTrack:
                [self goToNextTrack:self];
                break;
            default:
                break;
        }
    }
}

As you can see, you only redirect the events by invoking the respective action method which you’ll implement shortly.

Implementing the Player

You use an AVPlayer to do the playback. It differs from the MPMusicPlayerController you saw in the previous recipe in that it can continue playing in background mode. However, it doesn’t work directly with items from your music library which requires a little more coding than with MPMusicPlayerController.

Add the following properties to your view controller:

@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) NSMutableArray *playlist;
@property (nonatomic)NSInteger currentIndex;

The playlist property holds an array of items from the music library, and the currentIndex holds the index of the current track within the playlist. Go back to the viewDidLoad and add the following code to initialize the player and the playlist:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // . . .

    self.playlist = [[NSMutableArray alloc] init];
    self.player = [[AVPlayer alloc] init];

}

Now let’s start by implementing the Library button. It should present a media picker controller and append the selected items to the playlist. Here’s the implementation:

- (IBAction)queueFromLibrary:(id)sender
{
    MPMediaPickerController *picker =
        [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
    picker.delegate = self;
    picker.allowsPickingMultipleItems = YES;
    picker.prompt = @"Choose Some Music!";
    [self presentViewController:picker animated:YES completion:NULL];

}

You also need to add the MPMediaPickerControllerDelegate protocol to your view controller:

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate>

Now implement the delegate methods that receive the selected items. It should append them to the list and dismiss the media picker. Also, if these are the first items added, playback should be started:

-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
    BOOL shallStartPlayer = self.playlist.count == 0;
    [self.playlist addObjectsFromArray:mediaItemCollection.items];
    if (shallStartPlayer)
        [self startPlaybackWithItem:[self.playlist objectAtIndex:0]];

    [self dismissViewControllerAnimated:YES completion:NULL];
}

Which leads us to the startPlaybackWithItem: method. It replaces the currently played item (if any), resets the current playback position (in case the item has been played before), and starts the playback.

-(void)startPlaybackWithItem:(MPMediaItem *)mpItem
{
    [self.player replaceCurrentItemWithPlayerItem:[self avItemFromMPItem:mpItem]];
    [self.player seekToTime:kCMTimeZero];
    [self startPlayback];
}

Because the AVPlayer is working with AVPlayerItems and not MPMediaItems, you need to create one. This is the job of the avItemFromMPItem: method:

-(AVPlayerItem *)avItemFromMPItem:(MPMediaItem *)mpItem
{
    NSURL *url = [mpItem valueForProperty:MPMediaItemPropertyAssetURL];

    AVPlayerItem *item = [AVPlayerItem playerItemWithURL:url];

    [[NSNotificationCenter defaultCenter]
     addObserver:self
     selector:@selector(playerItemDidReachEnd:)
     name:AVPlayerItemDidPlayToEndTimeNotification
     object:item];

    return item;
}

What’s interesting here is that you not only create the AVPlayerItem, but also register a method to receive a notification when the song has reached its end. This is so that you can go on to the next tune in the playlist:

- (void)playerItemDidReachEnd:(NSNotification *)notification
{
    [self goToNextTrack:self];
}

The next stop is the startPlayback method. It starts the player, changes the title of the play button to “Pause,” and calls updateNowPlaying.

-(void)startPlayback
{
    [self.player play];
    [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    [self updateNowPlaying];
}

Finally, the last method to implement in this chain of calls is updateNowPlaying. The method, aside from updating your user interface to display the current song information, also uses the MPNowPlayingInfoCenter. This class allows the developer to place information on the device’s lock screen (see Figure 9-8 for an example), or on other devices when the application is displaying info through AirPlay. You can pass information to it by setting the nowPlayingInfo property of the defaultCenter to a dictionary of values and properties that you created.

-(void)updateNowPlaying
{
    if (self.player.currentItem != nil)
    {
        MPMediaItem *currentMPItem = [self.playlist objectAtIndex:self.currentIndex];
        self.infoLabel.text =
            [NSString stringWithFormat:@"%@ - %@",
                [currentMPItem valueForProperty:MPMediaItemPropertyTitle],
                [currentMPItem valueForProperty:MPMediaItemPropertyArtist]];
        UIImage *artwork =
[[currentMPItem valueForProperty:MPMediaItemPropertyArtwork]
                imageWithSize:self.artworkImageView.frame.size];
        self.artworkImageView.image = artwork;
        NSString *title = [currentMPItem valueForProperty:MPMediaItemPropertyTitle];
        NSString *artist =
            [currentMPItem valueForProperty:MPMediaItemPropertyArtist];
        NSString *album =
            [currentMPItem valueForProperty:MPMediaItemPropertyAlbumTitle];
        NSDictionary *mediaInfo =
           [NSDictionary dictionaryWithObjectsAndKeys:
               artist, MPMediaItemPropertyArtist,
               title, MPMediaItemPropertyTitle,
               album, MPMediaItemPropertyAlbumTitle,
               [currentMPItem valueForProperty:MPMediaItemPropertyArtwork],
                   MPMediaItemPropertyArtwork,
               nil];
        [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = mediaInfo;
    }
    else
    {
        self.infoLabel.text = @". . .";
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
        self.artworkImageView.image = nil;
    }
}

9781430245995_Fig09-08.jpg

Figure 9-8.  Information on the lock screen about the current track

The next action method is togglePlay, which should toggle between play and pause modes. An edge case here is that if the player has not yet been initialized you need to initialize it with the first item in your playlist.

- (IBAction)togglePlay:(id)sender
{
    if (self.playlist.count > 0)
    {
        if (self.player.currentItem == nil)
        {
            [self startPlaybackWithItem:[self.playlist objectAtIndex:0]];
        }
        else
        {
            // Player has an item, pause or resume playing it
            BOOL isPlaying = self.player.currentItem && self.player.rate != 0;
            if (isPlaying)
            {
                [self pausePlayback];
            }
            else
            {
                [self startPlayback];
            }
        }
    }
}

From the preceding code, there’s only the pausePlayback method that you haven’t yet implemented. That’s easily fixed. All it needs to do is to pause the player and update the play button title:

-(void)pausePlayback
{
    [self.player pause];
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Next up is the goToPrevTrack: and goToNextTrack: action methods. They are pretty straightforward. What’s worth noting though, is that if playback is more than five seconds into the song, the back button will rewind the current song and not skip to the previous item in the playlist:

- (IBAction)goToPrevTrack:(id)sender
{
    if (self.playlist.count == 0)
        return;
    if (CMTimeCompare(self.player.currentTime, CMTimeMake(5.0, 1)) > 0)
    {
        [self.player seekToTime:kCMTimeZero];
    }
    else
    {
        if (self.currentIndex == 0)
        {
            self.currentIndex = self.playlist.count - 1;
        }
        else
        {
            self.currentIndex -= 1;
        }
        MPMediaItem *previousItem = [self.playlist objectAtIndex:self.currentIndex];
        [self startPlaybackWithItem:previousItem];
    }
}

- (IBAction)goToNextTrack:(id)sender
{
    if (self.playlist.count == 0)
        return;
    if (self.currentIndex == self.playlist.count - 1)
    {
        self.currentIndex = 0;
    }
    else
    {
        self.currentIndex += 1;
    }
    MPMediaItem *nextItem = [self.playlist objectAtIndex:self.currentIndex];
    [self startPlaybackWithItem: nextItem];
}

The CMTimeMake() function that you just used is a very flexible function that takes two inputs. The first represents the number of time units you want, and the second represents the timescale, where 1 represents a second, 2 represents half a second, and so on. A call of CMTimeMake(100, 10) would make 100 units of (1/10) seconds each, resulting in 10 seconds.

There’s only one feature remaining unimplemented: clearing the playlist. Here is its implementation:

- (IBAction)clearPlaylist:(id)sender
{
    [self.player replaceCurrentItemWithPlayerItem:nil];
    [self.playlist removeAllObjects];
    [self updateNowPlaying];
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Finally, your app is now ready to build and run. When you test the app, it should continue to play music even after the application has entered the background. Figure 9-9 shows the multitasking bar with which you can control your media player even when another app is active.

9781430245995_Fig09-09.jpg

Figure 9-9.  The “remote” controls in the multitasking bar that can control an app playing audio in the background

Tip  This recipe used AVPlayer to do the playback. It can only handle one item at a time, which is why we had to implement an external playlist. However, there is an alternative player that you can use to play queued items. The AVQueuePlayer is suitable for applications that need to play a sequence of items, but don’t need complex navigation within the playlist.

Summary

The complete multimedia experience is one that goes beyond a simple matter of listening to music. Sound, as a product, is about the tiny details that make things just a little bit better. From recording music, to filtering media items, to creating volume ramps, every little detail that you, as a developer, take care to include will eventually result in more powerful and enjoyable tools. In iOS development, Apple has provided us with an incredibly powerful set of multimedia-based functionalities. We should not let it go to waste.

1 https://developer.apple.com/library/ios/documentation/AudioVideo/Conceptual/MultimediaPG/MultimediaProgrammingGuide.pdf

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset