Chapter     10

Multimedia Recipes

In the words of Aldous Huxley, “After silence, that which comes nearest to expressing the inexpressible is music.” We live in a world where we are surrounded by sound and music. From the most subtle background tune in an advertisement to the immense blast of an electric guitar at a rock concert, sound has a tremendous impact on and plays an integral part in our lives. It is our responsibility as developers to translate this force into our applications and bring the most complete and ideal experience to users.

Throughout this chapter, a variety of recipes make use of accessing the music library. Therefore, to fully test these recipes you should ensure there are at least a few songs in your device’s music library.

Recipe 10-1: Playing Audio

If you ask most people what they think of when they hear the words “iPhone” and “audio,” they will probably think along the lines of their iPod and the thousands of songs they have downloaded. What most users tend to overlook, despite its immense importance, is the concept of background audio and sound effects. These sound clips and tunes might go completely unnoticed by the user in normal use, but in terms of app functionality and design they can tremendously improve the quality of an app. It might be the little “shutter click” when you take a picture or some background music that gets stuck in your head after you play a game for too long; regardless of whether the user notices it, sound can make a world of difference. The iOS AV Foundation framework provides a simple way to access, play, and manipulate sound files using AVAudioPlayer. In this recipe, you will create a sample project that allows you to play an audio file and, in addition, allows the user to manipulate the clip’s playback.

Setting Up the Application

Start by creating a new single view application project. In this recipe, you will utilize two frameworks that aren’t linked by default, so you need to add them to your project. These are AVFoundation.framework, which includes the AVAudioPlayer class, and AudioToolbox.framework, which you’ll use to vibrate the device.

Next, to import the APIs for these frameworks, switch to your view controller’s header file and add the following statements:

//
//  ViewController.h
//  Recipe 10-1 Playing Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface ViewController : UIViewController

@end

Now build your view in the Main.storyboard file using the following components to resemble Figure 10-1:

  • 3 sliders: Rate, Pan, and Volume
  • 5 title labels: Average, Peak, Rate, Pan, and Volume
  • 2 value labels: Both with default value “0.0”
  • 3 buttons: Vibrate, Play, and Pause

9781430259596_Fig10-01.jpg

Figure 10-1. A user interface to control an AVAudioPlayer

You need your slider’s values to match the possible values of the properties they control. Using the attribute inspector, adjust the minimum and maximum values of your Rate slider to 0.5 and 2.0 (corresponding to half speed and 2x speed). Also, set its current value to 1. The values for the Pan slider should be -1 and 1 (corresponding to left pan and right pan) with 0 as the current value. The Volume slider’s default values should already be fine, as the volume property goes from 0 to 1. Just set its current value to 1 (maximum volume).

As you have done in preceding recipes, create outlets for the controls that are referenced from your code and actions for the events you need to respond to. Create the following outlets:

  • rateSlider, panSlider, and volumeSlider for the sliders
  • averageLabel and peakLabel for your two level-monitoring labels (the ones with the texts “0.0” in Figure 10-1)

Create the following actions:

  • updateRate, updatePan, and updateVolume for the Value Changed events of the respective sliders
  • playVibrateSound, startPlayer, and pausePlayer for the Touch Up Inside events of the buttons

Add two properties to your header file to keep track of your AVAudioPlayer and your AVAudioSession:

@property (strong, nonatomic) AVAudioPlayer *player;

The last step in your header file is to make your view controller conform to the AVAudioPlayerDelegate protocol. Your ViewController.h file should now resemble Listing 10-1.

Listing 10-1.  The initial ViewController.h setup

//
//  ViewController.h
//  Recipe 9.1 Playing Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>

@interface ViewController : UIViewController <AVAudioPlayerDelegate>

@property (weak, nonatomic) IBOutlet UISlider *rateSlider;
@property (weak, nonatomic) IBOutlet UISlider *panSlider;
@property (weak, nonatomic) IBOutlet UISlider *volumeSlider;

@property (weak, nonatomic) IBOutlet UILabel *averageLabel;
@property (weak, nonatomic) IBOutlet UILabel *peakLabel;

@property (strong, nonatomic) AVAudioPlayer *player;

- (IBAction)updateRate:(id)sender;
- (IBAction)updatePan:(id)sender;
- (IBAction)updateVolume:(id)sender;
- (IBAction)playVibrateSound:(id)sender;
- (IBAction)startPlayer:(id)sender;
- (IBAction)pausePlayer:(id)sender;

@end

Before you proceed, you need to select and import the sound file that your application will be playing. The file we use is called midnight-ride.mp3, and the code reflects this file name. You need to change any filename or file type according to the file you choose. You should consult Apple’s documentation on which file types are appropriate at https://developer.apple.com/library/ios/DOCUMENTATION/AudioVideo/Conceptual/MultimediaPG/UsingAudio/UsingAudio.html#//apple_ref/doc/uid/TP40009767-CH2. However, it is fairly safe to assume that most commonly used file types, such as .wav or .mp3, will work.

Tip   We downloaded our sound file from Sound Jay, which offers sound and music files free of charge. Be sure to read the terms of use (http://www.soundjay.com/tos.html) for how you may use Sound Jay’s files in your projects.

Add the sound file to your project by dragging and dropping it into the Supported Files folder. For more information about adding resource files, see Chapter 1, Recipe 1-8.

Setting Up the Audio Player

Switch to ViewController.m and locate the viewDidLoad method. Add the code in Listing 10-2 to set up AVAudioPlayer.

Listing 10-2.  Updating the viewDidLoad method to set up AVAudioPlayer

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.
    NSString *fileName = @"midnight-ride"; // Change this to your own file
    NSString *fileType = @"mp3";
    NSString *soundFilePath =
        [[NSBundle mainBundle] pathForResource:fileName ofType:fileType];
    NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
    
    NSError *error;
    self.player =
        [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:&error];
    if (error)
    {
        NSLog(@"Error creating the audio player: %@", error);
    }
    self.player.enableRate = YES; //Allows us to change the playback rate.
    self.player.meteringEnabled = YES; //Allows us to monitor levels
    self.player.delegate = self;
    self.volumeSlider.value = self.player.volume;
    self.rateSlider.value = self.player.rate;
    self.panSlider.value = self.player.pan;

    [self.player prepareToPlay]; //Preload audio to decrease lag

    [NSTimer scheduledTimerWithTimeInterval:0.1
        target:self selector:@selector(updateLabels) userInfo:nil repeats:YES];
}

From Listing 10-2, you can see that you set the URL for your sound file and initialized your AVAudioPlayer with it, set the enableRate property to allow you to change the playback rate, and set the meteringEnabled property to allow you to monitor the player’s levels. The optional prepareToPlay on your player gets called to pre-load the sound file, which hopefully makes your application slightly faster. At the end you created a timer, which performs your updateLabels method at a rate of ten times per second. This way your labels will update at a nearly constant rate.

Now add a simple implementation of the updateLabels method, as shown in Listing 10-3.

Listing 10-3.  Implementing the updateLabels method

-(void)updateLabels
{
    [self.player updateMeters];
    self.averageLabel.text =
        [NSString stringWithFormat:@"%f", [self.player averagePowerForChannel:0]];
    self.peakLabel.text =
        [NSString stringWithFormat:@"%f", [self.player peakPowerForChannel:0]];
}

The updateMeters method needs to be called any time you use the averagePowerForChannel or peakPowerForChannel methods to get up-to-date values. Both methods take an NSUInteger, which is an unsigned integer argument that specifies the channel for which to retrieve information. By giving it the value of 0, you specify the left channel for a stereo track or the single channel for a mono track. Given that you are dealing with only a basic use of the functionality, channel 0 is a good default.

Next, implement your action methods for your sliders, as shown in Listing 10-4. These actions are called every time the respective slider’s value is changed.

Listing 10-4.  Implementation of the updateRate, updatePan, and updateVolume methods

- (IBAction)updateRate:(id)sender
{
    self.player.rate = self.rateSlider.value;
}

- (IBAction)updatePan:(id)sender
{
    self.player.pan = self.panSlider.value;
}

- (IBAction)updateVolume:(id)sender
{
    self.player.volume = self.volumeSlider.value;
}

Next, implement your button action methods, which are also quite simple. This implementation is shown in Listing 10-5.

Listing 10-5.  Implementation of the playVibrateSound, startPlayer, and pausePlayer action methods

- (IBAction)playVibrateSound:(id)sender
{
    AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
}

- (IBAction)startPlayer:(id)sender
{
    [self.player play];
}

- (IBAction)pausePlayer:(id)sender
{
    [self.player pause];
}

Note   Although most of the AV Foundation functionalities you are currently working with will work on the simulator (using your computer’s microphone and speakers), the vibrate sound will not. You need a physical device to test this functionality.

Handling Errors and Interruptions

At this point, your app can successfully play and pause your music, and you can adjust your playback rate, pan, and volume as well as monitor your output levels. However, it lacks some basic error handling and interruption handling.

To catch any errors in playing files, you can implement the method in Listing 10-6 from the AVAudioPlayerDelegate protocol.

Listing 10-6.  Implementation of the audioPlayerDecodeErrorDidOccur:error: delegate method

-(void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
    NSLog(@"Error playing file: %@", [error localizedDescription]);
}

Whenever you are dealing with an app that has sound or music involved, there is always a concern that your app might be interrupted by a phone call or text message, so you should always include functionality to deal with these concerns. This can be done through a couple of AVAudioPlayer delegate methods. The audioPlayerBeginInterruption: method is called when an audio player has been interrupted while playing. For most cases, you don’t have to provide an implementation for that method because your player is automatically paused by the system. However, if you want your player to resume playing after such an interruption, you need to implement the audioPlayerEndInterruption: method. In this recipe you want the audio player to resume, so add the code in Listing 10-7 to your view controller.

Listing 10-7.  Implementing the audioPlayerEndInterruption:withOptions: method

- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [player play];
    }
}

You can now see the flexibility with which you can use the AVAudioPlayer, despite its simplistic use. By using multiple instances of AVAudioPlayer, you can implement complex audio designs using multiple sounds at the same time. One could possibly have a background music track running in one AVAudioPlayer and have one or two others handling event-based sound effects. The power, simplicity, and flexibility of the AVAudioPlayer class are what make it so popular among iOS developers.

Recipe 10-2: Recording Audio

Now that you have dealt with the key concept of playing audio, you can familiarize yourself with the reverse: recording audio. This process is very similar in both structure and implementation to playing audio. You use the AVAudioRecorder class to do your recording in conjunction with an AVAudioPlayer to handle the playback of your recording. We also make this project slightly more complicated by setting up two multifunctional buttons; one for starting and stopping a recording, and one for playing and pausing a recording.

Start by creating a new single view application project. You need to link and import the AVFoundation framework into your project again, as you did in the preceding recipe. Unlike the preceding recipe, however, you do not need the Audio Toolbox framework.

Now set up the user interface so that it looks like Figure 10-2 with the following items:

  • 2 title labels: Average and Peak
  • 2 value labels: Both with a “0.0” value
  • 2 buttons: Record and Play

9781430259596_Fig10-02.jpg

Figure 10-2. User interface for recording and playing audio

Create the following outlets:

  • averageLabel and peakLabel for the level-monitoring labels
  • recordButton and playButton for the buttons

Create the following actions:

  • toggleRecording for the Change Value event of the “Record” button
  • togglePlaying for the Change Value event of the “Play” button

Before you proceed to your implementation file, you’ll make some additional changes to the ViewController.h file, as shown in Listing 10-8. The first is to prepare the view controller for being a delegate for both the audio player and audio recorder by conforming to the AVAudioPlayerDelegate and the AVAudioRecorderDelegate protocols.

Listing 10-8.  Declaring delegates in the ViewController.h file

@interface ViewController : UIViewController <AVAudioPlayerDelegate,
                                             AVAudioRecorderDelegate>

Next, add an instance variable to flag when a new recording is available, as shown in Listing 10-9.

Listing 10-9.  Adding an instance variable to the ViewController.h file

@interface ViewController : UIViewController<AVAudioPlayerDelegate,
                                             AVAudioRecorderDelegate>
{
    @private
    BOOL _newRecordingAvailable;
}

// ...

@end

Finally, add four properties for holding the instance of the audio player, the audio recorder, the audio session, and the file path to the recorded file. With these and the preceding changes, your ViewController.h file should resemble Listing 10-10, with changes in bold.

Listing 10-10.  The complete ViewController.h file

//
//  ViewController.h
//  Recipe 9.2 Recording Audio
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController <AVAudioPlayerDelegate, AVAudioRecorderDelegate>
{
    BOOL _newRecordingAvailable;
}

@property (weak, nonatomic) IBOutlet UILabel *averageLabel;
@property (weak, nonatomic) IBOutlet UILabel *peakLabel;
@property (weak, nonatomic) IBOutlet UIButton *recordButton;
@property (weak, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) AVAudioPlayer *player;
@property (strong, nonatomic) AVAudioRecorder *recorder;
@property (strong, nonatomic) AVAudioSession *session;
@property (strong, nonatomic) NSString *recordedFilePath;

- (IBAction)toggleRecording:(id)sender;
- (IBAction)togglePlaying:(id)sender;

@end

Setting Up an Audio Recorder

Now it’s time to implement the viewDidLoad method in the ViewController.m file. First, you will set up the audio session and an error variable. The initialization of the AVAudioSession is different because we are initializing it with SharedInstance. The system creates a singleton, a single instance for the audio session that is shared between all apps that use audio. By initializing it, you have access to audio, but because it’s shared you might have interruptions from the music app or the phone. Add the code shown in Listing 10-11 to your viewDidLoad method.

Listing 10-11.  Setting an AVAudioSession and an error variable

self.session = [AVAudioSession sharedInstance];
[self.session setActive:YES error:nil];

NSError *error;

[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryRecord error:&error];

Next, you’ll define a file path for the recording, as shown in Listing 10-12.

Listing 10-12.  Defining a recording file path

self.recordedFilePath = [[NSString alloc] initWithFormat:@"%@%@",
    NSTemporaryDirectory(), @"recording.wav"];

Next, you’ll initialize the audio recorder with the file path converted to a URL. Listing 10-13 shows this initialization.

Listing 10-13.  Initializing the audio recorder

NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];
NSError *error;
self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:nil error:&error];
if (error)
{
    NSLog(@"Error initializing recorder: %@", error);
}
self.recorder.meteringEnabled = YES;
self.recorder.delegate = self;
[self.recorder prepareToRecord];

The call to prepareToRecord in Listing 10-13 assures that when the user taps the “Record” button later, the recording will start immediately (assuming the user granted permission to the microphone).

Finally, as in Recipe 10-1, start a timer that triggers updating of the level-monitoring labels. The viewDidLoad method should now look like Listing 10-14.

Listing 10-14.  The complete viewDidLoad method

- (void)viewDidLoad
{
    [super viewDidLoad];
    
    self.session = [AVAudioSession sharedInstance];
    [self.session setActive:YES error:nil];
    
    NSError *error;
    
    [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayAndRecord error:&error];
    

    self.recordedFilePath = [[NSString alloc] initWithFormat:@"%@%@",
                             NSTemporaryDirectory(), @"recording.wav"];
    NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];

    self.recorder = [[AVAudioRecorder alloc] initWithURL:url settings:nil error:&error];
    if (error)
    {
        NSLog(@"Error initializing recorder: %@", error);
    }
    self.recorder.meteringEnabled = YES;
    self.recorder.delegate = self;
    [self.recorder prepareToRecord];
    
    [NSTimer scheduledTimerWithTimeInterval:0.01 target:self
   }

You might be wondering why you aren’t also initializing the audio player in the viewDidLoad method. The reason is that the player’s initializer requires a URL that points to an audio file, but at the time of the view loading there are no audio files recorded. Therefore, as you’ll see later, you create the player when the user taps the “Play” button.

Add the updateLabels method as shown in Listing 10-15, which resembles the one in Recipe 10-1 with the exception that now it’s the audio recorder that’s being monitored and not the audio player.

Listing 10-15.  Implementation of the updateLabels method

-(void)updateLabels
{
    [self.recorder updateMeters];
    self.averageLabel.text =
        [NSString stringWithFormat:@"%f", [self.recorder averagePowerForChannel:0]];
    self.peakLabel.text =
        [NSString stringWithFormat:@"%f", [self.recorder peakPowerForChannel:0]];
}

Now let’s turn to the action methods. Start with toggleRecordinxg:. It has only two cases. If the recorder is currently active, it should stop the recording and reset the title of the “Record” button; if not, it should start recording and change the title of the “Record” button to “Stop.” Because the user now can potentially deny access to the microphone, we also added the completion block that tests this case. If permission to record is granted, then recording starts. Otherwise, an alert pops up to notify the user that she needs to enable the microphone in the privacy settings. Listing 10-16 shows the completed method.

Listing 10-16.  Implementation of the toggleRecording: method

- (IBAction)toggleRecording:(id)sender
{
    
    if ([self.recorder isRecording])
    {
        [self.recorder stop];
        [self.recordButton setTitle:@"Record" forState:UIControlStateNormal];
    }
    else
    {
        [self.session requestRecordPermission:^(BOOL granted) {
            
            if(granted)
            {
            [self.recorder record];
            [self.recordButton setTitle:@"Stop" forState:UIControlStateNormal];
            }
            else
            {
                
                UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Recording Permission Denied" message:@"Verify microphone access is turned on in Settings->Privacy->Microphone" delegate:nil cancelButtonTitle:@"OK" otherButtonTitles:nil];
                [alert show];
            }
        }];

    }
}

Next, implement the AVAudioRecorderDelegate method protocol, as shown in Listing 10-17. It will be called when a recording has finished, with a flag that indicates whether the recording was completed successfully.

Listing 10-17.  Implementing the audioRecorderDidFinishRecording:successfully: delegate method

- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *)recorder successfully:(BOOL)flag
{
    _newRecordingAvailable = flag;
    [self.recordButton setTitle:@"Record" forState:UIControlStateNormal];
}

As you can see, if the recording is successful you indicate that a new recording is available by setting an instance variable flag. You also reset the title of the button to read “Record” again.

Now to the slightly more complicated “Play” button. When the user taps it, there are four possible states (we’ll need to consider only three):

  1. The audio player is active, in which case you pause it and reset the button’s title to “Play.”
  2. A new recording is available, which forces you to recreate the audio player with the new file. Then start the player and set the button’s title to “Pause.”
  3. A player has been created but is currently not active, which means it has been paused and should be restarted. Start the player and set the button title to “Pause.”
  4. No player has been created yet. This means that there is no valid recording available. Simply ignore this case.

Listing 10-18 shows the preceding points translated into code.

Listing 10-18.  Implementation of the togglePlaying: method

- (IBAction)togglePlaying:(id)sender
{
    if (self.player.playing)
    {
        [self.player pause];
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (_newRecordingAvailable)
    {
        NSURL *url = [[NSURL alloc] initFileURLWithPath:self.recordedFilePath];
        NSError *error;
        self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];
        if (!error)
        {
            self.player.delegate = self;
            [self.player play];
        }
        else
        {
            NSLog(@"Error initializing player: %@", error);
        }
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
        _newRecordingAvailable = NO;
    }
    else if (self.player)
    {
        [self.player play];
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }
}

When the player has finished playing, the button’s title should be reset. The delegate method in Listing 10-19 takes care of that.

Listing 10-19.  Implementation of the audioPlayerDidFinishPlaying:successfully: delegate method

-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Handling Interruptions

At this point, your application successfully records and plays a sound. As with the previous recipe, you should implement the delegate methods to handle interruptions such as phone calls or text messages. Listing 10-20 shows the methods for handling interruptions for both the audio player and the recorder.

Listing 10-20.  Implementation of the two delegate methods to handle interruptions

- (void)audioPlayerEndInterruption:(AVAudioPlayer *)player withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [player play];
    }
}

- (void)audioRecorderEndInterruption:(AVAudioRecorder *)recorder withOptions:(NSUInteger)flags
{
    if (flags == AVAudioSessionInterruptionOptionShouldResume)
    {
        [recorder record];
    }
}

Now you have a fully functional app to record and play sounds from your device. As you can see, the AVAudioRecorder and AVAudioPlayer work well together to provide a complete yet simple audio interface for the user.

Recipe 10-3: Accessing the Music Library

So far you have been able to deal with playing and manipulating sound files that you have included in your project. However, an easy way to access a significantly larger supply of sound files is by accessing the user’s music library.

In this recipe, you will make another new single view application. This time you need to link it with the Media Player framework, which allows you to play music, movies, podcasts, and audio books. The other benefit is that it gives you access to the iPod library. As usual, add an import statement for the framework to your view controller.

Setting Up a Basic Music Player

Set up your view to work as a basic music player. Add the following components so it looks like Figure 10-3:

  • 4 buttons: Add Music to Queue, Prev, Play, Next
  • 3 labels: Now Playing:, Info, Volume
  • 1 view: Make it 20 points x 276 points

9781430259596_Fig10-03.jpg

Figure 10-3. User interface for queuing music from the music library

Create the following outlets for the controls that are referenced from your code:

  • infoLabel
  • volumeView; change the property type to MPVolumeView instead of UIView
  • playButton

Create the following actions:

  • addItems, for the Touch Up Inside event of the “Add Music to Queue” button
  • prevTapped, playTapped, and nextTapped, respectively, for the three buttons at the bottom

Define two properties in your header file—one of type MPMusicPlayerController called “player,” which you use to play music, and one of type MPMediaItemCollection called “myCollection,” which helps you keep track of your chosen tracks to play. Finally, make your view controller the delegate for a class calledMPMediaPickerController,” which allows your user to select music to play. Overall, your header file should now look like Listing 10-21.

Listing 10-21.  The finished ViewController.h file

//
//  ViewController.h
//  Recipe 10-3 Accessing Music Library
//

#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate>

@property (weak, nonatomic) IBOutlet UILabel *infoLabel;
@property (weak, nonatomic) IBOutlet MPVolumeView *volumeView;
@property (weak, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) MPMediaItemCollection *myCollection;
@property (strong, nonatomic) MPMusicPlayerController *player;

- (IBAction)addItems:(id)sender;
- (IBAction)prevTapped:(id)sender;
- (IBAction)playTapped:(id)sender;
- (IBAction)nextTapped:(id)sender;
- (IBAction)updateVolume:(id)sender;

@end

Now you can set up your viewDidLoad method in the implementation file. Listing 10-22 shows that you set the player, call the setNotifications method, and setup the volumeView. The MPVolumeView is a class that inserts a volume slider and a button to play music over Apple TV or a wireless streaming airplay standard device. The nice thing about this view is that it handles volume and airplay logic without any further configuration.

Listing 10-22.  The viewDidLoad implementation

- (void)viewDidLoad
{
    [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    self.infoLabel.text = @"...";
        
    self.player = [MPMusicPlayerController applicationMusicPlayer];
    
    [self setNotifications];
    
    [self.player beginGeneratingPlaybackNotifications];
    
    [self.player setShuffleMode:MPMusicShuffleModeOff];
    self.player.repeatMode = MPMusicRepeatModeNone;
    
    self.volumeView.backgroundColor = [UIColor clearColor];
    MPVolumeView *myVolumeView =
    [[MPVolumeView alloc] initWithFrame: self.volumeView.bounds];
    [self.volumeView addSubview: myVolumeView];

}

Note   The MPMusicPlayerController class has two important class methods that allow you to access an instance of the class. The one you used previously, applicationMusicPlayer, returns an application-specific music player. This option can be useful for keeping your music separate from the device’s music player, but it has the downside of being unable to play once the app enters the background. Alternatively, you can use the iPodMusicPlayer, which allows for continuous play despite being in the background. The main thing to keep in mind in this case, however, is that your player might already have a nowPlayingItem from the actual iPod that you should be able to handle.

Handling Notifications

Whenever you use an instance of MPMusicPlayerController, it is recommended you register for notifications for whenever the playback state changes or whenever the currently playing song changes. We have extracted this code into a helper method named “setNotifications.” Listing 10-23 shows the implementation for the setNotifications method.

Listing 10-23.  Implementation of the setNotifications method

-(void)setNotifications
{
    NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
    
    [notificationCenter
     addObserver: self
     selector:    @selector(handleNowPlayingItemChanged:)
     name:        MPMusicPlayerControllerNowPlayingItemDidChangeNotification
     object:      self.player];
    
    [notificationCenter
     addObserver: self
     selector:    @selector(handlePlaybackStateChanged:)
     name:        MPMusicPlayerControllerPlaybackStateDidChangeNotification
     object:      self.player];
    
}

Next, the method to handle the playback state-change notification simply updates the title of the “Play” button to reflect the new state, as shown in Listing 10-24.

Listing 10-24.  Implementation of the handlePlaybackStateChange: method

- (void) handlePlaybackStateChanged: (id) notification
{
    MPMusicPlaybackState playbackState = [self.player playbackState];
    
    if (playbackState == MPMusicPlaybackStateStopped)
    {
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (playbackState == MPMusicPlaybackStatePaused)
    {
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
    else if (playbackState == MPMusicPlaybackStatePlaying)
    {
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }
}

Finally, whenever the currently playing song is changed, the info label should be updated. The code in Listing 10-25 handles this case.

Listing 10-25.  Implementation of the handleNowPlayingItemChanged: method

- (void) handleNowPlayingItemChanged: (id) notification
{
    MPMediaItem *currentItemPlaying = [self.player nowPlayingItem];
    if (currentItemPlaying)
    {
        NSString *info = [NSString stringWithFormat:@"%@ - %@",
            [currentItemPlaying valueForProperty:MPMediaItemPropertyTitle],
            [currentItemPlaying valueForProperty:MPMediaItemPropertyArtist]];
        self.infoLabel.text = info;
    }
    else
    {
        self.infoLabel.text = @"...";
    }
}

Picking Media to Play

To add music to your list of tunes, use the MPMediaPickerController class. This class provides, as shown in Figure 10-4, a standardized way to make a music selection. Add the code in Listing 10-26 to the addItems action method to set up and display a media picker.

9781430259596_Fig10-04.jpg

Figure 10-4. The user interface of the MPMediaPickerController to select songs

Listing 10-26.  Completing the implementation for the additems: action method

- (IBAction)addItems:(id)sender
{
    MPMediaPickerController *picker =
        [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
    picker.delegate = self;
    picker.allowsPickingMultipleItems = YES;
    picker.prompt = NSLocalizedString (@"Add songs to play",
        "Prompt in media item picker");
    [self presentViewController:picker animated:YES completion:NULL];
}

The media picker communicates with your view controller through the MPMediaPickerControllerDelegate protocol you added to the header earlier. Implement the following two delegate methods to handle both cancellation and successful selection of media, as shown in Listing 10-27.

Listing 10-27.  Implementing methods to handle cancellation and successful selection of media

-(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker
{
    [self dismissViewControllerAnimated:YES completion:NULL];
}

-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
    [self updateQueueWithMediaItemCollection:mediaItemCollection];
    [self dismissViewControllerAnimated:YES completion:NULL];
}

An MPMediaItemCollection is the group of media items that were selected by the user. Use it to update the media player’s queue in the updateQueueWithMediaItemCollection: method, as shown in Listing 10-28.

Listing 10-28.  Implementing the updateQueueWithMediaItemCollection: method

-(void)updateQueueWithMediaItemCollection:(MPMediaItemCollection *)collection
{
    if (collection)
    {
        if (self.myCollection == nil)
        {
            self.myCollection = collection;
            [self.player setQueueWithItemCollection: self.myCollection];
            [self.player play];
        }
        else
        {
            BOOL wasPlaying = NO;
            if (self.player.playbackState == MPMusicPlaybackStatePlaying)
            {
                wasPlaying = YES;
            }
            
            MPMediaItem *nowPlayingItem        = self.player.nowPlayingItem;
            NSTimeInterval currentPlaybackTime = self.player.currentPlaybackTime;
            
            NSMutableArray *combinedMediaItems =
                [[self.myCollection items] mutableCopy];
            NSArray *newMediaItems = [collection items];
            [combinedMediaItems addObjectsFromArray: newMediaItems];
            
            self.myCollection =
                [MPMediaItemCollection collectionWithItems:combinedMediaItems];
            
            [self.player setQueueWithItemCollection:self.myCollection];
            
            self.player.nowPlayingItem      = nowPlayingItem;
            self.player.currentPlaybackTime = currentPlaybackTime;
            
            if (wasPlaying)
            {
                [self.player play];
            }
        }
    }
}

Listing 10-28 might seem complex, but it is actually a fairly linear progression. First, after checking to make sure the collection of newly selected items is not nil, check to see whether there is a previous queue set up. If not, simply set your player’s queue to this collection. On the other hand, if a collection does exist, then combine the two, set your player’s queue as the result, and restore your playback to where it previously was.

The remaining action methods’ implementations are pretty straightforward. Listing 10-29 shows the one that responds to a user tapping the “Prev” button.

Listing 10-29.  Implementing the prevTapped: action method

- (IBAction)prevTapped:(id)sender
{
    if ([self.player currentPlaybackTime] > 5.0)
    {
        [self.player skipToBeginning];
    }
    else
    {
        [self.player skipToPreviousItem];
    }
}

As you can see, in Listing 10-29 we’ve given the “Prev” button two functionalities: If the media player is at the beginning of the current song, tapping the button will skip to the previous song; however, if the playback is more than five seconds into the current song, tapping the button will skip to the beginning of the current song.

The next button is even simpler. It simply skips to the next song, as shown in Listing 10-30.

Listing 10-30.  Implementation of the nextTapped: action method

- (IBAction)nextTapped:(id)sender
{
    [self.player skipToNextItem];
}

The “Play” button toggles between “Play” and “Pause” and updates the button title accordingly, as shown in Listing 10-31.

Listing 10-31.  Implementation of the playTapped: action method

- (IBAction)playTapped:(id)sender
{
    if ((self.myCollection != nil) &&
        (self.player.playbackState != MPMusicPlaybackStatePlaying))
    {
        [self.player play];
        [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    }
    else if (self.player.playbackState == MPMusicPlaybackStatePlaying)
    {
        [self.player pause];
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
    }
}

Your application is now ready to build and run. One thing to note when you run this application is that until the music is playing, whether that be from the app or the music player, you cannot adjust your AVAudioPlayer’s volume by using the external volume buttons. These buttons still control the ringer volume, as opposed to the playback volume. After a song is playing, you receive full control over the playback volume through these buttons.

You’ll now go on by adding the possibility to search the music library for media to add to the playback queue.

Querying Media

The media player comes with a powerful querying capability with which you can search the music library. To give you an idea of its possibilities, we’re going to add an MPMediaQuery feature to the application. This feature allows the user to query the music library for items containing a certain text and have them added to the media player’s queue.

First, add a UIButton as well as a UITextField to your view so that your view now looks like Figure 10-5.

9781430259596_Fig10-05.jpg

Figure 10-5. User interface with a feature for querying music by artist

Create an outlet with the name “artistTextField” for referencing the text field and an action named “queueMusicByArtist” for the button.

The first thing you do with your new UITextField is to set its delegate to your view controller by adding the lines in Listing 10-32 to the viewDidLoad method.

Listing 10-32.  Adding the artistTextField delegate to the viewDidLoad method

- (void)viewDidLoad
{
    // ...

    self.artistTextField.delegate = self;
    self.artistTextField.enablesReturnKeyAutomatically = YES;
}

Make sure to adjust your header file to declare that your view controller conforms to the UITextFieldDelegate protocol, as shown in Listing 10-33.

Listing 10-33.  Declaring the UITextFieldDelegate protocol

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate ,
                                             UITextFieldDelegate>

// ...

@end

Next, implement the delegate method to have your text field dismiss the keyboard and automatically perform the query when the user taps the return key, as shown in Listing 10-34.

Listing 10-34.  Implementation of the textFieldShouldReturn: delegate method

-(BOOL)textFieldShouldReturn:(UITextField *)textField
{
    [textField resignFirstResponder];
    [self queueMusicByArtist:self];
    return NO;
}

Finally, the queueMusicByArtist: method is implemented, as shown in Listing 10-35. This method basically takes the value in the text field and checks to make sure it’s not blank. If it’s not blank, create a predicate, a logical condition in which to test media items against and return the correct list of items. Then populate the queue with the results.

Listing 10-35.  Implementation of the queueMusicByArtist: method

- (IBAction)queueMusicByArtist:(id)sender
{
    NSString *artist = self.artistTextField.text;
    if (artist != nil && ![artist isEqual: @""])
    {
        MPMediaPropertyPredicate *artistPredicate =
            [MPMediaPropertyPredicate
                predicateWithValue:artist
                forProperty:MPMediaItemPropertyArtist
                comparisonType:MPMediaPredicateComparisonContains];
        MPMediaQuery *query = [[MPMediaQuery alloc] init];
        [query addFilterPredicate:artistPredicate];
        
        NSArray *result = [query items];
        if ([result count] > 0)
        {
            [self updateQueueWithMediaItemCollection:
                [MPMediaItemCollection collectionWithItems:result]];
        }
        else
            self.infoLabel.text = @"Artist Not Found.";
    }
}

You can now run and test the new feature. Enter a search string in the artist text field and press “Return” (or tap the “Queue Music By” button); the media player should start playing all songs from artists whose names contain the provided text.

As you can see, querying the media library is a fairly simple process, which at its bare minimum requires only an instance of the MPMediaQuery class. You can then add MPMediaPropertyPredicates to a query to make it more specific. The MPMediaPropertyPredicates is basically a configurable operation that can be used to query media and return a list of media items based on the conditions of the operation.

Using MPMediaPropertyPredicates requires a decent knowledge of the different MPMediaItemProperties so you can know exactly what kind of information you can acquire. Not all MPMediaItemProperties are filterable, and the filterable properties are also different if you are dealing specifically with a podcast. You should refer to the Apple documentation about MPMediaItem for a full list of properties, but the following is a list of the most commonly used ones:

  • MPMediaItemPropertyMediaType: The media type, such as MP3, M4V, and so on
  • MPMediaItemPropertyTitle: The media item title, such as a song title
  • MPMediaItemPropertyAlbumTitle: The album title
  • MPMediaItemPropertyArtist: The artist
  • MPMediaItemPropertyArtwork: The album artwork image

Tip   Whenever you use MPMediaItemPropertyArtwork, you can use the imageWithSize: method defined in MPMediaItemPropertyArtwork to create a UIImage from the artwork.

We have barely scratched the surface of media-item queries, but here are a few points to keep in mind when dealing with them:

  • Whenever multiple filter predicates specifying different properties are added to a query, the predicates are evaluated using the AND operator, meaning that if you specify an artist name and an album name, you will receive only songs by that artist AND from that specific album.
  • Do not add two filter predicates of the same property to a query because the resulting behavior is not defined. If you wish to query a database for multiple specific values of the same property, such as filtering for all songs by two different artists, a better method is simply to create two queries and then combine their results afterward.
  • The comparisonType property of an MPMediaPropertyPredicate helps specify how exact you want your predicate to be. A value of MPMediaPredicateComparisonEqualTo returns only items with the string exactly equal to the given one, while a value of MPMediaPredicateComparisonContains, as shown earlier, returns items that contain the given string, which is a less specific search.

MPMediaQuery instances can also be given a “grouping property” so they automatically group their results. You could, for example, filter a query by a specific artist but group according to the album name:

[query setGroupingType: MPMediaGroupingAlbum];

In this way, you can retrieve all the songs by a specific artist but iterate through them as if they were in albums, as demonstrated by Listing 10-36.

Listing 10-36.  An example of querying all artists and iterating through them as if they were albums

NSArray *albums = [query collections];
for (MPMediaItemCollection *album in albums)
{
    MPMediaItem *representativeItem = [album representativeItem];
    NSString *albumName =
        [representativeItem valueForProperty: MPMediaItemPropertyAlbumTitle];
    NSLog (@"%@", albumName);
}

You can also set a grouping type by using MPMediaQuery class methods, such as albumsQuery, which creates your query instance with a pre-set grouping property.

Even though we haven’t dug deep into the Media Player framework, you can probably see the power of it. Accessing the user’s own library opens up an entirely new level of audio customization for your applications, possibilities such as selecting music to wake up to or allowing the user to specify her own background music for your game. You’re probably coming up with several other uses yourself right now. Why not go ahead and implement them?

Recipe 10-4: Playing Background Audio

In this recipe, you’ll build a basic music player app that can keep playing even in background mode. Additionally, you’ll use MPNowPlayingInfoCenter to allow your app to be controlled from the multitasking bar and to display information about the current tune on the lock screen.

Start by creating a new single view application project. You’ll need the following frameworks, so make sure you link their binaries to your project:

  • AVFoundation.framework: To play your audio files.
  • MediaPlayer.framework: To access your library of media files.
  • CoreMedia.framework: You won’t use any classes from this framework, but you will need some of the CMTime functions to help deal with your audio player.

Also, add the following import statements to your view controller’s header file. You do not need one for the Core Media framework in this project:

#import <MediaPlayer/MediaPlayer.h>
#import <AVFoundation/AVFoundation.h>

Setting Up the User Interface

It’s usually a good idea to start with the design of the user interface. You’ll build a simple media player with the following features:

  • Add items from the music library to the playlist
  • Start and pause playback
  • Navigate backward and forward in the playlist
  • Clear the playlist
  • Provide information about the current song and album

Create the user interface, as shown in Figure 10-6, with an info label, an image view, and five buttons: Library, <<, Play, >>, and Clear.

9781430259596_Fig10-06.jpg

Figure 10-6. A user interface for a simple media player with background playback

Create the following outlets:

  • playButton
  • infoLabel
  • artworkImageView

Create the following actions:

  • queueFromLibrary
  • goToPrevTrack
  • togglePlay
  • goToNextTrack
  • clearPlaylist

Declaring Background Mode Playback

Now set up your app to continue playing music after the app has entered a background state of operation. The first thing you need to do is declare a property of type AVAudioSession, called session.

@property (nonatomic, strong) AVAudioSession *session;

Next, add the code in Listing 10-37 to your viewDidLoad method.

Listing 10-37.  Filling out the ViewDidLoad method

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.session = [AVAudioSession sharedInstance];
    NSError *error;
    [self.session setCategory:AVAudioSessionCategoryPlayback error:&error];
    if (error)
    {
        NSLog(@"Error setting audio session category: %@", error);
    }
    [self.session setActive:YES error:&error];
    if (error)
    {
        NSLog(@"Error activating audio session: %@", error);
    }
}

By specifying that your session category is of type AVAudioSessionCategoryPlayback, you are telling your device that your application’s main focus is playing music and should therefore be allowed to continue playing audio while the application is in the background.

Now that you have configured your AVAudioSession, you need to edit your application’s .plist file to specify that your application, when in the background mode, must be allowed to run audio. You do that by adding audio as a required background mode in the properties list (.plist file), as shown in Figure 10-7. For more information on this procedure, refer to Chapter 1, Recipe 1-7.

9781430259596_Fig10-07.jpg

Figure 10-7. Setting audio as a required background mode

To allow the user to control your media player remotely, either from the buttons of her earphones or from the activity bar, you need to respond to remote control events. For these events to work, your view controller needs to be the first responder, so enable this by overriding the canBecomeFirstResponder method, as shown in Listing 10-38.

Listing 10-38.  Implementation of the canBecomeFirstResponder: override method

-(BOOL)canBecomeFirstResponder
{
    return YES;
}

Now, implement the viewDidAppear: and viewWillDisappear: methods to set the first responder status and register for the remote control events, as shown in Listing 10-39.

Listing 10-39.  Implementation of the viewDidAppear: and viewWillDisappear: methods

- (void)viewDidAppear:(BOOL)animated
{
    [super viewDidAppear:animated];
    [[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
    [self becomeFirstResponder];
}

- (void)viewWillDisappear:(BOOL)animated
{
    [[UIApplication sharedApplication] endReceivingRemoteControlEvents];
    [self resignFirstResponder];
    [super viewWillDisappear:animated];
}

To receive and respond to remote control events, implement the method shown in Listing 10-40.

Listing 10-40.  Implementation of the remoteControlRecievedWithEvent: method

- (void)remoteControlReceivedWithEvent: (UIEvent *) receivedEvent
{
    if (receivedEvent.type == UIEventTypeRemoteControl)
    {
        switch (receivedEvent.subtype)
        {
            case UIEventSubtypeRemoteControlTogglePlayPause:
                [self togglePlay:self];
                break;
                
            case UIEventSubtypeRemoteControlPreviousTrack:
                [self goToPrevTrack:self];
                break;
                
            case UIEventSubtypeRemoteControlNextTrack:
                [self goToNextTrack:self];
                break;
                
            default:
                break;
        }
    }
}

As you can see, you only redirect the events by invoking the respective action method, which you’ll implement shortly.

Implementing the Player

You use an AVPlayer to do the playback. It differs from the MPMusicPlayerController you saw in the preceding recipe in that it can continue playing in background mode. However, it doesn’t work directly with items from your music library, which requires a little more coding than with MPMusicPlayerController.

Add the following properties to your view controller:

@property (nonatomic, strong) AVPlayer *player;
@property (nonatomic, strong) NSMutableArray *playlist;
@property (nonatomic)NSInteger currentIndex;

The playlist property holds an array of items from the music library, and the currentIndex holds the index of the current track within the playlist. Return to the viewDidLoad method and add the code in Listing 10-41 to initialize the player and the playlist.

Listing 10-41.  Updating the viewDidLoad method to initialize the playlist and player

- (void)viewDidLoad
{
    [super viewDidLoad];
    
    // ...

    self.playlist = [[NSMutableArray alloc] init];
    self.player = [[AVPlayer alloc] init];
}

Now let’s start by implementing the “Library” button. It should present a media picker controller, as shown previously in Figure 10-4, and append the selected items to the playlist. Listing 10-42 shows this implementation.

Listing 10-42.  Implementation of the queueFromLibrary: method

- (IBAction)queueFromLibrary:(id)sender
{
    MPMediaPickerController *picker =
        [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
    picker.delegate = self;
    picker.allowsPickingMultipleItems = YES;
    picker.prompt = @"Choose Some Music!";
    [self presentViewController:picker animated:YES completion:NULL];
}

You also need to add the MPMediaPickerControllerDelegate protocol to your view controller. Modify the line in Listing 10-5, as shown in Listing 10-43.

Listing 10-43.  Adding the MPMediaPickerControllerDelegate protocol

@interface ViewController : UIViewController<MPMediaPickerControllerDelegate>

Now implement the delegate methods that receive the selected items. It should append them to the list and dismiss the media picker. Also, if these are the first items added, playback should be started (Listing 10-44).

Listing 10-44.  Implementation of the mediaPicker:didPickMediaItems: delegate method

-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
    BOOL shallStartPlayer = self.playlist.count == 0;
    
    [self.playlist addObjectsFromArray:mediaItemCollection.items];
    
    if (shallStartPlayer)
        [self startPlaybackWithItem:[self.playlist objectAtIndex:0]];

    [self dismissViewControllerAnimated:YES completion:NULL];
}

This leads us to the startPlaybackWithItem: method. This method replaces the currently played item (if any), resets the current playback position (in case the item has been played before), and starts the playback, as shown in Listing 10-45.

Listing 10-45. Implementation of the startPlaybackWithItem: method

-(void)startPlaybackWithItem:(MPMediaItem *)mpItem
{
    [self.player replaceCurrentItemWithPlayerItem:[self avItemFromMPItem:mpItem]];
    [self.player seekToTime:kCMTimeZero];
    [self startPlayback];
}

Because the AVPlayer is working with AVPlayerItems and not MPMediaItems, you need to create one. This is the job of the avItemFromMPItem: method shown in Listing 10-46.

Listing 10-46.  Implementation of the avItemFromMPItem: method

-(AVPlayerItem *)avItemFromMPItem:(MPMediaItem *)mpItem
{
    NSURL *url = [mpItem valueForProperty:MPMediaItemPropertyAssetURL];

    AVPlayerItem *item = [AVPlayerItem playerItemWithURL:url];

    [[NSNotificationCenter defaultCenter]
     addObserver:self
     selector:@selector(playerItemDidReachEnd:)
     name:AVPlayerItemDidPlayToEndTimeNotification
     object:item];

    return item;
}

What’s interesting in Listing 10-46 is that you not only create the AVPlayerItem, but you also register a method to receive a notification when the song has reached its end. This is so you can continue to the next tune in the playlist using the playerItemDidReachEnd: method shown in Listing 10-47.

Listing 10-47.  Implementation of the playerItemDidReachEnd: method

- (void)playerItemDidReachEnd:(NSNotification *)notification
{
    [self goToNextTrack:self];
}

The next stop is the startPlayback method. It starts the player, changes the title of the “Play” button to “Pause,” and calls updateNowPlaying. This implementation is shown in Listing 10-48.

Listing 10-48.  Implementation of the startPlayback method

-(void)startPlayback
{
    [self.player play];
    [self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
    [self updateNowPlaying];
}

Finally, the last method to implement in this chain of calls is updateNowPlaying, as shown in Listing 10-49. The method, aside from updating your user interface to display the current song information, also uses the MPNowPlayingInfoCenter. This class allows the developer to place information on the device’s lock screen (see Figure 10-8 for an example) or on other devices when the application is displaying info through AirPlay. You can pass information to it by setting the nowPlayingInfo property of the defaultCenter to a dictionary of values and properties that you created.

Listing 10-49.  Implementation of the updateNowPlaying method

-(void)updateNowPlaying
{
    if (self.player.currentItem != nil)
    {
        MPMediaItem *currentMPItem = [self.playlist objectAtIndex:self.currentIndex];
        
        self.infoLabel.text =
            [NSString stringWithFormat:@"%@ - %@",
                [currentMPItem valueForProperty:MPMediaItemPropertyTitle],
                [currentMPItem valueForProperty:MPMediaItemPropertyArtist]];
        
        UIImage *artwork =
            [[currentMPItem valueForProperty:MPMediaItemPropertyArtwork]
                imageWithSize:self.artworkImageView.frame.size];
        self.artworkImageView.image = artwork;
        
        NSString *title = [currentMPItem valueForProperty:MPMediaItemPropertyTitle];
        NSString *artist =
            [currentMPItem valueForProperty:MPMediaItemPropertyArtist];
        NSString *album =
            [currentMPItem valueForProperty:MPMediaItemPropertyAlbumTitle];
            
        NSDictionary *mediaInfo =
           [NSDictionary dictionaryWithObjectsAndKeys:
               artist, MPMediaItemPropertyArtist,
               title, MPMediaItemPropertyTitle,
               album, MPMediaItemPropertyAlbumTitle,
               [currentMPItem valueForProperty:MPMediaItemPropertyArtwork],
                   MPMediaItemPropertyArtwork,
               nil];
        [MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo = mediaInfo;
    }
    else
    {
        self.infoLabel.text = @"...";
        [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
        self.artworkImageView.image = nil;
    }
}

9781430259596_Fig10-08.jpg

Figure 10-8. Information on the lock screen about the current track

The next action method is togglePlay, which should toggle between play and pause modes. An edge case here is that if the player has not yet been initialized, you need to initialize it with the first item in your playlist. Listing 10-50 shows this implementation.

Listing 10-50.  Implementation of the togglePlay: method

- (IBAction)togglePlay:(id)sender
{
    if (self.playlist.count > 0)
    {
        if (self.player.currentItem == nil)
        {
            [self startPlaybackWithItem:[self.playlist objectAtIndex:0]];
        }
        else
        {
            // Player has an item, pause or resume playing it
            BOOL isPlaying = self.player.currentItem && self.player.rate != 0;
            if (isPlaying)
            {
                [self pausePlayback];
            }
            else
            {
                [self startPlayback];
            }
        }
    }
}

From the code in Listing 10-50, you can see a call to the pausePlayback method that you haven’t yet implemented. That’s easily fixed. All it needs to do is pause the player and update the “Play” button title, as shown in Listing 10-51.

Listing 10-51.  Implementation of the pausePlayback method

-(void)pausePlayback
{
    [self.player pause];
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Next, add the goToPrevTrack: and goToNextTrack: action methods shown in Listing 10-52. They are pretty straightforward. In this method, we created the functionality that checks if playback is more than five seconds into the song. If it is past five seconds, the “Back” button will rewind the current song and will not skip to the previous item in the playlist. This is expected behavior for most media players.

Listing 10-52.  Implementation of the goToPrevTrack: and goToNextTrack: methods

- (IBAction)goToPrevTrack:(id)sender
{
    if (self.playlist.count == 0)
        return;
    
    if (CMTimeCompare(self.player.currentTime, CMTimeMake(5.0, 1)) > 0)
    {
        [self.player seekToTime:kCMTimeZero];
    }
    else
    {
        if (self.currentIndex == 0)
        {
            self.currentIndex = self.playlist.count - 1;
        }
        else
        {
            self.currentIndex -= 1;
        }
        MPMediaItem *previousItem = [self.playlist objectAtIndex:self.currentIndex];
        [self startPlaybackWithItem:previousItem];
    }
}

- (IBAction)goToNextTrack:(id)sender
{
    if (self.playlist.count == 0)
        return;
    
    if (self.currentIndex == self.playlist.count - 1)
    {
        self.currentIndex = 0;
    }
    else
    {
        self.currentIndex += 1;
    }
    MPMediaItem *nextItem = [self.playlist objectAtIndex:self.currentIndex];
    [self startPlaybackWithItem: nextItem];
}

The CMTimeMake() function you used in Listing 10-52 is a very flexible function that takes two inputs. The first represents the number of time units you want, and the second represents the timescale, where 1 represents a second, 2 represents half a second, and so on. A call of CMTimeMake(100, 10) would make 100 units of (1/10) seconds each, resulting in 10 seconds.

There’s only one feature remaining unimplemented: clearing the playlist. Listing 10-53 shows this implementation.

Listing 10-53.  Implementation of the clearPlaylist: method

- (IBAction)clearPlaylist:(id)sender
{
    [self.player replaceCurrentItemWithPlayerItem:nil];
    [self.playlist removeAllObjects];
    [self updateNowPlaying];
    [self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}

Finally, your app is now ready to build and run. When you test the app, it should continue to play music even after the application has entered the background. Figure 10-9 shows the multitasking bar with which you can control your media player even when another app is active.

9781430259596_Fig10-09.jpg

Figure 10-9. The “remote” controls in the multitasking bar that can control an app playing audio in the background

Tip   This recipe used AVPlayer to do the playback. It can handle only one item at a time, which is why we had to implement an external playlist. However, there is an alternative player that you can use to play queued items. The AVQueuePlayer is suitable for applications that need to play a sequence of items, but don’t need complex navigation within the playlist.

Summary

The complete multimedia experience is one that goes beyond a simple matter of listening to music. Sound, as a product, is about the tiny details that make things just a little bit better. From recording music to filtering media items to creating volume ramps, every little detail that you, as a developer, take care to include will eventually result in more powerful and enjoyable tools. In iOS development, Apple has provided us with an incredibly powerful set of multimedia-based functionalities. We should not let it go to waste.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset