In the words of Aldous Huxley, “After silence, that which comes nearest to expressing the inexpressible is music.” We live in a world where we are surrounded by sound and music. From the most subtle background tune in an advertisement, to the immense blast of the electric guitar at a rock concert, sound plays an integral part in our world in all its varieties, and to ignore this fact in our development process would be detrimental to both the user experience and the developer community. Music and audio have a tremendous impact on our lives, and as developers, it is our responsibility to translate this force into our applications in order to bring the most complete and ideal experience to users, even if they don’t even notice.
Throughout this chapter, a variety of recipes will make use of accessing the iPod library on the device on which the app is run. In order to fully test these, you should ensure that there are at least a few songs in your device’s music library.
If you ask almost any random person what they think of when they hear the words “iPhone” and “audio,” they will probably be thinking along the lines of their iPod and the thousands of songs they have downloaded. What most users tend to overlook, despite its immense importance, is the concept of background audio and sound effects. These sound clips and tunes may go completely unnoticed by the user in normal use, but in terms of app functionality and design they can tremendously improve the quality of an app. It may be the little “shutter click” when you take a picture, or the background music of a game that gets stuck in your head after you play it for too long, but regardless of whether the user notices, it can make a world of difference. The iOS “AV Foundation” framework provides an incredibly simple way to access, play, and manipulate sound files, known as the AVAudioPlayer
. Here you will create a sample project that will allow you to play an audio file, as well as allow the user to manipulate the clip’s playback.
First, you will create a new project, naming it “Chapter7Recipe1”. Here, I have used the class prefix “Player”. As with most of the previous recipes, you can use the Single View Application template to make your project. Ensure the Storyboard check box is unchecked, while the Use Automatic Reference Counting box is checked, so that your configuration resembles Figure 7–1.
The first thing you need to do is link your project with a few frameworks that you will use to play your sound. Select your project in the navigation pane, and then navigate to Targets SoundCheck. Select the Build Phases tab, and drop down the section titled “Link Binary With Libraries”, as shown in Figure 7–2.
Next, click the + button, and add the following two frameworks:
AVAudioPlayer
class, which you will be using to play audio.Next, switch over to your view controller’s header file. You will import your frameworks’ header files by adding the following import statements:
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
Now you will go build your view in the view controller's XIB file. You will be creating your view to include sliders for your audio player's pan
, volume
, and rate
, as well as buttons that will play and pause the player, and a third button to play the device's “vibrate” system sound. You will also be monitoring your audio player's channel levels via labels at the top of the view. Set up your view so it looks like Figure 7–3.
You need your slider's values to match the possible values of the properties they control. Using the Attribute inspector, adjust the minimum and maximum values of your “rate” slider to 0.5 and 2.0 (corresponding to half speed and 2x speed), respectively, and the same values for the “pan” slider to -1 and 1 (correspond to left pan and right pan). The “volume” slider's default values should already be fine, as the volume property goes from 0 to 1.
As you have done in your past recipes, you will connect some of your view objects over to your header file by holding ^ (Ctrl) and dragging from the element over to the view controller's header file. Name the UISlider
s sliderRate
, sliderPan
, and sliderVolume
as appropriate. Your UIButton
s will be vibrateButton
, playButton
, and pauseButton
. Your two level-monitoring UILabel
s at the top (which I have given default values so far of “0.0”) will be averageLabel
and peakLabel
. You will define three methods (of type IBAction
, not void
) in your header file, one for each button, and connect your buttons to them by holding ⌃ and dragging from the button to the action definition. You will also define three more methods for your UISlider
's to be connected to as well. Your header file should now look like so:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AudioToolbox/AudioToolbox.h>
@interface PlayerViewController : UIViewController
@property (strong, nonatomic) IBOutlet UIButton *vibrateButton;
@property (strong, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) IBOutlet UIButton *pauseButton;
@property (strong, nonatomic) IBOutlet UISlider *sliderVolume;
@property (strong, nonatomic) IBOutlet UISlider *sliderPan;
@property (strong, nonatomic) IBOutlet UISlider *sliderRate;
@property (strong, nonatomic) IBOutlet UILabel *averageLabel;
@property (strong, nonatomic) IBOutlet UILabel *peakLabel;
-(IBAction)vibratePressed:(id)sender;
-(IBAction)playPressed:(id)sender;
-(IBAction)pausePressed:(id)sender;
-(IBAction)volumeSliderChanged:(UISlider *)sender;
-(IBAction)panSliderChanged:(UISlider *)sender;
-(IBAction)rateSliderChanged:(UISlider *)sender;
@end
You will also add a property to your header file to keep track of your AVAudioPlayer
, written like so:
@property (nonatomic, strong) AVAudioPlayer *player;
Synthesize this new property in your implementation file, and add the following lines to your -viewDidUnload
method to make sure your application is as efficient as possible in memory use.
self.player.delegate = nil;
[self setPlayer:nil];
The last step in your header file is to make your view controller conform to the AVAudioPlayerDelegate
protocol.
@interface PlayerViewController : UIViewController <AVAudioPlayerDelegate>
Before you proceed, you need to select and import the sound file that your application will be playing. The file I use is called systemCheck.mp3
, and the following code will reflect this file name. You will need to change any file name or file type according to the file that you choose. You should consult Apple's documentation on which file types are appropriate, but it is fairly safe to assume that most commonly used file types such as .wav
or .mp3
will work.
In order to catch any errors in playing files, you can implement one of the methods in the AVAudioPlayerDelegate
protocol:
-(void)audioPlayerDecodeErrorDidOccur:(AVAudioPlayer *)player error:(NSError *)error
{
NSLog(@"Error playing file: %@", [error localizedDescription]);
}
Find your sound clip in the Finder, and then click and drag the file into your project in Xcode. My preference is to put such files in the Supporting Files group, but this is entirely optional. A dialog will appear prompting you to choose options for adding the file. The only change you may need to make is to make sure that the box next to “Copy items into destination group's folder (if needed)” is checked, as in Figure 7–4.
Now that you have your sound file, you can implement your -viewDidLoad
method to set up your AVAudioPlayer
.
- (void)viewDidLoad
{
[super viewDidLoad];
NSString *fileName = @"systemCheck";
NSString *fileType = @"mp3";
NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:fileName ofType:fileType];
NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath];
NSError *error;
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:soundFileURL error:&error];
self.player.enableRate = YES; //Allows us to change the playback rate.
self.player.meteringEnabled = YES; //Allows us to monitor levels
self.player.delegate = self;
self.sliderVolume.value = self.player.volume;
self.sliderRate.value = self.player.rate;
self.sliderPan.value = self.player.pan;
[self.player prepareToPlay]; //Preload audio to decrease lag
NSTimer *timer = [NSTimer scheduledTimerWithTimeInterval:0.1 target:self
selector:@selector(updateLabels) userInfo:nil repeats:YES];
[timer fire];
}
As you can see, you've gotten the URL for your sound file, and then created your AVAudioPlayer
with it. You set up the enableRate
property to allow you to change the playback rate, and set the meteringEnabled
property to allow you to monitor the player's levels. You called the optional -prepareToPlay
on your player in order to pre-load the sound file, hopefully making your application slightly faster. You created a timer at the end, which will perform your -updateLabels
method ten times a second. This way you can have your labels updating at a nearly constant rate.
Let's put in a simple implementation of the -updateLabels
method.
-(void)updateLabels
{
[self.player updateMeters];
self.averageLabel.text = [NSString stringWithFormat:@"%f", [self.player
averagePowerForChannel:0]];
self.peakLabel.text = [NSString stringWithFormat:@"%f", [self.player
peakPowerForChannel:0]];
}
You need to call -updateMeters
anytime that you use the -averagePowerForChannel
or -peakPowerForChannel
methods in order to get the most up-to-date information, as these values do not automatically refresh. Both methods take an NSUInteger
argument that specifies the channel to retrieve information for. By giving it the value of 0, you specify the left channel for a stereo track, or the single channel for a mono track. Given that you are dealing with only a basic use of the functionality, channel 0 is a good default.
Next, you will implement your action methods for your UISlider
s, which will be performed every time the slider's value is changed.
-(void)volumeSliderChanged:(UISlider *)sender
{
self.player.volume = sender.value;
}
-(void)panSliderChanged:(UISlider *)sender
{
self.player.pan = sender.value;
}
-(void)rateSliderChanged:(UISlider *)sender
{
self.player.rate = sender.value;
}
Now, you will implement your button action methods, which are also quite simple.
-(void)vibratePressed:(id)sender
{
AudioServicesPlaySystemSound(kSystemSoundID_Vibrate);
}
-(void)playPressed:(id)sender
{
[self.player play];
}
-(void)pausePressed:(id)sender
{
[self.player pause];
}
The AudioServicesPlaySystemSound()
is a function defined in the Audio Toolbox that plays pre-defined system sounds, such as the one just shown, which causes the phone to vibrate. You can also register your own sound systems with certain file types.
NOTE: While most of the AV Foundation functionalities that you are currently working with will work on the simulator using your computer's microphone and speakers, the foregoing vibrate sound will not. You will need a physical device to test this functionality.
At this point, your app should be able to successfully play and pause your music, and you can adjust your playback rate, pan, and volume, and monitor your output levels.
Whenever you are dealing with an app that has sound or music involved, there is always a concern that your app may be interrupted by a phone call or text, and you should always include functionality to deal with these concerns. This is done through the AVAudioPlayer
delegate methods. Since you have already set your view controller as your AVAudioPlayer
's delegate, you simply need to implement these methods to pause and play your sound clip when an interruption begins or ends, respectively.
-(void)audioPlayerBeginInterruption:(AVAudioPlayer *)player
{
[self.player pause];
}
-(void)audioPlayerEndInterruption:(AVAudioPlayer *)player
{
[self.player play];
}
This is an incredibly simple implementation of these delegate methods. For your app, you may choose to add in further functionality to possibly save data or any other tasks that might need to be done before an application is interrupted.
You should now be able to see the flexibility with which you can use the AVAudioPlayer
, despite its simplistic use. By using multiple instances of AVAudioPlayer
, you can implement complex audio designs using multiple sounds at the same time. One could possibly have a background music track running in one AVAudioPlayer
, and have one or two others to handle event-based sound effects. The power, simplicity, and flexibility of the AVAudioPlayer
class are what make it so popularly used among iOS developers.
Now that you have dealt with the key concept of playing audio, you can deal with the reverse: recording audio. This process is very similar in both structure and implementation to playing audio. You will use the AVAudioRecorder
class to do your recording in conjunction with another AVAudioPlayer
to handle the playback of your recording.
Make a new project, titling it “Chapter7Recipe2”, with a class prefix “Recording”, using the Single View Application template.
You will need to import the AV Foundation framework into your project again. See the previous recipe on how to do this. Unlike the previous recipe, you will not need the Audio Toolbox framework, as you will not include the device vibrate function in this project. Make sure to add the following import statement to your view controller's header file.
#import <AVFoundation/AVFoundation.h>
Next, you will set up your view in your controller's XIB file, so that it looks like Figure 7–5.
Next, you will connect your two buttons, as well as your two monitoring labels, over to your header file. Switch over to the Assistant Editor. By holding ^ (Ctrl) and dragging from your view elements to your header file, connect your buttons and labels to your header file. For this project, I have named the buttons recordButton
, playButton
, averageLevel
, and peakLevel
.
You will also define two action methods for your buttons to perform, named -recordPressed:
and -playPressed:
, and connect your buttons to them. To do this, hold ^ (Ctrl) again, and drag from each button in the XIB file to its respective action in your header file.
Before you proceed to your implementation file, you will add an instance variable, named url
, of type NSURL, to your header file, to keep track of your recording's saved locations. Finally, you need to add in two more properties, one for your AVAudioPlayer
, named player
, and one for your AVAudioRecorder
, named audioRecorder
. Make sure to properly handle these by synthesizing them in your implementation file, and setting them to nil
(as well as their delegates) in your -viewDidUnload
method. At this point, your header file should look like so:
#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
@interface RecordingViewController : UIViewController {
NSURL *url;
}
@property (strong, nonatomic) IBOutlet UIButton *recordButton;
@property (strong, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) IBOutlet UILabel *averageLevel;
@property (strong, nonatomic) IBOutlet UILabel *peakLevel;
@property (strong, nonatomic) AVAudioRecorder *audioRecorder;
@property (strong, nonatomic) AVAudioPlayer *player;
-(IBAction)recordPressed:(id)sender;
-(IBAction)playPressed:(id)sender;
@end
Now, you will write your -viewDidLoad
method, which is quite similar to that in the previous recipe.
-(void)viewDidLoad
{
[super viewDidLoad];
url = [self tempFileURL];
self.audioRecorder = [[AVAudioRecorder alloc] initWithURL:url settings:nil error:nil];
self.audioRecorder.meteringEnabled = YES;
NSTimer *timer = [NSTimer scheduledTimerWithTimeInterval:0.01 target:self
selector:@selector(updateLabels) userInfo:nil repeats:YES];
[timer fire];
[self.audioRecorder prepareToRecord];
}
Just as in the previous recipe, you are sending your AVAudioRecorder
the -prepareToRecord
action in order to help improve your application's running speed. You have again set up a timer to repeatedly update your level-monitoring labels.
The foregoing implementation uses the method -tempFileURL
to retrieve your URL, which is implemented as follows. In order to avoid a compiler warning, make sure to include this method before the -viewDidLoad
method, or simply place its handler of -(NSURL *)tempFileURL:
in the header file.
- (NSURL *) tempFileURL
{
NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"recording.wav"];
NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
NSFileManager *manager = [[NSFileManager alloc] init];
if ([manager fileExistsAtPath:outputPath])
{
[manager removeItemAtPath:outputPath error:nil];
}
return outputURL;
}
Your -updateLabels
method is again implemented like so:
-(void)updateLabels
{
[self.audioRecorder updateMeters];
self.averageLevel.text = [NSString stringWithFormat:@"%f", [self.audioRecorder
averagePowerForChannel:0]];
self.peakLevel.text = [NSString stringWithFormat:@"%f", [self.audioRecorder
peakPowerForChannel:0]];
}
Finally, you just need to implement your buttons' actions.
-(void)recordPressed:(id)sender
{
if ([self.audioRecorder isRecording])
{
[self.audioRecorder stop];
[self.recordButton setTitle:@"Record" forState:UIControlStateNormal];
}
else
{
[self.audioRecorder record];
[self.recordButton setTitle:@"Stop" forState:UIControlStateNormal];
}
}
-(void)playPressed:(id)sender
{
NSFileManager *manager = [[NSFileManager alloc] init];
NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@", NSTemporaryDirectory(), @"recording.wav"];
if (![self.player isPlaying])
{
if ([manager fileExistsAtPath:outputPath])
{
self.player = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[self.player play];
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
}
}
else
{
[self.player pause];
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}
}
CAUTION: In the foregoing implementation, it is incredibly important to include the check to confirm that a file exists at the given path, in case the user presses the “play” button when no sound has been recorded yet. Initializing an AVAudioPlayer
with a URL with no file in it will cause your application to throw an exception.
At this point, your application will now successfully record and play a sound. Before you can be finished, you need to implement an AVAudioPlayerDelegate
method in order to handle the ending of your playback. First, you need to make sure the view controller conforms to the AVAudioPlayerDelegate
protocol, so the top of your view controller's header file now looks like so:
@interface RecordingViewController : UIViewController <AVAudioPlayerDelegate>
Now you need to make your view controller your player
's delegate in your -playPressed:
method after it has been created with the following line.
self.player.delegate = self;
Finally, you can implement your delegate method.
-(void)audioPlayerDidFinishPlaying:(AVAudioPlayer *)player successfully:(BOOL)flag
{
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}
As with the previous recipe, you can implement delegate methods for both your AVAudioRecorder
and your AVAudioPlayer
to handle interruptions such as phone calls or text messages by pausing and re-starting your recording or playback.
Other useful methods for AVAudioRecorder
that you have not implemented here include the -pause
method, which pauses recording, but allows for the -play
method to be called again to continue recording to the same file, and the -recordForDuration
method, which allows you to specify a limitation on recording time.
As you can see, the AVAudioRecorder
and AVAudioPlayer
can work incredibly well in conjunction to provide a complete yet simple audio interface for the user.
So far you have been able to deal with playing and manipulating sound files that you have included in your project, but there is an easy way to access a significantly larger supply of sound files: by accessing the user's music library.
Here you will make another new project, this time called “MusicPick”. First, you need to link your project with the Media Player framework, and as usual add an import statement for it to your view controller.
You will set up your view to work as a basic music player, so it looks like Figure 7–6.
Make sure to connect all four buttons, naming them playButton
, prevButton
, nextButton
, and queueButton
. Your slider will be sliderVolume
, and your “info” UILabel
will be infoLabel
.
You will also define five actions for your elements, which will be -playPressed:
, -prevPressed:
, -nextPressed:
, -queuePressed:
, and -volumeChanged:
. Make sure to connect each element to its respective action.
You will also define two properties in your header file, one of type MPMusicPlayerController
called player
, which you will use to play music, and one of type called MPMediaItemCollection
called myCollection
, which will help you keep track of your chosen tracks to play. Finally, you will make your view controller the delegate for a class called MPMediaPickerController
, which will allow your user to select music to play, by conforming to the MPMediaPickerControllerDelegate
protocol. Overall, your header file should now look like so:
#import <UIKit/UIKit.h>
#import <MediaPlayer/MediaPlayer.h>
@interface MainViewController : UIViewController <MPMediaPickerControllerDelegate>
@property (strong, nonatomic) IBOutlet UIButton *queueButton;
@property (strong, nonatomic) IBOutlet UIButton *prevButton;
@property (strong, nonatomic) IBOutlet UIButton *playButton;
@property (strong, nonatomic) IBOutlet UIButton *nextButton;
@property (strong, nonatomic) IBOutlet UISlider *sliderVolume;
@property (strong, nonatomic) IBOutlet UILabel *infoLabel;
@property (strong, nonatomic) MPMediaItemCollection *myCollection;
@property (strong, nonatomic) MPMusicPlayerController *player;
-(IBAction)queuePressed:(id)sender;
-(IBAction)prevPressed:(id)sender;
-(IBAction)playPressed:(id)sender;
-(IBAction)nextPressed:(id)sender;
-(IBAction)volumeChanged:(id)sender;
@end
Make sure to synthesize both player
and myCollection
, and properly handle them in -viewDidUnload
as well, as usual.
Now, you can set up your -viewDidLoad
method.
-(void)viewDidLoad
{
[super viewDidLoad];
self.infoLabel.text = @"...";
self.player = [MPMusicPlayerController applicationMusicPlayer];
[self setNotifications];
[self.player beginGeneratingPlaybackNotifications];
[self.player setShuffleMode:MPMusicShuffleModeOff];
self.player.repeatMode = MPMusicRepeatModeNone;
self.sliderVolume.value = self.player.volume;
}
The MPMusicPlayerController
class has two important class methods that allow you to access an instance of the class. The one you used previously, +applicationMusicPlayer
, returns an application-specific music player. This option can be useful for keeping your music separate from the device's actual iPod, but has the downside of being unable to play once the app enters the background. Alternatively, you can use the +iPodMusicPlayer
, which allows for continuous play despite being in the background. The main thing to keep in mind in this case, however, is that your player
may already have a nowPlayingItem
from the actual iPod that you should be able to handle.
Whenever you use an instance of MPMusicPlayerController
, it is recommended to register for notifications for whenever the playback state changes, or whenever the currently playing song changes. You will do this in your -setNotifications
method, like so:
-(void)setNotifications
{
NSNotificationCenter *notificationCenter = [NSNotificationCenter defaultCenter];
[notificationCenter
addObserver: self
selector: @selector (handle_NowPlayingItemChanged:)
name: MPMusicPlayerControllerNowPlayingItemDidChangeNotification
object: self.player];
[notificationCenter
addObserver: self
selector: @selector (handle_PlaybackStateChanged:)
name: MPMusicPlayerControllerPlaybackStateDidChangeNotification
object: self.player];
[notificationCenter addObserver:self
selector:@selector(volumeChangedHardware:)
name:@"AVSystemController_SystemVolumeDidChangeNotification"
object:nil];
}
You have also included a third notification registration in order to make sure you know any time the user adjusts the device volume using the device's side buttons. This way, your application can be used just like a regular music player.
Each of these notifications performs a selector, which are defined as follows.
-(void)volumeChangedHardware:(id)sender
{
[self.sliderVolume setValue:self.player.volume animated:YES];
}
- (void) handle_PlaybackStateChanged: (id) notification
{
MPMusicPlaybackState playbackState = [self.player playbackState];
if (playbackState == MPMusicPlaybackStateStopped)
{
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
self.infoLabel.text = @"...";
[self.player stop];
}
else if (playbackState == MPMusicPlaybackStatePaused)
{
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}
else if (playbackState == MPMusicPlaybackStatePlaying)
{
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
}
}
- (void) handle_NowPlayingItemChanged: (id) notification
{
MPMediaItem *currentItemPlaying = [self.player nowPlayingItem];
if (currentItemPlaying)
{
NSString *info = [NSString stringWithFormat:@"%@ - %@", [currentItemPlaying valueForProperty:MPMediaItemPropertyTitle], [currentItemPlaying valueForProperty:MPMediaItemPropertyArtist]];
self.infoLabel.text = info;
}
else
{
self.infoLabel.text = @"...";
}
if (self.player.playbackState == MPMusicPlaybackStatePlaying)
{
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
}
}
As you can see, when the device's volume is changed, you will simply animate your slider to adjust to the new value. When the playback state of the device is changed, you are simply adjusting your view based on the new state. Whenever the currently playing song is changed, you are updating your user interface to display basic information about whichever media item is now playing.
Next, you will define two of your MPMediaPickerController
's delegate methods to handle both cancellation and successful selection of media.
-(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker
{
[self dismissModalViewControllerAnimated:YES];
}
-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
[self updateQueueWithMediaItemCollection:mediaItemCollection];
[self dismissModalViewControllerAnimated:YES];
}
An MPMediaItemCollection
is a group of media items that were selected, which you can then queue up in your MPMusicPlayerController
and iterate through. You will update your player
's queue by defining the -updateQueueWithMediaItemCollection:
method.
-(void)updateQueueWithMediaItemCollection:(MPMediaItemCollection *)collection
{
if (collection)
{
if (self.myCollection == nil)
{
self.myCollection = collection;
[self.player setQueueWithItemCollection: self.myCollection];
[self.player play];
}
else
{
BOOL wasPlaying = NO;
if (self.player.playbackState == MPMusicPlaybackStatePlaying) {
wasPlaying = YES;
}
MPMediaItem *nowPlayingItem = self.player.nowPlayingItem;
NSTimeInterval currentPlaybackTime = self.player.currentPlaybackTime;
NSMutableArray *combinedMediaItems =
[[self.myCollection items] mutableCopy];
NSArray *newMediaItems = [collection items];
[combinedMediaItems addObjectsFromArray: newMediaItems];
[self setMyCollection:
[MPMediaItemCollection collectionWithItems:
(NSArray *) combinedMediaItems]];
[self.player setQueueWithItemCollection:self.myCollection];
self.player.nowPlayingItem = nowPlayingItem;
self.player.currentPlaybackTime = currentPlaybackTime;
if (wasPlaying)
{
[self.player play];
}
}
}
}
This method may seem complex, but it is actually a fairly linear progression. First, after checking to make sure that the collection is not nil
, you check to see if there is any previous queue set up. If not, you simply set your player
's queue to this collection. If so, you combine the two collections, set your player
's queue as the result, and then restore your playback to where it previously was.
TIP: If the queue is updated while an MPMusicPlayerController
is currently playing, you will probably notice a small break in your playback as the queue is updated. This can be worked around by using a BOOL flag to update the queue only between songs.
You can fairly simply define all of your methods that are performed by your buttons and slider to perform their respective commands.
-(void)queuePressed:(id)sender
{
MPMediaPickerController *picker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
picker.delegate = self;
picker.allowsPickingMultipleItems = YES;
picker.prompt =
NSLocalizedString (@"Add songs to play",
"Prompt in media item picker");
[self presentModalViewController:picker animated:YES];
}
-(void)prevPressed:(id)sender
{
if ([self.player currentPlaybackTime] > 5.0)
{
[self.player skipToBeginning];
}
else
{
[self.player skipToPreviousItem];
}
}
-(void)volumeChanged:(id)sender
{
if (self.player.volume != self.sliderVolume.value)
{
self.player.volume = self.sliderVolume.value;
}
}
-(void)playPressed:(id)sender
{
if ((myCollection != nil) && (self.player.playbackState != MPMusicPlaybackStatePlaying))
{
[self.player play];
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
}
else if (self.player.playbackState == MPMusicPlaybackStatePlaying)
{
[self.player pause];
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}
}
-(void)nextPressed:(id)sender
{
[self.player skipToNextItem];
}
As you can see, you have given users a five-second window in which to use the Previous button to skip to the previous song, before they must first skip back to the beginning of the current song.
Your very last step in this project is to fully implement your -viewDidUnload
method to correctly handle your player
's setup. Since you registered as an observer for three different kinds of notifications at the beginning of your application, you need to remove your view controller as an observer of these values. You will also be sure to -stop
your MPMusicPlayerController
and set its queue to nil
. In its entirety, the method will look like so:
- (void)viewDidUnload
{
[self.player stop];
[self.player setQueueWithItemCollection:nil];
[[NSNotificationCenter defaultCenter]
removeObserver: self
name: MPMusicPlayerControllerNowPlayingItemDidChangeNotification
object: self.player];
[[NSNotificationCenter defaultCenter]
removeObserver: self
name: MPMusicPlayerControllerPlaybackStateDidChangeNotification
object: self.player];
[[NSNotificationCenter defaultCenter] removeObserver:self name:@"AVSystemController_SystemVolumeDidChangeNotification" object:nil];
[self.player endGeneratingPlaybackNotifications];
self.myCollection = nil;
self.player = nil;
[self setQueueButton:nil];
[self setPrevButton:nil];
[self setPlayButton:nil];
[self setNextButton:nil];
[self setSliderVolume:nil];
[self setInfoLabel:nil];
[super viewDidUnload];
}
One thing to note when you run this application is that until you start playing music, you will not be able to adjust your AVAudioPlayer
's volume by using the external volume buttons, as these will still control the ringer volume, as opposed to the playback volume. Once you select a song to play, you will receive full control over the playback volume through these buttons.
You can probably see the pure power that this framework provides the developer. By being able to access the user's own library, you can open up a whole new level of audio customization for your application by the user, with possibilities ranging from selecting music to awake to, to specifying your own background music to a game. By giving a user the power of choice and personalization, your application's marketability and functionality increase tenfold.
Now that you have seen a very simple way to allow the user to select music, you can implement an even more powerful way to query, filter, and sort music from the user's media library.
Here you will simply be taking the previous recipe and adding functionality to it, specifically by allowing the user to type the name of an artist and then querying the library for music by that artist.
First, you will add a UIButton
as well as a UITextField
to your XIB remaining from your previous recipe, so that your view now looks like Figure 7–7.
Make sure to correctly connect this button to an action called -queryPressed:
and the text field to a property called textFieldArtist
.
The first thing you will do with your new UITextField
is to set its delegate to your view controller by adding the following lines to your -viewDidLoad
.
self.textFieldArtist.delegate = self;
self.textFieldArtist.enablesReturnKeyAutomatically = YES;
Make sure to adjust your header file to declare that your view controller will conform to the UITextFieldDelegate
protocol.
Next, you need to implement the following UITextField
delegate method to have your text field correctly dismiss the keyboard upon the pressing of the return key.
-(BOOL)textFieldShouldReturn:(UITextField *)textField
{
[textField resignFirstResponder];
return NO;
}
You may choose to also include a call to -queryPressed:
(which you will implement momentarily) from the -textFieldShouldReturn:
method, so that when the return key is pressed the query is automatically performed. I opted against doing this in case the user wants to wait to perform the query after the current song has finished.
Next implement the -queryPressed:
method.
-(void)queryPressed:(id)sender
{
NSString *artist = self.textFieldArtist.text;
if (artist != nil && artist != @"")
{
MPMediaPropertyPredicate *artistPredicate = [MPMediaPropertyPredicate predicateWithValue:artist forProperty:MPMediaItemPropertyArtist comparisonType:MPMediaPredicateComparisonContains];
MPMediaQuery *query = [[MPMediaQuery alloc] init];
[query addFilterPredicate:artistPredicate];
NSArray *result = [query items];
if ([result count] > 0)
{
[self updateQueueWithMediaItemCollection:[MPMediaItemCollection collectionWithItems:result]];
}
else
self.infoLabel.text = @"Artist Not Found.";
}
}
As you can see, querying the media library is a fairly simple process, which at its bare minimum requires only an instance of the MPMediaQuery
class. You can then add MPMediaPropertyPredicates
to a query to make it more specific.
Using MPMediaPropertyPredicates
requires a decent knowledge of the different MPMediaItemProperties
, so that you can know exactly what kind of information you can acquire. Not all MPMediaItemProperties
are filterable, and the filterable properties are also different if you are dealing specifically with a podcast. You should refer to the Apple documentation on MPMediaItem
for a full list of properties, but following is a list of the most commonly used ones:
MPMediaItemPropertyMediaType
MPMediaItemPropertyTitle
MPMediaItemPropertyAlbumTitle
MPMediaItemPropertyArtist
MPMediaItemPropertyArtwork
TIP: Whenever you are using the MPMediaItemPropertyArtwork
, you can use the -imageWithSize:
method defined in MPMediaItemPropertyArtwork
to create a UIImage
from the artwork.
3. Whenever multiple filter predicates specifying different properties are added to a query, the predicates are evaluated using the AND operator, meaning that if you specify an artist name and an album name, you will receive only songs by that artist AND from that specific album.
4. Do not add two filter predicates of the same property to a query, as the resulting behavior is not defined. If you wish to query a database for multiple specific values of the same property, such as filtering for all songs by two different artists, the better method is to simply create two queries, and then combine their results afterward.
5. The comparisonType
property of an MPMediaPropertyPredicate
helps specify how exact you want your predicate to be. A value of MPMediaPredicateComparisonEqualTo
returns only items with the string exactly equal to the given one, while a value of MPMediaPredicateComparisonContains
, as shown earlier, returns items that contain the given string, proving to be usually a less specific search.
As an added functionality, MPMediaQuery
s can also be given a “grouping property”, so that they automatically group their results. You could, for example, filter a query by a specific artist, but group according to the album name. In this way, you can retrieve all the songs by a specific artist but iterate through them as if they were in albums, as demonstrated by the following code, which could be added to your -queryPressed:
method, in which the following query
object is created.
[query setGroupingType: MPMediaGroupingAlbum];
NSArray *albums = [query collections];
for (MPMediaItemCollection *album in albums)
{
MPMediaItem *representativeItem = [album representativeItem];
NSString *albumName = [representativeItem valueForProperty: MPMediaItemPropertyAlbumTitle];
NSLog (@"%@", albumName);
}
You can also set a grouping type by using class methods, such as +albumsQuery
, for MPMediaQuery
, which will create your query
instance with a pre-set grouping property.
So far you have seen multiple techniques for dealing with audio in iOS, from playing a single sound, to accessing the library, to utilizing the iPod player. Each technique tends to have its pros, cons, and specific uses depending on the goal of your application.
However, when all these functionalities are used in conjunction, the possibilities for your app become nearly limitless. Here, you will combine the use of the AVFoundation
, MPMediaQuery
, MPMediaPickerController
, and MPNowPlayingInfoCenter
(a new iOS 5.0 feature!) to create your own version, albeit a less pretty one, of the iPod user interface such that your music can continue to play even if your application is in the background.
Make a new project using the Single View Application template, naming it “Chapter7Recipe3”, with class prefix “Main”.
Next, you will make sure all the frameworks you need are imported. Go ahead and link the following frameworks to your project, exactly as you have in all the previous recipes in this chapter.
AVFoundation.framework
: You will use this to play your audio files.MediaPlayer.framework
: This will allow you to access your library and media files.CoreMedia.framework
: You won't use any classes from this framework, but you will need some of the CMTime
functions to help deal with your audio player.Add the following import statements to your view controller's header file. You do not need one for the Core Media framework in this project.
#import <MediaPlayer/MediaPlayer.h>
#import <AVFoundation/AVFoundation.h>
Your view controller is also going to end up as the delegate for several objects, so add protocol statements to your header file for the following protocols:
UITextFieldDelegate
AVAudioSessionDelegate
MPMediaPickerControllerDelegate
Next you will set up your app to be able to continue playing music once it has entered the background of the device. The first thing you need to do is declare a property of type AVAudioSession
, called session
. Make sure to @synthesize it, and then in your -viewDidUnload
set it to nil
as usual.
@property (nonatomic, strong) AVAudioSession *session;
Next, add the following code to your -viewDidLoad
method:
self.session = [AVAudioSession sharedInstance];
self.session.delegate = self;
[self.session setCategory:AVAudioSessionCategoryPlayback error:nil];
[self.session setActive:YES error:nil];
By specifying that your session
's category is of type AVAudioSessionCategoryPlayback
, you are telling your device that your application's main focus is playing music, and should therefore be allowed to continue playing audio while the application is in the background.
You need to also make sure your session is deactivated when you are done with it, so add the following line to your -viewDidUnload
method.
[self.session setActive:NO error:nil];
Now that you have configured your AVAudioSession
, you need to edit your application's .plist
file in order to specify that your application, when in the background, must be allowed to run audio. You can usually find this file in the Supporting Files group of your project. If not, you can find the file in your project's folder, as in Figure 7–8.
By default, this file should open in Xcode, resembling Figure 7–9.
Under the Editor menu, select Add Item. A new item should appear, looking like Figure 7–10.
For the Application Category item, open the drop-down menu by clicking the pair of arrows, and then select “Required Background modes”.
Drop down the values list for this new item by clicking the arrow on the left. You will edit the value for “Item 0”. Drop down the menu on the right side, and select “App plays audio”, as in Figure 7–11. Alternatively, you could type audio into the value field, and once you press the return key, Xcode will change the value correctly.
Your application should now be ready to play audio while in the background state!
Next, you will build your user interface in your XIB file. Set up your view so that it looks like Figure 7–12.
For the purposes of this project, the following variable names will be assigned to the shown view elements, so connect each element over to your header file using the name shown here:
infoLabel
: Your UILabel
at the top of the view, which will display the information of the current songtextFieldSong
: Your UITextField
, which you put a placeholder text of “Song” insidequeryButton
: The upper UIButton
with the title “Query”artworkImageView
: Your UIImageView
, which will display album artwork for songslibraryButton
: Large “Library” UIButton
.playButton
: Middle “Play” UIButton
nextButton
: Right-side “Next” UIButton
prevButton
: Left-side “Prev” UIButton
Define actions for the five buttons that you will use and connect each button to its respective method. Your method declarations should look like so:
-(IBAction)queryPressed:(id)sender;
-(IBAction)nextPressed:(id)sender;
-(IBAction)prevPressed:(id)sender;
-(IBAction)playPressed:(id)sender;
-(IBAction)libraryPressed:(id)sender;
Next, you will declare the additional property objects that you will need to run your application. For each of the following properties, make sure you synthesize each one and properly set it to nil
at the end of your application. First, you will be using an instance of the AVQueuePlayer
class to play and queue up your sound files, so you will declare one of them as a property:
@property (nonatomic, strong) AVQueuePlayer *player;
In order to manage your player
, you will need two copies as NSMutableArray
s of its playlist. One will have all the queued items in their original MPMediaItem
class (which you can access media information for), while the other will have them all after they have been converted to instances of AVPlayerItem
(which you can play through your AVQueuePlayer
). You will also have a property of type NSUInteger
in order to help keep track of the currently playing item. These properties will be declared like so, making sure to correctly synthesize and nil
each one:
@property (nonatomic, strong) NSMutableArray *playlist;
@property (nonatomic, strong) NSMutableArray *myCollection;
@property (nonatomic) NSUInteger currentIndex;
The first thing you will do now in your implementation file is finish writing your -viewDidLoad
method, so that it now looks like so:
- (void)viewDidLoad
{
[super viewDidLoad];
self.session = [AVAudioSession sharedInstance];
self.session.delegate = self;
[self.session setCategory:AVAudioSessionCategoryPlayback error:nil];
[self.session setActive:YES error:nil];
self.textFieldSong.delegate = self;
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}
The only new code here was to confirm that the title for your playButton
is set correctly, and to set your view controller as the delegate for your UITextField
. Implement the UITextFieldDelegate
method to handle dismissing the keyboard like so:
-(BOOL)textFieldShouldReturn:(UITextField *)textField
{
[textField resignFirstResponder];
return NO;
}
You will also need to override the getter for your playlist
in order to ensure that it correctly initializes.
-(NSMutableArray *)playlist
{
if (!playlist)
{
playlist = [[NSMutableArray alloc] initWithCapacity:
5];
}
return playlist;
}
You will also override the getter for currentIndex
, so that it always returns the correct index of the currently playing item.
-(NSUInteger)currentIndex
{
currentIndex = [self.playlist indexOfObject:self.player.currentItem];
return currentIndex;
}
Next you will implement a method that will update your user interface every time the song changes.
-(void)updateNowPlaying
{
if (self.player.currentItem != nil)
{
MPMediaItem *nowPlaying = [self.myCollection objectAtIndex:self.currentIndex];
// NSLog(@"%@", [nowPlaying valueForProperty:MPMediaItemPropertyTitle]);
self.infoLabel.text = [NSString stringWithFormat:@"%@ - %@", [nowPlaying valueForProperty:MPMediaItemPropertyTitle], [nowPlaying valueForProperty:MPMediaItemPropertyArtist]];
UIImage *artwork = [[nowPlaying valueForProperty:MPMediaItemPropertyArtwork] imageWithSize:self.artworkImageView.frame.size];
if (artwork)
{
self.artworkImageView.image = artwork;
}
else
{
self.artworkImageView.image = nil;
}
if ([MPNowPlayingInfoCenter class])
{
NSString *title = [nowPlaying valueForProperty:MPMediaItemPropertyTitle];
NSString *artist = [nowPlaying valueForProperty:MPMediaItemPropertyArtist];
NSDictionary *currentlyPlayingTrackInfo = [NSDictionary
dictionaryWithObjects:[NSArray arrayWithObjects:title, artist, nil] forKeys:[NSArray
arrayWithObjects:MPMediaItemPropertyTitle, MPMediaItemPropertyArtist, nil]];
[MPNowPlayingInfoCenter defaultCenter].nowPlayingInfo =
currentlyPlayingTrackInfo;
}
}
else
{
self.infoLabel.text = @"...";
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
self.artworkImageView.image = nil;
}
}
The method just shown, aside from updating your user interface to display the current song information, also takes advantage of a new feature in iOS 5, called the MPNowPlayingInfoCenter
! This class allows the developer to place information on the device's lock screen, or on other devices when the application is displaying info through AirPlay. You can pass information to it by setting the nowPlayingInfo
property of the +defaultCenter
to a dictionary of values and properties that you created.
To make use of the foregoing MPNowPlayingInfoCenter
implementation, you must make a few specific additions in your code. Specifically, none of this functionality will work if your application cannot handle remote control events. To do this, start by implementing the -remoteControlReceivedWithEvent:
method that handles remote events.
- (void) remoteControlReceivedWithEvent: (UIEvent *) receivedEvent {
if (receivedEvent.type == UIEventTypeRemoteControl) {
switch (receivedEvent.subtype) {
case UIEventSubtypeRemoteControlTogglePlayPause:
[self playPressed:nil];
break;
case UIEventSubtypeRemoteControlPreviousTrack:
[self prevPressed:nil];
break;
case UIEventSubtypeRemoteControlNextTrack:
[self nextPressed:nil];
break;
default:
break;
}
}
}
This aforementioned method will be called when the user taps the Pause, Previous, and Next buttons on the lock screen or the multitasking screen. All you have to do is have them call your methods to play, advance, and rewind your player, which you will implement later.
Next, in order to receive these remote control events, you must modify your -viewDidAppear:animated:
and -viewWillDisappear:animated:
methods.
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[[UIApplication sharedApplication] beginReceivingRemoteControlEvents];
[self becomeFirstResponder];
}
- (void)viewWillDisappear:(BOOL)animated
{
[[UIApplication sharedApplication] endReceivingRemoteControlEvents];
[self resignFirstResponder];
[super viewWillDisappear:animated];
}
Finally, in order for these previous two methods to work correctly and allow your view controller to become the first responder, you must override the -canBecomeFirstResponder
method like so:
-(BOOL)canBecomeFirstResponder
{
return YES;
}
If these extra steps are not followed, your application will not be able to make use of any “Now Playing” information in the lock screen or multitasking bar.
In order to assist in keeping your two NSMutableArray
s in sync, you will define a method that will take an NSArray
of MPMediaItem
s and return an NSArray
of the same items, but essentially converted to the AVPlayerItem
.
-(NSArray *)AVPlayerItemsFromArray:(NSArray *)items
{
NSMutableArray *array = [[NSMutableArray alloc] initWithCapacity:[items count]];
NSURL *url;
for (MPMediaItem *current in items)
{
url = [current valueForProperty:MPMediaItemPropertyAssetURL];
AVPlayerItem *playerItem = [AVPlayerItem playerItemWithURL:url];
[[NSNotificationCenter defaultCenter]
addObserver:self
selector:@selector(playerItemDidReachEnd:)
name:AVPlayerItemDidPlayToEndTimeNotification
object:playerItem];
if (playerItem != nil)
[array addObject:playerItem];
}
return array;
}
In order to later be able to tell when a song has finished playing and act accordingly, you will need to add your view controller as an observer, as shown earlier. It is fairly convenient to include the -addObserver
call in this method since you can so easily access all the AVPlayerItem
s that you will be using.
While you are dealing with your NSNotificationCenter
, make sure that all the observers you just added will be properly removed. Add the following lines to your -viewDidUnload
.
for (AVPlayerItem *playerItem in self.playlist)
{
[[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
}
The implementation for the -playerItemDidReachEnd:
method that you've specified to run any time a song ends is quite simple, as shown here.
- (void)playerItemDidReachEnd:(NSNotification *)notification
{
[self performSelector:@selector(updateNowPlaying) withObject:nil afterDelay:
0.5];
}
I have included the half-second delay here to ensure that the -updateNowPlaying
call does not execute until the AVQueuePlayer
has completely moved on to playing the next item.
Next you'll implement your two methods that will be used whenever you need to update or add to your playlist.
-(void)updatePlaylistWithArray:(NSArray *)collection
{
if (([self.playlist count] == 0) || (self.player.currentItem == nil))
{
[self.myCollection removeAllObjects];
self.playlist = [NSMutableArray arrayWithArray:collection];
self.player = [[AVQueuePlayer alloc] initWithItems:self.playlist];
[self.player play];
}
else
{
AVPlayerItem *currentItem = [self.playlist lastObject];
for (AVPlayerItem *item in collection)
{
if ([self.player canInsertItem:item afterItem:currentItem])
{
[self.player insertItem:item afterItem:currentItem];
currentItem = item;
}
}
[self.playlist addObjectsFromArray:collection];
}
}
-(void)updateMyCollectionWithArray:(NSArray *)mediaItems
{
if ([self.myCollection count] == 0)
{
self.myCollection = [NSMutableArray arrayWithArray:mediaItems];
}
else
{
[self.myCollection addObjectsFromArray:mediaItems];
}
}
Your -playPressed:
and -nextPressed:
methods are quite simple in their implementation since AVQueuePlayer
inherits from the AVPlayer
class, and has a nice little method for advancing to the next item in the queue.
-(void)playPressed:(id)sender
{
if (self.playlist.count > 0)
{
if ([[self.playButton titleForState:UIControlStateNormal] isEqualToString:@"Play"])
{
[self.player play];
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
}
else
{
[self.player pause];
//Scrub back half a second to give the user a little lead-in when they resume playing
[self.player seekToTime:CMTimeSubtract(self.player.currentTime, CMTimeMakeWithSeconds(0.5, 1.0)) completionHandler:^(BOOL finished)
{
[self.playButton setTitle:@"Play" forState:UIControlStateNormal];
}];
}
[self updateNowPlaying];
}
}
-(void)nextPressed:(id)sender
{
[self.player advanceToNextItem];
if (self.player.currentItem == nil)
{
[self.playlist removeAllObjects];
[self.myCollection removeAllObjects];
}
[self updateNowPlaying];
}
Unfortunately, the AVQueuePlayer
does not include any easy implementation for moving backward in a queue, so your implementation of the -prevPressed:
method is slightly more complex, and just involves a bit of swapping around of your AVPlayerItem
s.
-(void)prevPressed:(id)sender
{
if (CMTimeCompare(self.player.currentTime, CMTimeMake(5.0, 1)) > 0)
{
[self.player seekToTime:kCMTimeZero];
}
else
{
[self.player pause];
AVPlayerItem *current = self.player.currentItem;
if (current != [self.playlist objectAtIndex:0])
{
AVPlayerItem *previous = [self.playlist objectAtIndex:[self.playlist indexOfObject:current]-1];
if ([self.player canInsertItem:previous afterItem:current])
{
[current seekToTime:kCMTimeZero];
[previous seekToTime:kCMTimeZero];
[self.player insertItem:previous afterItem:current];
[self.player advanceToNextItem];
[self.player removeItem:current];
[self.player insertItem:current afterItem:previous];
}
else
{
NSLog(@"Error: Could not insert");
}
}
else
{
[self.player seekToTime:kCMTimeZero];
}
[self.player play];
}
[self updateNowPlaying];
}
The CMTimeMake()
function that you just used is a very flexible function that takes two inputs. The first represents the number of time units you want, and the second represents the timescale, where 1 represents a second, 2 represents half a second, and so on. A call of CMTimeMake(100, 10)
would make 100 units of (1/10) seconds each, resulting in 10 seconds.
The function of your Query button will be to use an MPMediaQuery
to retrieve any songs in the library that contain the word or phrase in your UITextField
. This implementation is fairly straightforward, once you remember to update both of your NSMutableArray
s afterward.
-(void)queryPressed:(id)sender
{
MPMediaQuery *query = [[MPMediaQuery alloc] init];
NSString *title = self.textFieldSong.text;
MPMediaPropertyPredicate *songPredicate = [MPMediaPropertyPredicate predicateWithValue:title forProperty:MPMediaItemPropertyTitle comparisonType:MPMediaPredicateComparisonContains];
[query addFilterPredicate:songPredicate];
[self updatePlaylistWithArray:[self AVPlayerItemsFromArray:[query items]]];
[self updateMyCollectionWithArray:[query items]];
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
[self updateNowPlaying];
if ([self.textFieldSong isFirstResponder])
[self.textFieldSong resignFirstResponder];
}
Finally, you can set up your -libraryPressed:
method, which will bring up an MPMediaPickerController
, allowing you to select multiple songs for queuing.
-(void)libraryPressed:(id)sender
{
MPMediaPickerController *picker = [[MPMediaPickerController alloc] initWithMediaTypes:MPMediaTypeMusic];
picker.delegate = self;
picker.allowsPickingMultipleItems = YES;
picker.prompt = @"Choose Some Music!";
[self presentModalViewController:picker animated:YES];
}
You will need to implement two delegate methods to correctly handle the MPMediaPickerController
. First, here is the delegate method for a cancellation:
-(void)mediaPickerDidCancel:(MPMediaPickerController *)mediaPicker
{
[self dismissModalViewControllerAnimated:YES];
}
Second, you have the delegate method for a successful media choice, which will look very similar to the -queryPressed:
method.
-(void)mediaPicker:(MPMediaPickerController *)mediaPicker didPickMediaItems:(MPMediaItemCollection *)mediaItemCollection
{
[self updatePlaylistWithArray:[self AVPlayerItemsFromArray:[mediaItemCollection items]]];
[self updateMyCollectionWithArray:[mediaItemCollection items]];
[self.playButton setTitle:@"Pause" forState:UIControlStateNormal];
[self updateNowPlaying];
[self dismissModalViewControllerAnimated:YES];
}
You may, while testing this application, notice the slight issue of the fact that you have not included a way to clear the queue and reset the app without playing or skipping through to the end of the queue. This is not a problem if your queue is only a few songs long, but if you queue any more than that, you will probably want to incorporate some method to clear out your queue by resetting your NSMutableArray
s and removing all the items from your player
's queue. For testing purposes, however, simply closing the application by double-tapping the device's home button, holding a long press over the application's icon, and then closing it will do.
Your music queuing player should now be fully functional! When you test the app, it should continue to play music even after the application has entered the background of your device! Figure 7–13 demonstrates your application playing a song, along with its Now Playing functionality and ability to receive remote control events.
The complete multimedia experience is one that goes beyond a simple matter of whether the user can listen to music. Sound, as a product, is much more about the tiny details that make it just a little bit better, whether it's a quick fade-out when pausing, or the ability to more quickly find a song. From recording music to filtering media items to creating volume ramps, every extra detail that you, as a developer, take the care to include in your applications will eventually result in a significantly more powerful tool that your audience can enjoy. In iOS development, Apple has provided an incredibly powerful set of multimedia-based functionalities to make use of, and, as a quick search in the iTunes app store will show, the development community has gone above and beyond the call of duty in fully utilizing these in order to provide our audience, the users, with a multimedia environment beyond imagination.