Chapter    8

Camera Recipes

A great number of mobile applications can interact with your device’s camera, including apps that take pictures, record videos, and provide overlays (e.g. augmented-reality applications). iOS developers have a great deal of control in how they can interact with any given device’s hardware. In this chapter, you will go over multiple ways to access and use these functionalities, from simple, predefined interfaces to incredibly flexible, custom implementations.

Note  The iOS simulator does not support camera hardware. To test most recipes in this chapter, you must run them on a physical device.

Recipe 8-1: Taking Pictures

iOS has an incredibly handy and simple interface to your device’s camera. With this interface you can allow users to take pictures and record video from inside an app. Here, you learn the basics of starting the camera interface to capture a still image.

You create a simple project that allows you to pull up your camera, take a picture, and then display the most recently taken image on your screen. Start by creating a new single-view application project. You don’t need to import any extra frameworks into your project to access your camera in the predefined way you use here.

Setting Up the User Interface

Start by setting up a simple user interface containing an image view and a button. Switch over to your view controller’s .xib file. Then drag an image view from the object library and make it fill the entire view. Next, drag out a UIButton into your view. The button accesses the camera, so set its text to “Take Picture.” Your view should now resemble the view in Figure 8-1.

9781430245995_Fig08-01.jpg

Figure 8-1.  A simple user interface for taking pictures

Create outlets for your image view and your button, name them imageView and cameraButton, respectively. Also, create an action with the name takePicture for your button.

Your ViewController.h file should now resemble the following code:

//
//  ViewController.h
//  Recipe 8.1: Taking Pictures
//

#import <UIKit/UIKit.h>

@interface ViewController : UIViewController

@property (weak, nonatomic) IBOutlet UIImageView *imageView;
@property (weak, nonatomic) IBOutlet UIButton *cameraButton;

- (IBAction)takePicture:(id)sender;


@end

Accessing the Camera

Use an instance of the UIImagePickerController class to access your camera. Whenever dealing with the camera hardware on iOS, it is essential that, as a developer, you include a function to have your app check for hardware availability. This is done through the isSourceTypeAvailable: class method of UIImagePickerController. The method takes one of the following predefined constants as an argument:

  • UIImagePickerControllerSourceTypeCamera.
  • UIImagePickerControllerSourceTypePhotoLibrary.
  • UIImagePickerControllerSourceTypeSavedPhotosAlbum.

For this recipe you’ll use the first choice, UIImagePickerControllerSourceTypeCamera. UIImagePickerControllerPhotoLibrary is used to access all the stored photos on the device, while UIImagePickerControllerSavedPhotosAlbum is used to access only the Camera Roll album.

Now, switch to the ViewController.m file and locate the stubbed out takePicture: action method. There, you’ll begin by checking whether the Camera source type is available, and if not, display a UIAlertView saying so.

- (IBAction)takePicture:(id)sender
{
        // Make sure camera is available
    if ([UIImagePickerController
         isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
    {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"
                                                        message:@"Camera Unavailable"
                                                       delegate:self
                                              cancelButtonTitle:@"Cancel"
                                              otherButtonTitles:nil, nil];
        [alert show];
        return;
    }

}

The iOS Simulator does not have camera functionality. Therefore you’ll only see the error message, demonstrated in Figure 8-2, when you run your app there. To fully test this application you need to run it on a physical device.

9781430245995_Fig08-02.jpg

Figure 8-2.  The simulator does not have camera support so you need to test your app on a real device

Now before you expand the takePicture: method with the case in which the camera is in fact available, you need to make a couple of changes in ViewController.h. The first is to add a property to hold the image picker instance through which interface you’ll access the camera. The second change is to prepare the view controller for receiving events from the image picker. Such a delegate needs to conform to both UIImagePickerControllerDelegate and UINavigationControllerDelegate protocols. Following is the header file with those changes marked in bold:

//
//  ViewController.h
//  Recipe 8.1: Taking Pictures
//

#import <UIKit/UIKit.h>

@interface ViewController : UIViewController<UIImagePickerControllerDelegate,
                                             UINavigationControllerDelegate>

@property (weak, nonatomic) IBOutlet UIImageView *imageView;
@property (weak, nonatomic) IBOutlet UIButton *cameraButton;
@property (strong, nonatomic) UIImagePickerController *imagePicker;

- (IBAction)takePicture:(id)sender;

@end

Now you can add the following code to your takePicture: action method in ViewController.m. It creates and initializes the image picker instance—if it hasn’t already done so—and presents it to handle the camera device.

- (IBAction)takePicture:(id)sender
{
        // Make sure camera is available
    if ([UIImagePickerController
         isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
    {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"
                                                        message:@"Camera Unavailable"
                                                       delegate:self
                                              cancelButtonTitle:@"Cancel"
                                              otherButtonTitles:nil, nil];
        [alert show];
        return;
    }
    if (self.imagePicker == nil)
    {
        self.imagePicker = [[UIImagePickerController alloc] init];
        self.imagePicker.delegate = self;
        self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;

    }
    [self presentViewController:self.imagePicker animated:YES completion:NULL];

}

If you run your app now (on a real device) and tap the button, you should be presented with a simple camera interface that allows you to take a picture and select it for the purpose of your app (or retake it if you’re not satisfied). Figure 8-3 shows the user interface of UIImagePickerController.

9781430245995_Fig08-03.jpg

Figure 8-3.  The user interface of an UIImagePickerViewController

Retrieving a Picture

Now that you have set up your view controller to successfully present your UIImagePickerController, you need to handle how your view controller reacts to the completion of the UIImagePickerController’s selection, when a picture has been taken and selected for use. You do this by using the delegate method imagePickerController:didFinishPickingMediaWithInfo:. You retrieve the picture, update the image view, and finally dismiss the image picker, as follows:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info objectForKey:UIImagePickerControllerOriginalImage];
    self.imageView.image = image;
    self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    [self dismissViewControllerAnimated:YES completion:NULL];
}

Note  By setting the content mode property of the image view to UIViewContentModeScaleAspectFill, we ensure that the picture will fill the entire view while still maintaining its aspect ratio. This usually results in the picture’s being cropped instead of looking stretched. Alternatively, you could use UIViewContentModeScaleAspectFit, which displays the whole picture with retained aspect ratio but not necessarily fill the entire view.

You also implement another UIImagePickerController delegate method to handle the cancellation of an image selection. The only thing you need to do there is to dismiss the image picker view.

- (void) imagePickerControllerDidCancel: (UIImagePickerController *) picker
{
    [self dismissViewControllerAnimated:YES completion:NULL];
}

Your app can now access the camera, take a picture, and set it as the background of the app, as Figure 8-4 illustrates.

9781430245995_Fig08-04.jpg

Figure 8-4.  Your app with a photo set as the background

Note  The UIImagePickerController class does not support landscape orientation for taking pictures. Although you can take pictures that way, the view does not adjust according to the landscape orientation, which results in a rather weird user experience.

Implement Basic Editing

As an optional setting, you could allow your camera interface to be editable, enabling the user to crop and frame the picture she or he has taken. To do this, you simply have to set the UIImagePickerController’s allowsEditing property to YES.

- (IBAction)takePicture:(id)sender
{
    // ...

    if (self.imagePicker == nil)
    {
        self.imagePicker = [[UIImagePickerController alloc] init];
        self.imagePicker.delegate = self;
        self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
        self.imagePicker.allowsEditing = YES;

    }
    [self presentViewController:self.imagePicker animated:YES completion:NULL];
}

Then, to acquire the edited image, you also need to make the following change in the imagePickerController:didFinishPickingMediaWithInfo: method:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = [info objectForKey:UIImagePickerControllerEditedImage];
    self.imageView.image = image;
    self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    [self dismissViewControllerAnimated:YES completion:NULL];
}

Saving Picture to Photos Album

You might want to save the pictures you take to the device’s saved photos album. This is easily done with the UIImageWriteToSavedPhotosAlbum() function. Add the following line to your imagePickerViewController:didFinishPickingMediaWithInfo: method:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    UIImage *image = (UIImage *)[info objectForKey:UIImagePickerControllerEditedImage];
    UIImageWriteToSavedPhotosAlbum (image, nil, nil , nil);
    self.imageView.image = image;
    self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    [self dismissViewControllerAnimated:YES completion:NULL];
}

However, in iOS 6 privacy restrictions have been applied to the saved photos album; an app that wants to access it now needs an explicit authorization from the user. Therefore, you should also provide an explanation as to why your app requests access to the saved photos library. This is done in the application’s Info.plist file (found in the Supporting Files folder in the Project Navigator) and the key NSPhotoLibraryUsageDescription (or “Privacy—Photo Library Usage Description” as it’s displayed in the property list).

You can enter any text you want for the usage description; we chose “Testing the camera.” The important thing to know is that the text will be displayed to the user when he or she is prompted for authorizing the app access to the photos album, as in Figure 8-5.

9781430245995_Fig08-05.jpg

Figure 8-5.  Saving the picture to the photos library will need an authorization from the user

Recipe 8-2: Recording Video

Your UIImagePickerController is actually a lot more flexible than you’ve seen so far, especially because you’ve been using it exclusively for still images. Here you’ll go through how to set up your UIImagePickerController to handle both still images and video.

For this recipe, you build off the code from the Recipe 8-1 as it already includes the setup you need. You add to its functionality by implementing the option to record and save videos.

Start by setting the image picker’s allowed media types to all available types for the camera. This can be done using the availableMediaTypesForSourceType: class method of the UIImagePickerController, as follows:

- (IBAction)takePicture:(id)sender
{
        // Make sure camera is available
    if ([UIImagePickerController
         isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
    {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"
                                                        message:@"Camera Unavailable"
                                                       delegate:self
                                              cancelButtonTitle:@"Cancel"
                                              otherButtonTitles:nil, nil];
        [alert show];
        return;
    }
    if (self.imagePicker == nil)
    {
        self.imagePicker = [[UIImagePickerController alloc] init];
        self.imagePicker.delegate = self;
        self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
        self.imagePicker.mediaTypes = [UIImagePickerController

availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];

        self.imagePicker.allowsEditing = YES;
    }
    [self presentViewController:self.imagePicker animated:YES completion:NULL];
}

Next, you need to instruct your application on how to handle when the user records and uses a video. To do this, first you need to link the Mobile Core Services framework to your project, then you need to import its API in your view controller’s header file:

//
//  ViewController.h
//  Recipe 8.2: Recording Videos
//

#import <UIKit/UIKit.h>
#import <MobileCoreServices/MobileCoreServices.h>

@interface ViewController : UIViewController<UIImagePickerControllerDelegate, UINavigationControllerDelegate>

@property (weak, nonatomic) IBOutlet UIImageView *imageView;
@property (weak, nonatomic) IBOutlet UIButton *cameraButton;
@property (strong, nonatomic) UIImagePickerController *imagePicker;

- (IBAction)takePicture:(id)sender;

@end

Now add the following code to your UIImagePickerController’s delegate method:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
    if (CFStringCompare((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) ==
        kCFCompareEqualTo)
    {
        // Movie Captured
        NSString *moviePath =
            [[info objectForKey: UIImagePickerControllerMediaURL] path];
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath))
        {
            UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
        }
    }
    else
    {
        // Picture Taken

        UIImage *image =
            (UIImage *)[info objectForKey:UIImagePickerControllerEditedImage];
        UIImageWriteToSavedPhotosAlbum (image, nil, nil , nil);
        self.imageView.image = image;
        self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    }
    [self dismissViewControllerAnimated:YES completion:NULL];
}

Essentially, what you are doing here is comparing the media type of the saved file. The main issue comes into play when you attempt to compare mediaType, an NSString, with kUTTypeMovie, which is of the type CFStringRef. You accomplish this by casting your NSString down to a CFStringRef. In iOS 5 this process became slightly more complicated with the introduction of Automatic Reference Counting (ARC), because ARC deals with Objective-C object types such as NSString, but not with C types like CFStringRef. You create a bridged casting by placing __bridge before your CFStringRef, as shown earlier, in order to instruct ARC not to deal with this object.

If all has gone well your app should now be able to record video by selecting the video mode in the image picker view, as shown in Figure 8-6. The video is then saved (if allowed by the user) to the private photos library.

9781430245995_Fig08-06.jpg

Figure 8-6.  The image picker view with a switch control between photo and video modes

Recipe 8-3: Editing Videos

Although your UIImagePickerController offers a convenient way to record and save video files, it does nothing to allow you to edit them. Fortunately, iOS has another built-in controller called UIVideoEditorController, which you can use to edit your recorded videos.

You can build this fairly simple recipe off your second project, in which you added video functionality to your UIImagePickerController.

Start by adding a second button with the title “Edit Video” to your view controller’s interface file. Arrange the two buttons as in Figure 8-7.

9781430245995_Fig08-07.jpg

Figure 8-7.  New user interface with button for editing the video

Next, create an action named editVideo for when the user taps the Edit Video button.

You’ll also need a property to store the path to the video that the user records. Define it in the view controller’s header file:

//
//  ViewController.h
//  Recipe 8.3: Editing Videos
//

#import <UIKit/UIKit.h>
#import <MobileCoreServices/MobileCoreServices.h>

@interface ViewController : UIViewController<UIImagePickerControllerDelegate, UINavigationControllerDelegate>

@property (weak, nonatomic) IBOutlet UIImageView *imageView;
@property (weak, nonatomic) IBOutlet UIButton *cameraButton;
@property (strong, nonatomic) UIImagePickerController *imagePicker;
@property (strong, nonatomic) NSString *pathToRecordedVideo;

- (IBAction)takePicture:(id)sender;
- (IBAction)editVideo:(id)sender;

@end

Now, in the imagePickerController:didFinishPickingMediaWithInfo: method, make sure the pathToRecordedVideo property gets updated with the path to the newly recorded video.

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
    if (CFStringCompare((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) ==
        kCFCompareEqualTo)
    {
        NSString *moviePath =
            [[info objectForKey: UIImagePickerControllerMediaURL] path];
        self.pathToRecordedVideo = moviePath;
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath))
        {
            UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
        }
    }
    else
    {
        //...
    }
}

With the pathToRecordedVideo property in place, you can turn the focus to your editVideo action. It opens the last recorded video for editing in a video editor controller, or displays an error if no video was recorded.

- (IBAction)editVideo:(id)sender
{
    if (self.pathToRecordedVideo)
    {
        UIVideoEditorController *editor = [[UIVideoEditorController alloc] init];
        editor.videoPath = self.pathToRecordedVideo;
        editor.delegate = self;
        [self presentViewController:editor animated:YES completion:NULL];
    }
    else
    {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"
            message:@"No Video Recorded Yet"
            delegate:self
            cancelButtonTitle:@"Cancel"
            otherButtonTitles:nil, nil];
        [alert show];
    }

}

Because the video editor’s receiving delegate is your view controller, you need to make sure it conforms to the UIVideoEditorControllerDelegate protocol.

// ...

@interface ViewController : UIViewController<UIImagePickerControllerDelegate,
                                             UINavigationControllerDelegate,
                                             UIVideoEditorControllerDelegate>

// ...

@end

Finally, you need to implement a few delegate methods for your UIVideoEditorController. First, a delegate method to handle a successful editing/trimming of the video follows:

-(void)videoEditorController:(UIVideoEditorController *)editor didSaveEditedVideoToPath:(NSString *)editedVideoPath
{
    self.pathToRecordedVideo = editedVideoPath;
    if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (editedVideoPath))
    {
        UISaveVideoAtPathToSavedPhotosAlbum (editedVideoPath, nil, nil, nil);
    }
    [self dismissViewControllerAnimated:YES completion:NULL];
}

As you can see, your application sets the newly edited video as your next video to be edited, so that you can create increasingly trimmed clips. It also saves each edited version to your photos album if possible.

You need one more delegate method to handle the cancellation of your UIVideoEditorController:

-(void)videoEditorControllerDidCancel:(UIVideoEditorController *)editor
{
    [self dismissViewControllerAnimated:YES completion:NULL];
}

Upon testing on a physical device, your application should now successfully allow you to edit your videos. Figure 8-8 shows a view of your application giving you the option to edit a recorded video.

9781430245995_Fig08-08.jpg

Figure 8-8.  Editing (trimming) a video using UIVideoEditorController

Recipe 8-4: Using Custom Camera Overlays

There is a variety of applications that implement the camera interface but also implement a custom overlay—for example, to display constellations on the sky or simply to implement their own custom camera controls. In this recipe you’ll continue building on the project from the previous recipes and implement a very basic custom camera screen overlay. Specifically, you’ll be replacing the default button controls with your own versions of them. Although simple, the example should give you an idea of how to create your own, more useful overlay functionalities.

You build your custom overlay view directly in code, in a method that you’ll name customViewForImagePicker:. It creates an overlay view and populates it with three buttons: one for taking the picture, one for turning flash on and off, and one to toggle between the front and rear cameras. Here’s the code, which you add to ViewController.m in the code from Recipe 8-3:

-(UIView *)customViewForImagePicker:(UIImagePickerController *)imagePicker;
{
    UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, 280, 480)];
    view.backgroundColor = [UIColor clearColor];
    UIButton *flashButton =
        [[UIButton alloc] initWithFrame:CGRectMake(10, 10, 120, 44)];
    flashButton.backgroundColor = [UIColor colorWithRed:.5 green:.5 blue:.5 alpha:.5];
    [flashButton setTitle:@"Flash Auto" forState:UIControlStateNormal];
    [flashButton setTitleColor:[UIColor whiteColor] forState:UIControlStateNormal];
    flashButton.layer.cornerRadius = 10.0;
    UIButton *changeCameraButton =
        [[UIButton alloc] initWithFrame:CGRectMake(190, 10, 120, 44)];
    changeCameraButton.backgroundColor =
        [UIColor colorWithRed:.5 green:.5 blue:.5 alpha:.5];
    [changeCameraButton setTitle:@"Rear Camera" forState:UIControlStateNormal];
    [changeCameraButton setTitleColor:[UIColor whiteColor]
        forState:UIControlStateNormal];
    changeCameraButton.layer.cornerRadius = 10.0;
    UIButton *takePictureButton =
        [[UIButton alloc] initWithFrame:CGRectMake(100, 432, 120, 44)];
    takePictureButton.backgroundColor =
        [UIColor colorWithRed:.5 green:.5 blue:.5 alpha:.5];
    [takePictureButton setTitle:@"Click!" forState:UIControlStateNormal];
    [takePictureButton setTitleColor:[UIColor whiteColor]
        forState:UIControlStateNormal];
    takePictureButton.layer.cornerRadius = 10.0;
    [flashButton addTarget:self action:@selector(toggleFlash:)
        forControlEvents:UIControlEventTouchUpInside];
    [changeCameraButton addTarget:self action:@selector(toggleCamera:)
        forControlEvents:UIControlEventTouchUpInside];
    [takePictureButton addTarget:imagePicker action:@selector(takePicture)
        forControlEvents:UIControlEventTouchUpInside];
    [view addSubview:flashButton];
    [view addSubview:changeCameraButton];
    [view addSubview:takePictureButton];
    return view;
}

Here, you have defined your UIView, as well as your buttons to be put in it; given them their actions to perform and added them into the view; set the title of each button to be either its starting value or its purpose; and also set their cornerRadius so that the buttons will have rounded corners. One of the most important details here is that you set your buttons to be semitransparent, as they are placed over your camera’s display. You do not want to cover up any of your picture, so the buttons have to be at least partially see-through.

As you may have noticed, the action for the takePictureButton is directly connected to the takePicture method on the image picker. The other two buttons, on the other hand, are connected to methods (toggleFlash and toggleCamera, respectively) on your view controller. At this point, those two methods don’t exist so you need to implement them next.

-(void)toggleFlash:(UIButton *)sender
{
    if (self.imagePicker.cameraFlashMode == UIImagePickerControllerCameraFlashModeOff)
    {
        self.imagePicker.cameraFlashMode = UIImagePickerControllerCameraFlashModeOn;
        [sender setTitle:@"Flash On" forState:UIControlStateNormal];
    }
    else
    {
        self.imagePicker.cameraFlashMode = UIImagePickerControllerCameraFlashModeOff;
        [sender setTitle:@"Flash Off" forState:UIControlStateNormal];
    }
}

-(void)toggleCamera:(UIButton *)sender
{
    if (self.imagePicker.cameraDevice == UIImagePickerControllerCameraDeviceRear)
    {
        self.imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceFront;
        [sender setTitle:@"Front Camera" forState:UIControlStateNormal];
    }
    else
    {
        self.imagePicker.cameraDevice = UIImagePickerControllerCameraDeviceRear;
        [sender setTitle:@"Rear Camera" forState:UIControlStateNormal];
    }
}

Next, you hide the default camera buttons and provide the image picker with your custom overlay view. Add the following two lines of code to your takePicture: method. You can also comment out the setting of the allowsEditing property because the new way of taking pictures doesn’t support that.

- (IBAction)takePicture:(id)sender
{
        // Make sure camera is available
    if ([UIImagePickerController
         isSourceTypeAvailable:UIImagePickerControllerSourceTypeCamera] == NO)
    {
        UIAlertView *alert = [[UIAlertView alloc] initWithTitle:@"Error"
                                                        message:@"Camera Unavailable"
                                                       delegate:self
                                              cancelButtonTitle:@"Cancel"
                                              otherButtonTitles:nil, nil];
        [alert show];
        return;
    }
    if (self.imagePicker == nil)
    {
        self.imagePicker = [[UIImagePickerController alloc] init];
        self.imagePicker.delegate = self;
        self.imagePicker.sourceType = UIImagePickerControllerSourceTypeCamera;
        self.imagePicker.mediaTypes = [UIImagePickerController
availableMediaTypesForSourceType:UIImagePickerControllerSourceTypeCamera];
        // self.imagePicker.allowsEditing = YES;
        self.imagePicker.showsCameraControls = NO;
        self.imagePicker.cameraOverlayView =
            [self customViewForImagePicker:self.imagePicker];

    }
    [self presentViewController:self.imagePicker animated:YES completion:NULL];
}

Finally, you need to make a small change to the imagePickerController:didFinishPickingMediaWithInfo: method. As mentioned earlier, the takePicture method of the image picker doesn’t support editing. This means that you have to retrieve your picture from the info dictionary using the UIImagePickerControllerOriginalImage key instead of UIImagePickerControllerEditedImage, as shown in the code that follows:

-(void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info
{
    NSString *mediaType = [info objectForKey: UIImagePickerControllerMediaType];
    if (CFStringCompare((__bridge CFStringRef) mediaType, kUTTypeMovie, 0) ==
        kCFCompareEqualTo)
    {
        NSString *moviePath =
            [[info objectForKey: UIImagePickerControllerMediaURL] path];
        self.pathToRecordedVideo = moviePath;
        if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum (moviePath))
        {
            UISaveVideoAtPathToSavedPhotosAlbum (moviePath, nil, nil, nil);
        }
    }
    else
    {
        UIImage *image =
            (UIImage *)[info objectForKey:UIImagePickerControllerOriginalImage];
        UIImageWriteToSavedPhotosAlbum(image, nil, nil , nil);
        self.imageView.image = image;
        self.imageView.contentMode = UIViewContentModeScaleAspectFill;
    }
    [self dismissViewControllerAnimated:YES completion:NULL];
}

If you run your app now, your camera should, as shown in Figure 8-9, display your three buttons in an overlay.

9781430245995_Fig08-09.jpg

Figure 8-9.  An image picker controller with a custom overlay view replacing the standard buttons

From here you can create your own custom overlays and easily change their functions to fit nearly any situation. The following recipes leave the image picker controller and instead look into the Audiovisual (AV) Foundation framework for capturing your pictures and videos.

Recipe 8-5: Displaying Camera Preview with AVCaptureSession

While the UIImagePickerController and UIVideoEditorController interfaces are incredibly useful, they certainly aren’t as customizable as they could be. With the AV framework, however, you can create your camera interfaces from scratch, making them just the way you want.

In this recipe and the ones that follow, you use the AVCaptureSession API to essentially create your own version of the camera. You’ll do this in steps, starting with the displaying of a camera preview.

Begin by creating a new single-view project. You’ll be using the same project for the rest of this chapter so name it accordingly (e.g., MyCamera). Also make sure to add the AVFoundation framework to your project or you’ll run into linker errors later.

Now, add a property to your view controller to hold your AVCaptureSession instance, and one to hold the video input instance, by making the following changes to your ViewController.h file:

//
//  ViewController.h
//  Recipe 8.5: Displaying Camera Preview With AVCaptureSession
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController

@property (strong, nonatomic) AVCaptureSession *captureSession;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInput;


@end

Next, switch over to the ViewController.m file and locate the viewDidLoad method. There you set up the capture session to receive input from the camera. We’ll show you step by step now and later present you with the complete viewDidLoad implementation.

First, you create your AVCaptureSession. Optionally, you may also want to change the resolution preset, which is set to AVCaptureSessionPresetHigh by default.

self.captureSession = [[AVCaptureSession alloc] init];
//Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

Next  specify your input device, which is your rear camera (assuming one is accessible). You specify this through the use of the AVCaptureDevice class method+defaultDeviceWithMediaType:, which can take a variety of different arguments, depending on the type of media desired, the most prominent of which are AVMediaTypeVideo and AVMediaTypeAudio.

AVCaptureDevice *device = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

Next, you need to set up the instance of AVCaptureDeviceInput to specify your chosen device as an input for your capture session. Also include a check to make sure the input has been correctly created before adding it to your session.

NSError *error = nil;
self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
if (self.videoInput)
{
    [self.captureSession addInput:self.videoInput];
}
else
{
    NSLog(@"Input Error: %@", error);
}

The last part of your viewDidLoad is the creation of a preview layer, with which you can see what your camera is viewing. Set your preview layer to be the layer of your main view, but with a slightly altered height, so as not to block a button that you’ll set up in the next recipe.

AVCaptureVideoPreviewLayer *previewLayer =
    [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
UIView *aView = self.view;
previewLayer.frame =
    CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height-70);
[aView.layer addSublayer:previewLayer];

Here’s the complete viewDidLoad method:

- (void)viewDidLoad
{
    [super viewDidLoad];
    // Do any additional setup after loading the view, typically from a nib.

    self.captureSession = [[AVCaptureSession alloc] init];
    //Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;

    AVCaptureDevice *device =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];

    NSError *error = nil;
    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (self.videoInput)
    {
        [self.captureSession addInput:self.videoInput];
    }
    else
    {
        NSLog(@"Input Error: %@", error);
    }
    AVCaptureVideoPreviewLayer *previewLayer =
        [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView *aView = self.view;
    previewLayer.frame =
        CGRectMake(0, 0, self.view.frame.size.width,
            self.view.frame.size.height-70);
    [aView.layer addSublayer:previewLayer];

}

Note  Just like any other CALayer, an AVCaptureVideoPreviewLayer can be repositioned, rotated, and resized. With it, you are no longer bound to using the entire screen to record video as you are with the UIImagePicker, meaning you could have your preview layer in one part of the screen and other information for the user in another. As with almost every part of iOS development, the possibilities of use are limited only by the developer’s imagination.

Now, the only thing that remains is to start and stop your capture session. In this app you’ll display the camera preview upon application launch, so a good place to put the start code is in the viewWillAppear: method:

- (void)viewWillAppear:(BOOL)animated
{
    [super viewWillAppear:animated];
    [self.captureSession startRunning];
}

And the corresponding stopping of the capture session:

- (void)viewWillDisappear:(BOOL)animated
{
    [super viewWillDisappear:animated];
    [self.captureSession stopRunning];
}

If you build and run your application now, it should display a live camera preview as in Figure 8-10.

9781430245995_Fig08-10.jpg

Figure 8-10.  Displaying a camera preview with AVCaptureSession

Recipe 8-6: Capturing Still Images with AVCaptureSession

In the previous recipe you learned how to set up an AVCaptureSession with input from the camera. You also saw how you can connect an AVCaptureVideoPreviewLayer to display a live camera preview in your app. Now you will expand the project by connecting a AVCaptureStillImageOutput object to take still images and save them to the saved photos library on the device.

Before digging in to the coding, you need to make a couple of changes to the project. The first is to link AssetsLibrary.framework to your project. You use functionality from that framework to write the photos to the photos library.

The second thing you need to do, because you access the shared photos library of the device, is provide a usage description in the application’s Info.plist file. Go ahead and add the NSPhotoLibraryUsageDescription key (displayed as “Privacy—Photo Library Usage Description” in the property editor) with a brief text containing the reason why your app seeks the access (e.g., “Testing AVCaptureSession”).

Adding a Capture Button

You need a way to trigger a still image capture. Start by adding a button with the title “Capture” to your view controller’s .xib file, as in Figure 8-11. Also make sure to create an action named capture for the button.

9781430245995_Fig08-11.jpg

Figure 8-11.  A user interface with a button to capture a video frame

Now, switch to your ViewController.h file and import the AssetsLibrary API. Also, add a property to hold your still image output instance, as follows:

//
//  ViewController.h
//  Recipe 8.6: Taking Still Images With AVCaptureSession
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@interface ViewController : UIViewController

@property (strong, nonatomic) AVCaptureSession *captureSession;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInput;
@property (strong, nonatomic) AVCaptureStillImageOutput *stillImageOutput;

- (IBAction)capture:(id)sender;

@end

In ViewController.m, add the following code to the viewDidLoad method. The new code (marked in bold) allocates and initializes your still image output object and connects it to the capture session.

- (void)viewDidLoad
{
    [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    self.captureSession = [[AVCaptureSession alloc] init];
    //Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *device =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    NSError *error = nil;
    self.videoInput = [AVCaptureDeviceInput deviceInputWithDevice:device error:&error];
    if (self.videoInput)
    {
        [self.captureSession addInput:self.videoInput];
    }
    else
    {
        NSLog(@"Input Error: %@", error);
    }
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *stillImageOutputSettings =
        [[NSDictionary alloc] initWithObjectsAndKeys:
            AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];
    [self.captureSession addOutput:self.stillImageOutput];


    AVCaptureVideoPreviewLayer *previewLayer =
        [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView *aView = self.view;
    previewLayer.frame =
        CGRectMake(0, 0, self.view.frame.size.width, self.view.frame.size.height-70);
    [aView.layer addSublayer:previewLayer];
}

Note  Besides AVCaptureStillImageOutput, you can use a number of other output formats—for example, the AVCaptureMovieFileOutput, which you use in the next recipe, the AVCaptureVideoDataOutput with which you can access the raw video output frame by frame, AVCaptureAudioFileOutput for saving audio files, and AVCaptureAudioDataOutput for processing audio data.

Now it’s time to implement the action method. All it should do is trigger the capturing of a still image. We chose to extract the capturing code into a helper method for the sake of making changes that will come in the next recipe easier.

- (IBAction)capture:(id)sender
{
    [self captureStillImage];
}

The implementation of captureStillImage can seem daunting at first so we’ll take it in steps and then show you the complete method.

First you acquire the capture connection and make sure it uses the portrait orientation to capture the image.

- (void) captureStillImage
{
    AVCaptureConnection *stillImageConnection =
        [self.stillImageOutput.connections objectAtIndex:0];
    if ([stillImageConnection isVideoOrientationSupported])
        [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    // ...
}

Then you run the captureStillImageAsynchronouslyFromConnection method and provide a code block that is invoked when the still image capture has been completed.

[self.stillImageOutput
    captureStillImageAsynchronouslyFromConnection:stillImageConnection
    completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
    {
        // ...
    }
];

When the capture has completed, first check to see whether it was successful; otherwise log the error.

[self.stillImageOutput
    captureStillImageAsynchronouslyFromConnection:stillImageConnection
    completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
    {
        if (imageDataSampleBuffer != NULL)
        {
            // ...
        }
        else
        {
            NSLog(@"Error capturing still image: %@", error);
        }

    }
 ];

If the capturing was successful, extract the image from the buffer.

if (imageDataSampleBuffer != NULL)
{
    NSData *imageData = [AVCaptureStillImageOutput
        jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
    UIImage *image = [[UIImage alloc] initWithData:imageData];

    // ...
}

Then save the image to the photos library. This is also an asynchronous task so provide a block for when it completes. Whether the task completed successfully or with an error (e.g., if the user didn’t allow access to the photos library,) display an alert to notify the user.

ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
[library writeImageToSavedPhotosAlbum:[image CGImage]
    orientation:(ALAssetOrientation)[image imageOrientation]
    completionBlock:^(NSURL *assetURL, NSError *error)
    {
        UIAlertView *alert;
        if (!error)
        {
            alert = [[UIAlertView alloc] initWithTitle:@"Photo Saved"
                message:@"The photo was successfully saved to your photos library"
                delegate:nil
                cancelButtonTitle:@"OK"
                otherButtonTitles:nil, nil];
        }
        else
        {
            alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Photo"
                message:@"The photo was not saved to you photos library"
                delegate:nil
                cancelButtonTitle:@"OK"
                otherButtonTitles:nil, nil];
        }
        [alert show];
    }
];
Here's the captureStillImage method in it's entirety:
- (void) captureStillImage
{
    AVCaptureConnection *stillImageConnection =
        [self.stillImageOutput.connections objectAtIndex:0];
    if ([stillImageConnection isVideoOrientationSupported])
        [stillImageConnection setVideoOrientation:AVCaptureVideoOrientationPortrait];
    [self.stillImageOutput
        captureStillImageAsynchronouslyFromConnection:stillImageConnection
        completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error)
        {
            if (imageDataSampleBuffer != NULL)
            {
                NSData *imageData = [AVCaptureStillImageOutput
                    jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
                ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
                UIImage *image = [[UIImage alloc] initWithData:imageData];
                [library writeImageToSavedPhotosAlbum:[image CGImage]
                    orientation:(ALAssetOrientation)[image imageOrientation]
                    completionBlock:^(NSURL *assetURL, NSError *error)
                    {
                        UIAlertView *alert;
                        if (!error)
                        {
                            alert = [[UIAlertView alloc] initWithTitle:@"Photo Saved"
                                message:@"The photo was successfully saved to your photos library"
delegate:nil
cancelButtonTitle:@"OK"
                                otherButtonTitles:nil, nil];
                        }
                        else
                        {
                            alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Photo"
                                message:@"The photo was not saved to you photos library"
                                delegate:nil
                                cancelButtonTitle:@"OK"
                                otherButtonTitles:nil, nil];
                         }
                         [alert show];
                     }
                ];
            }
            else
            {
                NSLog(@"Error capturing still image: %@", error);
            }
        }
    ];
}

That completes Recipe 8-6. You now can run your app and tap the Capture button to take a picture that’s saved in your photos library, as in Figure 8-12.

9781430245995_Fig08-12.jpg

Figure 8-12.  A still image captured and saved to the photos library

While you haven’t included any fancy animations to make it look like a camera, this is quite useful as far as a basic camera goes. Recipe 8-7 takes it to the next level and shows you how to record a video using AVCaptureSession.

Recipe 8-7: Capturing Video with AVCaptureSession

Now that you have covered some of the basics of using AVFoundation, you will use it to implement a slightly more complicated project. This time, you’ll extend your app to include a mode control that allows the user to switch between taking pictures and recording videos. You’ll build on the same project that you have been working on since Recipe 8-5.

Adding a Video Recording Mode

You’re going to add a new component to your user interface that lets the user switch between still image and video recording modes. A simple segmented control works for the purpose of this recipe, so go ahead and add one from the object library. Make it have two options,“Take Photo” and “Record Video,” and place it so that your view resembles the one in Figure 8-13.

9781430245995_Fig08-13.jpg

Figure 8-13.  A simple user interface that allows the user to switch modes between photo and video capturing

You’re going to access both the segmented control and the button from your code, so add outlets for them. Use the names modeControl and captureButton, respectively. You’re also going to need to respond when the segment control’s value change, so create an action for that event. Name the action updateMode.

Now switch over to your ViewController.h file. You’re going to add a couple of properties that are for the video recording setup of your capture session, one for audio input and one for the movie file output. Also, to prepare the view controller for being an output delegate for the movie file recording, you’re going to add AVCaptureFileOutputRecordingDelegate protocol to the header. Following is the code with the changes, which are marked in bold:

//
//  ViewController.h
//  Recipe 8.7: Recording Video With AVCaptureSession
//

#import <UIKit/UIKit.h>
#import <AVFoundation/AVFoundation.h>
#import <AssetsLibrary/AssetsLibrary.h>

@interface ViewController : UIViewController<AVCaptureFileOutputRecordingDelegate>

@property (strong, nonatomic) AVCaptureSession *captureSession;
@property (strong, nonatomic) AVCaptureDeviceInput *videoInput;
@property (strong, nonatomic) AVCaptureDeviceInput *audioInput;
@property (strong, nonatomic) AVCaptureStillImageOutput *stillImageOutput;
@property (strong, nonatomic) AVCaptureMovieFileOutput *movieOutput;

@property (weak, nonatomic) IBOutlet UIButton *captureButton;
@property (weak, nonatomic) IBOutlet UISegmentedControl *modeControl;

- (IBAction)capture:(id)sender;
- (IBAction)updateMode:(id)sender;

@end

Now that your header file is all set up, switch over to your implementation file. To start, you’re going to make several changes to the viewDidLoad method. The first is to set up an audio input object to capture sound from the device’s microphone while recording video.

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.captureSession = [[AVCaptureSession alloc] init];
    //Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *videoDevice =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *audioDevice =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];

    self.videoInput =
        [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
    self.audioInput =
        [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];


    // ...
}

Next, you’re going to set up an output object that records the data from the input objects and produces a movie file.

- (void)viewDidLoad
{
    // ...

    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *stillImageOutputSettings = [[NSDictionary alloc]
        initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];
    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];

    // ...
}

Finally, you’re going to set up the capture session in the picture-taking mode and adjust the size of the preview layer so that it won’t cover the new segment control.

- (void)viewDidLoad
{
    // ...

    // Setup capture session for taking pictures
    [self.captureSession addInput:self.videoInput];
    [self.captureSession addOutput:self.stillImageOutput];

    AVCaptureVideoPreviewLayer *previewLayer =
        [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView *aView = self.view;
    previewLayer.frame =
        CGRectMake(0, 70, self.view.frame.size.width, self.view.frame.size.height-140);
    [aView.layer addSublayer:previewLayer];
}

With all these changes, your viewDidLoad method should resemble the code that follows:

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.captureSession = [[AVCaptureSession alloc] init];
    //Optional: self.captureSession.sessionPreset = AVCaptureSessionPresetMedium;
    AVCaptureDevice *videoDevice =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
    AVCaptureDevice *audioDevice =
        [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
    self.videoInput =
        [AVCaptureDeviceInput deviceInputWithDevice:videoDevice error:nil];
    self.audioInput =
        [[AVCaptureDeviceInput alloc] initWithDevice:audioDevice error:nil];
    self.stillImageOutput = [[AVCaptureStillImageOutput alloc] init];
    NSDictionary *stillImageOutputSettings = [[NSDictionary alloc]
        initWithObjectsAndKeys:AVVideoCodecJPEG, AVVideoCodecKey, nil];
    [self.stillImageOutput setOutputSettings:stillImageOutputSettings];
    self.movieOutput = [[AVCaptureMovieFileOutput alloc] init];
    // Setup capture session for taking pictures
    [self.captureSession addInput:self.videoInput];
    [self.captureSession addOutput:self.stillImageOutput];
    AVCaptureVideoPreviewLayer *previewLayer =
        [AVCaptureVideoPreviewLayer layerWithSession:self.captureSession];
    UIView *aView = self.view;
    previewLayer.frame =
        CGRectMake(0, 70, self.view.frame.size.width, self.view.frame.size.height-140);
    [aView.layer addSublayer:previewLayer];
}

Note that you’re not adding audioInput and movieOutput objects to the capture session. Later you’ll add and remove input objects, depending on which mode the user selects, but for now it’s assumed to be the “Take Photo” mode. Therefore, only input and output objects associated with that particular mode are added in the viewDidLoad method. (For the same reason, it’s also important that the segment control has the correct selected index value set.)

Now, update the capture action method. It should now check what mode the application is in and if in “Take Photo” mode, do what it used to do, namely, capture a still image.

- (IBAction)capture:(id)sender
{
    if (self.modeControl.selectedSegmentIndex == 0)
    {

        // Picture Mode
        [self captureStillImage];
    }
    else
    {
        // Video Mode
    }

}

If in video recording mode, however, it should toggle between a start and stop recording mode, depending on whether or not a movie is currently being recorded, as shown in the following code.

- (IBAction)capture:(id)sender
{
    if (self.modeControl.selectedSegmentIndex == 0)
    {
        // Picture Mode
        [self captureStillImage];
    }
    else
    {
        // Video Mode
        if (self.movieOutput.isRecording == YES)
        {
            [self.captureButton setTitle:@"Capture" forState:UIControlStateNormal];
            [self.movieOutput stopRecording];
        }
        else
        {
            [self.captureButton setTitle:@"Stop" forState:UIControlStateNormal];
            [self.movieOutput startRecordingToOutputFileURL:[self tempFileURL]
                recordingDelegate:self];
        }

    }
}

You have probably noticed that you called the method tempFileURL to set up your AVCaptureOutput earlier. This method, in short, returns a path for your recorded video to be temporarily saved on your device. If there is already a file saved at the location, it will delete that file. (This way, you never use more than one video’s worth of disk space.) Here’s its implementation.

- (NSURL *) tempFileURL
{
    NSString *outputPath = [[NSString alloc] initWithFormat:@"%@%@",
        NSTemporaryDirectory(), @"output.mov"];
    NSURL *outputURL = [[NSURL alloc] initFileURLWithPath:outputPath];
    NSFileManager *manager = [[NSFileManager alloc] init];
    if ([manager fileExistsAtPath:outputPath])
    {
        [manager removeItemAtPath:outputPath error:nil];
    }
    return outputURL;
}

The next step is to set up your AVCaptureMovieFileOutput’s delegate method to be invoked when an AVCaptureSession has finished recording a movie. The method starts by checking whether there were any errors in recording the video to a file, and then saves your video file into your Asset Library. The process of writing a video to the photos album is nearly the same as with photos, so you’ll probably recognize a lot of that code from the previous recipe:

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error
{
    BOOL recordedSuccessfully = YES;
    if ([error code] != noErr)
    {
        // A problem occurred: Find out if the recording was successful.
        id value = [[error userInfo]
            objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if (value)
            recordedSuccessfully = [value boolValue];
        // Logging the problem anyway:
        NSLog(@"A problem occurred while recording: %@", error);
    }
    if (recordedSuccessfully)
    {
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
            completionBlock:^(NSURL *assetURL, NSError *error)
            {
                UIAlertView *alert;
                if (!error)
                {
                    alert = [[UIAlertView alloc] initWithTitle:@"Video Saved"
                       message:@"The movie was successfully saved to you photos library"
                       delegate:nil
                       cancelButtonTitle:@"OK"
                       otherButtonTitles:nil, nil];
                }
                else
                {
                    alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Video"
                       message:@"The movie was not saved to you photos library"
                       delegate:nil
                       cancelButtonTitle:@"OK"
                       otherButtonTitles:nil, nil];
                }
                [alert show];
            }
        ];
    }
}

Finally, you’ll implement the action method that’s invoked when the user switches between the two modes. Here is its implementation:

- (IBAction)updateMode:(id)sender
{
    [self.captureSession stopRunning];
    if (self.modeControl.selectedSegmentIndex == 0)
    {
        // Still Image Mode
        if (self.movieOutput.isRecording == YES)
        {
            [self.movieOutput stopRecording];
        }
        [self.captureSession removeInput:self.audioInput];
        [self.captureSession removeOutput:self.movieOutput];
        [self.captureSession addOutput:self.stillImageOutput];
    }
    else
    {
        // Video Mode
        [self.captureSession removeOutput:self.stillImageOutput];
        [self.captureSession addInput:self.audioInput];
        [self.captureSession addOutput:self.movieOutput];
        // Set orientation of capture connections to portrait
        NSArray *array = [[self.captureSession.outputs objectAtIndex:0] connections];
        for (AVCaptureConnection *connection in array)
        {
            connection.videoOrientation = AVCaptureVideoOrientationPortrait;
        }
    }
    [self.captureButton setTitle:@"Capture" forState:UIControlStateNormal];
    [self.captureSession startRunning];
}

This method updates the capture session with the correct input and output objects that are associated with the corresponding mode. It also sets the orientation mode for the video mode output objects. (This is already taken care of for the still image mode; see Recipe 8-6 for details.) Finally, it resets the title of the capture button.

Now you are ready to build and run your app. You should be able to switch between taking photos and recording videos and the results should be stored in the photos library of your device. Figure 8-14 shows the app in action.

9781430245995_Fig08-14.jpg

Figure 8-14.  An app that can take photos and record videos

Recipe 8-8: Capturing Video Frames

For many applications that utilize videos, a thumbnail image is a useful way to “represent” a given video. In this recipe you expand your previous recipe and generate and display a thumbnail image when a video has been recorded.

Start by adding the CoreMedia framework to your project. You use it to generate the thumbnail.

Next, add an image view to the lower-left corner of your main view’s user interface so that it resembles Figure 8-15.

9781430245995_Fig08-15.jpg

Figure 8-15.  The user interface with a thumbnail image view in the lower-left corner

Also, add an outlet for the image view so that you can reference it later from your code. Name it thumbnailImageView.

Now, let’s get to the core of this recipe, the method that extracts an image from the halfway point of the movie and updates the thumbnail image view. Add the method to your view controller:

-(void)createThumbnailForVideoURL:(NSURL *)videoURL
{
    AVURLAsset *myAsset = [[AVURLAsset alloc] initWithURL:videoURL options:[NSDictionary dictionaryWithObject:@"YES" forKey:AVURLAssetPreferPreciseDurationAndTimingKey]];
    AVAssetImageGenerator *imageGenerator =
        [AVAssetImageGenerator assetImageGeneratorWithAsset:myAsset];
    //Make sure images are correctly rotated.
    imageGenerator.appliesPreferredTrackTransform = YES;
    Float64 durationSeconds = CMTimeGetSeconds([myAsset duration]);
    CMTime half = CMTimeMakeWithSeconds(durationSeconds/2.0, 600);
    NSArray *times = [NSArray arrayWithObjects: [NSValue valueWithCMTime:half], nil];
    [imageGenerator generateCGImagesAsynchronouslyForTimes:times
        completionHandler:^(CMTime requestedTime, CGImageRef image, CMTime actualTime,
                            AVAssetImageGeneratorResult result, NSError *error)
        {
            if (result == AVAssetImageGeneratorSucceeded)
            {
                self.thumbnailImageView.image = [UIImage imageWithCGImage:image];
            }
            else if (result == AVAssetImageGeneratorFailed)
            {
                NSLog(@"Failed with error: %@", [error localizedDescription]);
            }
        }
    ];
}

Now, all that’s left is to call the method when a video has been recorded. Add the following line to your captureOutput:didFinishRecordingToOutputFileAtURL: method:

- (void)captureOutput:(AVCaptureFileOutput *)captureOutput
didFinishRecordingToOutputFileAtURL:(NSURL *)outputFileURL
      fromConnections:(NSArray *)connections
                error:(NSError *)error
{
    BOOL recordedSuccessfully = YES;
    if ([error code] != noErr)
    {
        // A problem occurred: Find out if the recording was successful.
        id value =
            [[error userInfo] objectForKey:AVErrorRecordingSuccessfullyFinishedKey];
        if (value)
            recordedSuccessfully = [value boolValue];
        // Logging the problem anyway:
        NSLog(@"A problem occurred while recording: %@", error);
    }
    if (recordedSuccessfully)
    {
        [self createThumbnailForVideoURL:outputFileURL];
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        [library writeVideoAtPathToSavedPhotosAlbum:outputFileURL
            completionBlock:^(NSURL *assetURL, NSError *error)
            {
                UIAlertView *alert;
                if (!error)
                {
                    alert = [[UIAlertView alloc] initWithTitle:@"Video Saved"
                        message:@"The movie was successfully saved to you photos library"
                        delegate:nil
                        cancelButtonTitle:@"OK"
                        otherButtonTitles:nil, nil];
                }
                else
                {
                    alert = [[UIAlertView alloc] initWithTitle:@"Error Saving Video"
                        message:@"The movie was not saved to you photos library"
                        delegate:nil
                        cancelButtonTitle:@"OK"
                        otherButtonTitles:nil, nil];
                }
                [alert show];
            }
         ];
    }
}

Now build and run your application and switch to the Record Video mode when it has launched. Record a video by pressing the Capture button twice (once for start and again for stop). A few seconds later a thumbnail from halfway into your movie is now displayed in the lower-left corner, as Figure 8-16 shows.

9781430245995_Fig08-16.jpg

Figure 8-16.  Your app displaying a movie thumbnail in the lower-left corner

Note  As you may have noticed, the method used in Recipe 8-8 is rather slow, and it usually takes a couple of seconds to extract the image. There are other ways you may want to try—for example, adding a AVCaptureStillImageOutput to your capture session (it’s possible to have several output objects connected) and taking a snapshot when the recording starts. You can use what you learned in Recipe 8-6. Another way is to use the AVCaptureVideoDataOutput mentioned earlier. With that method you can grab any frame you like during recording and extract an image.

Summary

As a developer, you have a great deal of choice when it comes to dealing with your device’s camera. The predefined interfaces such as UIImagePickerController and UIVideoEditorController are incredibly useful and well designed, but Apple’s implementation of the AV Foundation framework allows for more possibilities. Everything from dealing with video, audio, and still images is possible. Even a quick glance at the full documentation reveals countless other functionalities not discussed here, including everything from device capabilities (such as the video camera’s LED “torch”) to the implementation of your own “Touch-To-Focus” functionality. We live in a world where images, audio, and video fly around the world in a matter of seconds, and as developers we must be able to design and create innovative solutions that fit in with our media-based community.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset