Howdy!
Today I am going to write about how to implement custom Camera using AVFoundation.
Here I have created this storyboard consisting of two View Controllers. One is Main View Controller and attached to Navigation Controller and the other one is Camera View Controller in which I have a UIView (in gray color), a Button which will help us capture the image/video, an Activity Indicator which will show processing, and a Bar Button for switching camera.
In my assets I have.
Now lets open your Main View Controller and create IBActions for both of your buttons and set the identifier of its segue to Camera View Controller as ‘capture’ (you can give it any identifier you like). Also create a variable of type String called keyForCamera.
Open your Camera View Controller now and create IBActions and IBOutlets, two functions for initiating Camera for Image and Video respectively and a variable named keyFromMenu of type String.
Create @IBOutlet for the UIView , Activity Indicator and @IBAction for the Capture Button and Switch Camera Bar Button.
Now come back to your Main View Controller and in your @IBActions paste the following code and override prepareForSegue.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
@IBAction func capturePicture(sender: UIButton) { self.keyForCamera = "Capture Picture" self.performSegueWithIdentifier("capture", sender: nil) } @IBAction func captureVideo(sender: UIButton) { self.keyForCamera = "Capture Video" self.performSegueWithIdentifier("capture", sender: nil) } override func prepareForSegue(segue: UIStoryboardSegue, sender: AnyObject?) { if segue.identifier == "capture"{ let destination = segue.destinationViewController as! CameraViewController destination.keyFromMenu = self.keyForCamera } } |
Just to check if our logic works build and run it, it should show you different titles on both of the buttons you created.
Now come back to your Camera View Controller class and declare the following variables in it.
1 2 3 4 5 6 7 8 9 10 11 12 |
//Camera Session var session: AVCaptureSession? //Capturing Image var stillImageOutput: AVCaptureStillImageOutput? //Capturing Video var videoOutput : AVCaptureMovieFileOutput? //Shows preview var videoPreviewLayer: AVCaptureVideoPreviewLayer? //Capturing Camera hardware var captureDevice:AVCaptureDevice! = nil //Switching to front/back camera var camera : Bool = true |
Your initiatePictureCamera function will now look like this.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 |
func initiatePictureCamera(){ print("Picture Camera is Running") session = AVCaptureSession() session!.sessionPreset = AVCaptureSessionPresetPhoto var input : AVCaptureDeviceInput? var error: NSError? if (camera == false) { let videoDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) for device in videoDevices{ let device = device as! AVCaptureDevice if device.position == AVCaptureDevicePosition.Front { captureDevice = device break } } } else { captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) } do { input = try AVCaptureDeviceInput(device: captureDevice) if error == nil && session!.canAddInput(input) { session!.addInput(input) stillImageOutput = AVCaptureStillImageOutput() stillImageOutput!.outputSettings = [AVVideoCodecKey: AVVideoCodecJPEG] if session!.canAddOutput(stillImageOutput) { session!.addOutput(stillImageOutput) videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session) videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait videoPreviewLayer!.frame = cameraOverlayView.bounds cameraOverlayView.layer.addSublayer(videoPreviewLayer!) session!.startRunning() } } } catch let err as NSError { error = err input = nil print(error!.localizedDescription) } } |
And your initiateVideoCamera will be something like
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 |
func initiateVideoCamera(){ print("Video Camera is Running") session = AVCaptureSession() session!.sessionPreset = AVCaptureSessionPresetHigh //var captureDevice:AVCaptureDevice! = nil var input : AVCaptureDeviceInput? var error: NSError? if (camera == false) { let videoDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo) for device in videoDevices{ let device = device as! AVCaptureDevice if device.position == AVCaptureDevicePosition.Front { captureDevice = device break } } } else { captureDevice = AVCaptureDevice.defaultDeviceWithMediaType(AVMediaTypeVideo) } do { input = try AVCaptureDeviceInput(device: captureDevice) if error == nil && session!.canAddInput(input) { session!.addInput(input) self.videoOutput = AVCaptureMovieFileOutput() if session!.canAddOutput(videoOutput) { session!.addOutput(videoOutput) videoPreviewLayer = AVCaptureVideoPreviewLayer(session: session) videoPreviewLayer!.videoGravity = AVLayerVideoGravityResizeAspectFill videoPreviewLayer!.connection?.videoOrientation = AVCaptureVideoOrientation.Portrait videoPreviewLayer!.frame = cameraOverlayView.bounds cameraOverlayView.layer.addSublayer(videoPreviewLayer!) session!.startRunning() } } } catch let err as NSError { error = err input = nil print(error!.localizedDescription) } } //MARK: - Change Camera to Front or Back @IBAction func switchCamera(sender: UIBarButtonItem) { self.camera = !camera session!.stopRunning() videoPreviewLayer!.removeFromSuperlayer() initiatePictureCamera() } //MARK: - Capture Video or Picture @IBAction func capture(sender: AnyObject) { if let title = self.keyFromMenu{ if title == "Capture Picture"{ if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) { stillImageOutput!.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault) let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right) UIImageWriteToSavedPhotosAlbum(image, self,#selector(CameraViewController.image(_:didFinishSavingWithError:contextInfo:)), nil) } }) } } //If not title is Capture Video else{ let fileName = "video.mp4"; let documentsURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] let filePath = documentsURL.URLByAppendingPathComponent(fileName) if self.videoOutput!.recording{ //Change camera button for video recording self.videoOutput!.stopRecording() self.captureOutlet.setImage(UIImage(named: "capture"), forState: .Normal) self.activityIndicator.hidden = false self.activityIndicator.startAnimating() return } else{ //Change camera button for video recording self.captureOutlet.setImage(UIImage(named: "video_record"), forState: .Normal) //Start recording self.videoOutput!.startRecordingToOutputFileURL(filePath, recordingDelegate: self) } } } } |
In your Capture button action
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 |
if let title = self.keyFromMenu{ if title == "Capture Picture"{ if let videoConnection = stillImageOutput!.connectionWithMediaType(AVMediaTypeVideo) { stillImageOutput!.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (sampleBuffer, error) -> Void in if sampleBuffer != nil { let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(sampleBuffer) let dataProvider = CGDataProviderCreateWithCFData(imageData) let cgImageRef = CGImageCreateWithJPEGDataProvider(dataProvider, nil, true, CGColorRenderingIntent.RenderingIntentDefault) let image = UIImage(CGImage: cgImageRef!, scale: 1.0, orientation: UIImageOrientation.Right) UIImageWriteToSavedPhotosAlbum(image, self,#selector(CameraViewController.image(_:didFinishSavingWithError:contextInfo:)), nil) } }) } } //If not title is Capture Video else{ let fileName = "video.mp4"; let documentsURL = NSFileManager.defaultManager().URLsForDirectory(.DocumentDirectory, inDomains: .UserDomainMask)[0] let filePath = documentsURL.URLByAppendingPathComponent(fileName) if self.videoOutput!.recording{ //Change camera button for video recording self.videoOutput!.stopRecording() self.captureOutlet.setImage(UIImage(named: "capture"), forState: .Normal) self.activityIndicator.hidden = false self.activityIndicator.startAnimating() return } else{ //Change camera button for video recording self.captureOutlet.setImage(UIImage(named: "video_record"), forState: .Normal) //Start recording self.videoOutput!.startRecordingToOutputFileURL(filePath, recordingDelegate: self) } } } |
This will show you an alert when image is saved in the Photos.
1 2 3 4 5 6 7 8 9 10 11 12 13 |
//MARK: - Shows alert when image is saved func image(image: UIImage, didFinishSavingWithError error: NSError?, contextInfo:UnsafePointer<Void>) { guard error == nil else { //Error saving image print(error) return } //Image saved successfully showAlert("Saved", message: "Image Saved to Photos") } |
Conform your class Camera View Controller to protocol AVCaptureFileOutputRecordingDelegate. It will provide the recorded video url which can be later saved into Photos, and paste the following code in your class.
1 2 3 4 5 6 7 8 |
//MARK: - Get completed video path func captureOutput(captureOutput: AVCaptureFileOutput!, didFinishRecordingToOutputFileAtURL outputFileURL: NSURL!, fromConnections connections: [AnyObject]!, error: NSError!){ //Saves the video in Photos doVideoProcessing(outputFileURL) } |
Finally for saving Video in Photos.
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 |
func doVideoProcessing(outputPath:NSURL){ PHPhotoLibrary.sharedPhotoLibrary().performChanges({ PHAssetChangeRequest.creationRequestForAssetFromVideoAtFileURL(outputPath) }) { (success, error) in if error == nil{ print("Success:\(success)") dispatch_async(dispatch_get_main_queue(),{ self.activityIndicator!.stopAnimating() self.showAlert("Saved", message: "Video saved to Photos") }) } else{ dispatch_async(dispatch_get_main_queue(),{ self.activityIndicator!.stopAnimating() self.showAlert("Error", message: error!.localizedDescription) }) } } |
Don’t forget to add the these frameworks in your Camera View Controller.
1 2 |
import AVFoundation import Photos |
At last, run at test it.
Check out this demo.
Here is the full Source Code to this project.
Good day!
4 Comments