OpenCV in iOS – The Camera

Hello everyone, this is my second blog post on ‘OpenCV in iOS’ series. Before starting this tutorial, it is recommended that you complete the ‘OpenCV in iOS – An Introduction‘ tutorial. In this blog post, I will be explaing how to use the camera inside your iOS app. For setting up the application in Xcode, please complete till step 6 in ‘OpenCV in iOS – An Introduction‘ tutorial before you proceed to the below mentioned steps!

  1. In this app, we need some additional frameworks to include in our project. They are listed as follows –
    • Accelerate.framework
    • AssetsLibrary.framework
    • AVFoundation.framework
    • CoreGraphics.framework
    • CoreImage.framework
    • CoreMedia.framework
    • opencv2.framework
    • QuartzCore.framework
    • UIKit.framework
  2. We already know how to add ‘opencv2.framework‘ from the previous blog post. I will go through the process of how to add one of the above mentioned frameworks (e.g: AVFoundation.framework), likewise you can add the rest. To add ‘AVFoundation.framework‘, go to ‘Linked Frameworks and Libraires‘ and click on the ‘+’ sign. Choose the ‘AVFoundation.framework‘ and click on ‘Add‘.

    Screen Shot 2016-07-23 at 11.36.53 pm

  3. Now your project navigator area should like this.

    Screen Shot 2016-07-24 at 10.36.20 pm

  4. It’s time to make our hands dirty! 🙂 Open ‘ViewController.h‘ and write the following lines of code.

    Screen Shot 2016-07-24 at 10.38.20 pm

  5. Now go to ‘ViewController.mm‘ file and add some lines to include C++ code along with the Objective-C code.

    Screen Shot 2016-07-24 at 10.55.23 pm

  6. Let us initialise some variables for getting the camera access and for live output from camera.

    Screen Shot 2016-07-24 at 10.58.52 pm

  7. Now setup the live view such that it fills the whole app screen.

    Screen Shot 2016-07-24 at 5.36.54 pm.png

  8. Initialise the Camera parameters and start capturing! 

    Screen Shot 2016-07-24 at 11.02.01 pm.png

  9. But wait! we still have to do one more step before actually testing our app. If you observe the line “@implementation ViewController”, you will find a warning “Method ‘processImage:’ in protocol ‘CvVideoCameraDelegate’ not implemented”. To know more about CvVideoCameraDelegate, refer this link. Coming back to our tutorial, we have to add the following lines of code to overcome that warning.

    Screen Shot 2016-07-24 at 11.19.11 pm

  10. And now we are ready to run the app! In this application, we have to use the iPad/iPhone to run and test our application because we have to access the camera of the device. Now we can see the live view of our camera! 🙂 

    IMG_1713.jpg

  11.  Lets give some basic instaTouch! to our app 😉 .Add the following lines of code in the ‘processImage‘ method and run the application on your device.

    This slideshow requires JavaScript.

With this we are coming to an end of this tutorial! 🙂 We have learnt how to access Camera inside the app and apply some live operations on the video. Though this is a basic tutorial, this will act as a precursor for many Augmented Reality type applications! 😀 We will try to get into next level of development of Computer Vision apps in our next tutorial! Still then stay tuned… 🙂 Feel free to comment your suggestions/doubts related to this tutorial.

The SOURCE CODE for the following tutorial is available at the following GITHUB LINK.

 

2 thoughts on “OpenCV in iOS – The Camera

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s