Computer Vision

How to develop an OpenCV C++ algorithm in Xcode

Everything you need to get up and running to develop and debug an OpenCV C++ algorithm

Anurag Ajwani

--

Looking to solve Computer Vision problems using OpenCV? Maybe you are developing an OpenCV algorithm to be used on multiple platforms like iOS and Android.

A great way to develop cross-platform OpenCV algorithms is by writing them in C++ and using Xcode. Both platforms support running and binding to C++ code. However developing an OpenCV algorithm inside §By testing your OpenCV C++ algorithm on your Mac you can save time and effort when developing the algorithm.

The aim of this post is to help you get up and running developing and debugging an OpenCV C++ algorithms. This post will show you how to setup your C++ OpenCV algorithm project in MacOS using Apple’s Xcode IDE.

I assume you are familiar with the basics of C++ and OpenCV. I also assume you have Xcode installed and have a webcam.

For this post I have used Xcode 12.2.

Getting Started

In this section we’ll setup a new C++ command line tool project in Xcode. Then we’ll add OpenCV to the project. Then we’ll look at three ways of capturing images using OpenCV; reading image from disk, reading frames from a video and capturing frames from the webcam. This is the point you’ll have a chance to process the image. For this post we’ll simply display the image back to the user.

The steps we’ll take:

  1. Create new Xcode project
  2. Installing OpenCV
  3. Linking OpenCV
  4. Reading images from disk
  5. Reading video from disk
  6. Streaming the camera feed
  7. Running the tool from the command line

We’ll making use of command line using Terminal. During this post make sure to have the Terminal app open.

1. Create new Xcode project

Let’s start by create a new Xcode project. From menu select File > New > Project…

When prompted “Choose a template for your new project:” search and select Command Line Tool under macOS tab. Then click Next.

Then when prompted “Choose options for your project:” name the product MyOpenCVAlgorithm. For the “Language:” select C++. Then click Next.

Xcode will create a very simple command line based app project that just contains a single file names main.cpp. Let’s run the tool. From menu select Product > Run.

Command line tool run output

2. Installing OpenCV

To install OpenCV we’ll be making use of command line tool Homebrew. Open Terminal app and run the following command:

/bin/bash -c "$(curl -fsSL https://raw.githubusercontent.com/Homebrew/install/HEAD/install.sh)"

Once Homebrew is installed run the following command to install OpenCV:

brew install opencv

The installation may take a while. Homebrew will download the OpenCV code and other code it depends on. Then Homebrew will build OpenCV for you.

Once the installation is finished you should find the OpenCV installation under /usr/local/Cellar/opencv/<VERSION>. At the time of writing the latest version of OpenCV on Homebrew is 4.5.0.

OpenCV installation path

OpenCV is now installed in our machines. However OpenCV is still not installed or linked to our Xcode project.

3. Linking OpenCV

In this step we will link OpenCV to our MyOpenCVAlgorithm command line tool target. Here is what we need to do to link OpenCV:

  1. Link the OpenCV library modules (Other Linker Flags)
  2. Tell Xcode where the library modules live or rather where to look for them (Library Search Paths)
  3. Tell Xcode where to find the public interfaces for the functionality contained within the library (Header Search Paths)

To find out the information we need to supply Xcode settings we’ll make use of pkg-config to tell us about it.

First let’s install pkg-config. Run the following command:

brew install pkg-config

Once the installation is complete run the following command:

export PKG_CONFIG_PATH=/usr/local/lib/pkgconfig

OpenCV installation will already have placed a file in /usr/local/lib/pkgconfig.

OpenCV package config file

Now we can use pkg-config to query the configuration settings for OpenCV. Run the following command:

pkg-config --libs-only-l opencv4 | pbcopy

The above command outputs the OpenCV libraries list and copies it to our clipboard ready to pasted.

Next let’s link OpenCV to our proejct. Open or go back to the project in Xcode. Open the project navigator (from menu select View > Navigators > Project) then select MyOpenCVAlgorithm project configuration file (the one with the blue icon).

Open project navigator on the right hand side pane
Select the project configuration file from the project navigator pane

Next select MyOpenCVAlgorithm under “Targets”. Then select Build settings tab.

Search “Other Linker Flags” and the paste the value(⌘V keys) in clipboard on the Other Linking Flags settings.

Linking OpenCV library

That’s the OpenCV library linked. However for now we have only told Xcode the names of the modules we want to link to but not where to find them. Let’s fix that.

First let’s find out where the OpenCV modules live. Run the following command:

pkg-config --libs-only-L opencv4 | cut -c 3- | pbcopy

Again we have asked pkg-config for the answer and copied the value to the clipboard. Search “Library Search Path” and paste (⌘V) the value on the clipboard as the value.

Settings Library search paths in Xcode

So far we have linked the OpenCV code to our command line app. However to consume the code we need tell Xcode where OpenCV’s public interface is–or in C++ the headers files location.

Again let’s ask pkg-config for the location. Run the following command:

pkg-config --cflags opencv4 | cut -c 3- | pbcopy

Next let’s tell Xcode where the header files are. In build settings search “Header Search Path” and paste in the value from the clipboard.

Setting Header search paths in Xcode

Let’s run the app. From menu select Product > Run. You should encounter an error such as the following:

dyld: Library not loaded:
/usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib
Referenced from: /usr/local/opt/openblas/lib/libopenblas.0.dylib
Reason: no suitable image found. Did find:
/usr/local/opt/gcc/lib/gcc/10/libgfortran.5.dylib: code signature in
...

This happens because by default apps built through Xcode have codesigning policies enabled. All the app code including its dependencies must be codesigned by the same developer. Let’s disable that for its dependencies.

Select MyOpenCVAlgorithm project configuration file on the project navigator. Then from the main pane select MyOpenCVAlgorithm under “Targets” and then select “Signing & Capabilities” tab. Scroll to the “Hardened Runtime” section and remove the whole section.

Note Hardened Runtime is intended to protect the runtime integrity of your app. If you intend to distribute you application I would recommend you not to delete this section and invest time and effort in securing your app. For this post the intention is to run it locally for development of you OpenCV C++ algorithm.

That’s it! OpenCV is ready to be consumed.

3. Reading images from disk

In this step we’ll use OpenCV to read an image from disk and display it to the screen.

Open main.cpp and the following line after #include <iostream>:

#include <opencv2/opencv.hpp>

Now OpenCV functionality is available to main.cpp. Next let’s add a function to read image from disk and display it on screen using OpenCV. Add the following function above the main function:

int showImageFromDisk(std::string imagePath) {
cv::Mat image = cv::imread(imagePath);
// process image
cv::imshow("Image from disk", image);
cv::waitKey();
return 0;
}

In the function above we read the image from disk using OpenCV’s imread function and store it in OpenCV image object (cv::Mat image). You can perform any image processing on this object.

Next change the main function to the following:

int main(int argc, const char * argv[]) {
if (argc == 3) {
std::string readType(argv[1]);
std::string filePath(argv[2]);
if (readType == "--image") {
return showImageFromDisk(filePath);
}
}
return 0;
}

Here we’re just reading the supplied image path to the showImageFromDisk function.

One last step let’s provide an image to our command line tool. From menu select Product > Scheme > Edit Scheme…

Then select Run > Argument and add the following argument:

--image <PATH_TO_IMAGE>

Run the app by selecting from menu Product > Run.

Note I added the OpenCV logo image to the project and set the image path using the Xcode project path variable SRCROOT.

4. Reading video from disk

In this step we’ll learn how stream the frames from a video to the user using OpenCV.

First let’s add a function to handle playing the video. Add the following function below the showImageFromDisk:

int showVideoFromDisk(std::string videoPath) {
cv::VideoCapture videoCapture(videoPath);
if (!videoCapture.isOpened()) {
std::cout << "Error opening video stream of file" << std::endl;
return -1;
}
while (true) {
cv::Mat frame;
videoCapture >> frame;
// process frame here
if (frame.empty()) break;
cv::imshow("Video frame", frame);
if (cv::waitKey(10) == 27) break;
}
videoCapture.release();
cv::waitKey();
return 0;
}

Above we are making use of OpenCV’s VideoCapture class to load and stream the video file frames. Again you can perform any image processing on the frame once you grab it and before displaying it.

Next let’s call that function from the main function. Change you main function to the following:

int main(int argc, const char * argv[]) {
if (argc == 3) {
std::string readType(argv[1]);
std::string filePath(argv[2]);
if (readType == "--image") {
return showImageFromDisk(filePath);
} else if (readType == "--video") {
return showVideoFromDisk(filePath);
}
}
return 0;
}

Above we are now calling showVideoFromDisk when the run argument is set --video and supplying the second run argument as the video file path.

Next let’s add the video run argument to the run scheme. Grab any video path you have on disk. Once again let’s open the run scheme by selecting from menu Product > Scheme > Edit Scheme…

Then select Run > Arguments. Add another argument:

--video <PATH_TO_VIDEO_FILE>

Make sure to uncheck the --image run argument from the previous step.

That’s it run the app by selecting from menu Product > Run.

Video playing

If you’d like to use the same video as I did you can find it at videvo.net (Royalty Free).

5. Streaming the camera feed

In this step we’ll use OpenCV functionality to stream the camera feed back to the user.

Open main.cpp and add the following function below showVideoFromDisk:

int streamWebcamFeed() {
cv::VideoCapture videoCapture(0);
if (!videoCapture.isOpened()) {
std::cout << "Unable to connect to webcam" << std::endl;
return -1;
}
while(true) {
cv::Mat frame;
videoCapture >> frame;
if(frame.empty()) break;
cv::imshow("Camera feed", frame);
if (cv::waitKey(10) == 27) break;
}
videoCapture.release();
return 0;
}

The code above is very similar to the showVideoFromDisk function with one exception; instead of supplying the video file path we supply the integer zero which tells OpenCV to try to connect to the default camera.

Next let’s show the camera feed when no arguments are supplied. Change the main function to the following:

int main(int argc, const char * argv[]) {
if (argc == 3) {
std::string readType(argv[1]);
std::string filePath(argv[2]);
if (readType == "--image") {
return showImageFromDisk(filePath);
} else if (readType == "--video") {
return showVideoFromDisk(filePath);
} else {
return streamWebcamFeed();
}
} else {
return streamWebcamFeed();
}
return 0;
}

Next open the Run scheme by navigating to Product > Scheme > Edit Scheme… then Run > Arguments then uncheck both run arguments.

Above we will run the camera feed when no arguments or an invalid number of arguments is supplied.

Run the app by select from menu Product > Run. You’ll get an error.

This app has crashed because it attempted to access privacy-sensitive data without a usage description.  The app's Info.plist must contain an NSCameraUsageDescription key with a string value explaining to the user how the app uses this data.

In order for you to access the camera device in MacOS you must include a Info.plist file declaring that your app requires camera usage and a description that is displayed to the user requesting permission to the camera device.

Let’s create an Info.plist. From menu select File > New > File…

Create a new file

When prompted “Choose template for you new file” search and select “Property List”. Then click Next.

Name the file Info. Then click Create.

Then add a new entry in the Info.plist file. Add NSCameraUsageDescription as the key and “Required to run the OpenCV algorithm”.

Note after entering NSCameraUsageDescription to the key the text will automatically change to Privacy — Camera Usage Description.

The value of NSCameraUsageDescription will be displayed to the user upon requesting camera permission.

Now we have to tell Xcode that we have an Info.plist file and its location. Open the project configuration. Then select MyOpenCVAlgorithm under “Targets”. Then select “Build Settings” tab. Search for “Info.plist”. We’ll need to change two settings here.

First change the value for “Info.plist File” to MyOpenCVAlgorithm/Info.plist.

Next change “Create Info.plist Section in Binary” to Yes.

One last thing is to copy the Info.plist to the path where the compiled command line tool will live. Open Build Phases tab and add a new run script.

Name the run script Copy Info.plist. Then change the contents to:

cp ${PROJECT_DIR}/${INFOPLIST_FILE} ${CONFIGURATION_BUILD_DIR}/

Run the app again (Product > Run). On the first run you’ll be requested for camera permission. Click allow.

However notice that the app stop executing. Run the app once more and watch camera feed displayed to you. This is expected behaviour. The OpenCV VideoCapture class does not manage permissions. If the camera permission is denied or not set then OpenCV will return an error. After the first run and camera permission allowed you shouldn’t encounter this problem anymore.

Run the app again (Product > Run).

OpenCV displaying camera feed

If by mistake you deny the camera permission to the app then you can change it in System Preferences > Security and Privacy > Privacy > Camera by checking the box for your app.

That’s it! You are now ready to develop your OpenCV C++ algorithm in Xcode.

7. Running the tool from the command line

So far we have been running the C++ algorithm through Xcode. However we can also run the tool from the command line. This step will teach you how.

First open the location of the root of your project in terminal, i.e.:

cd ~/PATH/TO/MyOpenCVAlgorithm

Next let’s build the command line tool. Run the following command:

xcodebuild -scheme MyOpenCVAlgorithm -derivedDataPath .derived_data

The build executable will placed under .derived_data/Build/Products/Debug.

Executable location built through Terminal

Next let’s run the executable by running the following command:

./.derived_data/Build/Products/Debug/MyOpenCVAlgorithm

If you are running the tool through Terminal for the first time you might encounter a prompt requesting camera permission for Terminal. Grant access and then run the command again.

Terminal requesting camera permission

That’s it!

Summary

In this post we have learnt:

  • How to create a C++ command line tool project in Xcode
  • How to install OpenCV in MacOS
  • How to display images from disk to the user using OpenCV
  • How to stream frames from a video on disk using OpenCV
  • How to stream the camera feed using OpenCV
  • Building and running the algorithm from Xcode
  • Building and running the algorithm from Terminal

Final Notes

You can find the full source code in my Github repositories:

OpenCV is great library if you are solving computer vision problems for Android and iOS. It’s cross platform and allows your team to build a single solution with similar performance across platform than building platform specific based solution.

However C++ and OpenCV requires lots of learning for the average mobile developer. Make sure you have carefully considered other options before using OpenCV.

For more on iOS and software development follow me on Twitter or Medium!

--

--

Anurag Ajwani

Senior iOS Engineer at Travelperk. 7+ years experience with iOS and Swift. Blogging my knowledge and experience.