I've integrated opencv in Swift IOS project using bridging header (to connect Swift to Objective C) and a Objective C wrapper (to connect Objective C to C++). Using this method I can pass single images from the Swift code, analyse them in the C++ files and get them back.
I've seen that opencv provides CvVideoCamera object that can be integrated with an Objective C UIViewController.
But since my UIViewController are written in Swift I've wondered if this is possible as well?
In order to use C++ inside Objective C (OpenCV is written in C++ and C++ cannot interface directly with Swift), you need to change the file extension from OpenCVWrapper. m to OpenCVWrapper.mm.
OpenCV is a framework written in C++. Apple's reference tell us that. You cannot import C++ code directly into Swift. Instead, create an Objective-C or C wrapper for C++ code.
OpenCV is an open source library that contains functions aimed at real-time computer vision. In this post I will show you how to use OpenCV in an iOS app. We will create an iOS app that will detect the road lane in which the user is driving.
This is an update to my initial answer after I had a chance to play with this myself. Yes, it is possible to use CvVideoCamera
with a view controller written in Swift. If you just want to use it to display video from the camera in your app, it's really easy.
#import <opencv2/highgui/cap_ios.h>
via the bridging header. Then, in your view controller:
class ViewController: UIViewController, CvVideoCameraDelegate {
...
var myCamera : CvVideoCamera!
override func viewDidLoad() {
...
myCamera = CvVideoCamera(parentView: imageView)
myCamera.delegate = self
...
}
}
The ViewController
cannot actually conform to the CvVideoCameraDelegate
protocol, but CvVideoCamera
won't work without a delegate, so we work around this problem by declaring ViewController
to adopt the protocol without implementing any of its methods. This will trigger a compiler warning, but the video stream from the camera will be displayed in the image view.
Of course, you might want to implement the CvVideoCameraDelegate
's (only) processImage()
method to process video frames before displaying them. The reason you cannot implement it in Swift is because it uses a C++ type, Mat
.
So, you will need to write an Objective-C++ class whose instance can be set as camera's delegate. The processImage()
method in that Objective-C++ class will be called by CvVideoCamera
and will in turn call code in your Swift class. Here are some sample code snippets.
In OpenCVWrapper.h
:
// Need this ifdef, so the C++ header won't confuse Swift
#ifdef __cplusplus
#import <opencv2/opencv.hpp>
#endif
// This is a forward declaration; we cannot include *-Swift.h in a header.
@class ViewController;
@interface CvVideoCameraWrapper : NSObject
...
-(id)initWithController:(ViewController*)c andImageView:(UIImageView*)iv;
...
@end
In the wrapper implementation, OpenCVWrapper.mm
(it's an Objective-C++ class, hence the .mm extension):
#import <opencv2/highgui/cap_ios.h>
using namespace cv;
// Class extension to adopt the delegate protocol
@interface CvVideoCameraWrapper () <CvVideoCameraDelegate>
{
}
@end
@implementation CvVideoCameraWrapper
{
ViewController * viewController;
UIImageView * imageView;
CvVideoCamera * videoCamera;
}
-(id)initWithController:(ViewController*)c andImageView:(UIImageView*)iv
{
viewController = c;
imageView = iv;
videoCamera = [[CvVideoCamera alloc] initWithParentView:imageView];
// ... set up the camera
...
videoCamera.delegate = self;
return self;
}
// This #ifdef ... #endif is not needed except in special situations
#ifdef __cplusplus
- (void)processImage:(Mat&)image
{
// Do some OpenCV stuff with the image
...
}
#endif
...
@end
Then you put #import "OpenCVWrapper.h"
in the bridging header, and the Swift view controller might look like this:
class ViewController: UIViewController {
...
var videoCameraWrapper : CvVideoCameraWrapper!
override func viewDidLoad() {
...
self.videoCameraWrapper = CvVideoCameraWrapper(controller:self, andImageView:imageView)
...
}
See https://developer.apple.com/library/ios/documentation/Swift/Conceptual/BuildingCocoaApps/MixandMatch.html about forward declarations and Swift/C++/Objective-C interop. There is plenty of info on the web about #ifdef __cplusplus
and extern "C"
(if you need it).
In the processImage()
delegate method you will likely need to interact with some OpenCV API, for which you will also have to write wrappers. You can find some info on that elsewhere, for example here: Using OpenCV in Swift iOS
Update 09/03/2019
At the community request, see comments, the sample code has been placed on GitHub at https://github.com/aperedera/opencv-swift-examples.
Also, the current, as of this writing, version of the OpenCV iOS framework no longer allows Swift code to use the header (now it's in videoio/cap_ios.h) that declares the CvVideoCameraDelegate
protocol, so you cannot just include it in the bridging header and declare the view controller to conform to the protocol to simply display camera video in your app.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With