I am using AVFoundation in swift for take pictures but I can't convert any func lines of code from objective c to Swift. My func code is:
- (void) capImage { //method to capture image from AVCaptureSession video feed
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in stillImageOutput.connections) {
for (AVCaptureInputPort *port in [connection inputPorts]) {
if ([[port mediaType] isEqual:AVMediaTypeVideo] ) {
videoConnection = connection;
break;
}
}
if (videoConnection) {
break;
}
}
NSLog(@"about to request a capture from: %@", stillImageOutput);
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
if (imageSampleBuffer != NULL) {
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageSampleBuffer];
[self processImage:[UIImage imageWithData:imageData]];
}
}];
}
This line send me error AnyObject[]does not conform to protocol sequencfe..:
for (AVCaptureInputPort *port in [connection inputPorts]) {
In swift:
for port:AnyObject in connection.inputPorts {
And I don't know how convert this line:
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnection completionHandler: ^(CMSampleBufferRef imageSampleBuffer, NSError *error) {
Can u help me to convert to swift? Thanks!!
Create a Swift class for your corresponding Objective-C . m and . h files by choosing File > New > File > (iOS, watchOS, tvOS, or macOS) > Source > Swift File. You can use the same or a different name than your Objective-C class.
Xcode automatically generates a bridging header to expose Objc-C classes to Swift, but it makes assumptions about naming conventions that can create mismatches. The file can't be manually edited and sometimes took several clean/build cycles to get it to update.
for (AVCaptureInputPort *port in [connection inputPorts]) { )
Arrays of AnyObject
should be cast to arrays of your actual type before interating, like this:
for (port in connection.inputPorts as AVCaptureInputPort[]) { }
In terms of blocks to closures, you just have to get the syntax correct.
stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection) {
(imageSampleBuffer, error) in // This line defines names the inputs
//...
}
Note that this also uses Trailing Closure Syntax. Do read up on the docs more!
EDIT: In terms of initializers, they now look like this:
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageSampleBuffer)
self.processImage(UIImage(data:imageData))
Try this
var videoConnection :AVCaptureConnection?
if let videoConnection = self.stillImageOutput.connectionWithMediaType(AVMediaTypeVideo){
self.stillImageOutput.captureStillImageAsynchronouslyFromConnection(videoConnection, completionHandler: { (buffer:CMSampleBuffer!, error: NSError!) -> Void in
if let exifAttachments = CMGetAttachment(buffer, kCGImagePropertyExifDictionary, nil) {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(buffer)
self.previewImage.image = UIImage(data: imageData)
UIImageWriteToSavedPhotosAlbum(self.previewImage.image, nil, nil, nil)
}
})
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With