Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

iOS 13 Image Capture API for accessing external camera's filesystem?

In Apple's iOS 13 feature list page they have the following blurb:

Image Capture API

The Image Capture API allows developers to leverage the Camera Connection Kit to import photos directly into their apps.

I've been looking but I can't seem to find any actual documentation about this change, and where it exists in the API. I also remember hearing a second or two talk about it in the keynote/state of the union in WWDC 19, but again no details in any session I've found so far.

It seems like you would be able to plug in a camera or it's SD card to the USB-C/Lightning port on the iOS device and be able to access that from within a 3rd party app. I know you can import to the system photo library, but that has been around for years. I also know about ExternalAccessory framework for MiFi hardware, but I don't see any significant changes to that, and it doesn't seem to have the described functionality exposed.

I do see that UIDocumentPicker can be shown and it allows the user to select a location that may be on a connected USB device. While that could work, it's not camera specific and would be quite error prone, if the user doesn't select a valid camera location.

Anybody know where I can find more info about this change or how you can programmatically access the camera's filesystem? The camera will have the standard camera folder structure DCIM and stuff, so it is recognized as a camera filesystem by many Mac apps.

like image 722
jamone Avatar asked Jul 09 '19 19:07

jamone


1 Answers

You're looking for the ImageCaptureCore framework. This is the same framework that exists on macOS for importing from SD Cards and Cameras. It is now available in iOS 13.2.

Update:

The ImageCaptureCore API is now working as of iOS 13.2.

However, be warned that as of iOS/iPadOS 13.1 Beta 3 (17A5837a) I have not been able to get it working yet (reported to Apple FB6799036). It is now listed with an asterisk on the iPadOS Features page indicating that it will be "Coming later this year".

I'm able to start an ICDeviceBrowser, but I see permissions errors when a device is connected and don't get any delegate messages. So there may be some permission or entitlement that is needed before it starts working.

Unfortunately there is no documentation or sample code (even for macOS) on Apple's developer site. But the framework does exist in the iOS 13 SDK and you can look at the header files there.

We use this framework in our macOS app and using just the headers to figure things out isn't too bad. You'd start by creating an ICDeviceBrowser (ICDeviceBrowser.h), setting its delegate, and then starting the browser:

@interface CameraManager() : NSObject <ICDeviceBrowserDelegate>
{
    ICDeviceBrowser* _deviceBrowser;
}
@end

@implementation CameraManager
- (id) init
{
    self = [super init];
    _deviceBrowser = [[ICDeviceBrowser alloc] init];
    _deviceBrowser.delegate = self;
    [_deviceBrowser start];

    return self;
}
...
@end

You should then start receiving delegate messages when a camera device is connected:

- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)addedDevice moreComing:(BOOL)moreComing;
- (void)deviceBrowser:(ICDeviceBrowser*)browser didRemoveDevice:(ICDevice*)removedDevice moreGoing:(BOOL)moreGoing;

When you get a didAddDevice: message you'll then want to use the ICDevice (ICDevice.h) and ICCameraDevice (ICCameraDevice.h) APIs to set a delegate and start a session. Once the session has started you'll start receiving delegate messages:

- (void)deviceBrowser:(ICDeviceBrowser*)browser didAddDevice:(ICDevice*)addedDevice moreComing:(BOOL)moreComing
{
    if ((addedDevice.type & ICDeviceTypeMaskCamera) == ICDeviceTypeCamera)
    {
        ICCameraDevice* camera = (ICCameraDevice *) addedDevice;
        camera.delegate = self;
        [camera requestOpenSession];
        //  probably want to save 'camera' to a member variable
    }
}

You can use the delegate method:

- (void)cameraDevice:(nonnull ICCameraDevice *)camera
         didAddItems:(nonnull NSArray<ICCameraItem *> *)items;

To get a list of items as they are enumerated by the API or wait for:

- (void)deviceDidBecomeReadyWithCompleteContentCatalog:(ICDevice*)device;

And then use the .contents property on the ICCameraDevice to get all of the contents.

From there you can use the ICCameraDevice to request thumbnails, metadata, and to download specific files. I'll leave that as an exercise to the reader.

As I mentioned above this doesn't seem to be working in iOS/iPadOS 13.1 Beta 3. Hopefully this will all start working soon as I'd really like to start testing it myself.

This is now working in iOS 13.2.

like image 84
Cutterpillow Avatar answered Oct 03 '22 03:10

Cutterpillow