I can't find any API to capture live photos. Did I miss something?
Apple release DOCs
Live Photos
Live Photos is a new feature of iOS 9 that allows users to capture and relive their favorite moments with richer context than traditional photos. When the user presses the shutter button, the Camera app captures much more content along with the regular photo, including audio and additional frames before and after the photo. When browsing through these photos, users can interact with them and play back all the captured content, making the photos come to life.
iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView view takes care of displaying the image, handling all user interaction, and applying the visual treatments to play back the content.
You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the set of files originally exported by the sender.
During the keynote, they mentioned that Facebook will support Live Photos, so I would suspect there has to be a way to capture Live Photos.
When you're on a video call in the FaceTime app , you can take a FaceTime Live Photo to capture a moment of your conversation (not available in all countries or regions).
UIImagePickerController looks like it will allow the capture of live photos.
Working with Live Photos
Live Photos is a Camera app feature on supported devices, enabling a picture to be not just a single moment in time but to include motion and sound from the moments just before and after its capture. A PHLivePhoto object represents a Live Photo, and the PHLivePhotoView class provides a system-standard, interactive user interface for displaying a Live Photo and playing back its content. Live Photos are still photos. When you use an image picker controller to capture or choose still images (by including only the kUTTypeImage type in the mediaTypes array), assets that were captured as Live Photos still appear in the picker. However, when the user chooses an asset, your delegate object receives only a UIImage object containing a still-image representation of the Live Photo. To obtain the full motion and sound content when the user captures or chooses a Live Photo with the image picker, you must include both the kUTTypeImage and kUTTypeLivePhoto types in the the mediaTypes array. For more information, see UIImagePickerControllerLivePhoto in UIImagePickerControllerDelegate Protocol Reference.
https://developer.apple.com/library/prerelease/ios/documentation/UIKit/Reference/UIImagePickerController_Class/index.html#//apple_ref/occ/cl/UIImagePickerController
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With