Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Apple Live Photo file format

Apple will introduce Live Photo in iOS 9/iPhone 6s. Where is the file format documented?

like image 375
Clay Bridges Avatar asked Sep 10 '15 17:09

Clay Bridges


People also ask

What is the format of Apple live photo?

heic file as the photo part? So the JPEG file itself does not contain the video, unlike Android's "Motion Photo", where the JPG contains the video in the same file.

Are live Photos saved as HEIC?

Live Photos are the poster child for HEIC, as they are made up of an image and video file combined. HEIC allows those two high-quality files to be stored together in one “container.”

Are Live Photos MOV files?

The Live Photos feature, as you know, is a mixture of movie and still image. That is, a Live Photo is neither a photo nor a video. When you capture a Live Photo on your iPhone, the iOS creates a MOV file as well as a JPEG file.

How are Apple live Photos stored?

Answer: A: When you import Live Photos from your iPhone to Photos, they should be stored as a pair of a high resolution still frame, a HEIC file plus a video component, typically a MOV file in the HEVC format.


2 Answers

A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).

  1. A JPEG; this must have a metadata entry for kCGImagePropertyMakerAppleDictionary with [17 : assetIdentifier] (17 is the Apple Maker Note Asset Identifier key).
  2. A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
    • Top-level Quicktime Metadata entry for ["com.apple.quicktime.content.identifier" : assetIdentifier]. If using AVAsset you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
    • Timed Metadata track with ["com.apple.quicktime.still-image-time" : 0xFF]; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF byte (aka -1) and can be ignored. If using an AVAssetReader you can use CMSampleBufferGetOutputPresentationTimeStamp to get this time.

The assetIdentifier is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.

like image 68
russbishop Avatar answered Oct 04 '22 17:10

russbishop


Here's the link. Otherwise, here's the text:

Live Photos

Live Photos is a new feature of iOS 9 that allows users to capture and relive their favorite moments with richer context than traditional photos. When the user presses the shutter button, the Camera app captures much more content along with the regular photo, including audio and additional frames before and after the photo. When browsing through these photos, users can interact with them and play back all the captured content, making the photos come to life.

iOS 9.1 introduces APIs that allow apps to incorporate playback of Live Photos, as well as export the data for sharing. There is new support in the Photos framework to fetch a PHLivePhoto object from the PHImageManager object, which is used to represent all the data that comprises a Live Photo. You can use a PHLivePhotoView object (defined in the PhotosUI framework) to display the contents of a Live Photo. The PHLivePhotoView view takes care of displaying the image, handling all user interaction, and applying the visual treatments to play back the content.

You can also use PHAssetResource to access the data of a PHLivePhoto object for sharing purposes. You can request a PHLivePhoto object for an asset in the user’s photo library by using PHImageManager or UIImagePickerController. If you have a sharing extension, you can also get PHLivePhoto objects by using NSItemProvider. On the receiving side of a share, you can recreate a PHLivePhoto object from the set of files originally exported by the sender.

Guidelines for Displaying Live Photos

It’s important to remember that a Live Photo is still a photo. If you have to display a Live Photo in an environment that doesn’t support PHLivePhotoView, it’s recommended that you present it as a regular photo.

Don’t display the extra frames and audio of a Live Photo separately. It's important that the content of the Live Photo be presented in a consistent way that uses the same visual treatment and interaction model in all apps.

It’s recommended that you identify a photo as a Live Photo by placing the badge provided by the PHLivePhotoView class method livePhotoBadgeImageWithOptions:PHLivePhotoBadgeOptionsOverContent in the top-left corner of the photo.

Note that there is no support for providing the visual effect that users experience as they swipe through photos in the Photos app.

Guidelines for Sharing Live Photos

The data of a Live Photo is exported as a set of files in a PHAssetResource object. The set of files must be preserved as a unit when you upload them to a server. When you rebuild a PHLivePhoto with these files on the receiver side, the files are validated; loading fails if the files don’t come from the same asset.

If your app lets users apply effects or adjustments to a photo before sharing it, be sure to apply the same adjustments to all frames of the Live Photo. Alternatively, if you don’t support adjusting the entire contents of a Live Photo, share it as a regular photo and show an appropriate indication to the user.

If your app has UI for picking photos to share, you should let users play back the entire contents so they know exactly what they are sharing.When selecting photos to share in your app, users should also be able to turn a Live Photo off, so they can post it as a traditional photo.

like image 32
Sam0711er Avatar answered Oct 04 '22 17:10

Sam0711er