A live photo has two resources. They are tied together with an asset identifier (a UUID as a string).
- A JPEG; this must have a metadata entry for
kCGImagePropertyMakerAppleDictionary
with [17 : assetIdentifier]
(17 is the Apple Maker Note Asset Identifier key).
- A Quicktime MOV encoded with H.264 at the appropriate framerate (12-15fps) and size (1080p). This MOV must have:
- Top-level Quicktime Metadata entry for
["com.apple.quicktime.content.identifier" : assetIdentifier]
. If using AVAsset
you can get this from asset.metadataForFormat(AVMetadataFormatQuickTimeMetadata)
- Timed Metadata track with
["com.apple.quicktime.still-image-time" : 0xFF]
; The actual still image time matches up to the presentation timestamp for this metadata item. The payload seems to just be a single 0xFF
byte (aka -1) and can be ignored. If using an AVAssetReader
you can use CMSampleBufferGetOutputPresentationTimeStamp
to get this time.
The assetIdentifier
is what ties the two items together and the timed metadata track is what tells the system where the still image sits in the movie timeline.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…