ios - 尝试使用PHLivePhotoView Objective-C显示实时照片时,图像元数据无效
<p>我正在尝试在iOS设备上使用Objective-C将<code>jpg</code>图像和<code>mov</code>文件一起加载以显示实时照片,并在<code>viewDidLoad</code>函数中创建以下代码段来实现此目的:<br><pre><code>- (void)viewDidLoad {
;
PHLivePhotoView *photoView = [initWithFrame:self.view.bounds];
NSURL *imageUrl = [ URLForResource:@"livePhoto" withExtension:@"jpg"];
NSURL *videoUrl = [ URLForResource:@"livePhoto" withExtension:@"mov"];
placeholderImage: targetSize:self.view.bounds.size contentMode:PHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(@"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
;
;
}];
}
</code></pre><br>我已将文件<code>livePhoto.jpg</code>和<code>livePhoto.mov</code>拖到Xcode项目中<br><br>但是在构建此Xcode日志时,会出现以下错误:<br><pre><code>2017-11-28 17:46:08.568455+0800 Live Photos we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos we are in handler
</code></pre><br>有什么想法吗?谢谢。<br><br>还有另一件事要问:<br><br>为什么根据打印内容两次调用了<code>resultHandler</code>?</p>
<br><hr><h1><strong>Best Answer-推荐答案</ strong></h1><br>
<p><strong> TL; DR </strong><br><br>以下是存储实时照片并将其上传到服务器的代码:<br><br>1.拍摄实时照片<br><pre><code>- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBuffer:(CMSampleBufferRef)photoSampleBuffer previewPhotoSampleBuffer:(CMSampleBufferRef)previewPhotoSampleBuffer resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettings:(AVCaptureBracketedStillImageSettings *)bracketSettings error:(NSError *)error {
if (error) {
;
return;
}
NSData *imageData = ;
CIImage *image = ;
; // 1. This is the metadata (which will be lost in step 2.)
]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutput:(AVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURL:(NSURL *)outputFileURL duration:(CMTime)duration photoDisplayTime:(CMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
;
} else {
; // 3. Store the URL to the actual video file
}
}
</code></pre><code>expectedAsset</code>只是一个包含所有必需信息的对象。您可以改用NSDictionary。并且由于此代码段是> = iOS 11 API,因此这里是“不推荐使用” iOS的一个...<br><pre><code>#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
;
} else {
];
]];
}
}
#pragma clang diagnostic pop
</code></pre><br>2.生成NSData<br><pre><code>- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = ;
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = ;
forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(@{@"image": self.imageData, @"video": }); // LivePhotoURL is the url from step 3 above
}
</code></pre><br> <strong>详细答案</strong><br><br>这是由于视频/图像文件中的元数据错误引起的。<br>创建实时照片时,PHLivePhoto在<code>kCGImagePropertyMakerAppleDictionary</code>(这是资产标识符)中搜索键17,并将其与mov文件的<code>com.apple.quicktime.content.identifier</code>相匹配。 mov文件还需要在捕获静止图像时添加一个条目(<code>com.apple.quicktime.still-image-time</code>)。<br><br>确保您的文件未在某处进行编辑(或导出)。事件UIImageJPEGRepresentation函数将从图像中删除此数据。<br><br>这是我用来将UIImage转换为NSData的代码片段:<br><pre><code>- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = ;
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = ;
forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
</code></pre><br>处理程序被调用两次,首先告诉您有关损坏的数据,第二次通知您取消进程(这是两个不同的键)。<br><br>编辑:<br><br>这是您的mov数据:<br><br> $ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov<br> 元数据:<br> major_brand:qt<br> minor_version:0<br> compatible_brands:qt<br> creation_time:2018-01-27T11:07:38.000000Z<br> com.apple.quicktime.content.identifier:cf70b7de66bd89654967aeef1d557816<br> 持续时间:00:00:15.05,开始:0.000000,比特率:1189 kb / s<br> 流#0:0(und):视频:h264(高)(avc1 / 0x31637661),yuv420p(逐行),540x960,1051 kb / s,29.84 fps,29.97 tbr,30k tbn,59.94 tbc(默认)<br> 元数据:<br> creation_time:2018-01-27T11:07:38.000000Z<br> handler_name:核心媒体数据处理程序<br> 编码器:“avc1”<br> 流#0:1(und):音频:aac(LC)(mp4a / 0x6134706D),44100 Hz,立体声,fltp,128 kb / s(默认)<br> 元数据:<br> creation_time:2018-01-27T11:07:38.000000Z<br> handler_name:核心媒体数据处理程序<br><br>此处缺少<code>com.apple.quicktime.still-image-time</code>键。<br><br>以下是元数据的外观:<br><br> 元数据:<br> major_brand:qt<br> minor_version:0<br> compatible_brands:qt<br> creation_time:2017-12-15T12:41:00.000000Z<br> com.apple.quicktime.content.identifier:89CB44DA-D129-43F3-A0BC-2C980767B810<br> com.apple.quicktime.location.ISO6709:+ 51.5117 + 007.4668 + 086.000 /<br> com.apple.quicktime.make:苹果<br> com.apple.quicktime.model:iPhone X<br> com.apple.quicktime.software:11.1.2<br> com.apple.quicktime.creationdate:2017-12-15T13:41:00 + 0100<br> 持续时间:00:00:01.63,开始:0.000000,比特率:8902 kb / s<br> 流#0:0(und):视频:h264(高)(avc1 / 0x31637661),yuvj420p(pc,smpte170m / smpte432 / bt709),1440x1080,8135 kb / s,26.94 fps,30 tbr,600 tbn,1200 tbc (默认)<br> 元数据:<br> 旋转90<br> creation_time:2017-12-15T12:41:00.000000Z<br> handler_name:核心媒体数据处理程序<br> 编码器:H.264<br> 辅助数据:<br> displaymatrix:旋转-90.00度<br> 流#0:1(und):音频:pcm_s16le(lpcm / 0x6D63706C),44100 Hz,单声道,s16,705 kb / s(默认)<br> 元数据:<br> creation_time:2017-12-15T12:41:00.000000Z<br> handler_name:核心媒体数据处理程序<br> 流#0:2(und):数据:无(mebx / 0x7862656D),12 kb / s(默认)<br> 元数据:<br> creation_time:2017-12-15T12:41:00.000000Z<br> handler_name:核心媒体数据处理程序<br> 流#0:3(und):数据:无(mebx / 0x7862656D),43 kb / s(默认)<br> 元数据:<br> creation_time:2017-12-15T12:41:00.000000Z<br> handler_name:核心媒体数据处理程序<br><br>仅供参考,以下是您的JPEG数据:<br><br> $ magickidentify -format% cf70b7de66bd89654967aeef1d557816.jpg<br> exif:ColorSpace = 1<br> exif:ExifImageLength = 960<br> exif:ExifImageWidth = 540<br> exif:ExifOffset = 26<br> exif:MakerNote = 65、112、112、108、101、32、105、79、83、0、0、1、77、77、0、1、0、17、0、2、0、0、0, 33、0、0、0、32、0、0、0、0、99、102、55、48、98、55、100、101、54、54、98、100、56、57、54、53, 52、57、54、55、97、101、101、102、49、100、53、53、55、56、49、54、0、0</p>
<p style="font-size: 20px;">关于ios - 尝试使用PHLivePhotoView Objective-C显示实时照片时,图像元数据无效,我们在Stack Overflow上找到一个类似的问题:
<a href="https://stackoverflow.com/questions/47528440/" rel="noreferrer noopener nofollow" style="color: red;">
https://stackoverflow.com/questions/47528440/
</a>
</p>
页:
[1]