我正在尝试在iOS设备上使用Objective-C将jpg 图像和mov 文件一起加载以显示实时照片,并在viewDidLoad 函数中创建以下代码段来实现此目的:
- (void)viewDidLoad {
[super viewDidLoad];
PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];
NSURL *imageUrl = [[NSBundle mainBundle] URLForResource"livePhoto" withExtension"jpg"];
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource"livePhoto" withExtension"mov"];
[PHLivePhoto requestLivePhotoWithResourceFileURLs[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed"livePhoto.jpg"] targetSize:self.view.bounds.size contentModeHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(@"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
[self.view addSubview:photoView];
[self.view sendSubviewToBack:photoView];
}];
}
我已将文件livePhoto.jpg 和livePhoto.mov 拖到Xcode项目中
但是在构建此Xcode日志时,会出现以下错误:
2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler
有什么想法吗?谢谢。
还有另一件事要问:
为什么根据打印内容两次调用了resultHandler ?
Best Answer-推荐答案 strong>
TL; DR
以下是存储实时照片并将其上传到服务器的代码:
1.拍摄实时照片
- (void)captureOutputAVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBufferCMSampleBufferRef)photoSampleBuffer previewPhotoSampleBufferCMSampleBufferRef)previewPhotoSampleBuffer resolvedSettingsAVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettingsAVCaptureBracketedStillImageSettings *)bracketSettings errorNSError *)error {
if (error) {
[self raiseError:error];
return;
}
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *image = [CIImage imageWithData:imageData];
[self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
[self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutputAVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURLNSURL *)outputFileURL durationCMTime)duration photoDisplayTimeCMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInpututputFileURL]; // 3. Store the URL to the actual video file
}
}
expectedAsset 只是一个包含所有必需信息的对象。您可以改用NSDictionary。并且由于此代码段是> = iOS 11 API,因此这里是“不推荐使用” iOS的一个...
#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:[photo metadata]];
[self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
}
}
#pragma clang diagnostic pop
2.生成NSData
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}
详细答案
这是由于视频/图像文件中的元数据错误引起的。 创建实时照片时,PHLivePhoto在kCGImagePropertyMakerAppleDictionary (这是资产标识符)中搜索键17,并将其与mov文件的com.apple.quicktime.content.identifier 相匹配。 mov文件还需要在捕获静止图像时添加一个条目(com.apple.quicktime.still-image-time )。
确保您的文件未在某处进行编辑(或导出)。事件UIImageJPEGRepresentation函数将从图像中删除此数据。
这是我用来将UIImage转换为NSData的代码片段:
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
处理程序被调用两次,首先告诉您有关损坏的数据,第二次通知您取消进程(这是两个不同的键)。
编辑:
这是您的mov数据:
$ ffmpeg -i cf70b7de66bd89654967aeef1d557816.mov 元数据: major_brand:qt minor_version:0 compatible_brands:qt creation_time:2018-01-27T11:07:38.000000Z com.apple.quicktime.content.identifier:cf70b7de66bd89654967aeef1d557816 持续时间:00:00:15.05,开始:0.000000,比特率:1189 kb / s 流#0:0(und):视频:h264(高)(avc1 / 0x31637661),yuv420p(逐行),540x960,1051 kb / s,29.84 fps,29.97 tbr,30k tbn,59.94 tbc(默认) 元数据: creation_time:2018-01-27T11:07:38.000000Z handler_name:核心媒体数据处理程序 编码器:“avc1” 流#0:1(und):音频:aac(LC)(mp4a / 0x6134706D),44100 Hz,立体声,fltp,128 kb / s(默认) 元数据: creation_time:2018-01-27T11:07:38.000000Z handler_name:核心媒体数据处理程序
此处缺少com.apple.quicktime.still-image-time 键。
以下是元数据的外观:
元数据: major_brand:qt minor_version:0 compatible_brands:qt creation_time:2017-12-15T12:41:00.000000Z com.apple.quicktime.content.identifier:89CB44DA-D129-43F3-A0BC-2C980767B810 com.apple.quicktime.location.ISO6709:+ 51.5117 + 007.4668 + 086.000 / com.apple.quicktime.make:苹果 com.apple.quicktime.model:iPhone X com.apple.quicktime.software:11.1.2 com.apple.quicktime.creationdate:2017-12-15T13:41:00 + 0100 持续时间:00:00:01.63,开始:0.000000,比特率:8902 kb / s 流#0:0(und):视频:h264(高)(avc1 / 0x31637661),yuvj420p(pc,smpte170m / smpte432 / bt709),1440x1080,8135 kb / s,26.94 fps,30 tbr,600 tbn,1200 tbc (默认) 元数据: 旋转90 creation_time:2017-12-15T12:41:00.000000Z handler_name:核心媒体数据处理程序 编码器:H.264 辅助数据: displaymatrix:旋转-90.00度 流#0:1(und):音频:pcm_s16le(lpcm / 0x6D63706C),44100 Hz,单声道,s16,705 kb / s(默认) 元数据: creation_time:2017-12-15T12:41:00.000000Z handler_name:核心媒体数据处理程序 流#0:2(und):数据:无(mebx / 0x7862656D),12 kb / s(默认) 元数据: creation_time:2017-12-15T12:41:00.000000Z handler_name:核心媒体数据处理程序 流#0:3(und):数据:无(mebx / 0x7862656D),43 kb / s(默认) 元数据: creation_time:2017-12-15T12:41:00.000000Z handler_name:核心媒体数据处理程序
仅供参考,以下是您的JPEG数据:
$ magickidentify -format%[EXIF:*] cf70b7de66bd89654967aeef1d557816.jpg exif:ColorSpace = 1 exif:ExifImageLength = 960 exif:ExifImageWidth = 540 exif:ExifOffset = 26 exif:MakerNote = 65、112、112、108、101、32、105、79、83、0、0、1、77、77、0、1、0、17、0、2、0、0、0, 33、0、0、0、32、0、0、0、0、99、102、55、48、98、55、100、101、54、54、98、100、56、57、54、53, 52、57、54、55、97、101、101、102、49、100、53、53、55、56、49、54、0、0
关于ios - 尝试使用PHLivePhotoView Objective-C显示实时照片时,图像元数据无效,我们在Stack Overflow上找到一个类似的问题:
https://stackoverflow.com/questions/47528440/
|