我正在尝试在iOS设备上使用Objective-C将jpg
图像和mov
文件一起加载以显示实时照片,并在viewDidLoad
函数中创建以下代码段来实现此目的:
- (void)viewDidLoad {
[super viewDidLoad];
PHLivePhotoView *photoView = [[PHLivePhotoView alloc]initWithFrame:self.view.bounds];
NSURL *imageUrl = [[NSBundle mainBundle] URLForResource"livePhoto" withExtension"jpg"];
NSURL *videoUrl = [[NSBundle mainBundle] URLForResource"livePhoto" withExtension"mov"];
[PHLivePhoto requestLivePhotoWithResourceFileURLs[videoUrl, imageUrl] placeholderImage:[UIImage imageNamed"livePhoto.jpg"] targetSize:self.view.bounds.size contentModeHImageContentModeAspectFit resultHandler:^(PHLivePhoto *livePhoto, NSDictionary *info){
NSLog(@"we are in handler");
photoView.livePhoto = livePhoto;
photoView.contentMode = UIViewContentModeScaleAspectFit;
photoView.tag = 87;
[self.view addSubview:photoView];
[self.view sendSubviewToBack:photoView];
}];
}
livePhoto.jpg
和livePhoto.mov
拖到Xcode项目中2017-11-28 17:46:08.568455+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.580439+0800 Live Photos[3669:1276778] we are in handler
2017-11-28 17:46:08.597147+0800 Live Photos[3669:1276806] Error: Invalid image metadata
2017-11-28 17:46:08.607881+0800 Live Photos[3669:1276806] Error: Invalid video metadata
2017-11-28 17:46:08.608329+0800 Live Photos[3669:1276778] we are in handler
resultHandler
?
TL; DR
以下是存储实时照片并将其上传到服务器的代码:
1.拍摄实时照片
- (void)captureOutputAVCapturePhotoOutput *)output didFinishProcessingPhotoSampleBufferCMSampleBufferRef)photoSampleBuffer previewPhotoSampleBufferCMSampleBufferRef)previewPhotoSampleBuffer resolvedSettingsAVCaptureResolvedPhotoSettings *)resolvedSettings bracketSettingsAVCaptureBracketedStillImageSettings *)bracketSettings errorNSError *)error {
if (error) {
[self raiseError:error];
return;
}
NSData *imageData = [AVCapturePhotoOutput JPEGPhotoDataRepresentationForJPEGSampleBuffer:photoSampleBuffer previewPhotoSampleBuffer:previewPhotoSampleBuffer];
CIImage *image = [CIImage imageWithData:imageData];
[self.expectedAsset addInput:image.properties]; // 1. This is the metadata (which will be lost in step 2.)
[self.expectedAsset addInput:[UIImage imageWithCIImage:image]]; // 2. Creating image, but UIImage is not designed to contain the required metadata
}
- (void)captureOutputAVCapturePhotoOutput *)output
didFinishProcessingLivePhotoToMovieFileAtURLNSURL *)outputFileURL durationCMTime)duration photoDisplayTimeCMTime)photoDisplayTime resolvedSettings:(AVCaptureResolvedPhotoSettings *)resolvedSettings error:(nullable NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInpututputFileURL]; // 3. Store the URL to the actual video file
}
}
expectedAsset
只是一个包含所有必需信息的对象。您可以改用NSDictionary。并且由于此代码段是> = iOS 11 API,因此这里是“不推荐使用” iOS的一个...#pragma clang diagnostic push
#pragma clang diagnostic ignored "-Wunguarded-availability"
- (void)captureOutput:(AVCapturePhotoOutput *)output didFinishProcessingPhoto:(AVCapturePhoto *)photo error:(NSError *)error {
if (error) {
[self raiseError:error];
} else {
[self.expectedAsset addInput:[photo metadata]];
[self.expectedAsset addInput:[UIImage imageWithData:[photo fileDataRepresentation]]];
}
}
#pragma clang diagnostic pop
- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1); // This is the UIImage (without metadata) from step 2 above
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary]; // imageMetadata is the dictionary form step 1 above
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
- (void)dataRepresentation:(DataRepresentationLoaded)callback {
callback(@{@"image": self.imageData, @"video": [NSData dataWithContentsOfURL:self.livePhotoURL]}); // LivePhotoURL is the url from step 3 above
}
kCGImagePropertyMakerAppleDictionary
(这是资产标识符)中搜索键17,并将其与mov文件的com.apple.quicktime.content.identifier
相匹配。 mov文件还需要在捕获静止图像时添加一个条目(com.apple.quicktime.still-image-time
)。- (NSData*)imageData {
NSData *jpgData = UIImageJPEGRepresentation(self.image, 1);
CGImageSourceRef source = CGImageSourceCreateWithData((__bridge CFDataRef) jpgData, NULL);
NSMutableData *dest_data = [NSMutableData data];
CFStringRef uti = CGImageSourceGetType(source);
NSMutableDictionary *maker = [NSMutableDictionary new];
[maker setObject:[self.imageMetadata objectForKey:(NSString*)kCGImagePropertyMakerAppleDictionary] forKey:(NSString *)kCGImagePropertyMakerAppleDictionary];
CGImageDestinationRef destination = CGImageDestinationCreateWithData((__bridge CFMutableDataRef)dest_data,uti,1,NULL);
CGImageDestinationAddImageFromSource(destination, source , 0, (__bridge CFDictionaryRef) maker);
CGImageDestinationFinalize(destination);
return dest_data;
}
com.apple.quicktime.still-image-time
键。关于ios - 尝试使用PHLivePhotoView Objective-C显示实时照片时,图像元数据无效,我们在Stack Overflow上找到一个类似的问题: https://stackoverflow.com/questions/47528440/
欢迎光临 OStack程序员社区-中国程序员成长平台 (https://ostack.cn/) | Powered by Discuz! X3.4 |