我在 iOS 上使用 AVSampleBufferDisplayLayer 时遇到了困难。我想使用这个层显示一个 CVPixelBuffer,但我无法让它在实际的 iOS 设备上工作。在我的示例应用程序中,我尝试使用以下代码来显示一种颜色像素缓冲区:
@implementation ViewController {
AVSampleBufferDisplayLayer *videoLayer;
}
- (void)viewDidLoad {
[super viewDidLoad];
videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
videoLayer.frame = CGRectMake(50, 50, 300, 300);
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:videoLayer];
}
@implementation ViewController {
AVSampleBufferDisplayLayer *videoLayer;
}
- (void)viewDidLoad {
[super viewDidLoad];
videoLayer = [[AVSampleBufferDisplayLayer alloc] init];
videoLayer.frame = CGRectMake(50, 50, 300, 300);
videoLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:videoLayer];
}
- (void)viewDidAppearBOOL)animated {
[super viewDidAppear:animated];
[self startVideo];
}
- (void)startVideo {
[self drawPixelBuffer];
[NSTimer scheduledTimerWithTimeInterval:0.1 target:self selectorselector(drawPixelBuffer) userInfo:nil repeats:YES];
}
- (void)drawPixelBuffer {
int imageSize = 100;
static const uint8_t pixel[] = {0x00, 0xAA, 0xFF, 0xFF};
NSMutableData *frame = [NSMutableData data];
for (int i = 0; i < imageSize * imageSize; i++) {
[frame appendBytes:pixel length:4];
}
CVPixelBufferRef pixelBuffer = NULL;
CVPixelBufferCreateWithBytes(NULL, imageSize, imageSize, kCVPixelFormatType_32BGRA, [frame bytes], imageSize * 4, NULL, NULL, NULL, &pixelBuffer);
CMSampleBufferRef sampleBuffer = [self sampleBufferFromPixelBuffer:pixelBuffer];
if (sampleBuffer) {
[videoLayer enqueueSampleBuffer:sampleBuffer];
CFRelease(sampleBuffer);
}
}
- (CMSampleBufferRef)sampleBufferFromPixelBufferCVPixelBufferRef)pixelBuffer {
CMSampleBufferRef sampleBuffer = NULL;
OSStatus err = noErr;
CMVideoFormatDescriptionRef formatDesc = NULL;
err = CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &formatDesc);
if (err != noErr) {
return nil;
}
CMSampleTimingInfo sampleTimingInfo = kCMTimingInfoInvalid;
err = CMSampleBufferCreateReadyWithImageBuffer(kCFAllocatorDefault, pixelBuffer, formatDesc, &sampleTimingInfo, &sampleBuffer);
if (sampleBuffer) {
CFArrayRef attachments = CMSampleBufferGetSampleAttachmentsArray(sampleBuffer, YES);
CFMutableDictionaryRef dict = (CFMutableDictionaryRef)CFArrayGetValueAtIndex(attachments, 0);
CFDictionarySetValue(dict, kCMSampleAttachmentKey_DisplayImmediately, kCFBooleanTrue);
}
if (err != noErr) {
return nil;
}
formatDesc = NULL;
return sampleBuffer;
}
@end
这在 iOS 模拟器中运行没有任何问题,但在真实设备上无法运行(没有渲染)。视频层的错误属性始终为 nil,状态始终等于 AVQueuedSampleBufferRenderingStatusRendering。
感谢您的帮助。
Best Answer-推荐答案 strong>
模拟器中的图形实现要健壮得多,而且您通常可以摆脱在设备上无法运行的东西。常见的原因有两种:
像素缓冲区应该由 IOSurface 支持
您正在通过 CVPixelBufferCreateWithBytes 直接映射该缓冲区。再次尝试使用 CVPixelBufferCreate 并将 kCVPixelBufferIOSurfacePropertiesKey 属性设置为空字典。
CVPixelBufferCreate(
NULL,
imageSize,
imageSize,
kCVPixelFormatType_32BGRA,
(__bridge CFDictionaryRef)@{
(id)kCVPixelBufferIOSurfacePropertiesKey: @{}
},
&pixelBuffer);
CVPixelBufferLockBaseAddress(pixelBuffer, 0);
void *bytes = CVPixelBufferGetBaseAddress(pixelBuffer);
// Write image data directly to that address
CVPixelBufferUnlockBaseAddress(pixelBuffer, 0);
实际上,任何生成这些像素的东西都应该尽可能直接写入 CVPixelBufferRef 。
设备不支持某些格式
这对于 kCVPixelFormatType_32BGRA 来说似乎极不可能,但我已经看到了仅模拟器支持,例如 kCVPixelFormatType_422YpCbCr8 。在这些情况下,必须首先将其转换为兼容格式,或者必须实现自定义渲染器(OpenGL、Metal 等)。
关于ios - AVSampleBufferDisplayLayer 不在设备上呈现,我们在Stack Overflow上找到一个类似的问题:
https://stackoverflow.com/questions/35365008/
|