It is possible to get CMSampleBufferRef
s from multiple video devices on MacOS X. You have to setup the AVCaptureConnection
objects manually. For example, assuming you have these objects:
AVCaptureSession *session;
AVCaptureInput *videoInput1;
AVCaptureInput *videoInput2;
AVCaptureVideoDataOutput *videoOutput1;
AVCaptureVideoDataOutput *videoOutput2;
Do NOT add the outputs like this:
[session addOutput:videoOutput1];
[session addOutput:videoOutput2];
Instead, add them and tell the session not to make any connections:
[session addOutputWithNoConnections:videoOutput1];
[session addOutputWithNoConnections:videoOutput2];
Then for each input/output pair make the connection from the input's video port to the output manually:
for (AVCaptureInputPort *port in [videoInput1 ports]) {
if ([[port mediaType] isEqualToString:AVMediaTypeVideo]) {
AVCaptureConnection* cxn = [AVCaptureConnection
connectionWithInputPorts:[NSArray arrayWithObject:port]
output:videoOutput1
];
if ([session canAddConnection:cxn]) {
[session addConnection:cxn];
}
break;
}
}
Finally, make sure to set sample buffer delegates for both outputs:
[videoOutput1 setSampleBufferDelegate:self queue:someDispatchQueue];
[videoOutput2 setSampleBufferDelegate:self queue:someDispatchQueue];
and now you should be able to process frames from both devices:
- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
fromConnection:(AVCaptureConnection *)connection
{
if (captureOutput == videoOutput1)
{
// handle frames from first device
}
else if (captureOutput == videoOutput2)
{
// handle frames from second device
}
}
See also the AVVideoWall sample project for an example of combining live previews from multiple video devices.
与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…