Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
539 views
in Technique[技术] by (71.8m points)

iphone - Performance issues when using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput

I'm having lag issues when I'm recording audio+video by using AVCaptureVideoDataOutput and AVCaptureAudioDataOutput. Sometimes the video blocks for a few milliseconds, sometimes the audio is not in sync with the video.

I inserted some logs and observed that first I get a lot of video buffers in captureOutput callback, and after some time I get the audio buffers(sometimes I don't receive the audio buffers at all, and the resulting video is without sound). If I comment the code that handles the video buffers, I get the audio buffers without problems.

This is the code I'm using:

-(void)initMovieOutput:(AVCaptureSession *)captureSessionLocal
{   
    AVCaptureVideoDataOutput *dataOutput = [[AVCaptureVideoDataOutput alloc] init];
    self._videoOutput = dataOutput;
    [dataOutput release];

    self._videoOutput.alwaysDiscardsLateVideoFrames = NO;
    self._videoOutput.videoSettings = [NSDictionary dictionaryWithObject: [NSNumber numberWithInt:kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange]
                                                             forKey:(id)kCVPixelBufferPixelFormatTypeKey
                                  ];     
    AVCaptureAudioDataOutput *audioOutput =  [[AVCaptureAudioDataOutput alloc] init];
    self._audioOutput = audioOutput;
    [audioOutput release];

    [captureSessionLocal addOutput:self._videoOutput];
    [captureSessionLocal addOutput:self._audioOutput];


    // Setup the queue
    dispatch_queue_t queue = dispatch_queue_create("MyQueue", NULL);
    [self._videoOutput setSampleBufferDelegate:self queue:queue];
    [self._audioOutput setSampleBufferDelegate:self queue:queue];
    dispatch_release(queue);
}

Here I set up the writer:

-(BOOL) setupWriter:(NSURL *)videoURL session:(AVCaptureSession *)captureSessionLocal
{
    NSError *error = nil;
    self._videoWriter = [[AVAssetWriter alloc] initWithURL:videoURL fileType:AVFileTypeQuickTimeMovie
                                                         error:&error];
    NSParameterAssert(self._videoWriter);


    // Add video input  
    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:640], AVVideoWidthKey,
                                   [NSNumber numberWithInt:480], AVVideoHeightKey,
                                   nil];

    self._videoWriterInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo
                                                                         outputSettings:videoSettings];


    NSParameterAssert(self._videoWriterInput);
    self._videoWriterInput.expectsMediaDataInRealTime = YES;
    self._videoWriterInput.transform = [self returnOrientation];

    // Add the audio input
    AudioChannelLayout acl;
    bzero( &acl, sizeof(acl));
    acl.mChannelLayoutTag = kAudioChannelLayoutTag_Mono;


    NSDictionary* audioOutputSettings = nil;          
    // Both type of audio inputs causes output video file to be corrupted.

        // should work on any device requires more space
        audioOutputSettings = [ NSDictionary dictionaryWithObjectsAndKeys:                       
                               [ NSNumber numberWithInt: kAudioFormatAppleLossless ], AVFormatIDKey,
                               [ NSNumber numberWithInt: 16 ], AVEncoderBitDepthHintKey,
                               [ NSNumber numberWithFloat: 44100.0 ], AVSampleRateKey,
                               [ NSNumber numberWithInt: 1 ], AVNumberOfChannelsKey,                                      
                               [ NSData dataWithBytes: &acl length: sizeof( acl ) ], AVChannelLayoutKey,
                               nil ];

    self._audioWriterInput = [AVAssetWriterInput 
                                        assetWriterInputWithMediaType: AVMediaTypeAudio 
                                        outputSettings: audioOutputSettings ];

    self._audioWriterInput.expectsMediaDataInRealTime = YES;    

    // add input
    [self._videoWriter addInput:_videoWriterInput];
    [self._videoWriter addInput:_audioWriterInput];

    return YES;
}

And here is the callback:

- (void)captureOutput:(AVCaptureOutput *)captureOutput
didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer
       fromConnection:(AVCaptureConnection *)connection
{

    if( !CMSampleBufferDataIsReady(sampleBuffer) )
    {
        NSLog( @"sample buffer is not ready. Skipping sample" );
        return;
    }
    if( _videoWriter.status !=  AVAssetWriterStatusCompleted )
    {
        if( _videoWriter.status != AVAssetWriterStatusWriting  )
        {               
            CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer); 
            [_videoWriter startWriting];
            [_videoWriter startSessionAtSourceTime:lastSampleTime];
        }

        if( captureOutput == _videoOutput )
        {
            if( [self._videoWriterInput isReadyForMoreMediaData] )
            {

            [self newVideoSample:sampleBuffer];

            }
        }
        else if( captureOutput == _audioOutput )
        {
            if( [self._audioWriterInput isReadyForMoreMediaData] )
            {

                 [self newAudioSample:sampleBuffer];


            }
        }
    }

}

-(void) newAudioSample:(CMSampleBufferRef)sampleBuffer
{

        if( _videoWriter.status > AVAssetWriterStatusWriting )
        {

            [self NSLogPrint:[NSString stringWithFormat:@"Audio:Warning: writer status is %d", _videoWriter.status]];
            if( _videoWriter.status == AVAssetWriterStatusFailed )
                [self NSLogPrint:[NSString stringWithFormat:@"Audio:Error: %@", _videoWriter.error]];
            return;
        }

        if( ![_audioWriterInput appendSampleBuffer:sampleBuffer] )
            [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to audio input"]];

}

-(void) newVideoSample:(CMSampleBufferRef)sampleBuffer
{
    if( _videoWriter.status > AVAssetWriterStatusWriting )
    {
        [self NSLogPrint:[NSString stringWithFormat:@"Video:Warning: writer status is %d", _videoWriter.status]];
        if( _videoWriter.status == AVAssetWriterStatusFailed )
            [self NSLogPrint:[NSString stringWithFormat:@"Video:Error: %@", _videoWriter.error]];
        return;
    }


    if( ![_videoWriterInput appendSampleBuffer:sampleBuffer] )
        [self NSLogPrint:[NSString stringWithFormat:@"Unable to write to video input"]];
}

Is there something wrong in my code, why does the video lag? (I'm testing it on a Iphone 4 ios 4.2.1)

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

It looks like you are using serial queues. The audio Output queue is right after the video output queue. Consider using concurrent queues.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...