objective-c – AVCaptureVideoDataOutput和AVCaptureAudioDataO
我尝试从AVCaptureVideoDataOutput和AVCaptureAudioDataOutput获取CMSampleBufferRef.
AVCamRecorder.h #import <AVFoundation/AVFoundation.h> @interface AVCamRecorder : NSObject { } @property (nonatomic,retain) AVCaptureVideoDataOutput *videoDataOutput; @property (nonatomic,retain) AVCaptureAudioDataOutput *audioDataOutput; @end AVCamRecorder.m #import "AVCamRecorder.h" #import <AVFoundation/AVFoundation.h> @interface AVCamRecorder (VideoDataOutputDelegate) <AVCaptureVideoDataOutputSampleBufferDelegate> @end @interface AVCamRecorder (AudioDataOutputDelegate) <AVCaptureAudioDataOutputSampleBufferDelegate> @end -(id)initWithSession:(AVCaptureSession *)aSession { self = [super init]; if (self != nil) { //AudioDataoutput AVCaptureAudioDataOutput *aAudioDataOutput = [[AVCaptureAudioDataOutput alloc] init]; //VideoDataoutput AVCaptureVideoDataOutput *aMovieDataOutput = [[AVCaptureVideoDataOutput alloc] init]; if ([aSession canAddOutput:aAudioDataOutput]) { [aSession addOutput:aAudioDataOutput]; } if ([aSession canAddOutput:aMovieDataOutput]) { [aSession addOutput:aMovieDataOutput]; } [aAudioDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; [aMovieDataOutput setSampleBufferDelegate:self queue:dispatch_get_main_queue()]; [self setAudioDataOutput:aAudioDataOutput]; [self setVideoDataOutput:aMovieDataOutput]; [self setSession:aSession]; } return self; } @implementation AVCamRecorder (VideoDataOutputDelegate) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"VideoDataOutputDelegate = %@",captureOutput); } @end @implementation AVCamRecorder (AudioDataOutputDelegate) - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { NSLog(@"AudioDataOutputDelegate = %@",captureOutput); } @end 奇怪的是,我在“@implementation AVCamRecorder(AudioDataOutputDelegate)”中获得了视频数据. AudioDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0> 我切换了“@implementation AVCamRecorder(VideoDataOutputDelegate)”和“@implementation AVCamRecorder(VideoDataOutputDelegate)”的顺序,我得到了 VideoDataOutputDelegate = <AVCaptureVideoDataOutput: 0x208a7df0> 似乎我无法设置2“captureOutput:didOutputSampleBuffer:fromConnection:”.否则,数据进入任何一个. 或者,我是否错误地设置了“@implementation AVCamRecorder(VideoDataOutputDelegate)”和“@implementation AVCamRecorder(AudioDataOutputDelegate)”? 我想我不需要单独回调,但我只是想知道出了什么问题. 提前谢谢你的帮助. 解决方法
您已在同一个类中定义了两个类别
AVCamRecorder (VideoDataOutputDelegate) AVCamRecorder (AudioDataOutputDelegate) 声明相同的方法 - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection; 这导致未定义的行为.请参阅“使用Objective-C编程”指南中的Avoid Category Method Name Clashes:
所以你的设置无法正常工作.你可以改为 >定义两个单独的类,一个作为音频,一个作为视频代表,>定义一个充当音频视频委托的类类别(并检查调用它的函数的回调方法),>只需将AVCamRecorder本身用作音频视频代表. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |