加入收藏 | 设为首页 | 会员中心 | 我要投稿 李大同 (https://www.lidatong.com.cn/)- 科技、建站、经验、云计算、5G、大数据,站长网!
当前位置: 首页 > 百科 > 正文

objective-c – 使用CGDisplayStream编码H.264压缩会话

发布时间:2020-12-16 06:53:14 所属栏目:百科 来源:网络整理
导读:我正在尝试使用屏幕上的数据创建一个H.264压缩会话.我已经创建了一个CGDisplayStreamRef实例,如下所示: displayStream = CGDisplayStreamCreateWithDispatchQueue(0,100,k32BGRAPixelFormat,nil,self.screenCaptureQueue,^(CGDisplayStreamFrameStatus stat
我正在尝试使用屏幕上的数据创建一个H.264压缩会话.我已经创建了一个CGDisplayStreamRef实例,如下所示:

displayStream = CGDisplayStreamCreateWithDispatchQueue(0,100,k32BGRAPixelFormat,nil,self.screenCaptureQueue,^(CGDisplayStreamFrameStatus status,uint64_t displayTime,IOSurfaceRef frameSurface,CGDisplayStreamUpdateRef updateRef) {
    //Call encoding session here
});

以下是我目前如何设置编码功能:

- (void) encode:(CMSampleBufferRef )sampleBuffer {
    CVImageBufferRef imageBuffer = (CVImageBufferRef)CMSampleBufferGetImageBuffer(sampleBuffer);
    CMTime presentationTimeStamp = CMTimeMake(frameID++,1000);
    VTEncodeInfoFlags flags;
    OSStatus statusCode = VTCompressionSessionEncodeFrame(EncodingSession,imageBuffer,presentationTimeStamp,kCMTimeInvalid,NULL,&flags);
    if (statusCode != noErr) {
        NSLog(@"H264: VTCompressionSessionEncodeFrame failed with %d",(int)statusCode);

        VTCompressionSessionInvalidate(EncodingSession);
        CFRelease(EncodingSession);
        EncodingSession = NULL;
        return;
    }
    NSLog(@"H264: VTCompressionSessionEncodeFrame Success");
}

我试图了解如何将数据从我的屏幕转换为CMSampleBufferRef,以便我可以正确调用我的编码功能.到目前为止,我还无法确定这是否可行,或者是我正在尝试做的正确方法.有没有人有什么建议?

编辑:我已经将我的IOSurface转换为CMBlockBuffer,但尚未弄清楚如何将其转换为CMSampleBufferRef:

void *mem = IOSurfaceGetBaseAddress(frameSurface);
size_t bytesPerRow = IOSurfaceGetBytesPerRow(frameSurface);
size_t height = IOSurfaceGetHeight(frameSurface);
size_t totalBytes = bytesPerRow * height;

CMBlockBufferRef blockBuffer;

CMBlockBufferCreateWithMemoryBlock(kCFAllocatorNull,mem,totalBytes,kCFAllocatorNull,&blockBuffer);

编辑2

更进一步:

CMSampleBufferRef *sampleBuffer;

OSStatus sampleStatus = CMSampleBufferCreate(
                             NULL,blockBuffer,TRUE,1,sampleBuffer);

[self encode:*sampleBuffer];

解决方法

可能,我有点迟了但是,它可能对其他人有帮助:

CGDisplayStreamCreateWithDispatchQueue(CGMainDisplayID(),CGDisplayStreamUpdateRef updateRef) {
    // The created pixel buffer retains the surface object.
    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreateWithIOSurface(NULL,frameSurface,&pixelBuffer);

    // Create the video-type-specific description for the pixel buffer.
    CMVideoFormatDescriptionRef videoFormatDescription;
    CMVideoFormatDescriptionCreateForImageBuffer(NULL,pixelBuffer,&videoFormatDescription);

    // All the necessary parts for creating a `CMSampleBuffer` are ready.
    CMSampleBufferRef sampleBuffer;
    CMSampleTimingInfo timingInfo;
    CMSampleBufferCreateReadyWithImageBuffer(NULL,videoFormatDescription,&timingInfo,&sampleBuffer);

    // Do the stuff

    // Release the resources to let the frame surface be reused in the queue
    // `kCGDisplayStreamQueueDepth` is responsible for the size of the queue 
    CFRelease(sampleBuffer);
    CFRelease(pixelBuffer);
});

(编辑:李大同)

【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容!

    推荐文章
      热点阅读