裁剪区域与iOS中的选定区域不同?
发布时间:2020-12-14 19:01:37 所属栏目:百科 来源:网络整理
导读:这是 github https://github.com/spennyf/cropVid/tree/master上的链接来尝试一下你的自我,看看我在谈论它需要1分钟来测试.谢谢! 我正在拍摄带有正方形的视频,以显示vid的哪一部分将被裁剪.像这样: 现在我正在做一张纸,正方形有4条线,顶部和底部有半条线差
这是
github
https://github.com/spennyf/cropVid/tree/master上的链接来尝试一下你的自我,看看我在谈论它需要1分钟来测试.谢谢!
我正在拍摄带有正方形的视频,以显示vid的哪一部分将被裁剪.像这样: 现在我正在做一张纸,正方形有4条线,顶部和底部有半条线差异.然后我使用我将发布的代码裁剪视频,但是当我显示视频时,我看到了这个(忽略背景和绿色圆圈): 你可以看到有超过四行,所以我设置它来裁剪某个部分,但它增加了更多,当我使用相机中显示的相同矩形,以及用于裁剪的相同矩形? 所以我的问题是为什么裁剪的尺寸不一样? 这是我如何裁剪和显示: //this is the square on the camera UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0,self.view.frame.size.width,self.view.frame.size.height-80)]; UIImageView *image = [[UIImageView alloc] init]; image.layer.borderColor=[[UIColor whiteColor] CGColor]; image.frame = CGRectMake(self.view.frame.size.width/2 - 58,100,116,116); CALayer *imageLayer = image.layer; [imageLayer setBorderWidth:1]; [view addSubview:image]; [picker setCameraOverlayView:view]; //this is crop rect CGRect rect = CGRectMake(self.view.frame.size.width/2 - 58,116); [self applyCropToVideoWithAsset:assest AtRect:rect OnTimeRange:CMTimeRangeMake(kCMTimeZero,CMTimeMakeWithSeconds(assest.duration.value,1)) ExportToUrl:exportUrl ExistingExportSession:exporter WithCompletion:^(BOOL success,NSError *error,NSURL *videoUrl) { //here is player AVPlayer *player = [AVPlayer playerWithURL:videoUrl]; AVPlayerLayer *layer = [AVPlayerLayer playerLayerWithPlayer:player]; layer.frame = CGRectMake(self.view.frame.size.width/2 - 58,116); }]; 以下是执行裁剪的代码: - (UIImageOrientation)getVideoOrientationFromAsset:(AVAsset *)asset { AVAssetTrack *videoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; CGSize size = [videoTrack naturalSize]; CGAffineTransform txf = [videoTrack preferredTransform]; if (size.width == txf.tx && size.height == txf.ty) return UIImageOrientationLeft; //return UIInterfaceOrientationLandscapeLeft; else if (txf.tx == 0 && txf.ty == 0) return UIImageOrientationRight; //return UIInterfaceOrientationLandscapeRight; else if (txf.tx == 0 && txf.ty == size.width) return UIImageOrientationDown; //return UIInterfaceOrientationPortraitUpsideDown; else return UIImageOrientationUp; //return UIInterfaceOrientationPortrait; } 这是其余的裁剪代码: - (AVAssetExportSession*)applyCropToVideoWithAsset:(AVAsset*)asset AtRect:(CGRect)cropRect OnTimeRange:(CMTimeRange)cropTimeRange ExportToUrl:(NSURL*)outputUrl ExistingExportSession:(AVAssetExportSession*)exporter WithCompletion:(void(^)(BOOL success,NSError* error,NSURL* videoUrl))completion { // NSLog(@"CALLED"); //create an avassetrack with our asset AVAssetTrack *clipVideoTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //create a video composition and preset some settings AVMutableVideoComposition* videoComposition = [AVMutableVideoComposition videoComposition]; videoComposition.frameDuration = CMTimeMake(1,30); CGFloat cropOffX = cropRect.origin.x; CGFloat cropOffY = cropRect.origin.y; CGFloat cropWidth = cropRect.size.width; CGFloat cropHeight = cropRect.size.height; // NSLog(@"width: %f - height: %f - x: %f - y: %f",cropWidth,cropHeight,cropOffX,cropOffY); videoComposition.renderSize = CGSizeMake(cropWidth,cropHeight); //create a video instruction AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = cropTimeRange; AVMutableVideoCompositionLayerInstruction* transformer = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:clipVideoTrack]; UIImageOrientation videoOrientation = [self getVideoOrientationFromAsset:asset]; CGAffineTransform t1 = CGAffineTransformIdentity; CGAffineTransform t2 = CGAffineTransformIdentity; switch (videoOrientation) { case UIImageOrientationUp: t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.height - cropOffX,0 - cropOffY ); t2 = CGAffineTransformRotate(t1,M_PI_2 ); break; case UIImageOrientationDown: t1 = CGAffineTransformMakeTranslation(0 - cropOffX,clipVideoTrack.naturalSize.width - cropOffY ); // not fixed width is the real height in upside down t2 = CGAffineTransformRotate(t1,- M_PI_2 ); break; case UIImageOrientationRight: t1 = CGAffineTransformMakeTranslation(0 - cropOffX,0 ); break; case UIImageOrientationLeft: t1 = CGAffineTransformMakeTranslation(clipVideoTrack.naturalSize.width - cropOffX,clipVideoTrack.naturalSize.height - cropOffY ); t2 = CGAffineTransformRotate(t1,M_PI ); break; default: NSLog(@"no supported orientation has been found in this video"); break; } CGAffineTransform finalTransform = t2; [transformer setTransform:finalTransform atTime:kCMTimeZero]; //add the transformer layer instructions,then add to video composition instruction.layerInstructions = [NSArray arrayWithObject:transformer]; videoComposition.instructions = [NSArray arrayWithObject: instruction]; //Remove any prevouis videos at that path [[NSFileManager defaultManager] removeItemAtURL:outputUrl error:nil]; if (!exporter){ exporter = [[AVAssetExportSession alloc] initWithAsset:asset presetName:AVAssetExportPresetHighestQuality] ; } // assign all instruction for the video processing (in this case the transformation for cropping the video exporter.videoComposition = videoComposition; exporter.outputFileType = AVFileTypeQuickTimeMovie; if (outputUrl){ exporter.outputURL = outputUrl; [exporter exportAsynchronouslyWithCompletionHandler:^{ switch ([exporter status]) { case AVAssetExportSessionStatusFailed: NSLog(@"crop Export failed: %@",[[exporter error] localizedDescription]); if (completion){ dispatch_async(dispatch_get_main_queue(),^{ completion(NO,[exporter error],nil); }); return; } break; case AVAssetExportSessionStatusCancelled: NSLog(@"crop Export canceled"); if (completion){ dispatch_async(dispatch_get_main_queue(),nil,nil); }); return; } break; default: break; } if (completion){ dispatch_async(dispatch_get_main_queue(),^{ completion(YES,outputUrl); }); } }]; } return exporter; } 所以我的问题是为什么视频区域与裁剪/相机区域不同,当我使用完全相同的坐标和大小的方块时? 解决方法
也许
Check This Previous Question.
看起来它可能与您遇到的情况类似.该问题的用户建议以这种方式裁剪: CGImageRef imageRef = CGImageCreateWithImageInRect([originalImage CGImage],cropRect); UIImage *croppedImage = [UIImage imageWithCGImage:imageRef]; CGImageRelease(imageRef); 我希望这有助于或至少为您提供正确方向的开端. (编辑:李大同) 【声明】本站内容均来自网络,其相关言论仅代表作者个人观点,不代表本站立场。若无意侵犯到您的权利,请及时与联系站长删除相关内容! |