2

我有一个 layerRect 显示我的相机的图像是这样的:

CGRect layerRect = [[videoStreamView layer] bounds];
[[[self captureManager] previewLayer] setBounds:layerRect];
[[[self captureManager] previewLayer] setPosition:CGPointMake(CGRectGetMidX(layerRect),
                                                              CGRectGetMidY(layerRect))];
[[videoStreamView layer] addSublayer:[[self captureManager] previewLayer]];

videoStreamView是我显示视频的视图,即 150x150。但是我使用setSampleBufferDelegateAVCaptureVideoDataOutput的,我得到的视频帧是整个相机图像(1280 * 720)。我该如何修改它?谢谢。

4

4 回答 4

1

我相信它是由AVCaptureSession'sessionPreset财产控制的。我仍在试图弄清楚每个预设值在图像大小等方面的含义。

于 2014-10-30T15:49:14.063 回答
0

也许这可以解决它?

CGRect bounds = view.layer.bounds;
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
previewLayer.bounds=bounds;
previewLayer.position=CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds));
于 2013-08-16T11:55:43.783 回答
-1
@property (nonatomic, retain) AVCaptureVideoPreviewLayer *prevLayer;

然后:

self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
    self.prevLayer.frame = yourRect;
[self.view.layer addSublayer: self.prevLayer];
于 2013-08-16T11:39:32.067 回答
-1

试试这个代码

AVCaptureVideoPreviewLayer *newCaptureVideoPreviewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:[self.captureManager session]];
newCaptureVideoPreviewLayer.frame = bounds;//CGRectMake(bounds.origin.x, bounds.origin.y, bounds.size.height, bounds.size.width);
于 2013-08-16T11:34:00.820 回答