7

几天来我一直在努力解决这个问题。

我想在 CALayer (AVCaptureVideoPreviewLayer) 上绘制一个矩形,它恰好是 iPhone4 上相机的视频输入。

这是我设置的一部分;

    //(in function for initialization)

        -(void)initDevices {
           AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput                   deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

           AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
           captureOutput.alwaysDiscardsLateVideoFrames = YES; 
           captureOutput.minFrameDuration = CMTimeMake(1, 30);
           dispatch_queue_t queue;
           queue = dispatch_queue_create("cameraQueue", NULL);
           [captureOutput setSampleBufferDelegate:self queue:queue];
           dispatch_release(queue);

           NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
           NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
           NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
           [captureOutput setVideoSettings:videoSettings]; 
           self.captureSession = [[AVCaptureSession alloc] init];
           [self.captureSession addInput:captureInput];
           [self.captureSession addOutput:captureOutput];
           [self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];   
           self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
           self.prevLayer.frame = CGRectMake(0, 0, 400, 400); 
           self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
           self.prevLayer.delegate = self;
           [self.view.layer addSublayer: self.prevLayer];
    }

    - (void)captureOutput:(AVCaptureOutput *)captureOutput 
      didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
      fromConnection:(AVCaptureConnection *)connection { 

       [self performSelectorOnMainThread:@selector(drawme) withObject:nil waitUntilDone:YES];
    }

    - (void)drawme {
 [self.prevLayer setNeedsDisplay];
    }

    //delegate function that draws to a CALayer
    - (void)drawLayer:(CALayer*)layer inContext:(CGContextRef)ctx {
     NSLog(@"hello layer!");
     CGContextSetRGBFillColor (ctx, 1, 0, 0, 1);
            CGContextFillRect (ctx, CGRectMake (0, 0, 200, 100 ));
    }

这甚至可能吗?从我当前的代码中,我得到了“hello layer”打印,但相机馈送没有填充矩形。

任何帮助都是极好的。:)

4

2 回答 2

1

或者您可以只插入一个图像 - 您可以使用 AVCaptureVideoPreviewLayer 进行视频捕获,并创建另一个 CALayer() 并使用 layer.insertSublayer(..., above: ...) 在视频层上方插入您的“自定义”层,按惯例,我的意思是另一个CALayer,可以说

layer.contents = spinner.cgImage

这是 Swift 的更详细说明

于 2017-08-05T03:09:49.430 回答
1

I think you should add another layer to AVCaptureVideoPreviewLayer and I modify the example code for you. You can try it.

    //(in function for initialization)

    -(void)initDevices {
       AVCaptureDeviceInput *captureInput = [AVCaptureDeviceInput                   deviceInputWithDevice:[AVCaptureDevicedefaultDeviceWithMediaType:AVMediaTypeVideo] error:nil];

       AVCaptureVideoDataOutput *captureOutput = [[AVCaptureVideoDataOutput alloc] init];
       captureOutput.alwaysDiscardsLateVideoFrames = YES; 
       captureOutput.minFrameDuration = CMTimeMake(1, 30);
       dispatch_queue_t queue;
       queue = dispatch_queue_create("cameraQueue", NULL);
       [captureOutput setSampleBufferDelegate:self queue:queue];
       dispatch_release(queue);

       NSString* key = (NSString*)kCVPixelBufferPixelFormatTypeKey; 
       NSNumber* value = [NSNumber numberWithUnsignedInt:kCVPixelFormatType_32BGRA]; 
       NSDictionary* videoSettings = [NSDictionary dictionaryWithObject:value forKey:key]; 
       [captureOutput setVideoSettings:videoSettings]; 
       self.captureSession = [[AVCaptureSession alloc] init];
       [self.captureSession addInput:captureInput];
       [self.captureSession addOutput:captureOutput];
       [self.captureSession setSessionPreset:AVCaptureSessionPresetHigh];   
       self.prevLayer = [AVCaptureVideoPreviewLayer layerWithSession: self.captureSession];
       self.prevLayer.frame = CGRectMake(0, 0, 400, 400); 
       self.prevLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
       [self.view.layer addSublayer:self.prevLayer];

       self.drawLayer = [CAShapeLayer layer];
       CGRect parentBox = [self.captureVideoPreviewLayer frame];
       [self.drawLayer setFrame:parentBox];
       [self.drawLayer setDelegate:self];
       [self.drawLayer setNeedsDisplay];
       [self.captureVideoPreviewLayer addSublayer:self.drawLayer];
}

- (void)captureOutput:(AVCaptureOutput *)captureOutput 
    didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer 
    fromConnection:(AVCaptureConnection *)connection { 

    [self performSelectorOnMainThread:@selector(drawme) withObject:nil waitUntilDone:YES];
}

- (void)drawme {
    [self.drawLayer setNeedsDisplay];
}

//delegate function that draws to a CALayer
- (void)drawLayer:(CALayer*)layer inContext:(CGContextRef)ctx {
    NSLog(@"hello layer!");
    CGContextSetRGBFillColor (ctx, 1, 0, 0, 1);
    CGContextFillRect (ctx, CGRectMake (0, 0, 200, 100 ));
}
于 2015-08-20T10:45:53.260 回答