2

使用 AVCapturePhotoOutput 设置自定义相机。配置 AVCapturePhotoOutput 以在主 JPEG 缓冲区之外提供预览缓冲区(缩略图)。

问题是我只接收一次预览缓冲区(第一次捕获),然后从第二次开始接收 nil(始终正确接收主 photoSampleBuffer)。

这是我设置捕获的方式:

func capturePhoto() {

    guard let videoPreviewLayerOrientation = deviceOrientation.videoOrientation else { return }

    sessionQueue.async {
        if let photoOutputConnection = self.photoOutput.connection(withMediaType: AVMediaTypeVideo) {
            photoOutputConnection.videoOrientation = videoPreviewLayerOrientation
        }

        // each photo captured requires a brand new setting object and capture delegate
        let photoSettings = AVCapturePhotoSettings()

        // Capture a JPEG photo with flash set to auto and high resolution photo enabled.
        photoSettings.isHighResolutionPhotoEnabled = true

        //configure to receive a preview image (thumbnail)
        if let previewPixelType = photoSettings.availablePreviewPhotoPixelFormatTypes.first {
            let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String : previewPixelType,
                                 kCVPixelBufferWidthKey as String : NSNumber(value: 160),
                                 kCVPixelBufferHeightKey as String : NSNumber(value: 160)]
            photoSettings.previewPhotoFormat = previewFormat
        }

        // TODO: photoSettings.flashMode = .auto 

        // Use a separate object for the photo capture delegate to isolate each capture life cycle.
        let photoCaptureDelegate = PhotoCaptureDelegate(with: photoSettings, willCapturePhotoAnimation: { [unowned self] in
            // show shutter animation
            self.shutterAnimation()
            }, completed: { [unowned self] (photoCaptureDelegate, photoData, previewThumbnail) in

                self.captureCompleted(photoCaptureDelegate: photoCaptureDelegate, data: photoData, thumbnail: previewThumbnail)
            }
        )
      // The Photo Output keeps a weak reference to the photo capture delegate so we store it in an array
        // to maintain a strong reference to this object until the capture is completed.
      self.inProgressPhotoCaptureDelegates[photoCaptureDelegate.requestedPhotoSettings.uniqueID] = photoCaptureDelegate
        self.photoOutput.capturePhoto(with: photoSettings, delegate: photoCaptureDelegate)
    }
}

在我的 PhotoCaptureDelegate(实现 AVCapturePhotoCaptureDelegate)中:

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

    if let photoBuffer = photoSampleBuffer {
        photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoBuffer, previewPhotoSampleBuffer: nil)
    }


    if let previewBuffer = previewPhotoSampleBuffer {
        if let pixelBuffer = CMSampleBufferGetImageBuffer(previewBuffer) {
            photoThumbnail = CIImage(cvPixelBuffer: pixelBuffer)
        }

    }
}

发生的情况是,我第一次拍摄时同时收到 photoSampleBuffer 和 previewPhotoSampleBuffer。第二次以后我只收到photoSampleBufferpreviewPhotoSampleBuffer = nil虽然当我检查时resolvedSettings.previewDimensions我得到: CMVideoDimensions(width: 160, height: 120)

如果我通过重新配置捕获会话来切换相机(从前到后),那么之后的第一次捕获就可以了,然后再次没有预览缓冲区。委托回调中的error参数始终为零。

在运行 iOS 10.3.1 的 iPhone 6 上测试

4

1 回答 1

0

找到了解决方案,但不完全了解它是如何导致最初的问题的。

我已将照片捕获委托中预览样本缓冲区的转换更改为通过 CIContext 对象转换为 UIImage。之前我只是创建了一个 CIImage 并将其发送到 UI(不同的线程),似乎 CIImage 持有原始缓冲区的引用,并且稍后完成的 UI 处理以某种影响下一次捕获的方式与它混淆(再次,不明白为什么)。

新代码创建图像的新位图副本(通过 cgImage),然后将其发送到 UI -> 因此不对原始缓冲区进行任何处理。

func capture(_ captureOutput: AVCapturePhotoOutput, didFinishProcessingPhotoSampleBuffer photoSampleBuffer: CMSampleBuffer?, previewPhotoSampleBuffer: CMSampleBuffer?, resolvedSettings: AVCaptureResolvedPhotoSettings, bracketSettings: AVCaptureBracketedStillImageSettings?, error: Error?) {

    if let photoBuffer = photoSampleBuffer {
        photoData = AVCapturePhotoOutput.jpegPhotoDataRepresentation(forJPEGSampleBuffer: photoBuffer, previewPhotoSampleBuffer: nil)
    }

    let previewWidth = Int(resolvedSettings.previewDimensions.width)
    let previewHeight = Int(resolvedSettings.previewDimensions.height)

    if let previewBuffer = previewPhotoSampleBuffer {
        if let imageBuffer = CMSampleBufferGetImageBuffer(previewBuffer) {
            let ciImagePreview = CIImage(cvImageBuffer: imageBuffer)
            let context = CIContext()
            if let cgImagePreview = context.createCGImage(ciImagePreview, from: CGRect(x: 0, y: 0, width:previewWidth , height:previewHeight )) {
                photoThumbnail = UIImage(cgImage: cgImagePreview)
            }
        }
    }
}
于 2017-05-30T06:25:11.063 回答