我正在开发一个基于 Swift 的 macOS 应用程序,我需要在其中捕获视频输入,但不将其显示在屏幕上......而不是显示视频,我想将缓冲的数据发送到其他地方进行处理,并最终显示它在SceneKit
场景中的一个物体上。
我有一个CameraInput
有prepareCamera
方法的类:
fileprivate func prepareCamera() {
self.videoSession = AVCaptureSession()
self.videoSession.sessionPreset = AVCaptureSession.Preset.photo
if let devices = AVCaptureDevice.devices() as? [AVCaptureDevice] {
for device in devices {
if device.hasMediaType(AVMediaType.video) {
cameraDevice = device
if cameraDevice != nil {
do {
let input = try AVCaptureDeviceInput(device: cameraDevice)
if videoSession.canAddInput(input) {
videoSession.addInput(input)
}
} catch {
print(error.localizedDescription)
}
}
}
}
let videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self as AVCaptureVideoDataOutputSampleBufferDelegate, queue: DispatchQueue(label: "sample buffer delegate", attributes: []))
if videoSession.canAddOutput(videoOutput) {
videoSession.addOutput(videoOutput)
}
}
}
还有一个startSession
启动AVCaptureSession
会话的方法:
fileprivate func startSession() {
if let videoSession = videoSession {
if !videoSession.isRunning {
self.videoInputRunning = true
videoSession.startRunning()
}
}
}
我还实现AVCaptureVideoDataOutputSampleBufferDelegate
了 ,我打算在其中捕获以CMSampleBuffer
供以后使用:
extension CameraInput: AVCaptureVideoDataOutputSampleBufferDelegate {
internal func captureOutput(_ captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection!) {
print(Date())
}
}
但是,永远不会调用委托。这是我必须显示视频输出才能调用它的情况吗?