现在我AVDepthPhotoFilter
从 iPhone7Plus 的立体摄像头运行 Rendering Depth Deta。
所以,我想访问每像素深度数据,但是,我不知道该怎么做。请指教。
现在我AVDepthPhotoFilter
从 iPhone7Plus 的立体摄像头运行 Rendering Depth Deta。
所以,我想访问每像素深度数据,但是,我不知道该怎么做。请指教。
如何获取 DepthData 并分析 CVPixelBuffer 数据
您需要确保您的 AVCapturePhotoSettings() 具有 isDepthDataDeliveryEnabled = true
你必须使用函数func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)
func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
//## Convert Disparity to Depth ##
let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer
//## Data Analysis ##
// Useful data
let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))
// Convert the base address to a safe pointer of the appropriate type
let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer<Float32>.self)
// Read the data (returns value of type Float)
// Accessible values : (width-1) * (height-1) = 767 * 575
let distanceAtXYPoint = floatBuffer[Int(x * y)]
}
如果您想了解更多关于 CVPixelBuffer 分析的信息,这里有一篇有用的帖子 ->详细信息