2

现在我AVDepthPhotoFilter从 iPhone7Plus 的立体摄像头运行 Rendering Depth Deta。

所以,我想访问每像素深度数据,但是,我不知道该怎么做。请指教。

4

1 回答 1

11

如何获取 DepthData 并分析 CVPixelBuffer 数据

  1. 您需要确保您的 AVCapturePhotoSettings() 具有 isDepthDataDeliveryEnabled = true

  2. 你必须使用函数func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?)

    func photoOutput(_ output: AVCapturePhotoOutput, didFinishProcessingPhoto photo: AVCapturePhoto, error: Error?) {
    
        //## Convert Disparity to Depth ##
    
        let depthData = (photo.depthData as AVDepthData!).converting(toDepthDataType: kCVPixelFormatType_DepthFloat32)
        let depthDataMap = depthData.depthDataMap //AVDepthData -> CVPixelBuffer
    
        //## Data Analysis ##
    
        // Useful data
        let width = CVPixelBufferGetWidth(depthDataMap) //768 on an iPhone 7+
        let height = CVPixelBufferGetHeight(depthDataMap) //576 on an iPhone 7+
        CVPixelBufferLockBaseAddress(depthDataMap, CVPixelBufferLockFlags(rawValue: 0))
    
        // Convert the base address to a safe pointer of the appropriate type
        let floatBuffer = unsafeBitCast(CVPixelBufferGetBaseAddress(depthDataMap), to: UnsafeMutablePointer<Float32>.self)
    
        // Read the data (returns value of type Float)
        // Accessible values : (width-1) * (height-1) = 767 * 575
    
        let distanceAtXYPoint = floatBuffer[Int(x * y)]
    
    }
    

如果您想了解更多关于 CVPixelBuffer 分析的信息,这里有一篇有用的帖子 ->详细信息

于 2017-10-20T11:47:46.577 回答