2

我在 iOS 12 (16A5288q) 下使用 ARKit 2,使用 Xcode 10 beta 6 构建,在 iPhone X 上运行,并且始终lookAtPoint为零。

我通过以下方式访问面部数据(在 Swift 中):

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }

    FaceAnchorsProcessedCount = FaceAnchorsProcessedCount + 1
    let rightEyeTransform: simd_float4x4 = faceAnchor.rightEyeTransform
    let leftEyeTransform:  simd_float4x4 = faceAnchor.leftEyeTransform
    let lookAtPoint:       simd_float3   = faceAnchor.lookAtPoint
}

我得到如下数据:

rightEyeTransform    simd_float4x4    \n[ [9.999874e-01, 0.000000e+00, 5.010252e-03, -3.208227e-02],\n  [2.375229e-04, 9.988756e-01, -4.740678e-02, 2.703529e-02],\n  [-5.004618e-03, 4.740737e-02, 9.988630e-01, 2.525132e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
leftEyeTransform     simd_float4x4    \n[ [9.978353e-01, 0.000000e+00, -6.576237e-02, 3.208223e-02],\n  [-3.110934e-03, 9.988804e-01, -4.720329e-02, 2.703534e-02],\n  [6.568874e-02, 4.730569e-02, 9.967182e-01, 2.525137e-02],\n  [0.000000e+00, 0.000000e+00, 0.000000e+00, 1.000000e+00] ]\n    
lookAtPoint          simd_float3      (0.000000e+00, 0.000000e+00, 0.000000e+00)    

我究竟做错了什么?或者这是一个已知的错误?

2018 年 10 月 4 日更新我lookAtPoint今天做了一个简单的测试。我把脸靠近听筒,然后再远离,又靠近;反复。lookAtPoint的最小z 值为 38.59 英寸,最大值为 39.17 英寸(从米转换而来)。

用卷尺测量的实际距离约为 4.5 英寸和 33 英寸。

Apple 的声明lookAtPoint将“[...] 估计用户的眼睛所关注的点,相对于面部。” 似乎不正确。

4

2 回答 2

0

今天发布了 iOS 12 以及 XCode 10(取代了 beta 版本)。我测试lookAtPoint了这些新版本的访问,现在正在获取填充向量。

SWIFT代码:

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard let faceAnchor = anchor as? ARFaceAnchor else { return }

    let lookAtPoint: simd_float3 = faceAnchor.lookAtPoint
    os_log("lookAtPoint: %.12f,%.12f,%.12f", type: .debug, lookAtPoint.x, lookAtPoint.y, lookAtPoint.z)

日志输出:

2018-09-17 16:17:12.097369-0700 EyeSync[512:41060] lookAtPoint: 0.049317009747,-0.004630976822,0.981833696365
2018-09-17 16:17:12.113925-0700 EyeSync[512:41060] lookAtPoint: 0.050239805132,-0.006484962534,0.981752157211
2018-09-17 16:17:12.130867-0700 EyeSync[512:41060] lookAtPoint: 0.051697697490,-0.011350239627,0.981206715107
2018-09-17 16:17:12.147272-0700 EyeSync[512:41060] lookAtPoint: 0.052744854242,-0.012763299979,0.981896817684
2018-09-17 16:17:12.163683-0700 EyeSync[512:41060] lookAtPoint: 0.054889015853,-0.015469233505,0.982917487621
2018-09-17 16:17:12.180636-0700 EyeSync[512:41060] lookAtPoint: 0.056391790509,-0.017265520990,0.983718335629
2018-09-17 16:17:12.197387-0700 EyeSync[512:41060] lookAtPoint: 0.059109147638,-0.018527992070,0.983208477497
2018-09-17 16:17:12.214021-0700 EyeSync[512:41060] lookAtPoint: 0.061453290284,-0.019032688811,0.981536626816
2018-09-17 16:17:12.230689-0700 EyeSync[512:41060] lookAtPoint: 0.063107110560,-0.019657038152,0.978309571743
于 2018-09-17T23:27:20.877 回答
0

是的,我在一个月前尝试过,我可以说你是绝对正确的——目前lookAtPoint实例属性不起作用,甚至失效。现在它总是返回float3(0.0, 0.0, 0.0)

我猜苹果还没有实现它(它处于测试状态)。Cross eyes' detection是一个 ARKit 的功能,我们似乎会在 iOS 12 最终稳定版本中看到。

目前我没有 Mac,也无法检查它,因此请尝试使用open class以下打开的实例属性:

open class ARFaceAnchor: ARTrackable {
    open var leftEyeTransform: simd_float4x4 { get } 
    open var rightEyeTransform: simd_float4x4 { get } 
    open var lookAtPoint: simd_float3 { get }
}

希望能帮助到你!

于 2018-08-24T01:17:33.487 回答