0

我试图在子弹物理世界中挑选物体,但我似乎只能挑选地板/地平面!!!我正在使用 Vuforia SDK 并更改了 ImageTargets 演示代码。我使用以下代码将我的触摸屏点投影到 3d 世界:

void projectTouchPointsForBullet(QCAR::Vec2F point, QCAR::Vec3F &lineStart, QCAR::Vec3F &lineEnd, QCAR::Matrix44F &modelViewMatrix)
{

QCAR::Vec4F normalisedVector((2 * point.data[0] / screenWidth - 1),
        (2 * (screenHeight-point.data[1]) / screenHeight - 1),
        -1,
        1);
QCAR::Matrix44F modelViewProjection;
SampleUtils::multiplyMatrix(&projectionMatrix.data[0],  &modelViewMatrix.data[0] , &modelViewProjection.data[0]);
QCAR::Matrix44F inversedMatrix = SampleMath::Matrix44FInverse(modelViewProjection);

QCAR::Vec4F near_point = SampleMath::Vec4FTransform( normalisedVector,inversedMatrix);

near_point.data[3] = 1.0/near_point.data[3];
near_point = QCAR::Vec4F(near_point.data[0]*near_point.data[3], near_point.data[1]*near_point.data[3], near_point.data[2]*near_point.data[3], 1);

normalisedVector.data[2] = 1.0;//z coordinate now 1
QCAR::Vec4F far_point = SampleMath::Vec4FTransform( normalisedVector, inversedMatrix);

far_point.data[3] = 1.0/far_point.data[3];
far_point = QCAR::Vec4F(far_point.data[0]*far_point.data[3], far_point.data[1]*far_point.data[3], far_point.data[2]*far_point.data[3], 1);

lineStart = QCAR::Vec3F(near_point.data[0],near_point.data[1],near_point.data[2]);
lineEnd = QCAR::Vec3F(far_point.data[0],far_point.data[1],far_point.data[2]);
} 

当我在我的物理世界中尝试进行射线测试时,我似乎只是击中了地平面!这是射线测试调用的代码:

   QCAR::Vec3F intersection, lineStart;
   projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
    btVector3 btRayFrom = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);
    btVector3 btRayTo = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);

    btCollisionWorld::ClosestRayResultCallback rayCallback(btRayFrom,btRayTo);
    dynamicsWorld->rayTest(btRayFrom, btRayTo, rayCallback);
    if(rayCallback.hasHit())
    {
        char* pPhysicsData = reinterpret_cast<char*>(rayCallback.m_collisionObject->getUserPointer());//my bodies have char* messages attached to them to determine what has been touched
        btRigidBody* pBody = btRigidBody::upcast(rayCallback.m_collisionObject);
        if (pBody && pPhysicsData)
        {
            LOG("handleTouches:: notifyOnTouchEvent from physics world!!!");
            notifyOnTouchEvent(env, obj,0,0, pPhysicsData);
        }

    }

我知道我主要是从上往下看,所以我一定会撞到地面,我至少知道我的触摸被正确地投射到世界上,但是我有物体躺在地面上,我似乎不能能够触摸到它们!任何指针将不胜感激:)

4

1 回答 1

0

我发现了为什么我无法触摸对象 - 我在绘制对象时将它们放大,因此在将触摸点投影到 3d 世界之前,我必须将视图矩阵缩放相同的值(编辑 I还颠倒了 btRayFrom 和 btRayTo 输入坐标,现在已修复):

//top of code
int kObjectScale = 100.0f
....
...
//inside touch handler method
SampleUtils::scalePoseMatrix(kObjectScale, kObjectScale, kObjectScale,&modelViewMatrix.data[0]);
    projectTouchPointsForBullet(QCAR::Vec2F(touch1.tapX, touch1.tapY), lineStart, lineEnd,inverseProjMatrix, modelViewMatrix);
    btVector3 btRayFrom = btVector3(lineStart.data[0], lineStart.data[1], lineStart.data[2]);
    btVector3 btRayTo = btVector3(lineEnd.data[0], lineEnd.data[1], lineEnd.data[2]);

我的触摸现在可以正确投影了:)

于 2012-08-23T15:58:02.943 回答