2

我正在尝试在 OpenGL ES 2.0 中实现光线拾取,以确定是否单击了对象。到目前为止,我只是想检查是否按下了特定的三角形。我使用这个网站作为动机http://android-raypick.blogspot.ca/2012/04/first-i-want-to-state-this-is-my-first.html

这是我到目前为止所拥有的:

public void onClick(float x, float y)
{
    float[] temp = new float[4];
    float[] temp2 = new float[4];
    System.out.println("X coordinate: " + x);
    System.out.println("Y coordinate: " + y);
    float[] pos = new float[4];

    y = (float) viewport[3] - y;

    int res = GLU.gluUnProject(x, y, 1.0f, 
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0, 
            temp, 0);

    Matrix.multiplyMV(temp2, 0, mMVPMatrix, 0, temp, 0);
    float[] nearCoOrds = new float[3];

    if(res == GLES20.GL_TRUE)
    {
        nearCoOrds[0] = temp2[0] / temp2[3];
        nearCoOrds[1] = temp2[1] / temp2[3];
        nearCoOrds[2] = temp2[2] / temp2[3];
        System.out.println("Near0: " + nearCoOrds[0]);
        System.out.println("Near1: " + nearCoOrds[1]);
        System.out.println("Near2: " + nearCoOrds[2]);
    }

    res = GLU.gluUnProject(x, y, 0,
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0,
            temp, 0);

    Matrix.multiplyMV(temp2,0,mMVPMatrix, 0, temp, 0);
    float[] farCoOrds = new float[3];

    if(res == GLES20.GL_TRUE)
    {
        farCoOrds[0] = temp2[0] / temp2[3];
        farCoOrds[1] = temp2[1] / temp2[3];
        farCoOrds[2] = temp2[2] / temp2[3];
        System.out.println("Far0: " + farCoOrds[0]);
        System.out.println("Far1: " + farCoOrds[1]);
        System.out.println("Far2: " + farCoOrds[2]);
    }

    float[] coords = new float[3];

    coords[0] = farCoOrds[0]-nearCoOrds[0];
    coords[1] = farCoOrds[1]-nearCoOrds[1];
    coords[2] = farCoOrds[2]-nearCoOrds[2];

    System.out.println("REAL COORDS 0: " + coords[0]);
    System.out.println("REAL COORDS 1: " + coords[1]);
    System.out.println("REAL COORDS 2: " + coords[2]);

}

和浮动是手指按下屏幕位置的 x 和 y 坐标xy该函数onClick是从MainActivity.

 GLU.gluUnProject(x, y, 1.0f, 
            mMVPMatrix, 0,
            mProjectionMatrix, 0,
            viewport, 0, 
            temp, 0);

mMVPMatrix是模型视图矩阵。mProjectionMatrix是投影矩阵,viewport其值为 {0,0,screenhwidth,screenheight}。

我得到的输出示例是(在屏幕中间触摸):

REAL COORDS 0: -0.21542415
REAL COORDS 1: 0.31117013
REAL COORDS 2: 9.000003

我的问题/主题是我不知道我是否在正确的轨道上?我是否得到了正确的想法,或者我似乎误解了什么?或者还有其他方法可以实现对三角形的触摸检测吗?

感谢您的任何帮助或指导!

4

2 回答 2

1

有一个出色的 Android OpenGL 框架,称为Rajawali。它支持物体拾取,示例代码看起来很简单,你应该试试。

于 2013-01-12T19:33:27.873 回答
0

I believe you many have misunderstood a bit (or maybe I have^^). The near and far coordinates are used to construct your ray segment to testing against your polys/hitboxes. The segment is in whats called world space and to test against your vertices/models you will need to convert them from model space(what they are in when loaded) to world space by multiplying them by their mModelView matrix (has all of their transforms, rotations, etc in it). That article you linked seems to take it through that point^^ Make sense or have I missed something?

bonus link for you: http://www.siggraph.org/education/materials/HyperGraph/raytrace/rtinter0.htm

于 2013-01-15T10:47:13.750 回答