0

I have a baffling problem with an iPhone 4 OpenGL ES app that I have been trying to tackle on and off for a couple of months now and have hit a dead end despite some really useful and tantalising tips and suggestions on this site.

I am writing a 3d game which simply draws blocks and allows the user to move them around into various arrangements and the bulk of the app is written in C++.

My problem is that I am trying to use GLuUnproject which I found source code for here:

http://webcvs.freedesktop.org/mesa/Mesa/src/glu/mesa/project.c?view=markup

to interpret the 3d point (and hence the block) selected by the user in order to move and rotate it which I convered to floating point rather than double precision.

Please note that I have compared this source to other versions on the net and it appears to be consistent.

I use the following code to get the ray vector:

Ray RenderingEngine::GetRayVector( vec2 winPos ) const
{
    // Get the last matrices used
    glGetFloatv( GL_MODELVIEW_MATRIX, __modelview ); 
    glGetFloatv( GL_PROJECTION_MATRIX, __projection );
    glGetIntegerv( GL_VIEWPORT, __viewport );

    // Flip the y coordinate
    winPos.y = (float)__viewport[3] - winPos.y;

    // Create vectors to be set
    vec3 nearPoint;
    vec3 farPoint;
    Ray rayVector;

    //Retrieving position projected on near plan
    gluUnProject( winPos.x, winPos.y , 0, 
                 __modelview, __projection, __viewport, 
                  &nearPoint.x, &nearPoint.y, &nearPoint.z);

    //Retrieving position projected on far plan
    gluUnProject( winPos.x, winPos.y,  1, 
                 __modelview, __projection, __viewport, 
                  &farPoint.x, &farPoint.y, &farPoint.z);

    rayVector.nearPoint = nearPoint;
    rayVector.farPoint = farPoint;

    //Return the ray vector
    return rayVector;
}

The vector code for tracing out the returned ray from the near plane to the far plane is straightforward and I find that blocks near the bottom of the screen are correctly identified but as one moves up the screen there seems to be an increasing discrepancy in the reported y-values and the expected y values for the points selected.

I have also tried using GLuProject to manually check which screen coordinates are generated for my world coordinates as follows:

vec3 RenderingEngine::GetScreenCoordinates( vec3 objectPos ) const
{

    // Get the last matrices used
    glGetFloatv( GL_MODELVIEW_MATRIX, __modelview ); 
    glGetFloatv( GL_PROJECTION_MATRIX, __projection );
    glGetIntegerv( GL_VIEWPORT, __viewport );

    vec3 winPos;

    gluProject(objectPos.x, objectPos.y, objectPos.z , 
                   __modelview, __projection, __viewport, 
                   &winPos.x, &winPos.y, &winPos.z);

    // Swap the y value
    winPos.y = (float)__viewport[3] - winPos.y;

    return winPos;
}  

Again, the results are consistent with the ray tracing approach in that the GLuProjected y coordinate gets increasingly wrong as the user clicks higher up the screen.

For example, when the clicked position directly reported by the touchesBegan event is (246,190) the calculated position is (246, 215), a y discrepancy of 25.

When the clicked position directly reported by the touchesBegan event is (246,398) the calculated position is (246, 405), a y discrepancy of 7.

The x coordinate seems to be spot on.

I notice that the layer.bounds.size.height is reported as 436 when the viewport height is set to 480 (the full screen height). The layer bounds width is reported as 320 which is also the width of the viewport.

The value of 436 seems to be fixed no matter what viewport size I use or whether I display the status screen at the top of the window.

Have tried setting the bounds.size.height to 480 before the following call:

[my_context
        renderbufferStorage:GL_RENDERBUFFER
        fromDrawable: eaglLayer];

But this seems to be ignored and the height is later reported as 436 in the call:

glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                GL_RENDERBUFFER_HEIGHT_OES, &height);

I have seen some discussion of the difference in points and pixels and the possible need for scaling but have struggled to use that information usefully since these hinted that the difference was due to the retina display resolution of the iPhone 4 and that different scaling would be required for the simulator and the actual device. However, as far as I can tell the simulator and device are behaving consistently.

30-Aug-2011 As not getting any feedback on this one - is there more information I can supply to make the question more tractable?

31-Aug-2011 OpenGL setup and display code as follows:

- (id) initWithCoder:(NSCoder*)coder
{    
    if ((self = [super initWithCoder:coder]))
    {

        // Create OpenGL friendly layer to draw in
        CAEAGLLayer* eaglLayer = (CAEAGLLayer*) self.layer;
        eaglLayer.opaque = YES;

        // eaglLayer.bounds.size.width and eaglLayer.bounds.size.height are 
        // always 320 and 436 at this point

        EAGLRenderingAPI api = kEAGLRenderingAPIOpenGLES1;
        m_context = [[EAGLContext alloc] initWithAPI:api];

        // check have a context
        if (!m_context || ![EAGLContext setCurrentContext:m_context]) {
            [self release];
            return nil;
        }

        glGenRenderbuffersOES(1, &m_colorRenderbuffer);
        glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_colorRenderbuffer);

        [m_context
            renderbufferStorage:GL_RENDERBUFFER
            fromDrawable: eaglLayer];

        UIScreen *scr = [UIScreen mainScreen];
        CGRect rect = scr.applicationFrame;
        int width = CGRectGetWidth(rect);    // Always 320
        int height = CGRectGetHeight(rect);  // Always 480 (status bar not displayed)

        // Initialise the main code
        m_applicationEngine->Initialise(width, height);

            // This is the key c++ code invoked in Initialise call shown here indented

            // Setup viewport
            LowerLeft = ivec2(0,0);
            ViewportSize = ivec2(width,height);

            // Code to create vertex and index buffers not shown here
            // …

            // Extract width and height from the color buffer.
            int width, height;
            glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                            GL_RENDERBUFFER_WIDTH_OES, &width);
            glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                            GL_RENDERBUFFER_HEIGHT_OES, &height);

            // Create a depth buffer that has the same size as the color buffer.
            glGenRenderbuffersOES(1, &m_depthRenderbuffer);
            glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_depthRenderbuffer);
            glRenderbufferStorageOES(GL_RENDERBUFFER_OES, GL_DEPTH_COMPONENT16_OES,
                                     width, height);

            // Create the framebuffer object.
            GLuint framebuffer;
            glGenFramebuffersOES(1, &framebuffer);
            glBindFramebufferOES(GL_FRAMEBUFFER_OES, framebuffer);
            glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_COLOR_ATTACHMENT0_OES,
                                         GL_RENDERBUFFER_OES, m_colorRenderbuffer);
            glFramebufferRenderbufferOES(GL_FRAMEBUFFER_OES, GL_DEPTH_ATTACHMENT_OES,
                                     GL_RENDERBUFFER_OES, m_depthRenderbuffer);
            glBindRenderbufferOES(GL_RENDERBUFFER_OES, m_colorRenderbuffer);


            // Set up various GL states.
            glEnableClientState(GL_VERTEX_ARRAY);
            glEnableClientState(GL_NORMAL_ARRAY);
            glEnable(GL_LIGHTING);
            glEnable(GL_LIGHT0);
            glEnable(GL_DEPTH_TEST);

        // ...Back in initiWithCoder

        // Do those things which need to happen when the main code is reset
        m_applicationEngine->Reset();

            // This is the key c++ code invoked in Reset call shown here indented

            // Set initial camera position where
            // eye=(0.7,8,-8), m_target=(0,4,0), CAMERA_UP=(0,-1,0)
            m_main_camera = mat4::LookAt(eye, m_target, CAMERA_UP); 


        // ...Back in initiWithCoder
        [self drawView: nil];
        m_timestamp = CACurrentMediaTime();

        // Create timer object that allows application to synchronise its 
        // drawing to the refresh rate of the display.
        CADisplayLink* displayLink;
        displayLink = [CADisplayLink displayLinkWithTarget:self
                                 selector:@selector(drawView:)];

        [displayLink addToRunLoop:[NSRunLoop currentRunLoop]
                 forMode:NSDefaultRunLoopMode];
    }
    return self;
}



- (void) drawView: (CADisplayLink*) displayLink
{

    if (displayLink != nil) {

        // Invoke main rendering code
        m_applicationEngine->Render();

            // This is the key c++ code invoked in Render call shown here indented

            // Do the background
            glClearColor(1.0f, 1.0f, 1.0f, 1);
            glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT);

            // A set of objects are provided to this method
            // for each one (called visual below) do the following:

            // Set the viewport transform.
            glViewport(LowerLeft.x, LowerLeft.y, ViewportSize.x, ViewportSize.y);

            // Set the model view and projection transforms
            // Frustum(T left, T right, T bottom, T top, T near, T far)
            float h = 4.0f * size.y / size.x;
            mat4 modelview = visual->Rotation * visual->Translation * m_main_camera;
            mat4 projection = mat4::Frustum(-1.5, 1.5, h/2, -h/2, 4, 14);

            // Load the model view matrix and initialise
            glMatrixMode(GL_MODELVIEW);
            glLoadIdentity();
            glLoadMatrixf(modelview.Pointer());

            glMatrixMode(GL_PROJECTION);
            glLoadIdentity();
            glLoadMatrixf(projection.Pointer());

            // Draw the surface - code not shown
            // …    

        // ...Back in drawView
        [m_context presentRenderbuffer:GL_RENDERBUFFER];
    }
}
4

2 回答 2

0

当持有renderer的view被resize时,会以这种方式得到通知:

- (void) layoutSubviews
{
  [renderer resizeFromLayer:(CAEAGLLayer*)self.layer];
  [self drawView:nil];
}

- (BOOL) resizeFromLayer:(CAEAGLLayer *)layer
{   
    // Allocate color buffer backing based on the current layer size
  glBindRenderbufferOES(GL_RENDERBUFFER_OES, colorRenderBuffer);
  [context renderbufferStorage:GL_RENDERBUFFER_OES fromDrawable:layer];
  glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_WIDTH_OES, &backingWidth);
  glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES, GL_RENDERBUFFER_HEIGHT_OES, &backingHeight);

  if (glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES) != GL_FRAMEBUFFER_COMPLETE_OES)
  {
    NSLog(@"Failed to make complete framebuffer object %x", glCheckFramebufferStatusOES(GL_FRAMEBUFFER_OES));
    return NO;
  }

  [self recreatePerspectiveProjectionMatrix];
  return YES;
}

请注意,由于视口大小已更改,因此应正确重新创建透视矩阵。并且会对非项目结果产生影响。

与规模问题相关:

内部视图初始化获取比例因子:

    CGFloat scale = 1;
    if ([self respondsToSelector:@selector(getContentScaleFactor:)])
    {
        self.contentScaleFactor = [[UIScreen mainScreen] scale];
        scale = self.contentScaleFactor;
    }

标准和视网膜显示器上的视图大小实际上是相同的,320 像素宽,但视网膜的渲染层大小将加倍,640 像素。在 opengl 渲染器空间和视图空间之间转换时,应考虑比例因子。

补充: 尝试更改初始化代码中获取和设置宽度和高度参数的顺序:

而不是这个:

    int width = CGRectGetWidth(rect);    // Always 320
    int height = CGRectGetHeight(rect);  // Always 480 (status bar not displayed)

    // Initialise the main code
    m_applicationEngine->Initialise(width, height);

        // This is the key c++ code invoked in Initialise call shown here indented

        // Setup viewport
        LowerLeft = ivec2(0,0);
        ViewportSize = ivec2(width,height);

        // Code to create vertex and index buffers not shown here
        // …

        // Extract width and height from the color buffer.
        int width, height;
        glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                        GL_RENDERBUFFER_WIDTH_OES, &width);
        glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                        GL_RENDERBUFFER_HEIGHT_OES, &height);

试试这个顺序(不要使用视图中的尺寸):

        // Code to create vertex and index buffers not shown here
        // …

        // Extract width and height from the color buffer.
        int width, height;
        glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                        GL_RENDERBUFFER_WIDTH_OES, &width);
        glGetRenderbufferParameterivOES(GL_RENDERBUFFER_OES,
                                        GL_RENDERBUFFER_HEIGHT_OES, &height);

    // Initialise the main code
    m_applicationEngine->Initialise(width, height);

        // This is the key c++ code invoked in Initialise call shown here indented

        // Setup viewport
        LowerLeft = ivec2(0,0);
        ViewportSize = ivec2(width,height);

还要确保您已经为全屏布局设置了 UIController 参数:

self.wantsFullScreenLayout = YES;

之后,对于 iphone 4,宽度和高度应该正好是 640x960,contentScaleFactor 应该是 2。

但是,还要注意 layoutSubviews 是标准的 UIView 函数,它是我获取屏幕大小和调整投影或截锥矩阵的唯一地方。

于 2011-08-30T14:28:53.787 回答
0

嗯……我现在感觉有点傻……

问题是我使用的视图实际上是 436 像素高,这是我很久以前在尝试为我不再使用的主窗口上的公共导航栏留出空间时设置的。

将其设置回 480 解决了这个问题。

向那些看到这个的人,特别是那些做出回应的人道歉。

在经历了几个月的挫折之后,我现在要去让自己摆脱痛苦!

于 2011-09-02T10:52:59.957 回答