Capture Eaglview / render buffer to UIImage data

This topic contains 1 reply, has 1 voice, and was last updated by  JMan 5 years, 4 months ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #30880

    JMan
    Member

    Hello,

    I’m working on an augmented reality app that shows video input on a uiview, with a transparent OpenGL ES 2.0 Eaglview on top of it (opaque=NO and glcleared with alpha 0.0). Working great. I can capture a still of the video input to UIImage data, and save that to the iPhone’s photoalbum. That obviously doesn’t automagically include the eaglview’s layer content. So I basically need to capture the opengl rendered frame to UIImage data as well, so I can merge it with the camera still before saving it to the photoalbum. After messing around with this for a day trying several approaches I
    figured I ask here because I think the problem I have is not being able
    to connect this to the Eaglview in the imgtec sdk.

    Here’s a Q&A article from Apple that explains exactly how to do it:
    http://developer.apple.com/library/ios/#qa/qa1704/_index.html

    I changed the following lines (I use ES 2.0):

    Code:
    // Get the size of the backing CAEAGLLayer
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_WIDTH, &backingWidth);
    glGetRenderbufferParameteriv(GL_RENDERBUFFER, GL_RENDERBUFFER_HEIGHT, &backingHeight);

    and removed the first line because I’m calling it after the same line in EndRender (‘Before’ the presentBuffer as stressed in the Apple doc), like so:

    Code:
        if(_msaaMaxSamples && _enableMSAA){
            glDisable(GL_SCISSOR_TEST);
            glBindFramebuffer(GL_READ_FRAMEBUFFER_APPLE, _msaaFrameBuffer);
            glBindFramebuffer(GL_DRAW_FRAMEBUFFER_APPLE, _framebuffer);
            glResolveMultisampleFramebufferAPPLE();
        }

        if(_enableFramebufferDiscard){
            GLenum attachments[] = { GL_COLOR_ATTACHMENT0, GL_DEPTH_ATTACHMENT, GL_STENCIL_ATTACHMENT };
            glDiscardFramebufferEXT(GL_READ_FRAMEBUFFER_APPLE, 3, attachments);
        }

        glBindRenderbuffer(GL_RENDERBUFFER, _renderbuffer);
       
        if (_takeSS) { //bool is
            [self snapUIImage];
            _takeSS = NO;
        }

       
        if(![_context presentRenderbuffer:GL_RENDERBUFFER])
            printf(“Failed to swap renderbuffer in %sn”, __FUNCTION__);

    Tried several variations (similar to the code Apple posted) and buffer bindings and locations to call the function but I end up getting white images in my photo album. Can someone point me in the right direction? I noticed the capture functions in the PVRshell but I don’t want to save to bmp file, need alpha channel and in UIImage format to merge with the camera capture.

    TIA!

    #35860

    JMan
    Member

    Solved. Just needed to bind the framebuffer explicitly. I wonder if that because of the glDiscardFramebufferEXT.. Wacko

Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic.