GL ES 2.0 calls through win32 DLL not working

This topic contains 3 replies, has 2 voices, and was last updated by  Tobias Hector 6 years, 4 months ago.

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
    Posts
  • #30545

    WOsborn
    Member

    Any reason why a dll calling gl es 2.0 functions would not work?
     
    OGLES2_WINDOWS_X86EMULATION_2.08.28.0634
     
    I made a sample app that creates a window, creates an EGL surface using that window,  and draws a quad onscreen. Smile
    I hooked in that same code to an openGL ES 2.0 renderer dll and some gl funcs do not get called? Cry
     
    I know this because I used PVRTrace to record the 2 scenarios.
    If you look at the second frame of both traces you will see the DLL version does not have the:
    glViewport()
    glClearColor()
    glDrawArrays()
     
     
    this is basically what I am doing in both:
     
    [C++ code]

    egl.switch_to(false);

    glViewport(0,0, size[0],size[1]);

    glClearColor(0.25f,0,0,1);

    glClear( GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT );

    glBindTexture( GL_TEXTURE_2D, def_tex );

    static const float X1 = -0.5f;

    static const float Y1 = -0.5f;

    static const float X2 = 0.5f;

    static const float Y2 = 0.5f; 

    static GLfloat vertexData[] = {

    X2, Y2,

    X1, Y2,

    X1, Y1,

    X1, Y1,

    X2, Y1,

    X2, Y2,

    };

    static const float uvX1 = 0;

    static const float uvY1 = 0;

    static const float uvX2 = 1;

    static const float uvY2 = 1;

    static GLfloat uvData[] = {

    uvX2, uvY2,

    uvX1, uvY2,

    uvX1, uvY1,

    uvX1, uvY1,

    uvX2, uvY1,

    uvX2, uvY2,

    };  

    glBindBuffer(GL_ARRAY_BUFFER, 0);

    glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, 0);

    glUseProgram( ogles2_shaderProgram );

    glEnableVertexAttribArray( ogles2_shader_vLoc );

    glVertexAttribPointer( ogles2_shader_vLoc, 2, GL_FLOAT, GL_FALSE, 0, vertexData );

    glEnableVertexAttribArray( ogles2_shader_uv0Loc );

    glVertexAttribPointer( ogles2_shader_uv0Loc, 2, GL_FLOAT, GL_FALSE, 0, uvData );

    glDrawArrays(GL_TRIANGLES, 0, 6);

    glUseProgram( 0 );

    glDisableVertexAttribArray(ogles2_shader_vLoc);

    glDisableVertexAttribArray(ogles2_shader_uv0Loc);

    egl.switch_back();

    egl.swap_buffers();

     
     

    WOsborn2011-06-06 03:22:55

    #34934

    Hi WOsborn,

    Could you post copies of your pvrtrace.cfg files that were used? This is an odd bug which I haven’t seen before, perhaps if I can see the .cfg files I may be able to help you.

    Thanks,

    Tobias

    #34935

    WOsborn
    Member
    Here you go… sorry for delay workstation got zapped.
     

    Pretty standard cfg.

     
     
    #34936

    Hi WOsborn,

    Ok I was a stumped for a while but I think I can see what’s going on now. In the trace_dll.pvrt case, is it that you’ve got a dll with some gl calls in (which are the ones being dropped) and an app that uses this dll, as well as making it’s own gl calls (which are still being captured)?

    If that’s the case, then what’s happening is that your .dll is not linking properly to the trace libraries, and instead sending its calls direct to the vframe libraries. If all your .dlls (your rendering dll, the trace libraries and your application .exe) are in the same folder this shouldn’t be a problem on Windows, although if you’ve already got it set up like this then let me know, and I’ll try to point you to a way to get it working!

    Please let me know if this is the case and if it works for you!

    Thanks,

    Tobias

    Tobias2011-06-16 10:57:45

Viewing 4 posts - 1 through 4 (of 4 total)
You must be logged in to reply to this topic.