create ghost texture – out of memory

This topic contains 4 replies, has 3 voices, and was last updated by  Joe Davis 4 years, 5 months ago.

Viewing 5 posts - 1 through 5 (of 5 total)
  • Author
    Posts
  • #31338

    jmoguill
    Member

    Hi,
    I’m implementing 2D graphics acceleration using OpenGL ES 2, on SGX 543. To implement surface blitting, I’m creating a temporary texture, blitting it to another texture using a framebuffer object, and then destroying the temporary texture. In the log I see “CreatedGhostTexture” each time I delete the temporary texture. Eventually I run out of memory. I tried using glFinish and glFlush but they don’t have any effect. Is there a way to avoid creating a ghost texture? How can I wait for the render to finish before deleting the texture? Are ghost textures deleted? My code is in Java, on Android ICS.

    Thanks,
    Jeff

    #37646

    Hi Jeff,

    Could you give a snippet of code to show which commands you’re submitting to the GPU? Alternatively if you could send us a PVRTrace file, we can look at that. Knowing the exact commands could help us pinpoint the issue. Also could you let us know which actual device you’re using?

    Thanks,
    Tobias

    #37647

    Joe Davis
    Member

    Hi Jeff,

    Depending on driver configuration, glFinish & glFlush may not behave as you expect. These calls may be ignored so the driver can submit work in the most optimal way possible. There are ways to force the render to complete, but it’s best to only use these techniques during testing/debugging as they incur a significant performance hit and can generally be avoided by altering the design of a rendering engine.

    As our graphics architecture defers rendering, our driver has to keep texture and buffer data in memory for frames that have been submitted by your application, but not yet been processed by the GPU. This blog post explains why our driver does this and how you can optimize your render to avoid texture ghosting.

    For the use case you’ve outlined, it may be more efficient to compose the images on the CPU before submitting the texture to GL.

    Thanks,
    Joe

    #37648

    jmoguill
    Member

    Thanks for the responses. I found a fix. I created a powervr.ini file in the /data folder, with:
    [default]
    FlushBehaviour=2

    #37649

    Joe Davis
    Member

    Hi,

    I would advise against the use of powervr.ini in production builds. Configuring the driver in this way can cause stability issues as the modification completely changes the dynamics of how the driver operates. Also, using the [default] flag means that modifications you’ve made will be applied to all processes on the system – not just your own.

    As mentioned before, I expect CPU image composition to be most efficient solution. It’s much safer to modify the behaviour of your application than risk breaking the driver.

    Thanks,
    Joe

Viewing 5 posts - 1 through 5 (of 5 total)
You must be logged in to reply to this topic.