OpenGL ES 3.0 integer precision

This topic contains 1 reply, has 2 voices, and was last updated by  Joe Davis 2 years, 7 months ago.

Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #31985

    RonanBel
    Member

    Hi, I’m working on a device with a PowerVR Rogue Hood (driver 1.3@2876724)
    I’m surprised to see that a 2k14 device has only 24bit integer (high) precision (using glGetShaderPrecisionFormat), I thought it would be 32 nowadays …
    (which may be the reason of some bugs I got on this device for a code running well on all other devices)

    I understand that on this kind of device, only 24bit are valid in (u)integer.
    Let’s consider the case of a TransformFeedback only (no rasterizer) vertex shader, outputing some uvec4 values in a 4xu32 buffer.
    Will the shader compiler be smart enough to merge 16/24bit intermediate results to write valid 32bit results
    or do I have to consider that at most 24 of the 32bit of the output will be valid ?
    (and in this case, will the top 8bit be cleared, set or undefined ?)

    #39499

    Joe Davis
    Member

    Hi,

    We’ve looked at the driver source and confirmed this is a bug. The 1.3 driver you’re using incorrectly reports 24-bits instead of 32-bits.

    As of changelist 3206984, all glGetShaderPrecisionFormat() values were corrected.

    (which may be the reason of some bugs I got on this device for a code running well on all other devices)

    If you can create a new discussion chain for the bugs you’ve encountered, we can help you investigate. If you need to share any builds or logs with us confidentially, you can attach them to tickets in our support system: https://pvrsupport.imgtec.com/new-ticket

    Joe

Viewing 2 posts - 1 through 2 (of 2 total)
You must be logged in to reply to this topic.