Very few colours with alpha channel textures

This topic contains 5 replies, has 2 voices, and was last updated by  Xmas 9 years ago.

Viewing 6 posts - 1 through 6 (of 6 total)
  • Author
    Posts
  • #29579

    ThM
    Member

    Hi, it’s me again. Smile

     

    I have a general problem when using textures with an alpha channel (GL_RGBA).

     

    When using textures without alpha (GL_RGB), it seems that these are internally stored (or at least displayed) with a color depth of 16 bit (RGB565 or RGB555). This is just a guess due to my perception of the resulting image (I can see some seams but not too many).

     

    Now, when adding an alpha channel to the source image and loading it into OpenGL-ES (with GL_RGBA), the color depth is extremely bad. My suspicion is that the alpha texture is internally stored (or at least displayed) with 16 bit. That should be the same as with GL_RGB but now it has to store the alpha channel, too. Thus it may be that RGBA4444 is used, which results in 12 Bit for the colour which means we only have 4096 colours left to display…

     

    In PC emulation everything works fine, I have a seamless colour transition. This problem only happens on the embedded device.

    Also if using one texture with alpha channel and one without in the same scene, only the texture with alpha channel has few colours.

    Furthermore it does not depend on the OpenGL-settings nor the states (GL_BLEND can be disabled, too, it doesn’t matter).

    It can be reproduced by using the “08_AlphaBlend”-TrainingCourse project from the OGLES-1.1_WINDOWS_PCEMULATION_2.02.22.0756 – SDK. Just disable the blending (otherwise it is not easily noticable) and you’ll find the red dots displayed with very few colours.

    It’s even more obvious if replacing the background texture with an image which has nice colour-fadings (and an alpha channel which is always set at being opaque).

     

    Is this problem known? Does it maybe depend on the driver or the hardware? Is there a solution?

     

    Best regards,

    Thomas

    ThM2008-09-01 13:13:05

    #32257

    Xmas
    Member

    Hi Thomas,

    Which device are you working on? What is the colour depth of your EGL window surface?

    #32258

    ThM
    Member

    Hi,

     

    my device is a Freescale IMX31 board with a PowerVR MBX Chip.

     

    I didn’t modify the TrainingCourse project, but the maximum possible colour depth on the device seems to be 565 = 16 bits.

     

    If I try it like this:

    conflist[i++] = EGL_RED_SIZE;

    conflist[i++] = 5;

    conflist[i++] = EGL_GREEN_SIZE;

    conflist[i++] = 6;

    conflist[i++] = EGL_BLUE_SIZE;

    conflist[i++] = 5;

     

    and if I then ask for the values like this:

    glGetIntegerv(GL_RED_BITS, &value);

    glGetIntegerv(GL_GREEN_BITS, &value);

    glGetIntegerv(GL_BLUE_BITS, &value);

    glGetIntegerv(GL_ALPHA_BITS, &value);

     

    I get 565 and an alpha size of 0 (but alpha is not needed for the display buffer I guess). Nevertheless, the result looks like 444. It looks definitely not like 16 bits used for colour only.

    It looks exactly the same as when I save the picture with your PVRTexTool in the format ARGB4444.

     

    And if I just change the image to a non-alpha-channel image (no other change at all in the code), the result looks fine. glGet still gives a colour depth of 565 and 0 for alpha.

    ThM2008-09-01 12:24:45

    #32259

    Xmas
    Member

    Could you check if the device supports the GL_IMG_texture_format_BGRA8888 extension? This extension defines a new token, GL_BGRA (0x80E1), for texture data in BGRA order. You can use this to get 32-bit textures.

    Code:
    #ifndef GL_BGRA

    #define GL_BGRA 0x80E1

    #endif

    // convert data to BGRA here

    glTexImage2D(GL_TEXTURE_2D, 0, GL_BGRA, width, height, 0, GL_BGRA, GL_UNSIGNED_BYTE, data);

    Alternatively, have you tried using PVRTC compressed textures?

    #32260

    ThM
    Member

    Thank you very much, Xmas! BGRA works on my device with 32 bits. Smile

    One question is left in my brain: Is there no other way to use 32 bit textures on this device?

    I think it’s strange that I pass 32 bits to OpenGL-ES but the textures are only displayed with 16 bits although the device is able to process 32 bits and to display it with this extension.

    Shouldn’t thus GL_RGBA support 32 bits, too?

     

    I found the exactly same question here with the same – good –  answer: http://www.khronos.org/message_boards/viewtopic.php?f=4&p=3435

    ThM2008-09-05 09:55:33

    #32261

    Xmas
    Member
    ThM wrote:
    One question is left in my brain: Is there no other way to use 32 bit textures on this device?

    No, when the framebuffer config is 16 bit you need to use the BGRA extension to get 32 bit textures on this device.

Viewing 6 posts - 1 through 6 (of 6 total)
You must be logged in to reply to this topic.