Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20629 Discussions

Integer texture support in Intel HD 4600

MJone15
Beginner
2,617 Views

I am having a problem allocating 64x64 OpenGL textures of the format ALPHA16I_EXT (internal) / ALPHA_INTEGER_EXT.

GPU caps lists the GL_EXT_texture_integer extension as being available. The driver is 10.18.10.3345 (10-31-2013).

The machine is a Dell with a hybrid NVidia/Intel configuration.

My question is more a driver question than a dev question: should the HD 4600 support this texture format? Is there limited support?

Thanks.

0 Kudos
1 Solution
ROBERT_U_Intel
Employee
921 Views

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

View solution in original post

0 Kudos
4 Replies
ROBERT_U_Intel
Employee
921 Views

Hi Michael

I will check with our OGL developers and get back to you.

Thanks

Robert

0 Kudos
ROBERT_U_Intel
Employee
922 Views

Hi Michael

The latest version of the OpenGL specs recommend not to use the alpha formats when only using one component.

Here's the list of formats with the deprecated ones highlighted in red (they are no longer part of the core version of the spec)

So my recommendation would be to use R16I/RED_INTEGER instead.

Thanks

Robert

0 Kudos
MJone15
Beginner
921 Views

Thanks. Offhand, I think that should work for us.

I will give it a shot and flag as answered if it works out.

0 Kudos
MJone15
Beginner
921 Views

That worked. Thanks. The following specifics do the trick:

Data format: GL_SHORT

Texture format: GL_RED_INTEGER[_EXT]

Internal format: GL_R16I

Also, I did the same thing with a 32 bit alpha luminance, and used an RGBA16UI texture instead.

0 Kudos
Reply