Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20499 Discussions

Loading texture to integer internal format resukts in all zeroes

JT9
Beginner
1,646 Views

Hi all,

I'm using OpenGL to convert video frames from 10-bit YUV420p to 8-bit RGB. YUV frame data is loaded as a texture with:

glTexImage2D(GL_TEXTURE_2D, 0, GL_R16UI, m_frameWidth, m_frameHeight + m_frameHeight / 2, 0, GL_RED_INTEGER, GL_UNSIGNED_SHORT, videoFrame.data());

In the fragment shader it's accessed with:

# version 130

// irrelevant variables definitions here

uniform usampler2D frameTex;

void main()

{

// component value is saved on 10 least significant bits,

// so to normalize it divide by maximum value that can be coded on 10 bits (2^10 - 1 = 1023)

float Y = float(texture(frameTex, vec2(gl_TexCoord[0].s, gl_TexCoord[0].t * YHeight)).r) / 1023.0;

float U = float(texture(frameTex, vec2(gl_TexCoord[0].s / 2, UOffset + gl_TexCoord[0].t * UHeight)).r) / 1023.0;

float V = float(texture(frameTex, vec2(gl_TexCoord[0].s / 2, VOffset + gl_TexCoord[0].t * VHeight)).r) / 1023.0;

gl_FragColor = vec4(HDTV * vec3(Y, U, V), 1.0);

}

Now, all the texels I get with texture() have value (0, 0, 0, 1).

The very same code works when I switch application to use discrete nVidia card.

What would be a problem here?

My system configuration:

Windows 8 64-bit, i7-3740QM with HD Graphics 4000 with driver version 9.17.10.2843 (the newest available for my Lenovo laptop) and discrete nVidia Quadro K1000M.

0 Kudos
2 Replies
IUman
Honored Contributor II
459 Views

Hello jtoma,

Thank you for joining the Intel communities.

You can get better support at the following link:

http://software.intel.com/ http://software.intel.com/

Regards!

Ivan

0 Kudos
JT9
Beginner
459 Views

Hello Ivan,

I shall try my chances there then.

Thanks and regards!

0 Kudos
Reply