3 Replies Latest reply on Feb 27, 2013 9:52 AM by nic

    HD 4000 OpenGL 4.0 driver bug?

    metaleap

      As previously reported on StackOverflow...

       

      Now, this is an extremely odd behavior.

       

      TL;DR -- in a render-to-texture setup, upon resizing the window (framebuffer 0) only the very next call to glClear(GL_COLOR_BUFFER_BIT) for bound framebuffer 0 (the window's client area) gives GL_OUT_OF_MEMORY, only on one of two GPUs, however rendering still proceeds properly and correctly.

       

      Now, all the gritty details:

       

      So this is on a Vaio Z with two GPUs (that can be switched-to with a physical toggle button on the machine):

       

      1. OpenGL 4.2.0 @ NVIDIA Corporation GeForce GT 640M LE/PCIe/SSE2 (GLSL: 4.20 NVIDIA via Cg compiler)

       

      2. OpenGL 4.0.0 - Build 9.17.10.2867 @ Intel Intel(R) HD Graphics 4000 (GLSL: 4.00 - Build 9.17.10.2867)

       

      My program is in Go 1.0.3 64-bit under Win 7 64-bit using GLFW 64-bit.

       

      I have a fairly simple and straightforward render-to-texture "mini pipeline". First, normal 3D geometry is rendered with the simplest of shaders (no lighting, nothing, just textured triangle meshes which are just a number of cubes and planes) to a framebuffer that has both a depth/stencil renderbuffer as depth/stencil attachment and a texture2D as color attachment. For the texture all filtering is disabled as are mip-maps.

       

      Then I render a full-screen quad (a single "oversized" full-screen tri actually) just sampling from said framebuffer texture (color attachment) with texelFetch(tex, gl_FragCoord.xy, 0) so no wrapping is used.

       

      Both GPUs render this just fine, both when I force a core profile and when I don't. No GL errors are ever reported for this, all renders as expected too. Except when I resize the window while using the Intel HD 4000 GPU's GL 4.0 renderer (both in Core profile and Comp profile). Only in that case, a single resize will record a GL_OUT_OF_MEMORY error directly after the very next glClear(GL_COLOR_BUFFER_BIT) call on framebuffer 0 (the screen), but only once after the resize, not in every subsequent loop iteration.

       

      Interestingly, I don't even actually do any allocations on resize! I have temporarily disabled ALL logic occuring on window resize -- that is, right now I simply fully *ignore* the window-resize event, meaning the RTT framebuffer and its depth and color attachment resolutions are not even changed/recreated. Meaning the next glViewPort will still use the same dimensions as when the window and GL context was first created, but anyhoo the error occurs on glClear() (not before, only after, only once -- I've double- and triple-checked).

       

      Would this be a driver bug, or is there anything I could be doing wrongly here?

        • 1. Re: HD 4000 OpenGL 4.0 driver bug?
          allan_intel

          Hi metaleap,

           

          This issue has been escalated to our research department; I suggest you keep visiting the community website for any updates on this inquiry.

           

          Thanks

          Allan

          • 2. Re: HD 4000 OpenGL 4.0 driver bug?
            metaleap

            Found a workaround resolving this, which I think will be very helpful to the Intel OpenGL driver team! Will they see this or could you forward to them?

             

            Basically the issue (that doesn't occur on GeForce or Quadro) started when I switched to a render-to-texture model: because then I also changed the depth and stencil bits being set during GL context creation, from 24/8 to 0/0. Since the final screen pass is just a simple full-screen triangle sampling from the off-screen-rendered framebuffer-texture, those depth/stencil (and alpha too) bits are not really needed in the main GL context / default/window/screen framebuffer, so I set them to 0 (of course that final RTT-sampling screen-rendering pass also disables depth/stencil testing). Now, ever since that change to 0 bits:

             

            • the framebuffer-affecting operations in the final pass (one glClear and one glDrawArray(triangles) call) resulted in GL_OUT_OF_MEMORY -- initially only on window-resize, but recently (since my latest driver update to 9.17.10.2932 (12-12-2012) I'm thinking) in every single loop iteration
            • framerate dropped from some 400+ FPS to 80-90 FPS

             

            Today I experimentally set the depth buffer bits at GL context creation from that 0 to just 8 (keeping stencil at 0 for now), and now the error is fully gone and framerate is back up at normal/previous levels.

             

            Now, this is not strictly a violation of the GL spec but at the same time I think it's also highly undesirable for a graphics driver to behave like this -- if anything, performance without a depth buffer should be higher rather than worse... so the above might be a good pointer to Intel's GL team.

            • 3. Re: HD 4000 OpenGL 4.0 driver bug?
              metaleap

              I just tried to post a new completely different HD-4000 issue here in this forum and although being logged in, got some "access restricted" error message.


              What's next? Intel still interested in a bug-free working to-spec GL implementation?

               

              If so, please let me know where I can post another issue (this time in GLSL 330 core with GL 3.3 core) that I found that occurs only on an HD but doesn't occur on other non-Intel GPUs...... thanks!

              • 4. Re: HD 4000 OpenGL 4.0 driver bug?
                nic

                Hi metaleap,

                 

                Can you provide a code sample that reproduces the issue you're referring to?

                 

                Thank you,

                -Nic