This content has been marked as final. Show 3 replies
Ok, thank you.
Created Vertex- and Pixelshader are leaking memory:
The calls to:
hr = g_pd3dDevice->CreatePixelShader( pPSBlob->GetBufferPointer(), pPSBlob->GetBufferSize(), nullptr, &g_pPixelShader );
hr = g_pd3dDevice->CreateVertexShader( pVSBlob->GetBufferPointer(), pVSBlob->GetBufferSize(), nullptr, &g_pVertexShader );
are leaking memory in the following scenario (see attached test application and screenshots):
- Create DirectX device
- Create Pixel and Vertex Shader
- Render 0.5s
- Release Pixel and Vertex Shader
- Release DirectX device
- Wait 0.5s
Each Vertex Shader leaks with multiple 4KB and 560B blocks.
Each Pixel Shader leaks with multiple 151B blocks.
Application: 64bit, using DirectX11
OS: Windows 10 Pro (1703)
CPU: Intel Core i7-6700 3.40GHz
Adapter: Intel HD Graphics 530
Driver: 220.127.116.1171 (08/13/2017) 154605.4771
Same scenario on the same computer with a Nvidia K620 (HD 530 turned off in BIOS) doesn't leak.
Installing an older driver (18.104.22.16852 12/15/2015) also leaks, but very different:
Each Vertex Shader leaks with multiple 31B, 72B, 36B and one 2KB blocks.
Pixel Shader doens't seam to leak.
The total leaking is much lesser then with the newest driver.
Our professional Video application creates and destroys DirectX devices very dynamic at runtime. The current behaviour lets the application use very much memory in short time,
so we need a solution for our customers using this Intel HD graphics. Currently the only and not always possible option for our customers is to use AMD or Nvidia graphics adapter.
Is it possible to get this fixed?
Ok, thank you.