Integrated Graphics Processor-Intel® HD Graphics support:
1 x DisplayPort, supporting a maximum resolution of 4096x2304@60 Hz
* Support for DisplayPort 1.2 version, HDCP 2.2, and HDR.
1 x HDMI port, supporting a maximum resolution of 4096x2160@30 Hz
* Support for HDMI 1.4 version and HDCP 2.2.
Maximum shared memory of 1 GB
Read pages 36ff
Thank you Stefan3D. I purchased a display port to HDMI converter today. I have the same issues where it should now be supported (I can do 4096x2160 @ 30Hz, but not 60Hz). I know my adapter works fine, as I tested it with my Vega 64 and it runs at 60Hz.
I am on windows 10 with current driver version of 22.214.171.12401.
Doesn't look like your mobo supports HDMI 2.0a as described in the following White Paper by Intel.
If the mobo does not have the appropriate LSPCon chip, the maximum display rate will be a HDMI 1.4 rates and HDR will not be enabled. There appears to be only one manufacturer for the LSPCon chip which is Mega Chips. Good luck trying to determine the specifications listed on their website.
What needs to be done, and what I have been told by Intel is that the LSPCon converter will be built into future Intel products. The question is when will Intel deliver a graphics processor like the UHD630 that supports the HDMI 2.0a natively and supports 4k60HZ.
The next problem I see is native support for dynamic HDR (Dolby Vision and now HDR10+). It appears HDR10+ will require a HDMI 2.1 interface while Dolby Vision, since the dynamic metadata is embedded in the video signal, and requires a HDMI interface as far back as version 1.4b.
I have asked the question to Intel on my thread "HDR10+ requires HDMI 2.1" at SilentK. Intel just provided a response that HDMI.org has just recognized and accepted the HDR10+ technology. I expect it will be some time before Intel incorporates the interface requirements to handle both the interface and processing requirements into their product line. That also means that existing mobo's that use the Mega Chips LSPCon chip could be outdated.
I did not get an answer as the Dolby Vision question I asked about the UHD630 and how it handles Dolby Vision signaling.
I believe HDR is just now evolving to a viable technolgy. If only Dolby Labs didn't require a licensing fee for Dolby Vision and Dolby audio such as Dolby ATMOS and DTS:X, we wouldn't be having this discussion.
Don't really know your system configuration. Can you supply?
Your LG OLED55C7P supports Dolby Vision. I have read where Dolby Vision signaling has to be implemented as a direct connection from the source to the sink. There can not be any adapters, splitters, ect. in the source-sink chain. I believe CyberLink discussed this issue when I was doing some research on their PowerDVD 17 Ultra product that is bundled with the Pioneer BDR-211UBK internal ultra HD burner. Cyberlink provides an 4K compatibility advisor. It may be useful to run the advisor and analyze the results. I have an open support question to CyberLink about how and if they handle Dolby Vision. I will post their response once I receive.
I have also tried running it through my receiver (Denon AVR-X1300W), which gives me the same input/output; which would take the direct TV connection out of play. The TV also supports HDR-10, so that would take out Dolby Vision. Again, I can take the same exact adapter and cable and plug it into my Vega 64 and it runs at 4k60Hz. As soon as I plug it into internal, I can only get 4k@30. Obviously I can't test HDR with the Vega card, as the proprietary BS is there and it won't allow it. The compatibility advisor says everything is good except the recommended HDR display (again, I can't do 60Hz so this is expected at least to this point).
16 Gig Ram
Z370 Aorus Gaming 7
Again, I would re-read the Intel HDR White Paper. It doesn't appear that your mobo has the LSPCon chip required to convert the DP port to HDMI 2.0a which would enable 4K60HZ HDR. That may be why your Vega64 works but the mobo HDMI does not. There is a section in the Intel WP that directs you to the mobo's that support LSPCon. The WP also addresses issues for primary and secondary video processing if you are using an external GP.
It doesn't appear to be a cable issue since you can get HDR from your Vega64, but it wouldn't hurt to check your cables to make sure they are rated and tested to HDMI.org specifications.
I have a question, how do you know the video signal is HDR and not SDR the is upconverted to HDR at the display?
You may also be running into a codec compatibility issue. You may want to read
I have an active converter, so the chip is inside the converter itself to make the conversion happen. This may have issues still for HDR content, but at this point I'd like to at least get it up and working off the iGPU at 60Hz and test from there.
For HDR content, my TV will display in the top right corner when it receives either HDR 10 or Dolby Vision content/connections. I don't see that when I hook up my Vega card, and assuming with the restrictions on content it won't work.
I can now get my TV to do 4k @ 60Hz on the iGPU, but not HDR. What inside the intel control panel enables 4:4:4 color/HDR?
I am now using this adapter and cable, which both should support it:
Have you upgraded Windows 10 to the 1709 version? If so, in settings under the Display tab, you should see "SGX" on/off feature if your video processing chain supports HDR. Try enabling SGX and see if your display shows HDR. If there is no SGX switch, then something in your video processing chain (source-to-sink) is not HDR compliant.
I am on 1709 with the latest drivers. I'm thinking it's the adapter or motherboard output that's causing the issue. I'm able to activate HDR using a straight HDMI cable to my Vega 64. As soon as I plug in the adapter and try over the same cable but under displayport on either the iGPU or Vega, I can no longer activate HDR (even though I can do 4k@60Hz on both). It's pretty disappointing overall since I need the iGPU to do HDR to use my Ultra HD Disk drive. I'm guessing the Dispalyport 1.2 on my motherboard will not correctly convert the signal to HDR from the reading I have done, since it appears to need Displayport 1.3 to work.
If anyone else has any thoughts before I return all the adapters/Ultra HD Drive please let me know.