Hello Team Intel!
If you see the corresponding Club 3D thread above, Club 3D Technical Support Team is more than happy to work you! Happy to provide a sample for testing, logs, anything you need.
I would also love it if Intel could collaborate with Club3D on this. I got one of their DP to HDMI 2.0 adapters, and even after a firmware update wasn't able to achieve anything greater than 30hz above 1920x1080. I got the firmware update, which didn't make a difference, and was already at the latest driver for my HD4600 (driver hadn't been updated since November '15). I tested another PC with an HD4600 with the same results, though I was able to get an HD5500 to work.
I don't know whether it's an issue with the adapter or the Intel driver, but I've got to assume it's at least a little of both. I reached out to Club3D's support and though they were very friendly, they weren't able to offer any advice besides checking my cables and changing color depth. By the way, 32/16/8bit on the Intel Control Panel correlates to 12/10/8bpc that everyone else references, though in my case all it did was make my colors look pretty wacky.
There is a feature in DP that would allow the 4k at 60Hz, which is called Multi-Stream Transport, I could tell that HDMI 2.0 doesn't support that due to the fact you are not getting full Hz from that interface.
I regret to inform you that this is more like an limitation from the adapter/interface that would be leading to the behavior of the system.
Right now, there is no information about Intel working with the company mentioned previously.
Some additional information that could help: HDMI - Wikipedia, the free encyclopedia
Also, I would like you to check if in the config for your TV, you have switch modes to be available, this is the capacity of the receiver to allow how the signal is received.
Please do check that the port used in your TV is indeed HDMI 2.0
May I also have the maker and model of your TV?
Yep, I replied with the hopes of getting someone from Intel to comment, but we see how well that went.
MST in DisplayPort is not required for a 4K display. Although early 4K monitors did use it in order to bypass the inability of a variety of interfaces to produce a 4K @ 60hz image, and while MST was a clever way to get it to work right, it presented a number of usability issues. "Modern" displays do not require MST, and it stands to reason that the current crop of DisplayPort 1.2 to HDMI 2.0 adapters does not utilize it either. It is not mentioned in the material for the chipset, nor is it mentioned as a requirement in the adapter documentation, so it seems safe to presume that it utilizes SST (single stream transport) instead.
Anyway, I did some digging around and found this thread:
Robert_U mentions that getting the full 60hz will require a dot clock of 536MHz, which in turn requires a 540MHz CDCLK (core display clock), which depending on the processor will be limited. This is the reason why the U processors only support 30hz, and possibly an OEM trying to save some power, will be unable to drive at the full 60hz.
However, with that in mind my own board is an Intel DH87RL with an 4570S, so if anyone would know that it's being limited, it would indeed be Intel. I'd be curious how to expose this value, since then I would at least be able to rule that out as an issue. According to CPUz the GPU clock is running at 350MHz, and while I don't mind trying to overclock that a smidge to test, I don't believe that it is the same value to target.
Either way, here is the information from the card, which suggests that the 3840x2160 @ 60hz should be exposed to be selected, but isn't available in the Intel control panel:
Intel(R) HD Graphics 4600
Report Date: Friday, January 15, 2016
Report Time [hh:mm:ss]: 6:37:37 PM
Driver Version: 10.18.14.4332
Operating System: Windows* 7 Professional (6.1.7601)
Default Language: English (United States)
Physical Memory: 7113 MB
Vendor ID: 8086
Device ID: 0412
Device Revision: 06
Video BIOS: 2179.0
Current Resolution: 3840 x 2160
Processor: Intel(R) Core(TM) i5-4570S CPU @ 2.90GHz
Processor Speed: 2893 MHz
Processor Graphics in Use: Intel(R) HD Graphics 4600
Shader Version: 5.0
OpenCL* Version: 1.2
* Microsoft DirectX* *
Runtime Version: 11.0
Hardware-Supported Version: 11.0
* Devices connected to the Graphics Accelerator *
Active Displays: 1
* Digital Television *
Display Type: Digital
DDC2 Protocol: Supported
Connector Type: DisplayPort
Device Type: Digital Television
Maximum Image Size
Horizontal Size: 73.62 inches
Vertical Size: 41.34 inches
640 x 480 (60p Hz)
640 x 480 (75p Hz)
720 x 400 (70p Hz)
720 x 480 (60p Hz)
720 x 576 (50p Hz)
720 x 576 (50i Hz)
800 x 600 (60p Hz)
800 x 600 (72p Hz)
800 x 600 (75p Hz)
1024 x 768 (60p Hz)
1024 x 768 (70p Hz)
1024 x 768 (75p Hz)
1280 x 720 (50p Hz)
1280 x 720 (60p Hz)
1920 x 1080 (24p Hz)
1920 x 1080 (60i Hz)
1920 x 1080 (50i Hz)
1920 x 1080 (50p Hz)
1920 x 1080 (60p Hz)
3840 x 2160 (25p Hz)
3840 x 2160 (60p Hz)
3840 x 2160 (30p Hz)
3840 x 2160 (24p Hz)
00 FF FF FF FF FF FF 00 4D 10 12 11 01 01 01 01
0F 19 01 03 80 BB 69 78 0A 64 FD AE 4F 41 B2 26
0D 47 4A A5 CE 00 01 01 01 01 01 01 01 01 01 01
01 01 01 01 01 01 08 E8 00 30 F2 70 5A 80 B0 58
8A 00 94 24 53 00 00 1E 00 00 00 FD 00 32 4D 1F
46 3C 00 0A 20 20 20 20 20 20 00 00 00 FF 00 34
30 31 42 30 37 31 31 31 31 32 0D 0A 00 00 00 FC
00 4C 43 2D 36 35 55 42 33 30 55 0A 20 20 01 93
02 03 33 71 4F 5F 03 04 05 10 20 15 12 13 14 1F
5D 5E 01 60 29 09 07 05 15 57 50 00 07 00 83 01
00 00 6D 03 0C 00 40 00 28 76 20 00 60 01 02 03
E2 0E 61 08 E8 00 30 F2 70 5A 80 B0 58 8A 00 94
24 53 00 00 1E 02 3A 80 18 71 38 2D 40 58 2C 45
00 94 24 53 00 00 1E 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 00
00 00 00 00 00 00 00 00 00 00 00 00 00 00 00 3B
For my specific scenario, I'm using a JS8500FXZA. I'm setting UHD Color mode and well as setting the input to PC. For the HDMI cable I've tested both these two high-speed cables:
I don't believe MST would be a factor for either the DP output on the Broadwell NUC or the Club 3D adapter.
I've asked the folks over the Club 3D to join this thread so we can narrow down the exact requirements between the recent iterations of Intel Graphics (e.g. integrated graphics associated with Haswell to Skylake). See if we can determine which iteration should be supported, and what exact settings are required to accomplish it. This would allow end users to determine if the have an Intel Graphics that qualify. Thanks.
I've got to say, I'm really disappointed by the lack of participation by Intel in this thread. I'm not very surprised that Club3D hasn't joined in, but I figured someone from the Intel community would try to chime in more than just once.
Thank you for the information provided about this scenario.
I will be sending this for review, I will be providing you with my outcome on this.
Thank you for the update Esteban, I sincerely look forward to hearing what resolution there may be.
On a somewhat related note, are there plans to update the current HD4600 driver for Windows 7? There appear to be beta drivers in place for this device, but only supporting Windows 8 and 10.
No problem, I will keep you posted on this inquiry about the adapter.
I cannot assure that Intel is going to release newer drivers for Windows 7* due to the fact that 2 newer versions of the OS have been out for a while now.
1 of 1 people found this helpful
I was able to get the CAC-1170 to put out 3840x2160 @ 60Hz from Intel HD Graphics 4600 using the instructions in this video: https://youtu.be/1fhmwuLQCxw. Using the latest Intel graphics driver (18.104.22.16880, doesn't need to be the beta) you make a custom resolution for 3840x2160@60 Hz and use the CVT-RB timing standard instead of GTF. Then go back to the General Settings and 60p should be an option under Refresh Rate. Color stays at 32 bit. Tested with both regular use and a bunch of videos and everything is working fine so far (Big Buck Bunny 3840x2160 60fps was buttery smooth: Big Buck Bunny 3D - Download). HD Graphics 4600 only has H.264 and HEVC 8-bit hardware acceleration (no VP9 or HEVC 10-bit) so 2160p60 videos on youtube and 2160p 60fps 10-bit x265 videos were stuttering pretty badly for me (with CPU @ 100%).
On a side note, why is the default for Intel Graphics to use the old GTF timing standard instead of the newer CVT/CVT-RB standard? Couldn't the software at least pick the timing standard based on the EDID from the display rather than requiring a custom resolution to use the newer standard? EstebanC_Intel, maybe you could communicate that to your developers and let them know it would save many users from the hassle of hunting down this work-around?