Graphics
Intel® graphics drivers and software, compatibility, troubleshooting, performance, and optimization
20599 Discussions

Does Intel HD Graphics 530 support 10-bit color, or will I need to retain Quadro GPU?

ACole3
Beginner
3,085 Views

I'm adding a 4k display with a 10-bit panel and found that Nvidia GPUs only support 10-bit color in their Quadro line. Will I need to retain the Quadro GPU after I upgrade my system to i7-6700K, or does HD 530 support 10-bit color depth?

Is Quadro still a performance leap up from integrated graphics, even in the latest HD 530 incarnation?

I see some Quadro problem discussion. Is there any inherent i7-6700K problem running a 4k panel via DisplayPort 1.2 from a Quadro K620 in a PCIe 2.0 x16 slot? Usual apps are Photoshop CS6 and Lightroom. I know they're still having problems making their apps run well even on the latest processors (validated in Adobe forums), but I'm looking for things I should be aware of before jumping to the i7-6700K.

Thanks for your insight.

0 Kudos
3 Replies
Allan_J_Intel1
Employee
1,667 Views

I understand that when you install a video card (Nvidia) the on board graphics solution of the processor gets electronically disabled.

Allan.

0 Kudos
AP16
Valued Contributor III
1,667 Views

HD Graphics have a potential to output 10bpc (at least deepcolor support stated for HDMI), but because display controllers are in mainboard chipset, actual support depends on motherboard vendors (and most of them nevermind about it).

Is Quadro still a performance leap up from integrated graphics, even in the latest HD 530 incarnation?

GPU of K620 is possible for 768 GFLOPS (single-precision), while HD Graphics 530 capable only 441. Using fastest DDR4 memory available i6700 can provide more video memory bandwidth against K620 (34 vs 29 GB/s), but this bandwidth is shared instead of Quadro dedicated VRAM. You should wait for Skylake-based Xeons that will have more potent Iris Pro Graphics, with twice more EUs (execution units, like GPU cores).

ACole3
Beginner
1,667 Views

Thanks for your information, Guys. I know that the Xeons are going to be really pricey and I'm after a Lightroom - Photoshop - Sony Vegas machine to replace my Core2 Quad 9450 with 8GB on a P35 board that was great in its day but has been overrun by 35MB camera files and video, especially in the excruciatingly sluggish Lightroom - which I read on Adobe forums still isn't close to 'snappy' with even the most robust HW. I have a Quadro K620 on the way, so won't worry about better on-die graphics.

You're probably saying, "Video - WAIT..., but I don't do a lot of it, and it isn't nearly as objectionable on the Quad as is Lightroom. I HAVE to have more RAM, though, and will start at 32GB. 2x16, of course, to save those slots. I do quite a few theatre lobby prints in Photoshop at 6,000x9,000px, 16-bit color, 40+ layers and a file size of about 3GB. Ever brought four cores and 8GB to their knees? By the time my little part of the photo industry overruns the 6700K/64GB/M.2 SSDs/4K/DisplayPort/USB3.x and so on, those new quantum compute jobs will be out, cost the same $1,500-2,000 as every new box has since my AST 386SX-20 (not sure what my Apple II cost with the fancy disk drive and 12" monochrome, but probably somewhere in the vicinity), and have the high-performance version of the holographic display for a few bucks more with prices falling.

PugetSystems.com did a study and documented that Lightroom likes fast cores with greatly diminishing return after 6, has a hard time leveraging GPU threads, and likes fast storage and RAM. I'll ask about cores since I'm in the right place. Is a multi-core-aware program able to use the 6700K's eight vcores in the same way it would use eight single-threaded cores on another chip?

Thanks again!

0 Kudos
Reply