- It may have a smaller amplifier than the old solution had, but this shouldn't cause poorer quality sound.
- No, absolutely identical in that regard.
- It shouldn't (but with Microsoft crap, you never know).
- No, they should be all the same. It is using the internal (processor-based) audio in all cases. I would note, however, that there is a separate sound chip on every NUC and its output can be directed to speakers or headphones.
- It is sending digital audio over the HDMI connection.
Hope this helps,
Scott, thanks for your reply.
Can you explain what the computer does with audio when outputting over HDMI? My thinking is that the computer just reads the data (the 1's and 0's from the mp3 file), then sends the raw data out over HDMI as is without any processing, then the receiver turns the 1's and 0's into analog signal to drive the speakers. Is this right?
Only when I plugin headphones, the computer will use its sound chip, essentially acting as the receiver to convert the data to analog signal to drive the headphones, right?
Yes, that's essentially correct.
Did you install the latest graphics driver for your system? You can download it here:
By installing the Intel Graphics Driver, the Intel Display/HDMI Audio Driver will be automatically installed.
Make sure you have the latest bios installed.
Do you have high quality HDMI cable?
It will be a good idea to try different HDMI cable to see if you can get better audio and a high quality cable if it is possible.
To verify you have one of these, look along the cabling itself for the printed text which says “High Speed HDMI Cable.” If it is not there, you may not be using a cable that is required to work with your device.
Ivan, I am a little confused by your reply in conjunction with Scott's reply.
Scott stated the computer does not do any audio processing, just sends the raw digital audio data as is over HDMI.
As with any digital signal, it is just 1's and 0's -- on or off, works or not, there is no "quality" involved.
However, your answer implies a different BIOS or driver or cable will somehow change the quality of the digital data. I am not sure how to understand that. Can you explain?
Scott is right about what he is saying and you understood correctly.
What I mean is in regards to bandwidth, there are two types of HDMI cable: Standard HDMI Cable and High Speed HDMI Cable. Standard cables only have enough bandwidth to push through 720p and 1080i video signals. Note that's 1080i, not 1080p. In order to get access to 1080p and 4K content, you'll need a High Speed HDMI cable. Manufacturers are required to specify whether their cables are Standard or High Speed, the difference being the amount of bandwidth each cable can handle. Any High Speed HDMI cable can transmit 4K content from your source to your TV.
You can see some information here:
It should go without saying, but I will say it anyway (): same applies to audio; bandwidth *is* a factor...
I'm running Ubuntu 16.04 with a 4.5 series kernel on nuc6i5 and have the same problem with HDMI audio. In my case I compared audio from the headphone jack with my HDMI connected monitor's headphone jack and the monitor's audio is noticeably worse. Unfortunately I don't have a receiver with HDMI and a quality DAC to determine if the problem is caused my the nuc or the monitor. Maybe someone else would like to check?
So far, my understanding is that since computers do not process audio in anyway when sending over HDMI, every computer should sound the same when connected to the same receiver using the same cable. Any difference in sound quality is probably imaginary.
Please let me know if that's correct.
Since you mentioned music, you will have more involved. If it were a movie, then the 5.1 (as it may be) encoding would pass right through unchanged (should, especially if encoded in Dolby/DTS), but with music, there will be some driver involvement, if only to encode to the expected format (LPCM, AC52, etc.) and, for LPCM at least, sample rate, bit size, and maybe more.
So, no, for music it won't be straight through. But the expectation is that any difference would not be audible. For the two machines' difference, it's likely something has been added between the music .flac (.mp3, etc.) file and the cable. For example, a mis-applied sample rate converter, or sound effect (enhancement from driver, or music player), or maybe just the levels (as set in device's properties, or even music player) are different - maybe a lower level would sound better?
I would try a different player.
If you are considering cables, I think all HDMI cables made the past 10 years at least have been high-speed (read this on the cable cord), and in that case, the bits flowing through the cable don't care for any more than that: the bits at one end of the cable will be identical to the other end of the cable. The cable is not part of the equation.
P.S. Music files (.flac, .mp3, etc.) are always decoded by the music player, or something initiated by that player, but this should be a constant in the equation, if you are using the same player and everything else in the player is set the same. This decoded (to almost always LPCM) data is then sent to the audio driver, which directs it out to HDMI. The HDMI may, I believe, carry this exact-same decoded LPCM (decoded from the music file), or it may AGAIN re-encode the data into AC52, or some other digital format. For example, Dolby Digital Live (AC52), or its DTS counterpart, will re-encode LPCM.
HDMI has enough bandwidth to send multi-channel LPCM, so you can and should avoid re-encoding, if possible. That is especially valid if the original source was itself lossy-encoded, like mp3 files, since double-lossy encoding audio sounds even more terrible than that mp3. Bluetooth audio has that problem (it re-encodes before it transmits to the BT HP). LPCM lossy-encode to mp3...decode mp3 to LPCM (can play this)...re-encode (yet another loss of audio info bits) to other format to send down HDMI, etc. It's that last re-encode you want to avoid, and can. Just send native LPCM down HDMI.
Audio tracks in movies generally are already encoded in a format that can natively go down to the HDMI device. So for that (movie), at least, audio should always sound identical, since the audio signal is completely untouched by the computer.
They probably have written books on this, so if it sounds like gibberish, find a few books.
P.P.S. When you rip a CD, the data you get off the CD is LPCM, at 44100 Hz sample rate, 16 bits (2 bytes) of data per channel, two channels. 44100 Hz x 16 bits/channel x 2 channels = 1,411,200 bits per second (about 1.4 Mbps (M bits per second)). The CD audio data rate (bandwidth). Not much at all by HDMI standards.
I agreed with Cornel right up until his last statement (Well, last before he edited it). This is patently not the case (it might be true in a perfect world but we don't live in one). There are still plenty of non-high-speed cables around. Worse, there are lots of cables (high-speed or otherwise) that are of such poor quality that transmission of high-quality video and audio is next to impossible (as they say, you get what you pay for). You want to find a cable that says it supports HDMI 1.4a or better.
There are a few settings you can look at to see if you can improve things at the driver level. Right-click on the Volume icon in the system tray and select Playback Devices. Highlight the entry for the Intel SST Audio Device (if it isn't already), click on Properties and then select the Advanced tab. In the Default Format box, click on the down-arrow and select the (typically last) entry with the highest available bit rate and quality. Click on the Test button to verify that the output actually works at this setting. If it doesn't, select a lesser setting. Once you have the highest available bit rate and quality setting that will work, press Apply and then OK. Test with your application of choice and see if this improves things...
Hope this helps,
Digital audio. If even one bit is off the sound isn't sort of worse, it's obviously worse. Doesn't matter if LPCM or AC52; even a single flipped bit is noticable. Well, assuming the cable isn't just somehow finding the Lsb of LPCM data coming down it. Read on first.
16 bit audio. The Msb (most significant bit), the sign, of a 16 bit data value has a value of 32768 (out of 65536: 0 to 65535). If that is flipped (1 instead of 0, or 0 instead of 1), the noise would be heard as a very loud tick. And that's just one wrong bit in over 1.4 million bits coming down the wire every second (assuming 2 channel, 44100 Hz, 16 bit audio). Again, not even close to sort of worse sounding. So, it's not the cable. Period.
One other possibility. Bias. Your folks like the Lenovo, for whatever reason, and don't like the NUC. If they "think" the Lenovo sounds better, it will, to them.
So with DVD, the audio is sent without any processing, but with mp3, the audio is processed by the player...
I know mp3 is compressed, so does "decoding" mean "taking a guess" at what the initial music sounded like, which means different players will guess differently, thus sound different? Is there any website I can read up on to understand how all this works?
BTW, my parents changed the HDMI audio setting from 16bit to 24bit, and they feel NUC sounds closer to Lenovo now. Not sure if that really helped or just in their minds. Any thoughts?
Great to know that your parents now are having better experience with the Intel® NUC. I do not have any website where you can understand how all this works, maybe anyone else would like to share some links with us.