I've found that you can use just about any DC Power Supply for any DC power equipment provided that the tip is right, the input AC voltage is correct for your region and the power brinks deliver the correct DC voltage and Minimum Amps rated (19.5v/11.8A).
The Bios based 230 watt setting is exactly what you turn off if you find a power supply that has the right voltage but more than 11.8A because the delivered wattage is > 230.
You can run these NUCs on a 19.5v/20A power brick @390W (and turn off the wattage bios setting).
Using an underrated Power Brick, is never recommend.
Well, I ended up purchasing a couple 330w Dell "5X3NX" power supplies (directly from Dell Canada, at great expense, to avoid the potentially counterfeit junk on Amazon).
I figured I need two, since if I want to disable the BIOS power sensing limit, everywhere I plug it in better be able to supply more juice, so I then shouldn't use the stock power supply anymore. Seems kinda cheap to artificially limit current draw rather than shipping it with a more capable brick.
I visited the Dell Link you provided, and I can't see where they describe the DC Output Voltage or output Amps.
Overall wattage is > 230W, but if the power supply isn't rated for 19.5VDC output, you will have, at a minimum problems with your NUC working properly.
There are three basic scenarios when purchasing DC Power bricks that are not an exact match for the OEM version.
The following will cause damage to your device:
Higher voltage adapter than device rating
The following will cause harm to your power cord or adapter:
Lower current adapter than device rating
The following might not cause damage, but the device will not work properly:
Lower voltage adapter than device rating
If the adapter has the correct voltage, but the current is greater than what the device input requires, then you shouldn’t see any problems. For example, if your device needs a 19V / 5A DC input, but you use a 19V / 8A DC adapter, your device will still get the 19V voltage it requires, but it will only draw 5A of current. As far as current goes, the device calls the shots, and the adapter will have to do less work.
Checked BIOS this morning to see if Power Sense was set.
This is the BIOS setting that needs to be off, if using a more capable power supply than the 230W brick shipped with the unit.
I assumed it was anything providing 19.5VDC and *more than* 230W.
[edited] Reading the exact section of the technical specification actually identifies my assumption was wrong.
When enabled, the power sense will monitor the input power from the power supply and will assert PROCHOT# (Processor Hot) to the CPU if the power [draw] is high enough that it risks causing the power adaptor to shut down.
the "[draw]" in the quoted section above, is for clarity.
System shipped with Power Sense off and I have the standard brick, and makes sense given the tech spec.
My conclusion is exactly the opposite. No, it doesn't make sense. If you have the standard power brick, you want this circuit enabled to protect against overdraw causing an abrupt system shutdown and, potentially, a permanent failure of the power brick. Only if you have a more-capable power brick should you consider not enabling this circuit.
Isn't it funny how the contrary thoughts will always pop into your head literally a split second after you press the "Add Reply" button? Curses Red Baron!
Anyway, my contrary thought is this: What constitutes a fully-loaded system? Well, it has to be one that is consuming the maximum power possible across all connectors. Now, I do not have any information on the power budget of the overall design, but I do know that you are not going to have devices connected to every USB port that are drawing the maximum current (you do have at least one and possibly two in use for keyboard and mouse). Bottom line, I would think that the draw is going to be well below the full 230W. Bottom line, the protection circuit is likely not necessary -- but what advantage does disabling it have? Having it enabled protects you against an extreme case and, in normal times, is completely benign -- because no, disabling it won't result in a potentially faster system as the maximum budget of the processor and graphics solutions is [well, supposed to be ] fixed.
Yeah, I really wonder how much the CPU/GPU can potentially draw for turbo/burst style operations when operating below TDP? ie, does this option have any practical impact if you aren't also charging via the USB port, etc?
If I feel ambitious maybe I will try doing some benchmarks. My hunch is you're right, and the option will have no measurable impact on performance.