What do the phrases “My datacenter power bill is too high” and “We have an energy crisis in our data center” have in common?  The answer is that each gets the meaning of the words POWER and ENERGY wrong. The words ENERGY and POWER are so commonly misused it can lead to confusion what people are actaully talking about. I thought I’d write down some thoughts about what they mean, or should mean, to the CIO.

 

Why should the CIO care? Because both POWER and ENERGY drive data center Total cost of Ownership (TCO) along parallel, but different tracks. Understanding the difference helps with prudent usage to distinguish different cost drivers. You'd do one thing to fix a power problem in a data center, and quite another to reduce energy costs.

 

Why does the CIO care about POWER?  Because POWER drives Capital Cost. Power is (roughly) how many electrons you are pushing through wires, into transistors and ultimately turning into heat that needs to be dissipated. More electrons means more POWER. The greater the POWER, the bigger the substation, the more capacity your UPS system will need, the more wiring infrastructure you need and, likely, the more servers you will be installing. The greater the POWER, the more air handling equipment you will need. Think of the cost of POWER as they cost of capacity; this typically runs $5 to $15/Watt for a modern data center. The catch? Once you buy the capacity, you have it (and pay for it) whether you use it or not. On the other hand, if you run out of capacity you’ll need to buy more, which is expensive and possibly disruptive to business.

 

What does the CIO care about ENERGY? Because your ENERGY bill is likely the biggest operational cost line item for your data center. People often use the rule of thumb like “a Watt’s a buck.” It’s a nonsense statement, but what they likely mean is that a Watt consumed over a year (which equates to 8.8kWh of ENERGY) times a nominal ENERGY cost of $0.11/kWh is $0.97 - about a buck. Unlike power you only pay for the ENERGY you use. So for instance, by refreshing your data center with more energy efficient modern servers, you should see a big decrease in your ENERGY bill even if your power infrastructure remains unchanged.

 

This kind of thinking can help us decode words like Power Efficiency (which, when substituted for energy efficiency, I've never liked) and Energy Efficiency (which has a well defined meaning). In the above context, Power Efficiency for a data center might reflect something like how much of the (paid for) power infrastructure is actually being used. A metric for stranded capacity. Energy Efficiency is well defined as the amount of work produced by your data center per the energy being consumed (and paid for).

 

So there you have it. ENERGY and POWER. Although often used interchangeably in the vernacular, they have, in reality, very different meanings. Some might say the difference is splitting hairs. But the reality is they contribute in very different ways to the cost structure of the data center. And I would argue that being precise about what we mean when it comes to the cost structure of a multi-million dollar data center is far from an academic exercise.

 

So that’s why the difference between ENERGY and POWER matter to the CIO.

 

Does that make sense? Did I get it right?