the Power Consumption of your servers (watts) or the Power Efficiency of your servers (performance / watt)
... or maybe you prefer the Performance per Watt per SqFt argument
I have spent a lot of my time the last several years discussing this topic with IT professionals around the world - and there are a lot of varying opinions.
I believe that Performance per Watt is a better measureof overall value for the data center and server room.
The power consumed by a server is an important measure, but power only comparisons can be misleading.
Example: If server ‘A' consumes 50W less power than server ‘B', then it can save IT $79 per year per server in power and cooling costs (assumes $0.08 kW/hr power costs and cooling costs equal to power costs). Scale that $79 savings per server across a data center with thousands of servers and it can be a pretty impressive number.
However, if a server with 50W lower power delivers lower application performance ... is the power savings worth it? The answer of course depends ... but generally in my experience the answer is a resounding No.
Example: What if server A (the 50W lower power server) underperforms server B by 33% in performance. This means that you need to deploy more ‘A' Servers to get the same performance as ‘B' Servers. In fact, with a 33% performance advantage, you need only 3 ‘B' servers for every 4 ‘A' servers. The higher performance per Watt delivered by server B reduces acquisition costs, reduces power consumption (less servers) and minimizes space and eases manageability. This example is shown graphically above
What do you think? What power and performance metrics do you look at before purchasing servers