Currently Being Moderated

I recall a conversation not too long ago with my program manager who had embarked on his maiden LEAN - 6 sigma certification, and evangelized the omission of waste from a given process cycle.

 

This prompted me to think about organizational IT functions, and what this currently entails.  What in the process isn’t useful? Without impacting deliverables, what can be reduced without compromising deliverables?

 

Today’s cloud environment is hosted in mega data centers, and many companies host their private cloud in their enterprise data centers. Power efficiency is at the heart of every leading company.   Are our data centers designed correctly?

 

Walk into any data center and you will find that it is running at 21C to 22C. Data centers are over-cooled, and designed to run at maximum cooling temperatures.  ASHRAE (American Society of Heating, Refrigeration and Air conditioning Engineers) recommends temperatures in the high thirties centigrade range for different classes.  Since 2006, ASHRAE operating guidelines advise running data centers at a temperature of 27C (81F).  However, few Data Center operators adhere to these guidelines, as evidenced by Service Level Agreements (SLA) that state operating temperatures of 22C.  Clearly, a paradigm shift must precede any technical corrections.

 

For now, let’s get the basics right, starting with 27C operations.  What does running at 27C actually mean? It refers to the supply temperature of the server intake cold aisle.  So, is the solution as simple as increasing the operating temperature? Is that all that is necessary?

 

It is a misnomer that allowing the room to heat up will save on cooling costs.  Proper equipment cooling requires an engineered solution that adheres to Delta T specifications in terms of cooling tonnage and airflow. Data centers must be designed with proper cooling capacity for high density racks.  Air Flow needs to be segregated properly – hot air in the hot aisle and cold air in the cold aisle – for a closed loop cooling path.    It is essential to ensure sufficient airflow velocity to cool the equipment racks. The Computer Room Air Conditioning (CRAC) must match the equipment heat load and ensure the Delta T temperature is met. Exceeding the Delta T specs has operational consequences so understanding the specs and how your DC performs is crucial in this respect.

 

What is the next step?

Once IT equipment has cooled effectively and you’ve ensured your data center is designed in accordance with basic cooling principles, the next step is to raise the termperature from 21C to 27C.  This should be done progressively, raising the temperature one degree at a time.  You can change the sensor settings or switch off the Computer Fluid Dynamic (CFD) of Computer Room Air Conditioning (CRAC). Measure the CRAC supply and return temperature, equipment inlet and outlet temperature, under the raised floor and ceiling plenum temperature, and air pressure. Use a CFD tool to perform an analysis prior to re-calibrating the room. Your building management systems should measure key vital signs to ensure the CRAC, cooling loop supply and return temperature before and after the change.  This should be well documented and in line with your cooling equipment specifications.

 

What are the rewards?

To demonstrate the rewards, we describe a 2MW IT load data center - a model that closely parallels similar customer setup. For 21C operations, the total annual power consumption is assessed at 19.6 GWhr, Power Usage Effectiveness (PUE) of 1.60 and infrastructure overhead energy of 7.39GWhr.  When the operating temperature raises to 27C, overhead energy is reduced by 23%. There are greater advantages when an economizer is setup; there will be a 37% savings in overhead infrastructure energy and PUE is reached at 1.38. Assuming a power cost of $0.08 per kwhr, this translates to a conservative $212,000, which is no small change, and a carbon footprint reduction of 1,280 tons. All this can be accomplished with zero impact on applications.

 

This also adheres to the operations guidelines of 2008, which sets the operating temperature range at 18C to 27C. Can the temperature be set higher? Absolutely! That is why ASHRAE designed Class A1, Class A2, Class A3 and Class A4 with their ASHRAE 2011 DC white-paper, which moves the DC into 40C high ambient operations. This will be a topic that fuels a future discussion.

 

For many enterprises, there is a renewed interest in setting up a solid and efficient cloud infrastructure that runs like a utility – always available. This means setting up the private cloud, proper sizing and hosting the applications in the appropriate DC.

 

For cloud computing hosting providers and builders, there is a need to provide high uptime with a reasonable cost structure.  Operate your DC right, and the savings will flow to your bottom line. A 1C increase in operating temperature can translate to 4% savings in chiller power.  Run the data center efficiently, and your business will enjoy competitive edge.  It is critical to get the infrastructure right in order to ensure customer confidence in the cloud. This will have a multiplier effect, as customers often sign up with providers based on recommendations. Many customers are keen to sign up with hosting providers who are leaders in stewarding sustainability. Being Green is not a choice; it is a mandate and makes sound business sense.

Comments

Filter Blog

By author:
By date:
By tag: