IT Peer Network

3 Posts authored by: JohnMusilli

Do Not Wait for an Alarm or Failure Give your Data Center a "Health Check" using a simple hand held Infra Red (IR) Gun. This tool can provide early warning for electrical breaker overload, CRAC unit calibration issues, server air supply stratification, source of CRAC short-cycling. See the image below and use the number references for legend. The cost of the tool is between $100 and $500 the higher priced guns are recommended for the multiple features

 

1. Check temperature range of breakers

Check panel cover for ambient temperature, then breaker temperature range. Look for outliers hot and cold. Hot could be loose wire or overloaded circuit.

2. Check under floor for poor air flow

Floor tile temperature is a quick check for restricted air flow or range beyond CRAC.

3. Check actual temperature of delivered air (Supply air)

Concrete in front of CRAC should be around 55 degrees Fahrenheit.

4. Server in-take temperature on rack frame low

Rack frame at first server position compared to temperature at top of rack shows air temperature stratification or rack heating from conductive heat loads. Temperature range of 6 degrees is good. If more than 10 degrees, look for hot air mixing from above or behind servers. Max intake air temp greater than 90 degrees is a great risk to the server platform.

5. Server in-take (supply) temperature on rack frame high

Plus 6 to 10 degrees is the range from good to poor. (See note in 4 previous)

6. In-coming air (return air) temperature off sheet metal frame

Temperature in center of CRAC filter bank is a good indication of actual ambient mixed air returned to CRAC. Compare this temp with CRAC thermal readout for indication of short cycling or bad CRAC temp sensor.

 

 

 

 

See previous Blogs at

Data Center Toolbox for Power and Cooling

Data Center Toolbox \\"Watts per Square Foot of What\\"?

See Published articals at

http://searchdatacenter.techtarget.com/originalContent/0,289142,sid80_gci1275008,00.html

http://www.cio.com.au/index.php/id;537667845;fp;4;fpid;51245

http://www.computerworld.com/action/article.do?command=viewArticleBasic&articleId=9028098&pageNumber=1

 

 

 

Please comment on and rate this Blog.

 

 

New topics coming soon:

"Generic Data Center Racking, Cost and Space Benifits"

"Data Center Layer One and Structured Cabling Designs, Without Costly Patch Panel Installations"

"Server Power Cord Management"

"Humidity Management to "Humidify or Not Humidify"

 

Disclaimer

 

The opinions, suggestions, management practices, room capacities, equipment placement, infrastructure capacity, power and cooling ratios are strictly the opinion and observations of the author and presenter.

The statements, conclusions, opinions, and practices shown or discussed do not in any way represent the endorsement or approval for use by Intel Corporation.

Use of any design practices or equipment discussed or identified in this presentation is at the risk of the user and should be reviewed by your own engineering staff or consultants prior to use.

 

 

Things You Need to Operate a Successful Data Center Infrastructure.

This is number 2 in a series of Toolbox topics.

 

If you have spent more than 3 months in data center operations someone has asked, "What is your Watts per Square Foot (W/sq.ft) Data Center design"?

 

Odds are your room design is somewhere between 40 watts per sq.ft and 100 watts per sq.ft This value is most likely the room envelope, Wall to Wall area including staging, telecom, tape storage, PDU,s (Power Distribution Units) and CRAC units (Computer Room Air Conditioner) See diagram below. Although this is the correct answer from the architect's perspective and the electrical,mechanical capacity construction designs, it causes great confusion in the industry. What we really want to describe and reference is the area or space the work is being performed in. In other words where the POWER (Heat) is delivered, and COOLING, (heat removal), is required. To better understand this concept and use this knowledge to communicate with others, please review the drawing below. This is an example of the possable interpretations of Watts per Square Foot data center design. Note as you are going through the exercise that I started out with a 50w/sq.ft room and by re-evaluating my environment I created a room design at 130w/sqft without spending a dime! The point is Do Not be Confused by The Facts you may have a 50w/sqft room but you can produce 130w/sqft of capacity

bq.


 

Data Center Math

Watts Per Square Foot Of What?


  • Room Envelope = Gross Raised Floor sq.ft. This is the wall to wall space of the entire room including ramps, tape storage, PDU,s CRAC's staging area

  • Production Area= Servers Plus Support Equipment (Traditional Layout) This area is represented in blue and is the actual recommended space access (48in front 36in rear) PLUS the direct support equipment CRAC's that need to be near the heat loads

  • Equipment Footprint or Work Cell = Racks + Required Access Space (~16sq.ft. per rack) this is the recommended space for access (48in front 36in rear) and average rack size (24x40in)*

  • Server Rack Load The actual electrical load of the installed server base in Kw (kilo watts)

 

Please see my earlier blog Data Center Toolbox for Power and Cooling. Please comment on and rate this Blog. New topics coming soon:

  • "Use of a Hand Held IR (Infra Red) Gun for a Data Center Health Check"

  • "Generic Data Center Racking, Cost and Space Benifits"

  • "Data Center Layer One and Structured Cabling Designs, Without Costly Patch Panel Installations"

  • "Server Power Cord Management"

 

Disclaimer

  • The opinions, suggestions, management practices, room capacities, equipment placement, infrastructure capacity, power and cooling ratios are strictly the opinion and observations of the author and presenter.

  • The statements, conclusions, opinions, and practices shown or discussed do not in any way represent the endorsement or approval for use by Intel Corporation.

  • Use of any design practices or equipment discussed or identified in this presentation is at the risk of the user and should be reviewed by your own engineering staff or consultants prior to use.

 

Things you need to operate a successful Data Center infrastructure.

 

This is a first in a series of Toolbox topics others include

"Watts per Sq.Ft.of What"

"Use of a Hand Held IR (Infra Red) Gun for a Data Center Health Check"

"Generic Data Center Racking, Cost and Space Benifits"

"Data Center Layer One and Structured Cabling Designs, Without Costly Patch Panel Installations"

 

As a data center operations manager you are reasonable for the stability of the physical infrastructure of your environment. Often this requires support from maintenance and or engineering staff to provide you with capacity and room loading calculations. In order for you to do your job efficiently and not be reliant on others you need a few tools to Help You Help Yourself the first in a series is:

 

*Data Center Math

*Power and Thermal Measurement

 

Watts (w) = Volts (V) x Amps (A)

 

Voltage x Amps +=+ KW (Kilo Watts) (This is Electrical Heat)

1000 w

 

British Thermal Unit (BTU) (measure of heat)

 

One Watt of Power requires 3.432 BTU's to cool

 

12,000 BTU's = One Ton of Cooling

 

Example;

120 Volts x 160 Amps = 19,200 Watts = 19.2 kW

 

 

19,200w x 3.432 BTU = 65,894 BTU = 5.5 Tons of Cooling Required

One Ton of Cooling = 12,000 BTU

 

 

Power Basics:

Reduce all loads to Watts as the common measurement including cooling. If you use Watts as the common unit you do not need Amps or Voltage when determining capacities.

 

 

 

Power Rough Rules of Thumb

• One average rack of 2u to 8u servers, (40u's total) use~ 5000watts

• One disc type storage bay(24inches)is ~5000watts

• One network equipment rack ~ 30 to 40u's of switches requires 5000w to 6000w

• The average server landing power requirement with redundant network and redundant disc storage is 400watts per server

• The average server landing power requirement with single network switch and single storage connectivity is 300watts per server

• The average "One U" server rack with 40 servers per rack ranges between 7500w to 9000w depending on utilization

• One blade center is 3600w to 4000w

 

Cooling Rough Rule of Thumb

One blade center @3600watts requires 1 ton of cooling.

• One rack of 2u through 8u servers, (40u's total) required 1 1/2 (one and one half) tons of cooling

• The industry standard rack doors can restrict up to 40% of the air flow

• If using "relative humidity set points" set @ 50% plus or minus 20% this will reduce alarms and operating cost

• Available supply air temperature at the server intake can be as high as 80 degrees Fahrenheit without issues

 

 

If this Information is useful please comment

 

 

*Disclaimer

The opinions, suggestions, management practices, room capacities, equipment placement, infrastructure capacity, power and cooling ratios are strictly the opinion and observations of the author and presenter.

The statements, conclusions, opinions, and practices shown or discussed do not in any way represent the endorsement or approval for use by Intel Corporation.

Use of any design practices or equipment discussed or identified in this presentation is at the risk of the user and should be reviewed by your own engineering staff or consultants prior to use.

 

 

Filter Blog

By author:
By date:
By tag: