Data Center Cooling

1-800-638-2638

Data Center Cooling

Which cooling solution is best for your current and
future capacity needs? 42U can help you find out!

Data Center Curtains

An economical airflow management solution for
potential energy savings & reduced cooling costs

In Row Precision Cooling

Flexible & scalable cooling solutions optimized for
your Data Center

Hot / Cold Aisle Containment

Time tested strategies to maximize cooling efficiency

High Density Liquid Cooling

Closed Loop cooling from 10kW to as high as 60kW
within a 12-inch wide footprint

In-Rack Cooling

At the rack cooling for increased densities

Have Cooling
Questions?

42U has provided vendor independent
Data Center Solutions for over 19 years

Cooling Solutions for Server Racks, Computer Rooms, Server Rooms & Data Centers

Improve uptime & lower costs with efficient cooling solutions for server racks, server rooms & data centers

Cooling infrastructure is a significant part of a data center. The complex connection of chillers, compressors and air handlers create the optimal computing environment, ensuring the longevity of the servers, and the vitality of the organization they support.

Yet, the current data center cooling ecosystem has come at a price. The EPA's oft-cited 2007 report predicted that data center energy consumption, if left unchecked, would reach 100 billion kWh by 2011 with a corresponding energy bill of $7.4 billion. This conclusion, however, isn't strictly based on Moore's Law or the need for greater bandwidth. Their estimate envisions tomorrow's processing power will be addressed with yesterday's cooling strategies. The shortcomings of these designs, coupled with demand for more processing power, would require (10) new power plants to provide the juice for it all, according to that report.

According to a more recent study commissioned by the NY Times from Jonathan Koomey Ph.D. Stanford entitled, Growth in Data center electricity use 2005 to 2010, "the rapid rates of growth in data center electricity use that prevailed from 2000 to 2005 slowed significantly from 2005 to 2010, yielding total electricity use by data centers in 2010 of about 1.3% of all electricity use for the world, and 2% of all electricity use for the US." Assuming the base line figures are correct, Koomey states that instead of doubling as predicted by the EPA study, energy consumption by data centers increased by 56% worldwide and only 36% in the US. 

According to Koomey, the reduced growth rates over earlier estimates were, "driven mainly by a lower server installed base than was earlier predicted rather than the efficiency improvements anticipated in the report to Congress."  In the NY Times article, Koomey goes on say, "Mostly because of the recession, but also because of a few changes in the way these facilities are designed and operated, data center electricity consumption is clearly much lower than what was expected..."

However, this reduction in growth is likely temporary, as our appetite continues to increase for internet access, streaming and cloud based services.  Data centers will continue to consume growing amounts of electricity, more and more data centers will come on line, and data center managers will increasingly look to newer technologies to reduce their ever growing electricity bills.  Additionally, when you consider that the estimated energy consumption of the US in 2010 was around 3,889 Billion kWh, 2% still represents close to 78 billion kWh.  Clearly the trend is increased data consumption and with it increased energy consumption.   

In light of these trends and despite the lower growth rates, many industry insiders are continuing to turn a critical eye toward cooling, recognizing both the inefficiencies of current approaches and the improvements possible through new technologies. The information contained herein is designed to assist the data center professional who, while keeping uptime and redundancy inviolate, must also balance growing demand for computing power with pressure to reduce energy consumption.

Issue: Understanding the Efficiency Metrics Best Practice: Adoption and use of PUE/DCiE

In furtherance of its mission, The Green Grid is focused on the following: defining meaningful, user-centric models and metrics; developing standards, measurement methods, processes and new technologies to improve data center performance against the defined metrics.
- The Green Grid

Measurements like watts per square foot, kilowatts per rack, and cubic feet per minute (CFM) are ingrained in data center dialogue. Until recently, no standard measurement existed for data center efficiency. Enter the Green Grid, a consortium promoting responsible energy use within critical facilities. The group has successfully introduced two new terms to the data center lexicon: Power Usage Effectiveness (PUE) and Data Center Infrastructure Efficiency (DCiE).

Power Usage Effectiveness (PUE)

PUE is derived by dividing the total incoming power by the IT equipment load. The total incoming power includes, in addition to the IT load, the data center's electrical and mechanical support systems such as chillers, air conditioners, fans, and power delivery equipment. Lower results are better, as they indicate more incoming power is consumed by IT equipment instead of the intermediary, support equipment.

While it's not the only consideration, cooling can be a major player in PUE measurement. Consider the following diagram, where the combination of the chiller, humidifier, and CRAC consume 45% of the total energy coming into the facility.

Where does the energy go? (Source: The Green Grid)
Where does the money go?
(Source: The Green Grid)

The Uptime Institute approximates an industry average PUE of 2.5. Though there are no tiers or rankings associated with the values, PUE allows facilities to benchmark, measure, and improve their efficiency over time. Companies with large-scale data center operations, like Google and Microsoft, have published their PUE. In 2008, Google had an average PUE of 1.21 across their six company data centers. Microsoft's new Chicago facility, packed with data center containers, calculated an average annual PUE of 1.22.

The widespread adoption of PUE, left in the hands of marketing departments, leaves the door open for manipulation. Though the equation seems simple, there are many variables to consider, and users should always consider the context of these broadcasted measurements. At its core, however, the measurement encourages the benchmarking and improvement at the site level-the actions individual professionals can take to improve the efficiency of their facilities.

Data Center Infrastructure Efficiency (DCiE)
DCiE is simply the inverse of PUE-Total IT Power/Total Facility Power x 100%. DCiE presents a quick snapshot into the amount of energy consumed by the IT equipment. To examine the relationship between PUE and DCiE, "A DCiE value of 33% (equivalent to a PUE of 3.0) suggests that the IT equipment consumes 33% of the power in the data center."

ASHRAE temperature and humidity recommendations:

The American Society of Heating, Refrigerating and Air-Conditioning Engineers (ASHRAE) is an international technical society organized and a leading authority providing recommendations for data center cooling and humidity ranges. ASHRAE TC 9.9 recently released their "2008 ASHRAE Environmental Guidelines for Datacom Equipment" which expanded their recommended environmental envelope as follows:

2004 Version2008 Version
Temperature20°C (68°F) to 25°C (77°F)18°C (64.4°F) to 27°C (80.6°F)
Humidity40% RH to 55% RH5.5°C DP (41.9°F) to 60% RH & 15°C DP (59°F DP)

** Conditions reflect air entering the Server & IT equipment **KW/Ton

KW/Ton measures HVAC efficiency by comparing the energy consumption of the component in kW versus the amount of cooling provided in tons (12,000 BTU/hr). Like PUE, the lower the value, the more efficient the device. A kW/Ton rating is possible for all of the major components of a data center cooling system from the compressors to the server exhaust fans.

As data centers implement best practices and state-of-the-art technologies, they can expect the kW/Ton ratings to improve throughout the energy scheme.

Issue: Understanding the Cooling Components Best Practice: Next generation of cooling solutions

"Even in many organizations where IT and facilities staffs are cooperative, budgetary and measurement objectives are often separate and sometimes at cross purposes. Facilities typically pays for data center electricity, and IT often does not know how much electricity is being used in the data center or what it costs. In particular, cooling costs are almost always invisible to IT"
- Gartner

A data center professional is not necessarily an HVAC engineer. But he must be mechanically savvy and comprehend the entire energy scheme within the cooling infrastructure. Rising energy costs and a volatile economy will make the corporate hierarchy call for reductions in operational costs. And, based on the PUE discussion, we've seen that cooling infrastructure can have a major impact on OpEx.

A brief example of the traditional cooling system

Computer Room Air Conditioners (CRAC)

Computer Room Air Handler (CRAH)

Humidifier

Usually installed within CRAC / CRAH and replaces water loss before the air exits the A/C units. Also available in standalone units.

Chiller

The principles of data center cooling (air delivery, movement, and heat rejection) are not complex, but these systems are. There are a number of smaller components like compressors, fans, and pumps, which shape the system's operation and effectiveness.

Even within the "traditional" generation, nothing is one size fits all; cooling solutions are often dependent on factors like room layout, installation densities, and geographic location. 42U's cooling technologies provide the data center manager with thorough product overviews, data on ROI, installation and commissioning, and objective product recommendation based strictly on user application, environment, and goals.

Airflow Management

Issue: Understanding Airflow Management Best Practice: Measurement, CFD Analysis, Containment

"In most cases, a fully developed air management strategy can produce significant and measurable economic benefits and, therefore, should be the starting point when implementing a data center energy savings program"
- The Green Grid

The cooling components are charged with creating and moving air on the data center floor. From there, the room itself must maintain separate climates - the cool air required by the servers and the hot air they exhaust. Without boundaries, the air paths mix, resulting in both economic and ecological consequences.

In the early 2000s, Robert Sullivan, an Uptime Institute scientist, advanced the concept of hot aisle/cold aisle, attempting to achieve air separation within the server room. The design, which aligns data center cabinets into alternating rows, endures in critical facilities throughout the world and is widely regarded as the first step in improving airflow management.

Hot Aisle/Cold Aisle Configuration
Hot Aisle / Cold Aisle Server Rack Configuration

The arrangement, however, lacks precise air delivery and removal, leaving users a new set of challenges.

Users can address these inefficiencies through measurement, modeling, and analysis. These tools - including Computational Fluid Dynamics (CFD) analysis, air velocity and pressure gauging, provide a snapshot of your environment, pinpointing problem areas. Furthermore, real time measurement provides an immediate analysis of your data center environment. Often significant improvements are achieved with quick, inexpensive remedies, like blanking panels, brush strips and a modular type containment system (for more information click here).

Cold Aisle Containment

Cold Aisle Containment attempts to maximize the hot aisle/cold aisle arrangement by encasing the cold aisle with barriers made of metal, plastic or fiberglass. This approach eliminates the above challenges, ensuring the cold air stays at the server intake, while the air handlers receive the warmer exhaust error, improving their efficiency.

Hot Aisle Containment

In Hot Aisle Containment, the hot aisle is now enclosed, using the same barriers as its cold aisle counterpart. The design captures exhaust air via In-Row air conditioners, conditions it, and returns it to the cold aisle. AC efficiency is further improved as neither the hot exhaust air nor cold inlet air has far to travel.

High Density Cooling

Issue: Designing, Implementing, Managing High Density Best Practice: Next Generation Cooling Technology

The trend towards higher density cabinets and racks will continue unabated through 2012, increasing both the density of compute resources on the data center floor, and the density of both power and cooling required to support them.
- Gartner

The industry is exploring progressive cooling solutions because the current generation, discussed earlier, has proven insufficient and inflexible with increased computing requirements. (Chillers, for instance, are estimated to consume 33% of a facility's total power in current layouts). In its report, the EPA christened some of the latest options as either "best practice" or "state-of the art" in its analysis and approximated gains of 70-80% in infrastructure efficiency through their use.

Best PracticeState of the Art
Free CoolingDirect Liquid Cooling
Air Side EconomizersClose-Coupled Cooling
Water Side Economizers 
 - Evaporative Cooling  
 - Dry Cooling  

Free Cooling

Free cooling brings Mother Nature into the data center. When the ambient temperature and humidity are favorable, an economizer system circumvents some of the cooling infrastructure and uses the outside air as a cooling mechanism. The economizers come in two forms.

An air side economizer uses the outside climate to cool the data center. This outside air is distributed to the cabinets via the existing air delivery system, except no mechanical activity is needed for heat rejection.
Read more about Air side economizers here http://www.42u.com/cooling/air-side-economizers.htm

A water side economizer uses the outside air in conjunction with a chiller system. Instead of compressors, the outside air cools the water, which is then pumped to data center CRAHs. Water side economizers are marketed as either evaporative coolers or dry coolers.
Read more about water side economizers here http://www.42u.com/cooling/water-side-economizers.htm

Economizer use and ROI depend heavily on climate, meaning the data center manager must thoroughly review readings like wet bulb temperature, dry bulb temperature, and relative humidity for his location. For those with optimal environments (low night-time or seasonal temperatures) there will be compelling arguments for their use, especially their impact at the chiller level.

Rittal Liquid Cooling Package
Rittal Liquid Cooling

Close-Coupled Liquid Cooling

Close-coupled liquid cooling expands further on air management and containment. The use of water and the proximity of the heat transfer increases efficiencies and enables some close-coupled cooling designs to operate with elevated chilled water temperatures. Higher inlet water temperatures can reduce the energy needed for mechanical cooling and also maximize the amount of free cooling hours per year.

Elevated Chilled Water Temperatures

For facilities with a chiller infrastructure, the EPA recommends raising the water temperatures. Different sources estimate a traditional supply temperature between 42-45 degree F. A higher supply temperature yields higher efficiency for the chiller as it reduces energy consumption. Higher set points can also segue into water side economizers.

In a survey conducted by the Uptime Institute, enterprise data center managers responded that 39% of them expected that their data centers would run out of cooling capacity in the next 12-24 months and 21% claimed they would run out of cooling capacity in 12-60 months. The power required to cool IT equipment in your data center far exceeds the power required to run that equipment.

Overall power in the data center is fast reaching capacity as well, and an obvious area that needs to be addressed is implementing cooling best practices where ever possible and utilization of in-row cooling to address hot spots. In the same Uptime survey 42% of these data center managers expected to run out of power capacity within 12-24 months and another 23% claimed that they would run out of power capacity in 24-60 months. Greater attention to energy efficiency and consumption is critical.

To optimize the cooling in your data center, a good first step is an in-depth analysis of your current environment to gain a holistic understanding of your data center's environment, increase awareness of your critical risk factors, benchmark performance metrics, and generate a punch list of opportunities for cooling improvement.

Heat is just one of the many factors that affects IT performance. Beat the heat with the scalable climate control solutions. From passive air to active liquid cooling; the best and most flexible cooling concepts are designed according to your requirements. Below are details on some of the latest energy efficient data center cooling solutions available.

Server Room Cooling Solutions:

Up to 60 kW cooling output, with three cooling modules possible per equipment rack. Read more:

Effective Cooling Strategies For Today's Datacenters ( http://www.42u.com/cooling-datacenter.htm ), Liquid Cooling for Data Centers ( http://www.42u.com/liquid-cooling-article.htm ), and Rack Cooling ( http://www.42u.com/rack-cooling.htm ).

Computer Room Cooling Services:

Specialized engineering and educational services that enhance data center performance without additional capital investments. These services will help you to boost data center reliability, optimize your current cooling infrastructure, enable precision cooling to eliminate hot spots, dramatically reduce bypass airflow, assist with data center ASHRAE compliance and understand the dynamics of your data center, including cooling requirements and deficiencies

Rittal Liquid Cooling Package
Rittal TS8 Racks with
Rittal Liquid Cooling Package

Rittal Liquid Cooling Package

Developed to remove high levels of waste heat from server enclosures, this high density cooling solution utilizes air/water heat exchangers providing uniform, effective and affordable cooling for servers and similar IT equipment. The special horizontal airflow of the Rittal LCP represents an adaptation of this widespread cooling principle, providing cooled air uniformly throughout the complete height of the enclosure.

The liquid cooling unit is a modular, upgradeable, and temperature-neutral cooling concept.

Blade Cooling Circuit: Distributor Kit

Liquid cooling module for Rack level cooling
Rack Cooling Module

Rittal's liquid cooling for server racks serves as a basis for this solution. The large number of drip-free taps to the individual computers is now replaced by the connection of horizontal liquid sub-distributors, where the drip-free connectors for connection of the Power blade are located.

Liebert Water-Based Cooling System

Liebert XD Water Based Cooling System

Liebert XD Water Based Cooling Module and Racks

The Liebert server rack cooling modules work as a system with Liebert XDP-W Coolant Pumping Unit to supply cooling water to high heat density racks. The Liebert XDK and Liebert XDR modules offer an alternative to critical applications where the hot aisle/cold aisle approach to cooling is not practical.

Liebert XD Water Pumping Unit

Liebert XDP Water Pumping Unit

The water-based Liebert XDP-W pumping unit houses the isolating heat exchanger between the Liebert XDR-W or Liebert XDK-W and the building chilled water. It circulates the chilled water to the cooling modules at a temperature always above the actual dew point to prevent condensation.

Rack Cooling Distributor Accessories & Fan Systems

Enclosure Blowers

Enclosure Blower for Server Cabinets

Assist airflow in your cabinets with an enclosure blower and be confident that your equipment will run cooler. The Enclosure Blower features a durable, lightweight polymer housing and fits in the bottom of server cabinets where it draws cool air from beneath the floor and boosts delivery of the coldest available air directly to server fronts. This unique blower is a suggested solution for solid or partially vented front doors with rails set back a minimum of four inches to establish the column of air to raise to the top of the cabinet.

Dual Power Transfer Switch

The dual power transfer switch is designed to provide continuous 115 V AC power for an enclosure blower and other single corded fans and motor-driven devices. Capable of automatically switching a 5A load during a power failure on one input circuit, the dual power transfer switch can power other single corded fans and motor-driven devices commonly deployed in data centers.

Split Rear Door Fan

42U Rack Fan Cooling Solution
Rack Cooling Fans

With today's high heat loads inside server enclosures, it is critical to use thermal management techniques with optimized enclosure ventilation. With vented front and rear doors and a vented roof, server cabinets provide such ventilation. A split rear door fan takes advantage of the front to rear airflow of most servers and allows the user to position the fan assembly in the most critical location within the cabinet. Two high-pressure fans, 105 CFM each, feature a shallow depth chassis design and provide the ability to exhaust hot air out the rear of the enclosure at a dramatically higher rate than traditional roof mount fan trays. The result is cooler operating equipment, even with very high enclosure heat loads.

Fan Expansion Kit

Designed for retrofitting various Rack fan units or to supplement the fan mounting plate and fan roof, modular.

Air Baffle System

An Air Baffle System has an air inlet nozzle in the base frame. Cold air is drawn in from below and is routed into the double-walled door. The cold air is then distributed inside the rack with special covers.

DC Fan Mounting Plate

Exceptionally low-noise thanks to FCS speed control and fully fitted. Suitable for rack enclosures with a raised roof (> 20 mm) or vented roof plates. It is installed from above. A cable entry is prepared in the rear section of the plate.

Server Fan Assembly

A server fan assembly takes advantage of the front-to-rear airflow of most servers and allows the user to position the fan assembly in the most critical location within the cabinet. The result is cooler operating equipment, even with very high enclosure heat loads.

Room Air Conditioners

Liebert Air Conditioning Solutions

Liebert precision cooling products provide the precise, year-round cooling required by sensitive electronics while protecting them from the environmental hazards of dust, temperature and humidity. Liebert solutions include standalone, ceiling and wall-mount systems for spot, room and telecom shelter applications, and liquid chilling systems for medical imaging and industrial equipment.

Mini Air Conditioners

42U Rack Air Conditioner Cooling Solution
Portable Cooling

Especially developed for small enclosures but can also be used in large enclosures with low heat dissipating requirements. Rittal air conditioners are available in compact roof mount and powerful wall mount versions.

Medium Range Air Conditioners

These medium capacity air conditioners are used worldwide and stand out through high dependability, flexible installation options, optimum ease of operation and high safety standards.

Full Sized Air Conditioners

Multi-functional wall mount air conditioners fully meet these requirements through their user friendliness, microcontroller technology, and ease of installation.

Mini Chillers

Mini Chillers
Mini Chillers

Compact size does not mean small performance. These mini chillers are ideal when installation is required in the tightest areas.

Freestanding Chillers

Free Standing Chiller
Free Standing Chiller

With a wide cooling range to satisfy demands for a high cooling output with optimized space requirements, these Freestanding Chillers feature removable panels to ensure easier access for servicing.

Chiller Technical design:

Chiller Configuration:

Chiller system wired and plumbed ready for connection, with multilingual documentation including functional diagram and wiring plans.

Cooling Accessories

Raised Floor Thermal Management

Raised Floor Grommet Thermal Management
Raised Floor Grommet

A cost-effective airflow and thermal management strategy to improve airflow in your data center or server room is to make sure that all cable openings in your raised floor tiles are sealed. Based on measurements at multiple data centers, 50 - 80% of valuable conditioned air is not reaching the air intake of IT equipment due to unsealed cable openings. This lost air, known as bypass airflow, contributes to equipment hot spots, cooling unit inefficiencies, and increasing infrastructure costs. Sealing cable openings is an essential component of IT equipment reliability and availability. Raised Floor Grommets are a cost-effective remedy.

Optimize Airflow in your Rack Enclosures with Blanking Panels

Optimize air flow within your server racks by using effective sealing quick finger install blanking panels while monitoring temperature within your racks with optional temperature strips.

Interface Board

The interface board is an extension for server rack cooling units with a comfort controller. In this way it is possible to monitor a master/slave combination of up to 10 cooling units.

Digital Enclosure Internal Temperature Display and Thermostat

For installing on the enclosure door or wall and in a cooling unit or heat exchanger.

42U Rack Air Cooling Solution Accesories

Speed Control

Temperature dependent speed control for Rack fan and filter units and air/air heat exchangers with a rated operating voltage of 230 V AC for noise reduction and to save energy in part-load operation.

Conclusion

"Greening" the data center often starts with the cooling infrastructure. This page addresses the basics, hoping to facilitate deeper discussions on current vs. future cooling technologies. From chillers to CRACs to economizers, there is no shortage of vendors who, armed with case studies, tables, and whitepapers, are championing the efficiency of their products. Our goal is to help the data center professional wade through mountains of vendor data to find the most germane, economical, and efficient products for his application

Bibliography

Cappuccio, D. (2008). Creating Energy-Efficient, Low-Cost, High Performance Data Centers. Gartner Data Center Conference, (p. 4). Las Vegas.

EPA. (2007, August 2). EPA Report to Congress on Server and Data Center Energy Efficiency. Retrieved January 5, 2009, from Energy Star:
http://www.energystar.gov/ia/partners/prod_development/downloads/EPA_Report_Exec_Summary_Final.pdf

Koomey, Jonathan. (2011). Growth in Data center electricity use 2005 to 2010. Oakland, CA: Analytics Press. August 1.
http://www.analyticspress.com/datacenters.html

Markoff, John. (2011, July 31). Data Centers’ Power Use Less Than Was Expected. Retrieved July 5, 2012, from The New York Times:
http://www.nytimes.com/2011/08/01/technology/data-centers-using-less-power-than-forecast-report-says.html

Sullivan, R. (2002). Alternating Cold and Hot Aisles Provides More Reliable Cooling for Server Farms. Retrieved December 15, 2008, from Open Xtra:
http://www.openxtra.co.uk/article/AlternColdnew.pdf

The Green Grid. (2008, October 21). Seven Strategies To Improve Data Center Cooling Efficiency. Retrieved December 18, 2008, from The Green Grid:
http://www.thegreengrid.org/en/Global/Content/white-papers/Seven-Strategies-to-Cooling

The Green Grid. (2009). The Green Grid: Home. Retrieved January 5, 2009, from:
http://www.thegreengrid.org/home

About 42U

42U focuses on providing Facilities and Data Center Managers with objective and trusted independent information on Data Center Power and Data Center Cooling best practices along with KVM over IP and Remote Power lights out management to help them reduce their organization's power consumption. Our Data Center Management website has helped millions of IT professionals globally who search for best practices and technologies to help them make their computer environment more efficient.

Creative Commons License
This work is licensed under a Creative Commons Attribution-No Derivative Works 3.0 Unported License.

55,000+ Client Companies Have Chosen 42U

There’s a reason we're the #1 choice of so many IT professionals.

Questions? Call us.

Call in and speak with one of our knowledgeable solutions engineers who will answer any questions you might have.

1-800-638-2638

Request a Evaluation

One of our data center solutions specialists will work together with you to find the optimal solution for your system.

Request a Evaluation