Transmission

XMission's Company Journal

Data Center Cooling Optimizations Bring Big Wins

At XMission we have just completed some significant energy efficiency improvements in our data center. To educate the public and our customers as well as promote sustainable energy practices, I will be writing some blog posts about industry trends and the steps we have taken at XMission to dramatically improve our energy efficiencies. These best practices upgrades not only make XMission more environmentally friendly but help us offset incessant rate hikes by our electrical utility provider. I should note that Rocky Mountain Power’s federally mandated incentive program is providing us with a substantial rebate for these energy saving upgrades.

xmdc

Since much of the data available about HVAC (Heating Venting Air Conditioning) is written by engineers with PhDs and includes complex formulas, I intend to reach a different audience and will try to translate these highly technical ideas into general terms. For this first installment, I will provide a brief historical overview about data center energy efficiencies. Subsequent posts will focus on different aspects of the overall strategy.

Historical Background
Up until about a decade ago, data center cooling strategies were simple: make computer rooms really cold. Like a meat locker. This approach was a holdover from the telco clean rooms of yesteryear where various, expensive equipment was loosely housed in rows with a multitude of large CRAC (Computer Room Air Conditioning) units circling the perimeter and continually running their compressors, the largest energy draw from traditional air conditioning systems.

While all IT equipment had fans, it circulated air in various directions: networking equipment blew air sideways, or even in the opposite direction of server fans. Such an approach mixes the heat into the surrounding cooler air, rather than directly removing it, and creates only a slight variance in temperature between the air leaving and eventually re-entering the CRAC units. This practice was horribly inefficient but in the past data centers were relatively small, uncommon, and an insignificant percentage of the total cost of doing business.

Unfortunately, this archaic cooling strategy was deployed at the onset of the tech boom in the late 90s with its exponential data center growth, both in size and number. Things were going too fast to design facilities any differently from how they had been built in the preceeding decades. Over the next 10 years, both server rack density and data center growth dramatically increased, leaving IT companies to not only struggle to cool the equipment but also to pay the exorbitant power bills. Adding ever more CRAC units to server rooms was expensive and often ineffective. Making the room colder didn’t effectively pull the heat away from high density server racks either. Often, the servers would recirculate the hot exhaust from the top and sides back around to their own intakes.

Road to Efficiencies
By 2004, IT companies realized that they needed to re-engineer server room cooling efficiencies to more effectively cool their gear and improve energy efficiencies. There also was a growing concern about worldwide data center power consumption and the environmental burden caused by such significant consumption (coal, natural gas, nuclear, etc). Change was long overdue and pressing. Among other things, ASHRAE (American Society of Heating and Air-Conditioning Engineers) responded with it’s first edition of its breakthrough, Thermal Guidelines for Data Processing Environments and the EPA worked with IT equipment vendors on Energy Star guidelines for data centers.

Data centers in the US currently consume about 2% of the total electricity generated nationally and that number continues to increase, although not as quickly as it was between 2005-2010. Virtually every query and web site you access from your computers and smartphones depends on equipment housed in data centers, which are scattered across the globe. With electricity being one of the largest operating expenses for these facilities, much work has been done to analyze and improve server and cooling efficiencies.

To best understand cooling optimization in data centers you need to keep two key factors in mind:

  1. Computers don’t need to be cool inside; they just need to be able to effectively reject the heat they generate. Internal operating temperatures of 100 – 140 F (38 – 60 C) are absolutely fine for safe, long term operation. Most computing equipment is designed to operate at intake temperatures of up to 95 F (35 C), although ASHRAE recommends ideal intake temperatures to not exceed 80 F (27 C).
  2. Air conditioning doesn’t actually need to create cold air but rather only needs to remove enough heat from the facility to sufficiently maintain safe operating temperatures inside of IT equipment. This concept goes against how we traditionally think of air conditioning and can be difficult to grasp. Among other things, this means that the hot aisle can be quite hot.

Cooling Strategies and Benefits
Chief among the top changes include improved energy efficient computer hardware, hot/cold aisle containment, a return hot air plenum, VFD (Variable-Frequency Drive) fans, adiabatic humidification, and water/air side economizing. I will be writing blog posts in the coming weeks with a focus on each of these strategies.

The two key gains from these strategies are significant energy efficiency improvements and more effective cooling. Saving energy is a huge win both for a company’s bottom line and our environment. Without the efficiency gains in recent years, global data center power consumption could be as much as twice its current level of 1%. As well, supercomputers and blade servers couldn’t be densely populated in racks without the dramatically reinvented approach, which focuses on air flow and heat removal.

My next blog post will focus on optimizing air flow using aisle containment.

Grant Sperry works at XMission overseeing operations and colocation. Established back in 1993, XMission was an early Internet pioneer and continues to provide amazing products and personalized service. If you like what we’re doing, contact us to see how we can help your company thrive.

Facebooktwitterredditpinterestlinkedinmail

, , , , , , , , , ,

Comments are currently closed.