Menu
IT Leaders Pursue Data Center Innovation to Beat the Heat

IT Leaders Pursue Data Center Innovation to Beat the Heat

The perennial data center quest to beat the heat has sparked a wave of innovation in enterprise computing.

Densely packed computing facilities produce a lot of heat. Getting rid of it is a must for boosting the reliability of computing and communications gear. The trick is keeping things cool without running up utility bills and expanding the carbon footprint.

To that end, IT managers have an expanding list of options and measures to consider. Data centers may combine straightforward approaches (such as organizing centers into cold and hot aisles) with more elaborate components (such as cooling towers). Even water-cooled computers, once a staple of the mainframe world, appear to be making a comeback. Immersion cooling, in which servers are bathed in a nonconductive cooling fluid, has made an appearance in a few data centers.

"People are tackling this problem in every possible direction you can imagine," says Chris Sedore, CIO and associate vice chancellor for academic operations at Syracuse University.

Ingenuity in cooling is born of necessity, but the array of choices, in part, stems from greater tolerances for heat and humidity in data centers. In previous decades, data centers had tightly controlled climates, with temperate set points of 68 to 70 degrees Fahrenheit and narrow humidity ranges, Sedore says. But the American Society of Heating, Refrigeration and Air Conditioning Engineers (ASHRAE), which sets de facto standards for data center climate, has pushed the high end of its recommended temperature range to 80.6° F and increased the peak humidity threshold as well.

Sedore says ASHRAE's opening of the temperature and humidity window has influenced manufacturers to expand the climate range in which their machines can be expected to operate. The shift has also enabled data centers to use more cooling methods.

Some organizations have adopted evaporative cooling, for example. It would have been difficult for a data center to hit 68° F with just evaporative cooling, Sedore says, but the greater climate tolerance makes such techniques possible. "There is much more opportunity to take an approach like that."

Evaporative Cooling Helps Data Centers Save Energy

REI, a $2 billion retail co-op with headquarters near Seattle, Wash., employed evaporative cooling (among other approaches) in a data center retrofit it completed in late 2013. The project supported the company's objectives of conserving energy and reducing greenhouse gas emissions.

From a business perspective, the project saved a sufficient amount of energy to power six stores for a year. REI opens five to eight stores each year, so the project contributes to the company's expansion. "We save enough energy to keep on growing the company for another year," says Kirk Myers, corporate social responsibility manager at REI.

REI's retrofit led to a 93 percent reduction in the energy needed to cool the data center. The company also makes better use of the energy it does consume to run its machines. Power usage effectiveness (PUE) is a metric organizations use to gauge data center energy efficiency, with the goal of moving the dial as close to 1 as possible. Myers said REI's data center had a PUE rating of 2.4 prior to the project. It now aims for a PUE value of well below 1.4, he adds.

REI's energy savings stems from a number of sources -- among them an evaporative cooling tower installed on the roof of the building housing the data center. Before the retrofit, the data center relied on mechanical cooling, specifically indoor, closed-loop dry coolers, noted Michael Stachowiak, senior energy engineer at CLEAResult Consulting, a company that provides energy efficiency programs and services and worked with REI on its energy-saving data center initiative.

The evaporative cooling unit, however, is open to the outdoor air, so provides the efficiency of "free cooling." Here's how it works:

A pump moves water to the top of the cooling tower.

The water passes through a filter and the water, as it's exposed to the air, is cooled through evaporation.

A small amount of water evaporates during this process; the rest of the water runs down the tower and into a coil inside the building.

Air is pushed through the coil and cools the data center.

The water is then pumped back to the tower.

The cooling tower, Myers says, takes advantage of the Seattle region's temperate climate. Temperatures are relatively mild year round, but it's the typically low humidity that makes evaporative cooling a good fit. (Water-side free cooling approaches such as evaporative cooling are most effective in low-humidity areas. Less water evaporates in high humidity conditions.)

Airborne contaminants are one downside of evaporative cooling systems exposed to the elements. Stachowiak says the REI cooling tower's filter works around this issue, screening out cottonwood seeds, bird feathers and other matter. The filter is self-cleaning and flushes itself once a day.

Myers says evaporative cooling handles 99 percent of the data center's cooling needs, with mechanical cooling automatically taking over for the evaporative system during particularly hot and humid conditions. "We need an extra kick in the summer for a few hours," he says.

Chilled Water Offers Opportunity for 'Free Cooling'

DuPont Fabros Technology, a Washington, D.C.-based real estate investment trust that develops and operates wholesale data centers, is incorporating mechanical and evaporative cooling technologies in a new data center under construction in Northern Virginia.

The Ashburn Corporate Campus ACC7 data center will use an evaporative chilled water plant with plate and frame heat exchangers for water side economization and centrifugal chillers to assist during the summer months. Bob Rosenberger, vice president of data center mechanical operations at Dupont Fabros, calls the chillers "among the most efficient ... on the planet."

The plant will produce 70° F chilled water to cool the data center. The temperature of the chilled supply water marks a sharp departure from conventional chillers that historically cool water to 45° F, Rosenberger notes. Once the water absorbs the heat from the equipment in the data center, it will reach a temperature of 95° F in the return flow. Normally, return water temperatures have been around 55 to 65 degrees.

The higher design temperatures of the supply and return water, made possible by ASHRAE's revised temperature guidance for data centers, will let ACC7 use "free cooling" 70 percent of the year. That's because wet-bulb temperatures in Virginia are low enough for 70 percent of the hours in a year to let the cooling towers and the heat exchangers produce 70° F chilled water.

"It's a radically different design, and it's all about efficiency," Rosenberger say.

Syracuse's Green Data Center, meanwhile, pursues a different take on chillers, using waste heat from its onsite power generation capability, natural gas-fired microturbines. The 585° F turbine exhaust is routed to absorption chillers. The exhaust re-concentrates a lithium bromide solution and releases water vapor, which is re-condensed in a cooling tower, according to Syracuse.

A stream of cooled water -- which is kept above the dew point, typically 60° F -- is directed to the data center, where it's piped to heat exchangers located in the server racks. The Coolcentric rear door heat exchanger model SU uses has a cooling capacity of 18 to 24 kilowatts, while the prototype sidecar model is rated at 30 kW. The Green Data Center uses 56 rear door and 5 sidecar heat exchangers, Sedore notes.

Sedore says the cooling process means the data center doesn't rely strictly on moving air around with computer room air handlers. He notes that air handlers are still used for humidity control and comfort control for people working in the data center.

Syracuse partnered on its Green Data Center with IBM and the New York State Energy Research and Development Authority.

Efficient Data Center Cooling Means Multiple Methodologies

Companies working toward more efficient cooling employ multiple methods. At REI, for example, the evaporative cooling system is the most visible physical attribute of the company's cooling approach, but it works in combination with more prosaic efficiency measures.

For one, REI replaced belt-driven fans with more efficient Electrically Commutated (EC) plug fans to circulate the air in its data center, Stachowiak says. The belt-driven fans are typical of air conditioning systems used in data centers. At REI, the EC plug fans were placed below the raised floor to move cool air to the data center equipment, he says.

REI found another opportunity to improve cooling under the raised floor. Excess cabling had accumulated in the underfloor space, restricting airflow, but tidying the coils of cable addressed that issue, Myers says.

Data centers such as REI's also maximize cooling efficiency by matching the airflow characteristics of perforated floor tiles to the server racks they cool. The tiles, placed in the cold aisles of a data center, allow air circulating below the raised floor to reach computing equipment. A high-flow tile would be used for a rack with 30 pieces of gear, while a tile with a lower flow rating would be used for a rack with just one device operating. "Only put enough there to meet the load," Stachowiak says. "Nothing else."

The REI retrofit project also considered its power distribution units (PDUs) and uninterruptible power supply (UPS) units. CLEAResult found that the company's PDUs were redundant since the UPSs were equipped with distribution panels. The PDUs were removed. The two remaining UPS units were given a firmware and software upgrade to boost efficiency.

Myers says the multifaceted approach to the data center refit saves REI 2.2 million kilowatt-hours annually. The cooling work, which includes the evaporating cooling tower and other measures such as a hot-aisle/cold-aisle containment curtain, conserves 1.8 million kilowatt-hours, while the PDU/UPS project saves 400,000 kilowatt-hours.

Partnerships Provide Data Center Efficiency Assist

Some data centers are getting an efficiency assist from outside parties. At Dupont Fabros' ACC7 data center, clients will play a role in the company's improved cooling system. The data center operator will mandate that ACC7 customers use chimney racks or row-based hot aisle containment, Rosenberger notes. (Chimney rack systems vent hot air to the above ceiling plenum. With row containment, server racks are oriented so they direct exhaust into a hot aisle and then into the above ceiling plenum.)

The combination of cooling methods will lead to a low PUE value, according to Rosenberger. He says engineering data suggests an annualized PUE for the facility of 1.13 or lower.

Dupont Fabros, meanwhile, is taking steps to make its cooling greener. Evaporative cooling plants at four data centers on its Northern Virginia campus have been converted to use reclaimed, as opposed to potable, water. The ACC7 facility, scheduled to open in June, will use reclaimed water from the onset, Rosenberger says.

The use of reclaimed water in combination with improved PUE will reduce costs as well as conserve resources. The reclaimed water conversion will reduce the cost of water consumption by 50 to 65 percent. This savings is passed through to the customers, according to DuPont Fabros. "That's an option if reclaimed water exists," Rosenberger says. "It can save the customer money and is also environmentally friendly."

The first step toward better cooling could be a call to the local power utility.

REI worked with Puget Sound Energy and its Data Center Energy Efficiency Program. CLEAResult administers the utility's program, which provides a free data center audit to zero in on efficiency projects. The Puget Sound Energy offers incentives that defray the cost of a data center retrofit. Stachowiak says the utility covered 70 percent of REI's project.

Myers said the retrofit's payback period was less than 12 months. Working with CLEAResult helped REI identify opportunities for boosting efficiency, he notes. Taking a holistic approach to the cooling improvements was key. "What we did wasnt rocket science; it wasn't space age technology. It just became a different way of seeing the opportunities and framing up the solution."

Join the CIO New Zealand group on LinkedIn. The group is open to CIOs, IT Directors, COOs, CTOs and senior IT managers.

Join the newsletter!

Or

Sign up to gain exclusive access to email subscriptions, event invitations, competitions, giveaways, and much more.

Membership is free, and your security and privacy remain protected. View our privacy policy before signing up.

Error: Please check your email address.

Tags IT management

More about AirborneAssistDuPont AustraliaIBM AustraliaImmersionTechnology

Show Comments