8 Ways to Optimize Data Center Equipment and Operations

July 12, 2018

Optimizing data centers doesn’t always require new equipment and a huge upfront investment.

Optimizing data centers doesn’t always require new equipment and a huge upfront investment. Just implementing better management strategies can conserve energy and money.

Try these strategies to optimize data center equipment and operations without the cost of a complete remodel.

1) Containment

Containment simply means separating hot air and cold air streams. Keeping them from mixing with each other means you don’t have to continually recondition the same air, and the hot air isn’t cascading back into the top third of the cabinet. That’s why the top third tends to have the highest number of failed servers.

Ideally, cold air and hot air are restricted to their own aisles by placing cabinets front-to-front and back-to-back rather than front-to-back, but if your data center currently has a front-to-back layout, you can still implement containment with a chimney solution.

Trending: What is IoT? How Smart Building Technology
is Changing Facilities Management

Chimneys can be deployed in almost any data center. The solid metal chimneys mount on top of cabinets and funnel hot air toward the ceiling so that it can go through the computer room air conditioner (CRAC) units to be cooled, rather than cascading down and affecting equipment. Make sure that the opening on top of the cabinet is large enough to accommodate the chimney and size the equipment correctly to efficiently direct the airflow.

If you’re about to do a refresh on your data center anyway, think about changing the layout to maximize containment opportunities and potentially turn off some of your CRAC units.

Other equipment options for containment include baffles that block the cascade effect, overhead ducted systems and drop-down curtains, which resist cascading but drop down during fires or other emergencies so that they’re not blocking the sprinklers.

2) Free Cooling

Any amount of cooling you can get at no charge will conserve energy and save you money, especially in high-energy applications like data centers. Whether you opt for a chilled water system, glycol refrigerant or some other setup, anything you can do to rely less on the mechanical system will be useful.

Depending on your existing cooling equipment, you may be able to configure your system so that the cold water or refrigerant cooled by outdoor air is bypassing the compressor and going to the same condenser coil as what’s coming out of the chiller, which means the chiller won’t have to work as hard.

Trending: A New Evaporative Cooling Solution for Data Centers

If the temperature outside is low enough, you may not need the chiller to supplement with mechanical cooling at all.

3) Directed Air

Cooling a data center efficiently is all about chasing the cold spots. Don’t bring in more cubic feet per minute than you have to and don’t overflow the data center with air. You don’t want to feel cold air when you walk in because that means that cold air isn’t going to the front of the cabinets where it belongs.

Make sure you’re targeting cooling where it needs to go by balancing airflows, putting variable frequency drives on fans and doing a CFD analysis to make sure the air is going to the right components.

If you can reduce your airflow 10% by simply being more selective about where you’re directing the air, then you can drop your supply air temperature slightly. A 2-degree change in temperature lets you reduce fan speeds by 10%, which then drops the fan horsepower by 40%, saving you money.

Be careful when you’re using perforated floor tiles to direct air. Don’t put dampers on all of them because that will reduce the pressure, even when the dampers are 100% open. Instead, strategically locate the floor dampers in the most important locations.

4) Increasing Supply Air Temperatures

ASHRAE has broadened the allowable zone for temperature and relative humidity in data centers. However, the warranties on your servers may not allow you to increase the supply air temperature, or you may have service level agreements with tenants that require a specific temperature and humidity level in the room.

Check the manufacturers’ documentation and any tenant agreements you have to see what you can do with server inlet conditions.

Raising the supply air temperature also increases free cooling hours. As long as the outside conditions are below the room temperature, you can use some amount of free cooling.

5) Lowering Humidity Setpoints

Data center optimization allows you to drop your humidity setpoints, using less water and energy. I’ve worked on projects where implementing a chimney solution meant the clients could turn off the humidifier completely in a 5 MW data center with a power usage effectiveness of 1.2. They also deployed a water-cooled chiller system.

6) Optimized Airflows

Blanking panels are key for making sure airflow goes where it’s needed. These panels block off other open space that air could pass through so that it can only go through the servers to cool them down, not bypass the servers to get back to the CRAC units.

Improved cable management can also enhance airflow by eliminating the need for articulating or folding cable arms. These devices seem like a great idea, but the bundles of cables tied to them block air from getting out of the cabinet. A deeper, wider cabinet also gives you plenty of room for cable management and allows more airflow in it.

7) Verifying the IT Load

Meet with your IT group and work together to develop a plan for better data center management. Facilities and IT should be on the same page as far as removing old servers, deploying new ones and improving layouts. Remove or replace antiquated servers during refresh projects and turn off legacy or orphaned servers.

How many data centers have a handful of servers still running in the corner of the data center because the person who managed them left years ago and now everyone is afraid to turn them off? We have clients who have a policy that if a server hasn’t been managed in five years according to a server management log, they unplug it. If no one complains, the server is removed.

Also, work with the IT department or anyone else involved in managing the data center to turn the lights off all day. Servers don’t care if it’s dark. Put in some occupancy sensors so people who need to work in the data center can still have light, and make sure there’s emergency lighting at the end of the rows.

8) Nothing Saves More Money than Off

Anything you can turn off will save you money, from redundant CRAC units to compressors. Implementing the low-hanging fruit of data center efficiency measures will conserve energy and lead to eliminating CRAC units.

It’s tempting to leave them on for redundancy in case there’s a failure, but the fans from the CRAC units also heat up the supply air and add extra load to the room on top of what they consume themselves. If you don’t need them for airflow, turn them off and just let them come back on during a failure mode.

Two hand-picked articles to read next:

About the Author

Janelle Penny | Editor-in-Chief at BUILDINGS

Janelle Penny has been with BUILDINGS since 2010. She is a two-time FOLIO: Eddie award winner who aims to deliver practical, actionable content for building owners and facilities professionals.

Voice your opinion!

To join the conversation, and become an exclusive member of Buildings, create an account today!

Sponsored Recommendations