Most recent articles
5 Colocation Considerations
How safe is your data facility?
Aside from people who may not have good intentions, computer rooms and data centers are under threat from moisture, seismic activity, outages and other emergencies that can spell disaster for an organization.
Organizations are increasingly moving their IT assets outside their own facilities – and they absolutely need to know that your IT and facility infrastructure is able to survive a challenge. The Uptime Institute’s annual survey revealed that 20% of respondents currently use a colocation or multi-tenant data center provider and roughly half expect that most of their IT workloads will move off-premises soon. Of those, 70% predict a move by 2020 and another 23% forecast the shift to happen within the next year, foreshadowing a significant increase in reliance on colocation facilities.
BUILDINGS recently toured one such facility – the Marion, IA branch of Involta, a national IT intelligence and infrastructure provider that operates multi-tenant data centers. Chief Security Officer Jeff Thorsteinson and Data Center Manager Chris Rodeffer led BUILDINGS on a tour of the data center and explained the top five focuses every FM should examine when evaluating a data center.
1) Building Construction and Layout
Examine the physical facility, both inside and outside. Don’t hesitate to pry deeper into how data is protected from the elements, from construction materials to emergency systems. The larger the potential customer, the more likely they are to ask about these considerations, Thorsteinson says.
“Typically bigger customers are more prescriptive,” notes Thorsteinson. “Right now I’m working with a large insurance company that has requested our design characteristics. They can say ‘We like to see this flow rate’ or ‘We’d like to verify what your very early smoke detection apparatus is’ and I’ll answer those questions. In the past we’ve answered questions on how our diesel generators separate fuel supply, what are the weldments under our roof, and what the active snow load and dead load are.”
Involta’s Marion location can withstand 165 mph of wind shear, Thorsteinson notes. However, a building that strong also has to incorporate features that account for seismic activity. How does your facility balance strength and flexibility?
“In our Boise data center, we have what’s called a floating roof that allows the walls to sway,” notes Thorsteinson. “In Iowa, we have hollow plank cores that reduce the weight, but still have the strength of concrete, followed by a 3-inch pour of concrete over that. Underneath is a rubber membrane and polyisocyanurate insulation. The stronger the building, the more you have to worry about with the roof because it can’t be as rigid as the walls. Our walls use stronger connections to footings than a typical building would, which allows the structure to carry all that weight downward into the ground.”
Inside the building, focus on three key areas: cooling, redundancy of systems and physical security.
Cooling: A key concern. Any data center should have a carefully designed environment to maximize the effectiveness of CRAC units. Innovative compressors, VFDs and microchannel condensers let users cool hot servers more efficiently. Involta also uses a tube-shaped fabric diffuser that creates a “bathtub” effect to surround equipment on the floor with cool air rather than just aiming a direct flow of cooling at the hottest spots.
Redundancy: Know the steps your colocation center will take in case of an outage, from uninterruptible power supply systems to extra equipment. Involta, for example, follows the n+1 redundancy rule when designing data centers – the company maintains at least one backup for components like diesel generators in case one fails.
“We have two sources for everything down to the customer plug-in for the server,” Thorsteinson notes. “We also ask our customers to use dual power supplies. On the cooling side, we have 10 units available to serve the load but only seven are operating all the time. We also have redundancy with respect to connectivity so that if somebody were to chop a wire, we wouldn’t be dead in the water – you want to have two feeds instead of one. We have folks that can lose $7,500 a minute when they’re offline.”
Security features: How are each organization’s racks protected? Locks on cabinets and private cages keep unauthorized visitors from accessing equipment, notes Rodeffer.
“We started out with lockable racks and then found out that customers wanted to have private cages so that only they could access that cage,” Thorsteinson says. “Now we even have private suites where a whole section of the building is dedicated to one customer who wants that security and privacy but doesn’t want to build its own data center.
“One big difference in our newer data centers is that all of the electrical gear, such as the CRAC units, are in the center spine of the building so customers never see it,” adds Thorsteinson. “That way we don’t have to have service personnel anywhere near your protected data – they’d only need to get into the spine instead. It’s a little thing, but little things add up and mean a lot.”PageBreak
2) Policies and Protocol
How is the data center managed? Make sure your colocation center has sensible access policies. Involta provides 24/7 access, but only people who have been pre-approved are allowed to enter the data center floor unescorted – everyone else must be accompanied by a staff member. In addition, entrances are monitored so that no one can take in food, drink, or combustibles like a piece of equipment in a cardboard box.
“The number one cause for data center outages is this little thing called a human. The fewer times that people touch things, the better off we are,” Thorsteinson notes. “Contrast that with a corporate data center – accommodations get made because maybe it’s easier to take a shortcut through the data center to get somewhere and people allow that, or maybe executives like to show off the data center so someone who doesn’t work in it has access. Sometimes people are stationed in the data center so they have to have a microwave and refrigerator – people come with their own space and have to have their coffee or soda or Danish. All of those things represent challenges.”
Company culture is an important complement to official policies, Thorsteinson notes: “We try and do the basic building blocks of cybersecurity, which means a physical perimeter, a logical perimeter and personnel to say ‘Hey, this doesn’t seem right.’ That’s a big deal. If you look at the big breaches, the ‘This doesn’t look right’ barrier was surmounted. We’re always working on culture.”
3) Power Supply and Preparation
On top of investing in extra generators, some data centers participate in demand response, which benefits all area utility customers. This can offer a side benefit for both customers and data center operators during hours of peak energy usage.
“We take advantage of an interruptible rate from the utility. When our 900 kW drops off the grid, the utility doesn’t have to buy that for customers and they no longer have to pay the higher price associated with having less electricity available,” Thorsteinson explains.
4) Building Intelligence
How clear of a picture do you have about the temperature and humidity on the floor? Operators should have adequate monitoring in place to keep equipment from getting too hot and moisture from wreaking havoc on sensitive electronic components.
“We have humidity and temperature sensors positioned throughout the facility and attached to the front of cabinets so we get a reading near where the airflow hits a customer’s equipment,” notes Rodeffer. “We also have internal tools that show us temperatures – they have alerting built in so if the temperature hits a certain threshold, we’d receive a notification. For visual representation of the facility as a whole, that tool generates a color-coded thermograph so I can pick out hot and cold spots. If I see a specific spot that might be getting too warm, I can take control of that by adjusting CRAC temperatures.”
Customers can see the temperature and power monitoring associated with their rack, but not the CRAC unit operations, adds Thorsteinson.
5) Energy Efficiency Opportunities
When customers utilize a colocation center, it can be easy for them to overlook advances in data center technology because they are no longer paying for all the energy that your servers and storage use. However, you may be able to offer them some savings from efficiency initiatives in a few key areas.
Avoiding the last mile charge: Part of the cost of telecommunications for individual consumers and businesses is the result of the “last mile charge,” a fee intended to cover the final leg of telecom technology needed to carry power or data from the main infrastructure to the customer’s home or business. Large data centers like Involta can take advantage of aggregation rates from telecom utilities, effectively becoming part of the area’s communications backbone. At the Marion facility, this reduces the data center’s telecom rates by an average of 40%.
Agreement structure: Contracts that require customers to pay for their share of cooling in addition to their cabinets and the electricity their equipment consumes, they have a stake in your energy efficiency initiatives. At Involta, this will take the form of blanking panels to seal up unused space in cabinets, as well as variable speed fans and associated sensors for CRAC units.
A reduced electric footprint: “Customers can benefit both themselves and us by reducing electricity usage,” notes Thorsteinson. “In our Duluth facility, one disc storage device that took up an entire rack and used 6 kW continuously was replaced by a flash storage device that took up a third of the rack and could do more but used less than half of the electricity.”
Janelle Penny email@example.com is senior editor of BUILDINGS.