How safe is your data facility?
Aside from people who may not have good intentions, computer rooms and data centers are under threat from moisture, seismic activity, outages and other emergencies that can spell disaster for an organization.
Organizations are increasingly moving their IT assets outside their own facilities – and they absolutely need to know that your IT and facility infrastructure is able to survive a challenge. The Uptime Institute’s annual survey revealed that 20% of respondents currently use a colocation or multi-tenant data center provider and roughly half expect that most of their IT workloads will move off-premises soon. Of those, 70% predict a move by 2020 and another 23% forecast the shift to happen within the next year, foreshadowing a significant increase in reliance on colocation facilities.
BUILDINGS recently toured one such facility – the Marion, IA branch of Involta, a national IT intelligence and infrastructure provider that operates multi-tenant data centers. Chief Security Officer Jeff Thorsteinson and Data Center Manager Chris Rodeffer led BUILDINGS on a tour of the data center and explained the top five focuses every FM should examine when evaluating a data center.
1) Building Construction and Layout
Examine the physical facility, both inside and outside. Don’t hesitate to pry deeper into how data is protected from the elements, from construction materials to emergency systems. The larger the potential customer, the more likely they are to ask about these considerations, Thorsteinson says.
“Typically bigger customers are more prescriptive,” notes Thorsteinson. “Right now I’m working with a large insurance company that has requested our design characteristics. They can say ‘We like to see this flow rate’ or ‘We’d like to verify what your very early smoke detection apparatus is’ and I’ll answer those questions. In the past we’ve answered questions on how our diesel generators separate fuel supply, what are the weldments under our roof, and what the active snow load and dead load are.”
Involta’s Marion location can withstand 165 mph of wind shear, Thorsteinson notes. However, a building that strong also has to incorporate features that account for seismic activity. How does your facility balance strength and flexibility?
“In our Boise data center, we have what’s called a floating roof that allows the walls to sway,” notes Thorsteinson. “In Iowa, we have hollow plank cores that reduce the weight, but still have the strength of concrete, followed by a 3-inch pour of concrete over that. Underneath is a rubber membrane and polyisocyanurate insulation. The stronger the building, the more you have to worry about with the roof because it can’t be as rigid as the walls. Our walls use stronger connections to footings than a typical building would, which allows the structure to carry all that weight downward into the ground.”
Inside the building, focus on three key areas: cooling, redundancy of systems and physical security.
Cooling: A key concern. Any data center should have a carefully designed environment to maximize the effectiveness of CRAC units. Innovative compressors, VFDs and microchannel condensers let users cool hot servers more efficiently. Involta also uses a tube-shaped fabric diffuser that creates a “bathtub” effect to surround equipment on the floor with cool air rather than just aiming a direct flow of cooling at the hottest spots.
Redundancy: Know the steps your colocation center will take in case of an outage, from uninterruptible power supply systems to extra equipment. Involta, for example, follows the n+1 redundancy rule when designing data centers – the company maintains at least one backup for components like diesel generators in case one fails.
“We have two sources for everything down to the customer plug-in for the server,” Thorsteinson notes. “We also ask our customers to use dual power supplies. On the cooling side, we have 10 units available to serve the load but only seven are operating all the time. We also have redundancy with respect to connectivity so that if somebody were to chop a wire, we wouldn’t be dead in the water – you want to have two feeds instead of one. We have folks that can lose $7,500 a minute when they’re offline.”
Security features: How are each organization’s racks protected? Locks on cabinets and private cages keep unauthorized visitors from accessing equipment, notes Rodeffer.
“We started out with lockable racks and then found out that customers wanted to have private cages so that only they could access that cage,” Thorsteinson says. “Now we even have private suites where a whole section of the building is dedicated to one customer who wants that security and privacy but doesn’t want to build its own data center.
“One big difference in our newer data centers is that all of the electrical gear, such as the CRAC units, are in the center spine of the building so customers never see it,” adds Thorsteinson. “That way we don’t have to have service personnel anywhere near your protected data – they’d only need to get into the spine instead. It’s a little thing, but little things add up and mean a lot.”