The white cages, like rooms within rooms, were scattered throughout the server room, fitting into odd shapes, none a perfect rectangle. The operations manager said it was like the game Tetris trying to fit them optimally into the server rooms. But you know the data center is likely charging quite enough to numb the pain of 3D Tetris, giving customers who require security that much more peace of mind.
However, one of my fellow editors pointed out that he could simply suction up a floor tile, descend to the concrete slab three feet below, crawl over pipes and wire, push up a floor tile a couple tiles over, and reemerge on the inside of the white metal cage. The operations manager retorted that he could also climb over the white metal wall into the cage, though since the links were smaller than actual chain-link fencing, it would be difficult for anyone heavier than a child to try.
But, he added, security would be watching and would stop anyone before they got very far. He pointed out that the layers of security couldn’t necessarily prevent the serious criminal from breaching them but were enough to deter the casual opportunist from trying anything. If a person even gets this far into this part of the data center, it’s likely they belong here and the cage is just to prevent any crazy, sudden impulses. Like the redundancy and backup of the systems to cool and power the place, it seemed to me the security measures also worked together to maintain a certain redundancy.
Basic black cabinets for server racks; below are the vented floor tiles.
I saw exactly four other people during the tour, besides our group, and they were each working alone in the server room, each with a badge. One had no coat and the others had coats slung over arms or on a stand near where they worked, which led me to believe I’d seen one data center employee and three employees of other organizations, sent to work on their servers. It struck me how well suited a data center job might be to someone who doesn’t want much human contact.
We stopped briefly in the new area under construction. The operations manager reminded us about the guys we’d seen outside installing additional generators. It appears that in the years since 2001 when this data center was built, little has changed in the design of the building or the redundancy of the utilities that power and cool the servers. Efficiency is the main improvement.
The more interesting technologies we saw at the green data center at HP’s research facility (ice-making to cover cooling during peak-energy use times; conceptual models for using methane gas from cows to power cooling) have obviously not hit the established, cost-conscious mainstream data center market. As energy costs rise, and for some, the cost of water as well, I think that green will come to mean not the color of “extras” such as environmental stewardship but the green of money and the “necessity” of making profits in a smart way—two different ends but the same means to those ends.
As the operations manager said, “We’d like to slap some solar panels on the building but it’s not cost effective.” If we revisited him in five years, I’d bet he’d have some kind of alternative energy/renewable energy source going—if not for redundancy, for cost effectiveness.
So there you have it: Our field trip to the data center. It was interesting to get a behind-the-scenes look (actually, for those of us interested in hosted SharePoint, hosted Exchange, and virtualization, a “behind the behind-the-scenes” look).
Read the accounts of my colleagues, who actually took notes. And photos. When they get them on the website.
Update--thanks to Jeff James and his loan of 3 photos. Jeff runs the Business Technology pages of Windows IT Pro.
Also, check out Jason Bovberg's blog post about our visit to the HP research datacenter for a forward-looking view of data centers of the future.