If you read this site, you probably know that cooling costs are a major concern for data center operators, no matter how large or small the facility, mainly because somewhere around half the power used by many data centers goes straight to keeping the equipment from overheating. No surprise, then, that many of the vendors on the expo floor at Data Center World in San Antonio earlier this month were pitching cooling solutions -- or at least ways to keep cooling expenses as low as possible.
One of the companies we looked at was demonstrating a direct-to-chip cooling solution that can eliminate the need for on-premises air conditioning -- at least for the electronics if not for the human workers. Another was explaining software that uses computational fluid dynamics to, among other things, help data center engineers predict with precision how much additional cooling will be necessary after a major change, like adding another row or two of server racks within a facility.
Chilldyne: This four-year-old startup hailing from Carlsbad, California, claims that by deploying its liquid cooling solution, it's possible to remove all air conditioning from a data center facility.
"You don't need any air conditioning," Steve Harrington, the company's CTO, told Data Center Knowledge, "other than maybe a little bit of airflow to cool off other equipment."
Harrington evidently understands that data center operators might be a little reluctant to consider pumping liquid around motherboards and other electronic equipment that can be easily damaged beyond repair in the event of a leak. At the company's booth he was demonstrating the safety of his direct-to-chip liquid cooling system by handing visitors a pair of diagonal cutters and inviting them to snip a plastic line carrying coolant over a motherboard and through the CPU's heat sink. We gave it a try, and all of the liquid was immediately sucked out of the tube with not a drop spilling onto the electronics. Cool.
"It's just like if you're sucking soda out of a straw and somebody cuts the straw," he explained. "What's going to happen is that some goes in your mouth and some falls back in the soda, but none of it spills on the floor."
Data centers wanting to take advantage of the patented system would need to install a floor model Chilldyne Cooling Distribution Unit (CDU), a negative pressure system that can cool up to 300kW of server heat, and servers will need to be converted to accept the liquid coolant. According to Chilldyne, the unit can use cooling tower water at a temperature of 59-86F to remove up to 300kW of server heat in high-density applications.
The company claims that it has yet to find a server it couldn't covert to liquid cooling, and Harrington indicated the conversion process is quick and easy. Backup air cooling capability can be retained by using a hybrid air-liquid heat sink.
The company generally lets value added re-sellers handle sales and installations and currently boasts of two relatively large installations in South Korea, both installed by VAR partners.
"One is a thousand AMD GPUs," Harrington said. "That's 350kW. We have another installation that's 2,500 Fury AMD GPUs, and that's 660kW."
There's more information on the company's website.
Future Facilities: A major change within a data center facility, such as adding new racks loaded with equipment, often doesn't work out in reality the way it did on paper, even when all of the I's and T's seem to have been dotted and crossed. Unexpected cooling issues arise, or breakers start going off in an unanticipated manner. Future Facilities, a software firm focused on data center design, wants to help operators take the guesswork out with 6SigmaAccess, the latest in its 6Sigma line of "virtual facility" products.
A San Jose-based company that's been around since 2004, Future Facilities produces software that works by harnessing the principles of computational fluid dynamics (CFD) to get a detailed model of a working data center, the "virtual facility," even before all components are in place.
The system maps all aspects of the data center, including space, the power network, and cooling. According to Joe Dorsey, a Future Facilities sales manager, because the system is based on CFD, it's more accurate than most DCIM applications.
"Although DCIM will give you this information, without having actual simulated results you have to use some kind of a rule of thumb, whereas we're actually giving results from this mathematical model."
By using the system, Dorsey said, customers can get an exact indication of cooling costs before a planned change.
The software is browser-based, but customers run it on their own equipment and on their own network.
"You don't have to go across the internet, so for government customers it stays on their network, it doesn't go outside," he said. "We can set it up for a cloud version, but right now, most of our customers are doing it all in-house."
6SigmaAccess can be used as a stand-alone product, but Dorsey said it can be more fully harnessed in combination with the company's flagship product, 6SigmaRoom, which builds the actual "virtual facility." Pricing for Access is on a per-instance basis and Room is priced by the physical size of the facility where it's being used.