GRC (formerly Green Revolution Cooling) has revealed details of its modular, immersion-cooled data centers deployed at Hill Air Force Base near Salt Lake City and Tinker Air Force Base in Oklahoma City.

The two ICEtank systems have been "cumulatively" tested for three years (GRC declined to detail when they were installed) with 100 percent uptime. Thanks to a successful trial, GRC is now working with the USAF to develop next generation production units.

GRC USAF deployment
– GRC

The military loves its oil

GRC developed two versions of its system for two different sites - C3 One (Hill AFB) and C3 Two (Tinker AFB). In a case study, the company explained that C3 One uses a cooling tower as its final heat rejection method, while C3 Two relies on a dry cooler. Each container houses 2-4 purpose-built racks, filled with the proprietary 'ElectroSafe' coolant.

"In the container at Hill (C3 One), water and drain connections were made through the floor for secure water use and freeze prevention," the study stated. "The C3 Two container, with a dry cooler, has fewer connection requirements by not requiring a drain or water supply."

Mike Neri, director of communications and information at Hill, said: “We are working with GRC on these data center deployments. They worked closely with us to design a system to meet our unique needs for mobility and rapid deployment, and for data centers that can withstand harsh environments.”

In an interview with DCD at our San Francisco event last year, CEO of the company, Peter Poulin, talked about the importance of mobile data centers for specific use cases: "What you're seeing with the Air Force is a need for the [data center] to be mobile. If I'm in a location where I have to deal with a hurricane coming in, you need to be able to move.

"Imagine an air-cooled, containerized data center in a sandy part of the world and a helicopter landing next to it - not good. The environmental resilience we talk about is not just that it's ruggedized, but what it does is give you a tremendous amount of location flexibility."

Further details about the deployment can be found in an USAF Request for Information for modernizing and consolidating the data centers at Hill Air Force Base. A document published on 8 January 2019 notes that GRC "is currently on contract through a Small Business Innovative Research (SBIR) project initiative to develop the capability to deploy a cost-effective and expedited data center build-out."

The SBIR website provides further insight into GRC's state grants - with the company receiving $887,458 from the DoD across two awards, and $652,695 from the National Science Foundation, across another two awards.

The first NSF grant, for Phase I: Mineral oil cooling for energy and cost efficient data centers, was awarded back in 2010. The most recent DoD award came in 2016, with $737,458 for Improved Energy Conservation for Data Centers.

GRC believes that these deployments confirm its technology is ready for the mass market - one of the aims of the last year's rebranding efforts was to drop 'revolution,' as the company wants enterprises to see immersion cooling as a standard option. Poulin told DCD last year: "I can't disclose the customer, but one of the hyperscale cloud guys just bought a pilot [system] from us to test, they're anticipating [using] it as part of their AI cloud, because they know that density is going to get higher.

"As more of those bigger brand names start deploying [immersion cooling] at scale, that's when we'll see it take off."

But working with hyperscale companies carries a risk, since these businesses have the resources to fund their own R&D and build their own cooling solutions.

"They do," Poulin agreed. "And that's why we spend quite a bit of time making sure we lock down those patents and make sure we're documenting them effectively. No one never knows how strong your patents are until they get challenged.

"We are seeing those players come to us. Some of them are starting to buy pilots from us and some of them, I suspect, may be trying to roll their own. In fact, I know one of them in particular is trying to do that because they've been blatant in telling us - but it turns out that the crime is not quite as easy as they thought it was."

The idea of immersing IT systems in a dielectric fluid for cooling purposes "is a remarkably simple solution," Poulin admitted. "But while it's simple in concept, it's not necessarily simple to do it the right way."

The biggest obstacle to adoption, he said, was that facility operators still don't understand how immersion cooling works - prior to the interview, at a panel on liquid cooling, an attendee refused to believe electronics could be submerged in liquid without causing a short-circuit.

Another reason for confusion, Poulin said, was the intense competition with established vendors: "There are a number of very, very large companies that are highly invested in the status quo that have much larger marketing budgets, and much bigger coverage model, that are very good at spreading FUD [fear, uncertainty and doubt].

"There's a video of one of our racks with the coolant on fire. And we finally tracked down what happened - this competitor had literally poured gasoline on top of our fluid and then lit it." Poulin declined to reveal the company responsible, and DCD was unable to verify the video.

He added: "We we're not trying to convince the market that this is the be all end all, there are absolutely still going to be situations where air cooling is the best answer, where liquid-to-the-chip is the best solution, where rear door heat exchangers are the best, and there are going to be situations where we're the best."

King of the hill

Hill Air Force Base has its own traditional data center deployment - however, it is unclear how many servers actually remain on-site, with the base having consolidated its data centers after launching 'Project BonFire' in 2007.

An August 2015 presentation by Hill AFB's chief architect, Douglas Babb, noted that the site had "consolidated over 1,000 servers and 1PB of data," over the course of the project.

The presentation also quoted Mike Jolley, chief of the Operational Policy Branch and program manager for the command’s computer center, saying: “Through Project BonFire, Hill Air Force Base updates data servers and storage systems to assure that 170 apps for jet and missile maintenance are available 24x7... The command at the northern Utah base repairs and maintains F‐16 and A‐10 jet aircraft and intercontinental ballistic missiles. The mission: Keep these aircraft and missiles ever‐ready for war.

"Although the apps worked fine, in recent years, server sluggishness and downtime had become a problem."

A separate 2015 summary of a work document makes reference to secure vaults containing Autonomic Logistics Information System (ALIS) server rooms for the F-35 Weapon Systems Evaluation Program (WSEP). A more recent presentation on Hill AFB by vendor Elastic, made in October 2018, included a map that shows two Active-Active data centers at Hill AFB.

This year's Hill Enterprise Data Center Request For Information (RFI) notes that "the HEDC is comprised of world-class data centers managed by the [75th Air Base Wing] hosting over 1,200 physical and virtual servers."

Tinker, Tailor, Pilot, Spy

As for Tinker Air Force Base, details on its non-GRC data center are limited, but that may be because its footprint is much smaller - the same Elastic presentation noted that there was a single 'Compute POD' at the site.

A presumably outdated 2012 article by Tinker AFB details its data center hardware refresh, which saw the base shift from four cabinets to just one.

2018 job listing for a systems engineer suggests that facilities management services at Tinker are provided by Chickasaw Nation Industries, a federally chartered corporation wholly owned by the Chickasaw Nation.

Also of note in the USAF computing infrastructure is its research laboratory in Rome, New York, which was once home to a supercomputer made of PlayStation 3s and now hosts IBM's TrueNorth neuromorphic chips.