Data Center Alley, a few square miles of Loudoun County, North Virginia, is the greatest concentration of data center space on the planet. And it's where a new provider is upgrading one of the world’s oldest Internet data centers.

Locals claim that 70 percent of the world’s Internet traffic passes along the fiber cables that run beneath Loudoun County Parkway and Waxpool Road. There are some 10 million sq ft of data center space here, consuming something like 1GW of electrical power.

It’s growing fast, and with a stream of trucks hauling steel and prefabricated concrete to giant construction sites for Digital Realty and CloudHQ, it’s easy to think that everything in Data Center Alley is new. But it’s been an infrastructure hub for more than 20 years, and alongside the new builds, some historic locations are being rebuilt too.

A unicorn in Ashburn

Today, the Equinix campus in Ashburn is the leading Internet exchange. Equinix DC1 was the company’s first data center in the late 1990s, and DC2 is reputedly the most connected building on earth. But before Equinix arrived, Virginia had MAE-East, one of the earliest Internet exchanges, and the first data center in Ashburn is believed to be the former Dulles Technology Center, a bunker-like building on Waxpool and Pacific Boulevard, created by AOL in 1997 to serve its growing army of dial-up Internet users.

Now more or less forgotten, AOL was the leading Internet provider of its day. New fiber arrived to serve it, and the Alley snowballed from there, as other providers and data centers arrived to feed on that fiber. AOL is no longer a power in the land (it’s a Verizon brand), but its data center is still there, under a new owner, on prime fiber-connected land close to Equinix DC2.

Stack Infrastructure is selling wholesale data center capacity to hyperscale providers. It launched in early 2019, taking over the facilities previously owned by Infomart, a provider which picked up the 10MW AOL site in 2014.

Under Infomart Data Centers, 6MW of capacity had been re-opened in the Dulles facility by 2017. Now the work is continuing under the brand Stack Infrastructure, created by Infomart’s new investors IPI.

The result, according to Stack sales vice president Dan Ephraim, is a wholesale colo site offering up to 18MW, playing to the building’s strengths but still meeting current demands.

“You are in an absolute unicorn,” Ephraim told DCD during our visit. “This is totally unique.”

The building is designed to withstand a jet crashing into its roof, Ephraim told me: “Being the first data center here, these people were hypersensitive to the risk of being on a glide path [i.e. close to Dulles International Airport],” he said. “It’s borderline foolish. But the design was: if a Cessna or a Hawker private jet hits one of our data halls, that hall will collapse, but it won’t affect the other five data halls.”

The building’s internal and external walls are built from reinforced “rebar” concrete, and the double-tee roof has six more inches of concrete above it. As well as crashing airplanes, it can also withstand winds of up to 200mph.

With a facility this old, the best approach was to almost completely replace the mechanical and electrical assets, Ephraim explained: “We ripped out everything AOL had from a mechanical perspective.” Chillers, pumps and cooling towers were replaced, along with uninterruptible power supplies (UPS), power distribution systems, mechanical cooling, fire prevention, and security systems.

This gives major benefits: the facility is a new, flexible data center, inside a super-robust shell. Standing next to the new cooling towers in the yard outside the building, Ephraim said: “Everyone else in the market, for their utility yard, has a dog-pound fence. We’ve got a 30ft concrete wall. They just overbuilt.”

Inside, there are five data halls, up to 15,000 sq ft each. It could be our imagination, but as we talked, the echo seemed stronger and louder than in other facilities. Could that be the rebar talking?

The data halls support power densities of up to 600W per sq ft, and the building can deliver a PUE (power usage effectiveness) of 1.2 to 1.25. That’s impressive for today’s shared colo space, and even more so in a building that was created with no consideration for efficiency. When AOL built in 1997, data centers focused on delivery not efficiency, and the term PUE wasn’t coined till 2006.

There’s a NOC control room built with a view into the power systems: “They designed it so you could not sit in this room without seeing at least two generators.” Each hall is set up for 2MW but, thanks to AOL’s overbuilding, they can be upgraded quickly to 3MW: “As long as it takes us to order the equipment, that is how long it takes to give the client another MW of compute.”

Stack's Ashburn facility
– Stack Infrastructure

Federal-ready

Rebuilding in an older space has created opportunities for creative thinking. AOL built a row of UPS rooms between the mechanical and electrical section and the data halls, but in twenty years, battery technology has moved on. Infomart replaced lead-acid batteries with more compact and intelligent lithium-ion cells, leaving it with whole battery rooms to repurpose.

“We recycled 1,800 tons of lead-acid batteries from this building, and replaced them with lithium batteries,” he said. “I have regained 12,000 sq ft of rental space.”

It’s specialist space, though. The old battery rooms are smaller than today’s typical wholesale colocation space, but they are super-resilient: “This is the most hardened and fortified space in the building. The outside wall is concrete, the inside walls are concrete. It’s effectively a bunker, inside of a bunker, inside of a bunker.”

Who would want such a space? Ephraim doesn’t spell it out, but he says the phrase “hardened and fortified” is a code in the US colo market, for “federal-ready.” Governments might use this space, as well as big cloud providers, for activities that require a heightened level of privacy and security.

Despite the emphasis on reliability, the site has not been given an Uptime Institute Tier certificate. In part, that’s because the building predates the Tier rating system, and in part because this is a sophisticated market that will look beyond the certificate to the components of the building.

The site has N+1 power and UPS infrastructure, and meets many Tier IV requirements, even though it doesn’t have two utility feeds (just one from Dominion). If the power goes, it does have 80,000 gallons of diesel fuel, half of it below ground. “That’s enough for five to seven days,” Ephraim boasted. “Everyone else in the market guarantees one day. If we lose power at Dominion, we have three generator rooms, and they can pick up a full load.”

It also has 650,000 gallons of water storage, fed from two wells. “It’s a closed loop,” Ephraim said. “There’s little evaporation. We’re effectively 3N on water.”

One thing the building doesn’t feature is a glossy exterior. Inside, there are all the things a customer would expect, such as office space, storage and lounge areas. But outside, this is the least imposing data center DCD has visited. Compared to nearby facilities from RagingWire, Digital Realty and Equinix, it’s nearly invisible.

“What I like about our building is we are selling on anonymity,” Ephraim said. “It’s industrial; it’s not sexy. 25,000 cars drive through this intersection a day, but not one percent of them know what this building is.”

It’s not just security by obscurity: AOL made sure the exterior can subtly deflect physical attacks as well as crashing airplanes. Bollards with 8ft roots block any vehicle from directly ramming the gates, and a high berm deflects any attacks to vehicle-arresting cables, as well as blocking casual sightseers.

If this building is a unicorn, as Ephraim says, it’s a unicorn that’s had an extreme make-over. It’s an armor-plated energy-unicorn, wearing a cloak of invisibility.

This feature appeared in the February issue of DCD Magazine. Subscribe for free today: