After decades of warnings, climate change is here, exacting a costly toll on people and infrastructure. Once a threat that we were told would impact our children and grandchildren, years of inaction have ensured that this is a problem of our lifetime.

Already the horrors that were foretold are beginning to unfold across the planet, be it through droughts in New Zealand, floods in the UK, hurricanes in the US, or the fires that decimated Australia.

“Climate change is the most significant crisis facing mankind right now,” Professor Paul Barford, of the University of Wisconsin, told DCD. And, as part of that, the fabric of the Internet also faces the consequences of a warming planet.

With this in mind, a few years ago Barford and a team of researchers set out to answer a relatively straightforward question: “What is the risk to telecommunications infrastructure, of the sea level rise that’s projected over the next hundred years?”

This feature appeared on the cover of the March issue of DCD Magazine. Subscribe for free today.

In a 2018 paper, Lights Out: Climate Change Risk to Internet Infrastructure, Barford et. al. took US sea level models from the National Oceanic and Atmospheric Administration and overlaid their own curated map of data center and PoP locations, known as The Internet Atlas.

The study found that by 2030 alone, “about 771 PoPs, 235 data centers, 53 landing stations, 42 IXPs will be affected by a one-foot rise in sea level.”

Over the past three months, DCD contacted more than a dozen of the companies shown to be impacted by the end of the decade. The vast majority declined to comment, pulled out of interviews at the last minute, or assured us that they had plans in place which they would not detail.

On-Demand Social Card_Energy Smart Virtual.jpg

Broadcast

The virtual conference that sets the agenda for data center decarbonisation

One major multinational telecommunications company said that none of its infrastructure was at risk of sea level rise. When shown data to the contrary, representatives stopped communicating with DCD.

“I’ve gotten a broad range of responses from people in the industry,” Barford said. “Ranging from ‘wow, this is something that we really need to take seriously,’ to flat out denial: ‘we don’t believe it.’” Even technical people, with the ability to understand the issues, often refuse to accept there’s a problem, he said.

“My role as a scientist is not to try to convince people of these things, but to report the information and then assume that people will take the logical next steps after that. Which is not always what happens.”

Some companies are beginning to take the issue seriously, however, as the changing climate exacerbates events that cause costly damage to their infrastructure. “It’s definitely been a journey for us,” said Shannon Carroll, director of global environmental sustainability at AT&T. “Talking about climate change, and looking at it in a formal way, is something we started doing back in 2015.”

For years, the telco used its own weather operations center to study historical data when making resiliency and planning decisions. “We know now that past weather is no longer the predictor of future weather,” Carroll said.

With a vast, continent-spanning infrastructure, the company is vulnerable to the whims of the climate - from 2016 to 2018, natural disasters cost the company more than $874m. “These severe weather events are obviously connected to climate change, so it’s definitely in our best interest, as well as our customers’ best interest, for us to be prepared for the future impact of climate change.”

To understand that impact, AT&T couldn’t just rely on sea level data. ”That’s not going to get you to where you need to be when you’re looking at coastal and inland flooding. It’s just one of the components.”

Mag 36 Cover

Issue 36: Are you ready for climate change?

We talk to data center operators, scientists, and government officials about climate change preparation in a blockbuster issue of the magazine

By

So the company turned to Argonne National Laboratory to create a detailed picture of extreme weather in the future, developing the most localized climate model currently available for the continental United States.

Modeling risk

“Generally speaking, a climate model operates by taking the entire globe and dividing it up into a grid,” Thomas Wall, a senior infrastructure and preparedness analyst at Argonne, explained.

Within each grid cell, there’s a series of equations that represent all of the physical processes in the atmosphere, and in the interaction between the land and the ocean, and so on.

“For every step forward in time through the model we calculate all of the variables and all the model outputs.”

The problem is, “if you have a global climate model, you’re trying to do everything for the entire world, you can’t have grid cells that are extremely small because you run out of computing power - there’s a lot of grids. Each grid cell is maybe 100 kilometers on each side, which is great if you’re trying to look at global trends, but it’s difficult to say, ‘here’s where the impacts will occur to my piece of infrastructure,’ because a whole city is now just part of a grid cell.

“So what we’ve done at Argonne is to take a regional climate model where, because we’re looking at just North America, our grid cells are 12 kilometers,” tracking the extremes of flood levels and wind speeds.

Then, the researchers did additional hydrological modeling in the Southeast to create maps at a 200 meter scale. “It’s the kind of thing where companies could say ‘let’s look at acquiring real estate in another location, so we can relocate that facility there.’”

Even with the models focusing on just part of the planet, they were still limited by the abilities of their 11.69 petaflops Theta supercomputer. “We can’t run the entire century ahead of us, because this would require a massive amount of computing,” Wall said. “What we did was take time slices. One of them is around mid-century, and one of them is around the end of century. We tried to capture the nearer term trends that I think people who are building infrastructure today would be concerned about, but also provide some insight to understand where we are headed along different trajectories.”

Here the researchers had to contend with another problem - the uncertainties of the future. We don’t know whether society will significantly reduce emissions, or if we will continue to fail to meet targets. We don’t know who the next US president will be, whether the Amazon rainforest will be further plundered, or if China will curb its coal consumption.

“There are four scenarios that are currently outlined by the Intergovernmental Panel on Climate Change, we run two of them,” Wall said. One is the 4.5 ‘Representative Concentration Pathway’ (a measure of greenhouse gas concentration), the closest approximation to the Paris Agreement.

The other is 8.5 RCP, jovially known as the ‘business as usual’ case, based on if we make no efforts to curb emissions at all - that’s something we really, really want to avoid.

Argonne also had to try to capture the differing views of the scientific community - with various models suggesting different outcomes, even with the same emissions levels.

“Some models are very sensitive like the Hadley Center model, which predicts a 5-7°C temperature increase when you double the carbon dioxide in the atmosphere,” the Atmospheric Science and Climate research group’s chief scientist and department head at Argonne, Rao Kotamarthi, said. “Then there are models which are around 4-4.5°C per doubling. So we combined three models to account for model uncertainty.”

Again, with more computing power and time, the team could have included more models, Kotamarthi said. “We can dream about that in the future when we get an exascale supercomputer.”

With the three models running on the two IPCC scenarios, Argonne provided data that tried to explain both the forecast and uncertainty in the prediction for each grid cell. “We tried to simplify this as much as possible,” Kotamarthi said, with a focus on ensuring that business decision-makers could understand the tool without requiring a climate science background.

For its part, AT&T looked at four states - Florida, Georgia, South Carolina, and North Carolina. “You have to start small and then expand later if you can,” Carroll said.

“What we are able to do is look at the extreme outcomes of the severe weather events in those areas.” For example, by mid-century, a 50-year flood event will produce floodwaters up to 10 feet deep across inland and coastal areas of southeastern Georgia.

“We can focus on our physical assets, and how long we anticipate those physical assets being operational.”

It’s also about the future siting of assets. “Where are you going to put the next cell tower, where are you going to put the next central office? We’re learning how to better use the tool every day, folks are starting to use it more and more.”

AT&T said that it would make the tool and data public, and that it was looking at expanding the model to include a wider area and potentially track other climatic threats including wildfires.

Rival telecoms company CenturyLink currently uses public climate change data to assess its risk. “We’ve got proposals to perform [similar] climate scenario analyses, but we haven’t finalized that plan yet,” Michael Beekman, CenturyLink’s director of global environment, health, and safety, said.

Both companies said that they wouldn’t stop providing services to areas that their models showed would be at risk. But they wouldn’t site critical interconnection points in those areas, either. “Put it this way - we would not put a major data center sitting on the shore of Florida in a hurricane area,” Kathryn Condello, senior director of national security and emergency preparedness at CenturyLink, said.

Protecting the nation

As part of her remit, Condello works with the Department of Homeland Security to ensure that her company’s infrastructure is ready for disaster, natural or otherwise. “The DHS does certain regional risk assessments, such as: ‘What would be the impact of a much, much higher magnitude hurricane coming into the Washington, D.C. area?’ Normally we get the fringes of hurricanes, but based on modeling that DHS will get from [a national lab] they’ll see that, while the possibility of that is small, the consequences might be high.”

CenturyLink, along with utilities and other relevant bodies, engages in exercises with the DHS to plan out how it would deal with such scenarios. “Based on that, we then look at where our infrastructure is. Do we think it’s protected? Do we think we have enough in place for that?”

Condello said that over the last decade she has been involved with countless regional risk assessment programs for the DHS. “All of them in one form or fashion, even if it was more cyber-related, have had a component that was associated with climate change,” she added. “It’s us trying to deal with contingencies that maybe we didn’t think about, but that the US government might have concerns about.”

The US government, like all governments, is endlessly concerned, continually searching for signs of weakness in its nation’s structure. “After 9/11, we stood up the National Critical Infrastructure Prioritization Program,” Bob Kolasky, head of the Cybersecurity and Infrastructure Security Agency’s (CISA) National Risk Management Center (NRMC) at the DHS, told DCD.

NCIPP is a statutorily mandated list of “key physical assets that should be protected in the face of a potential attack,” Kolasky said. “And we have thousands of infrastructure [assets] on that, where, if they get flooded or a cyber-attack causes the system not to work, they end up being really important, whether it’s a terrorist attack or not.”

This data, along with information from state and local governments, is used to build an understanding of weak points across the country. “I think that the most immediate use case and the easiest to imagine is a hurricane headed to the Florida Panhandle,” Kolasky said.

“You have some range of uncertainty of what the storm surge could look like, but we can pretty quickly knit together where the things that we’re going to end up caring about are, whether it’s the hospital, the wastewater treatment plant, or the substation.”

Securing the digital heart

While resources like hospitals and power plants are among those at the top of the list of important assets to protect during a disaster, data centers have increasingly become integral to the functioning of society.

With the expected rise of smart cities, smart grids, AI-based healthcare, and the like, the need for data centers to maintain operations during a disaster is something that is likely to only grow in importance. When they go down, it’s hard to imagine what they might bring down with them. “That’s a hypothesis that would cause us to be looking more closely at those things,” Kolasky said. “We don’t fully know what’s going to be the answer, right?”

His team has to deal with long time scales, where it can be hard to predict what will happen. “In general, we’re trying to anticipate out 10, 15, 20 years. We just launched a program called the Secure Tomorrow series that looks out about 20 years, which is appropriate for some infrastructure build-up.”

Amid this work, Kolasky’s team has to deal with another reality: That of the government of the day. A public denier of climate change, President Trump has defunded programs to combat the problem, and sidelined various agency efforts to prepare for the worst.

Blast door and fallout lock

Children of the Cold War

An apocalyptic legacy is being used to build data centers ready for the next major disaster

In the DHS, the federal government’s second-largest department, FEMA (the Federal Emergency Management Agency) removed all mention of climate change from its 2018 strategic plan, while Homeland Security Secretary Kirstjen Nielsen (who left last year) questioned whether humans were responsible for climate change.

Kolasky would not be drawn on the matter of whether the politicized nature of climate change impacted his work. “You characterize it as politicized. I characterize it as: We plan for risks. We don’t endorse the risks we’re planning for.”

In 2015, well before Trump’s election, a congressional subcommittee was formed to “Examine DHS’s Misplaced Focus on Climate Change.” In an opening statement, Subcommittee Chairman Scott Perry (R-PA) listed various threats, from ISIS, to hackers, and added: “I am outraged that the DHS continues to make climate change a top priority.”

Kolasky was called up to testify, and defended the decision to invest in climate change resilience: “Climate change threatens our nation’s security... The analysis of infrastructure exposure to extreme weather events we have conducted shows that rising sea levels, more severe storms, extreme and prolonged drought conditions, and severe flooding combine to threaten the infrastructure that provides essential services to the American public.” He told DCD his views had not changed.

The work on ensuring resilience continues, but faces another roadblock: “Almost everything we do is voluntary,” Kolasky said. “There are a lot of different requirements and regulations that infrastructure owners have to follow, but our relationship with them is voluntary.

“One of the things we try to do as an agency is use existing requirements as a way to incentivize security best practices within that, but you’re not going to hear me doing a lot of calling out companies for not doing the right thing. We have levers of influence, but they tend to be behind closed doors.”

A company like CenturyLink, Kolasky said, “would let us know if they were building significant infrastructure. They’d certainly ask for a consultation and partnership at the community level, and we have security advisors who can help do it.”

Equally, “the big builders of data centers [want to ensure that] state and local governments know that their infrastructure is really vital to the functioning of something that’s important to the government.”

Searching for weak points

While data centers are individually designed to last a few decades at best, the benefits of close proximity to other facilities, interconnection points, and favorable tax environs will ensure that data center hubs last much longer. “If there’s too much concentration of risk of a geographic site, obviously that can be problematic,” Kolasky said.

“And certainly our guidance encourages diversity of location so that one single incident can’t bring down a big portion of it.”

The Internet was originally designed to be incredibly robust, with its predecessor, the ARPANET, built to withstand a nuclear war. “It is an amazingly resilient infrastructure,” Professor Barford, who is the founder and director of the Wisconsin Advanced Internet Laboratory, said. “However, there are certain locations and certain aspects of the Internet that are much more strategically important than others.”

Barford hopes to analyze where the Internet is most vulnerable, from interconnection points to submarine cables. “But I would say that in the research community, there isn’t a clear understanding of exactly where the most important aspects of the Internet actually are. I don’t think that there’s been any systematic assessment of that. And until we actually do that work, then it’s very hard to say: ‘Well, here’s where we need to focus the most on because if this goes down, all hell is gonna break loose.’”

There are two facts that one should bear in mind. First, that some 70 percent of Internet’s traffic flows through Northern Virginia. Second, that Virginia is sinking.

“It’s complicated, but it has to do with the fact that the glaciers used to come down almost to Virginia, and they actually pushed up the land ahead of them and caused it to bulge up,” Virginia’s state climatologist Professor Jerry Stenger said. “And now it’s sinking back down.”

At the same time, “you’re going to have more melting of land ice that’s going to run off and raise sea levels because there’s significant increases in the melting rates of the Arctic ice shelves and at the margins of the Greenland ice sheet.

“So the question would be, what about data centers that are located right near the coast?”

Virginia Beach, which features both data centers and cable landing stations, is home to the East Coast’s fastest-rising sea levels. Stenger told DCD that he did not want to provide real estate advice, but added: “I don’t know that I’m going to rush to buy a beachfront property at Virginia Beach. There’s going to be a lot of areas along the coast that are going to have more and more problems.”

The impacts will go beyond just the immediate sea level rise. “You raise the water level a little bit and bring a big storm in,” Stenger said, “and now you’re inundating more land every time there’s a storm surge of the same magnitude. You don’t necessarily need to wait until the water is lapping at your door, because the same type of storm comes through and now it’s pushing the water even further inland.”

Beyond the storm

Droughts will also likely become more prevalent, putting pressure on data centers that use water cooling. Dr. Arman Shehabi, the Lawrence Berkeley National Laboratory researcher best known for tracking data center energy use, is in the early stages of trying to understand how the industry could be impacted by more droughts.

“We’ve been looking at how water scarcity will change, mainly in the US, but it is a global issue of how that could affect the water demand that data centers need,” Shehabi told DCD. “If you have a data center that’s using water cooled chillers, and they’ve been sited based on what the demand is for today, that could be changing as that utility comes under more stress in the future.”

If the utility has to make tough decisions to ration water, “who would be the first they would drop? Data centers would be pretty high up there, as they’re using utility water for cooling for the most part.” While a lot of the huge water-consuming industries use on-site water, most data centers use “utility water that’s been treated and is needed for people, for houses. I would think that cooling data centers would fall at a lower priority.”

Data center goliath Digital Realty is aware of the growing risk. “We recently did a global review of water scarcity and stress risk across the portfolio and did some rankings of our portfolio that way,” Aaron Binkley, the company’s senior director of sustainability programs, said.

“A number of our facilities that use water for cooling are using non-potable reclaimed water, so we’re not heavily dependent on potable water supplies that a community would otherwise need for household purposes.” Some facilities do not use any water cooling, and the company is also looking into ways to reuse more water for longer. “That’s a significant long term effort that we’ve put in place.”

To deal with the risk of disasters to its more than 210 data centers around the world, the company recently opened a global operations command center in New Jersey. “Around times of a hurricane or a blizzard, they’re providing real-time updates and dialog out to the site teams with weather updates and other notifications so that they’re prepared and can respond accordingly.”

But the most vital way to prepare, Binkley said, is to hold activities and drills. “We do pull-the-plug tests - if somebody just walked in and hits the power off button, what happens? Is everyone prepared to respond quickly to that, and does the control system, and every piece of equipment work the way it’s supposed to? Those are real-life ways to not only test the facility itself but also to test the operators and make sure that our teams know what to do.”

The tests and procedures have to extend beyond the facility itself, to the upstream risks - like what to do if the power goes out. “We have these very robust fuel resupply agreements with suppliers so that in the event that there is an extended outage, we don’t run out of diesel fuel for the backup generators.

“We’ve planned road routes and refueling locations on-site so that they can get in with a truck in an area that’s not going to be flooded or not going to be obstructed, that they can park the appropriate distance from the fuel tank that they need to refill and do that in a time-efficient manner so that they don’t miss the window to get to the next property of ours and get that one refueled.”

Even then, there are limits - if all the roads shut and trucks can’t get through, what can one do? “This goes beyond the scope of just Digital Realty but, if our building floods that’s our problem, if lower Manhattan floods, that’s the City of New York’s problem, so to speak. We can’t build a sea wall for the City of New York.”

Building a wall

The US Army Corps of Engineers is currently studying five potential sea wall proposals. The largest would cost $119bn and take 25 years (if everything went smoothly), and is not designed to account for some predicted sea level increases. The city has several smaller projects underway.

“None of those have been fully constructed or built yet,” Ke Wei, assistant deputy director for the New York City Mayor’s Office of Sustainability and the Office of Resiliency, told DCD.

“We essentially tell telecoms and data center [operators] that because your infrastructure is so critical to the provision of basic services, public safety and health, that you don’t necessarily want to count on broader sea walls to protect your facility. We would expect them to continue to think about how they can harden their specific facilities, and that they shouldn’t solely just depend on the larger cultural resiliency projects that are being built.”

Wei collaborates with infrastructure operators across energy, wastewater, transportation, and telecoms to build resiliency plans. “To be completely honest with you, I think telecoms has been a challenging sector for us to work with on a voluntary basis. First, because newer cellular and Internet technologies have transitioned it to a more competitive market landscape versus the energy sector, which is more regulated as a monopoly.

The Great Disconnect

The great disconnect

Governments are shutting down the Internet, using digital sieges to quell unrest, and threatening the Balkanization of the web

“Second is just the different levers of control. With respect to the energy sector, there’s more local and state authority relative to what we’re seeing on the telecoms side.”

The city shares regulatory authority over telecoms with federal and state bodies, Priya Shrinivasan, special counsel and director of policy standards for the New York City Mayor’s Office of the CTO, said. “So the telcos are predominantly regulated at the federal level, and some at the state, and some at the city level. So it gets very complicated.”

While most of the work they do with telecom companies is voluntary, “we have climate resiliency design guidelines, and we put those into new city capital projects,” Shrinivasan said.

“We also share with major telecom stakeholders climate projections that are projected to impact New York City,” Wei added. “We have specific climate projections that are developed for the 2020s, 2050s, and 2080s.”

New York is no stranger to the dangers of an angry climate. In 2012, Hurricane Sandy tore through the city, flooding streets, subway tunnels, and offices. There were widespread power outages, billions in damages, and at least 53 deaths.

Data centers were mostly fortunate throughout the disaster, with only one major outage. Datagram went down when key equipment in its basement was flooded. A Zayo site risked danger when it had to power down its cooling systems amid generator issues, while Peer 1 Hosting employees were able to keep their facility fueled by forming a bucket chain to relay five-gallon buckets of diesel fuel up 17 flights of stairs.

“Because of what happened with Hurricane Sandy there were a lot of investments made to harden the facilities from flooding across the city and across the infrastructure space,” Wei said.

“And so I think that at least people are aware of the risks because of that experience,” Wei said, but warned that eight years’ work on resiliency has not been tested in action: ”We obviously haven’t experienced a comparable flooding event since [Hurricane Sandy].”

The crisis is here

“You need a big crisis - otherwise, people don’t move,” Paul Budde told DCD.

Budde should know: For years the telecoms analyst has been pushing for the Australian government to undertake a national resiliency plan for his sector. His efforts were rebuffed, and ignored. Then, Australia caught on fire.

“We’ve now got a meeting with a government minister, and a lot of the things that I mentioned in my discussion paper have been addressed, and are going to be looked at so that’s a positive.

Budde has experienced first hand what happens when little thought was given to cell tower battery power, fuel lines, or who is supposed to fill the generators, as communities went dark amid widespread wildfires. It’s an area his proposal seeks to address: “Access to electricity, and everything around it, that is seen as an easy win, because it’s not going to cost lots of money.

“Then the next thing is, what’s going to happen over the next 5, 10, 20 years? I don’t want to think about it, to be honest, but at the same time, it’s reality. There will be more fires, and that means you have to start looking at where you are placing your mobile towers.

“They are typically on top of hills, which are the most vulnerable parts because the fire creeps up the hill and is at the hottest at the top of the hill. If you’ve got your mobile tower there, there’s no hope in the world.”

This will require an honest discussion about redundancy and resilience, Budde believes. “It’s not just about communication for people living in the bush and schools and hospitals and things like that. It’s how are we as a country going to cope?”

Unfortunately, Budde has little hope of the current government finding a solution. “The politicians in Australia are ultra-conservative and are not really interested in the climate change issue because they believe that it’s far more important to keep coal going for jobs and income. We still have a long way to go on the political side to really get a visionary plan and a long term strategy. These sorts of disasters will increasingly happen, and to be honest, if it’s man-made or not, who bloody cares?

“We’ve got these problems. Come up with plans. Sticking your head in the sand is not the solution.”

Wildfires are not a problem localized to Australia, as any Californian resident can attest. The flames pose a huge risk to human life, and infrastructure. But for data centers, “the biggest issue is the smoke,” Pete Marin, CEO of wholesale and enterprise data center company T5, said. “If you use some type of indirect or direct evaporative and you’re taking smoke in through your cooling system, that’s very problematic.

“So on the West Coast of [the US], where there have been fires, you just have to monitor that. And that comes down to proper protocols and the operations of the data center. And that’s how you manage that, you don’t manage it through the cooling system. You manage it through the way you operate, and the process and procedures that you train and train and train for.”

But, again, no matter how much operators train, they are helpless to stop the impact of the disaster affecting upstream elements. “For the majority of our data centers we have a municipal utility that provides this very reliable low cost, clean power, but some of the transmission lines that feed power to that municipality are PG&E lines,” Digital Realty’s Binkley said.

“And so even though they’re not PG&E, and our bill doesn’t come from PG&E, we have an indirect exposure to the reliability of PG&E’s system,” he said, referring to the utility’s decision to turn off power amid wildfires late last year.

“And there’s no way for anyone in that market to avoid that.”

PG&E’s self-imposed outage was a mixture of climate change exacerbated events and poor planning by the utility company. But the grid is certainly going to struggle with the changing climate.

Do you trust the grid?

High temperatures and heatwaves limit the transfer capability of transmission lines, increase transmission loss and line sagging. High winds and storms will knock out transmission lines. Cold waves, snow, and ice can bridge insulators and cause flashover faults. Lightning strikes can cause short-circuit faults, or the voltage surge can damage equipment. Floods could take out substations.

Utilities will have to prepare for all this, while simultaneously trying to rapidly shift to intermittent renewable power sources, and handle new use cases such as electric cars. Rising temperatures and heat waves are also expected to lead to a huge increase in air conditioning load, risking blackouts and brownouts when people need power the most.

“From a city perspective, more people die every year due to heat-related illnesses than the number of people that were killed during Sandy,” Ke Wei said. “So that’s something that we think about a lot.”

Cities are designed to last countless generations, and require thinking that is equally long-term. Is it fair to expect the same from the data center industry? “We’re thinking out maybe 10 to 20 years and I think for that timing we don’t have any concerns about the usability of our facilities. But 40 years? A lot can happen between now and then,” Ed Henigin, CTO of Texan colo Data Foundry, said.

“I mean, geez, what if everybody moves to the cloud and colocation data centers are pointless and it all doesn’t matter? What if Amazon just ends up buying up all the available real estate because they’re just this massive?”

Henigin is confident his facilities are ready to handle the increased storms and climactic events expected to batter the southern states - ironically because the area has always been at risk. “The Dallas area in northern Texas has a far higher risk of tornadoes. So, as these things get magnified by climate change, we’re already in a position where we have basically over-designed in order to mitigate those risks, whether or not we had the foresight to know that this was really going to be a long-term climate change issue.“

This means that data centers are designed to ride out a 185mph wind: “We all say if the sky turns green, go into the data center.”

We’re in this together

It’s this shared fate, and disagreement over where the responsibility for action lies, that scares climate scientists. “Whenever I’m talking to people, I always bring this up,” Argonne’s Kotamarthi said.

“Let’s say, there is a certain amount of change in a place which has very little capacity to adapt. The consequences will be much more disastrous than a place where it has much higher resources to adapt.”

The US, should it collectively agree to believe the preponderance of scientific evidence, is far better placed to adapt than much of the planet. “Large portions of this world are not ready to handle that - they’re not even thinking about it,” Kotamarthi said.

“Researchers have put temperature and humidity data together, to come up with an index of livability, and it shows that for a lot of the summer people cannot even go outside and work in large parts of Middle East, North Africa and India.

“It is shocking when you realize how few people are actually worried about. It could affect millions of people, maybe hundreds of millions. And it seems like nobody is really worried about that.”

Even in the US, “people in Florida are not really thinking hard about what it all means. I can immediately see a map of the US from a later part of the century, and I know which year it is from just by looking at how much of Florida is underwater. I don’t think people are really thinking about it. That’s the really shocking thing for me”

Kotamarthi has been studying the climate since the ‘90s and said that the decades-old models of his youth have held up surprisingly well. “The models now are fairly robust. Actually, they may be underestimating some of these changes.

“I’m worried about what is going on. But I’m also hoping people will be proactive as they start getting impacted more and more.”

When the wind blows

But building for hurricanes is something that operators need to understand will cost more, and add to the construction time. “The key force to deal with when it comes to high winds is uplift. In order to build a highly wind-resistant structure, it starts with the foundations.”

Giant tubes of concrete are dug up to 25 feet deep, and are welded to the roof structure. “So when there’s a high uplift pressure on the facility, it would have to pull 25 feet of ground beneath the building.”

Henigin believes that there are some companies that don’t care about such levels of windproofing. “We’re really talking about the hyperscalers and people who share their mindset, who have such large capital budgets and such rapid lifecycle turnover of equipment, and distributed footprints, that specific individual facility resiliency is less important to them than removing those features to save the money that they can then spend it on a wider footprint or software stacks that can dynamically respond to outages.

“That part of the market will certainly achieve significantly lower construction costs by taking everything out, but that’s a holistic thing.”

Still, even with the extra layers of defense, Henigin admits that no structure can ever claim total resilience. “There’s force majeure events that are really beyond the scope. We do not promise and no business promises to be [indestructible].”

“There are inherent risks to physicality. If you take the map of Houston, it’s very flat, very close to sea level. Nevertheless, the city of Houston is one of the largest economies in the world. And we are a portion of that, and if that larger economy takes a hit, then we will take the hit along with it. There’s just a massive shared fate element to these systemic things that can happen.”