If you dig into the history of cloud computing, you can trace its origins back to the 1990s though it was still theoretical and remained that way until well past the turn of the century.

It wasn’t until 2006 when former Google CEO Eric Schmidt introduced the term that anyone started to take notice. But once they did, there was no turning back.

The 2010s were inarguably cloud's decade as the fledgling market exploded into the $227.8 billion global phenomenon. Cloud providers like Amazon, Microsoft and Google shot to the top of any list of the world’s most valuable businesses and brands, and their gravity drove numerous changes across the data center industry.

Most notably, the cloud presented an alternative to the enterprise data center that was the industry archetype for some 20 years. Early, cautious forays into the cloud eventually gave way to a significant migration. More than a few obituaries were written for those enterprise facilities.

But something interesting happened on the way to the funeral. As organizations adapted to the cloud, so too did their data centers. They became smaller, more efficient, and even more mission-critical because they housed the data that organizations didn’t entrust to the cloud. Now, enterprise facilities are becoming the hub of a new, hybrid network of architectures that incorporate public and private cloud resources and distributed edge computing.

Data centers have adapted to the cloud – Thinkstock

The fast and the flexible

A new report, The Modern Data Center: How IT is Adapting to New Technologies and Hyperconnectivity, touches on several topics related to this trend.

The report is the result of a survey of 150 data center executives and engineers from various industries around the world.

Perhaps the most alarming data point from the report concerned a widespread belief that organizations are not fully prepared for today’s evolving data ecosystem.

Just 29 percent of data center decision-makers say their current facilities are meeting their needs, and just 6 percent say their data centers are updated ahead of those needs. Digging deeper, just 11 percent of executives say their data centers are updated proactively, which is troubling enough, but engineers have an even dimmer view: just 1 percent of engineers say the same.

These concerns are compounded by workloads that are expanding and becoming more distributed. These realities place a premium on flexibility and speed because regardless of obstacles, users demand seamless transitions and accessibility.

Other notable results from the survey:

Security (43 percent), backup and emergency preparedness (33 percent), the ability to implement new technologies (28 percent) and bandwidth (27 percent) were the most commonly identified features that will lead to a competitive advantage.

Google Server
Businesses have still refused to house all their data on cloud servers – Google

45 percent of respondents said security was the area that their data centers needed an upgrade the most.

Respondents are bullish on self-configuring and self-healing data centers. 24 percent said more than half of their data centers will be self-configuring by 2025, and 32 percent said more than half would be self-healing.

74 percent of C-suite executives believe staffing will be reduced or handled by external cloud or edge service providers.

The big takeaway

A new equilibrium is emerging between public and private clouds, edge deployments, and reconfigured enterprise facilities.

The changes, driven by the seemingly limitless appetite for data, are happening so fast that many organizations are struggling to keep pace and determine where to place their bets. This is creating widespread uncertainty and unease among decision-makers and putting a premium on partners who can deliver expertise and plan to help navigate this new world.