The need for data is accelerating. According to industry analyst firm IDC, 64.2 zettabytes of data were created, captured, copied and consumed worldwide in 2020 – due in large part to the number of people working, learning and entertaining themselves from home during the pandemic.

Meanwhile, DataReportal estimates that more than five billion people around the world use the internet – equivalent to 63 percent of the world’s population. And the number continues to grow. In the 12 months to July 2021, almost 180 million new users joined the online population.

The growing internet of things (IoT) market is only adding to this explosion in data. Analyst firm IoT Analytics predicts there will be 14.4 billion active connected devices by the end of 2022, almost doubling to 27 billion by 2025.

It’s perhaps unsurprising, then, that some have predicted that the amount of electricity required to manage this data may grow four-fold by 2030 – a concerning increase given ongoing environmental concerns.

One thing that’s becoming increasingly clear is that, as the volume of data grows, managing the unprecedented scale and speed of that growth presents data center owners and operators with a range of challenges.

Addressing challenges

A conflict exists between performance and sustainability, for instance. While it’s essential to drive a data center’s performance to meet the ever-growing demand for data, this can often run at odds with the size of its carbon footprint.

The complexity involved in running more efficient data centers can be challenging, too. According to McKinsey & Company, 87 percent of companies worldwide believe they have or expect to have a skills gap within their organizations.

An increasingly competitive climate means it can be difficult to hire and retain qualified people with the skills and expertise needed. As a result, data center operators can find themselves without the people they need to do everything that needs to be done.

Adapting to the increased demands of managing such a large volume and variety of data will require a shift away from traditional standard operational procedures toward more agile micro decisions.

Put simply, making decisions every minute or hour will no longer be enough. The need for efficiency means decisions must now be made in near real-time.

For the same reason, low-latency Edge processing is needed to manage the proliferation of IoT devices, as well as the capabilities of 5G-enabled applications. Data center operators will therefore need to consider their transition to the Edge while simultaneously running their core business.

Taking the right steps

It’s clear, then, that data centers as we know them are changing. But there are important steps that all data center operators must take now to reach the optimum situation for today’s data-rich environment.

Eaton 1.png
As the volume of data grows, managing the unprecedented scale and speed of that growth presents data center owners and operators with a range of challenges – Eaton

They should already be assessing the extent to which their processing is digitally enabled, for example, and identifying any gaps in their solutions that must be filled for effective operations and a more efficient use of resources.

Certain functional requirements must be maintained to ensure operations meet today’s standards. For example, accurate and up-to-date asset management and capacity planning are both key to optimum efficiency, as too is effective and, ideally, proactive 24/7 monitoring.

Remote monitoring, in particular, allows data center managers to view data in real-time and receive alerts when the unexpected happens, enabling them to quickly address issues to minimize – and potentially eliminate – downtime.

Consideration must also be given to optimizing existing assets and processes to maximize their efficiency and effectiveness. Workflow management and visualization tools are important for better capacity planning, for instance. And failure simulation, which many data center owners haven’t considered, will ensure an organization’s readiness in the event of an outage.

But perhaps even more valuable, in terms of adopting a more efficient proactive approach, are features such as power anomaly analytics and predictive asset health, both of which can help minimize the impact of an issue occurring before it has an opportunity to take hold – often eliminating it all together.

Ultimately, though, the key to unlocking new value from a data center lies in multi-system integration. There’s currently a wall between the data people need to make the right decision, and that data that’s available. It’s why we talk about a single pane of glass; intelligent decision-making requires the right data to be made available to the right people.

But this data will often be held in various discrete silos such as information technology service management (ITSM), building management (BMS), data center infrastructure management (DCIM) and enterprise resource planning (ERP) systems. It’s essential, therefore, to integrate all these systems to fully optimize them.

Looking to the future

Finally, of course, data center operators should always keep one eye on the future, thinking about forward-looking strategies and investments that will enable them to remain agile in the years ahead. The goal is to continue testing, learning, and adopting the building blocks that will help solve tomorrow’s challenges.

As I mentioned earlier, balancing performance against sustainability is key. Eaton’s grid-interactive EnergyAware uninterruptible power supplies (UPSs), intelligently leverage a data center’s connected energy storage to manage power and the flow of energy.

Eaton 2.png
With Eaton EnergyAwareUPSs, data center operators can support sustainable energy solutions, optimize the cost of powering buildings and create additional revenue streams from power protection assets – all while helping energy providers balance power generation and consumption – Eaton

Not only can these UPSs help optimize energy usage and reduce the cost of energy through demand-response activities, but they can also support the grid to allow a higher penetration of renewables.

Shifting to a smart Energy-as-a-Service (EaaS) model and offering a more flexible, responsive customer-centric energy system will also improve sustainability, along with helping customers better manage costs.

Greater insights will be crucial, too, to enable the agility and flexibility that’s needed. Investing in data science capabilities such as advanced artificial intelligence and machine learning will help organizations glean the most valuable insights from their data, allowing them to make the right decisions to accelerate their business.

But making the decisions that will drive actions to achieve smart value requires the ability to run different scenarios. That’s where the concept of digital twins will become increasingly important. Indeed, running and testing scenarios in this way is crucial to making micro decisions.

Data centers must evolve to keep pace with the demands of an ever-growing volume of data – not to mention environmental concerns around their use of power.

Embracing that data and enhancing assets appropriately is, admittedly, no easy task. But, by assessing, maintaining, optimizing and integrating those assets – remaining mindful of what might lie ahead – data center owners and operators can ensure they’re moving in the right direction for greater efficiency, effectiveness and sustainability.