Spending on Internet of Things (IoT) will reach $1.3 trillion by 2020, and the processing of 43 percent of this data will be at the edge, according to IDC projections. The number of “connected things” will reach 30 billion in 2020 and jump to 80 billion in the following 25 years, the research firm predicts. This market demand is driving the need for new solutions like edge computing.
Historically, the primary point of IT has swung between the focus on a centralized model and a decentralized one. It started with centralized mainframe computing, then moved to decentralized client-server networks, and then back again to a centralized model in the cloud. Now, the time for the pendulum to swing back to a decentralized model has come in the form of ‘local edge computing.’
As IT infrastructures are now required to support IoT applications and devices – with different requirements from traditional IT applications such as e-mail and productivity applications – the decentralized model will become critical.
This doesn’t mean the cloud is going away. It will continue to handle massive amounts of data from traditional IT applications and much of our data for data science will be handled there, but it will be reserved for the data that doesn’t require immediate attention or has different needs from traditional applications. However, anything that requires real-time decisions, such as intelligent medical devices that monitor patients and feed information back to healthcare personnel, is likely to be handled on the edge.
Edge computing defined
But what exactly is the edge, and why should data have to be processed there? Simply put, within your hybrid IT deployments, the local edge creates a high-performance bridge for your off-premise public and/or private cloud, on-premise and co-located corporate data centers and your local on-premise IT deployments. The cloud centralizes data processing and storage, but as companies embark on IoT implementations, many have come to realize this centralization has its limitations and there are certain applications it simply cannot support appropriately.
IoT aims to link together any device or person that can be connected through a network, to generate a continuous flow of information that helps organizations reduce their operational costs, increase revenue or improve customer experiences.
Edge computing is needed because many IoT applications require some combination of extremely low latency, high-bandwidth or strict data-handling. Take, for instance, the retail industry:
To improve foot traffic at brick-and-mortar locations, retailers are digitizing the customer experience through the implementation of digital signage, digital wallets, augmented reality and smart fitting rooms, while at the same time optimizing their cost structure through digitizing their supply chains. If the data used by these systems has to travel hundreds or thousands of miles to a centralized cloud center, the latency it experiences will dramatically affect customers’ in-store experience the retailer is attempting to deliver.
This is why edge computing has gained momentum. It creates a network of localized sites that process data as close to the person or thing as possible, thereby greatly reducing or eliminating latency, bandwidth and data handling issues.
So, as we look ahead into the brave new world of IoT and edge computing, we will see a combination of cloud, regional data centers and localized edge computing that ideally will work in harmony to produce better business outcomes and bring Certainty in a Connected World. What those outcomes will be is up to each individual company, but optimizing operational processes, increasing revenue and improving customer experience are usually high on the list. Any company looking to leverage IoT technology needs to understand edge computing and determine where this decentralized approach will bring new efficiencies to their operations.
To learn more about edge computing and it applications to companies, read this free white paper, “The Drivers and Benefits of Edge Computing.”