Edge emerged as a potent buzzword in the data center field around eight years ago.
The term came into use around 2014, a year in which Qualcomm engineering vice president Karim Arabi defined the term as “all computing outside the cloud happening at the edge of the network, and more specifically in applications where real-time processing of data is required.”
He was delivering a keynote at an IEEE DAC (Design Automation Conference) and repeated the ideas in a wider talk at MIT the following year.
Buy our Edge gear
Applications were emerging, we were told, that needed fast responses, or low latency. A lot of processing power should move out of the centralized cloud facilities where it had migrated during the first part of this century, and be placed close to the users and devices that produced and consumed this data, so it could process these responses in millisecond timescales.
What were those applications? They varied, but an explosion in the Internet of Things (IoT) was predicted, with sensors everywhere that needed to connect directly to control software. Vast fleets of autonomous vehicles were due on our roads any day, and it seemed obvious that they could only avoid collisions if they had very fast responses to image recognition queries that could only be delivered from Edge networked resources.
During the years of Edge hype, other applications came and went. Virtual reality gaming was going to be huge, but users would get seasick unless Edge processing could give their headsets instantaneous data that tracked their exact position.
After this, we heard that, as we all moved into the metaverse, we’d have to have servers within a few meters, to keep our avatars updated and live.
And how would the Edge be delivered? Multiple vendors had different answers, which boiled down, for each one, to: "You need our kit."
Mobile telcos got behind Edge, making it almost synonymous with their latest network iteration, 5G. The telco Edge could serve all these applications from equipment at base stations.
There was a surge of packaged micro data centers, from companies including Schneider and Vertiv, and newcomers like Zella and DataQube. These could hold enough equipment to serve applications locally and communicate back to central locations.
Telecoms operators and Edge providers planned to put a shipping container full of kit on every street corner, or at least by every cell tower.
Fast response - slow growth
Years on from that, things don’t look quite as planned. For some, the Edge began to look like a Yukon Gold Rush. Difficulties have been flagged up, including the fact that Edge resources will be more expensive than hyperscale resources, as they lack the economies of scale of large facilities, and can’t be moved to where there is cheap power.
Telco towers were supposed to be the prime location for these containerized Edge boxes. That is happening but it won't reach the levels projected, because most telco towers don’t actually have enough spare electrical power to run much.
Also, some versions of Edge were planning, to put it simply, to install IT-grade boxes in telco spaces. Telcos work with cabinets and have hardened “NEBS-rated” equipment. IT equipment is normally designed for use in an office-like space. The ever-practical experts on environmental control at ASHRAE pointed out that electronics in an Edge box could be subjected to an ingress of dirt and moisture every time the door was opened.
Alongside all these difficulties, the most-hyped Edge applications have been stubbornly slow in arriving. IoT turns out to only need small amounts of data. Autonomous vehicles, if they ever arrive, will need response times so fast they will carry their own IT rather than rely on smart Edge lamp-posts. And to most people, the metaverse is still a repugnant fantasy.
But other Edge players are still there, such as Involta. There’s a common factor in most of the Edge survivors. They don’t use vanilla containerized data center boxes, most of which were looking past their best before the Edge hype started.
Edge is not really about specific locations, but providing good enough latency. When the industry started to consider the practicalities, it turned out - surprise surprise! - that the speed of light is actually pretty fast. So you can get good response times from relatively distant locations - and Edge resources could be relatively centralized in modest-sized data centers.
The Edge shipping containers mostly didn’t show up. The big companies still offering this kind of business model are making less extravagant claims, and have shifted the containers to less prominent pages on their websites.
The real Edge: CDNs... and AI?
Alongside this, it turns out there is actually a real Edge in existence. We have some real applications, and the infrastructure to deliver them has been getting smarter and more capable.
I am talking about content delivery networks (CDNs) which started speeding up Internet access before the year 2000, by caching content locally. Cloudflare, Akamai, and others have expanded on this model to host applications locally, creating a lot of what the Edge was supposed to deliver, but without the hype.
Netflix and other streaming services do the same, and they do it with small packages of standard IT hardware installed in colocation data centers.
That’s the story so far. But now, there’s another possible Edge application being talked of: artificial intelligence (AI). Once an AI model has been trained, far away from users, on demanding hardware in a remote data center, it will be deployed close to users where its wisdom can be quickly applied.
CDN players like Cloudflare are preparing to seize the opportunity.
After all the hype and hoo-ha, we might just have a real Edge application - and the players who deliver it could be ones who were around long before the Edge story first took off.