The term “CPU," short for Central Processing Unit, has been around since the 1950s. At the heart of every computer or smart device sits the CPU, tirelessly executing traditional applications.

Since the 1990s we have also had the GPU, or Graphical Processing Unit – originally created for the special task of accelerating on-screen computer graphics for engineering workstations, video processing, and gaming to keep up with the extremely demanding expectations of the human visual system. We owe our survival to our ability to be immediately alert to the slightest movement or visual anomaly – and that has always presented a severe challenge to cinema, television or computer screen design. Whereas the CPU is designed to process a linear stream of data, the GPU is designed to process a multidimensional experience including three dimensions of space plus dimensions of color and time. In theory, the CPU could handle this, but in practice it would be far too slow and imprecise.

The fascinating thing about the GPU is that this multidimensional processing skill turns out to be much closer to the requirements of machine learning and big data processing. Something originally created to make computer games more vivid has become a vital part of today’s data centers driving digital transformation.

And now we are hearing a lot more about DPUs – “Data Processing Units.” But don’t CPUs and GPUs already process data? The difference is a matter of “it ain’t what you do, it’s the way that you do it.” The DPU plays a key role in managing the way data moves through the data center. And yes, it does contain its own CPU.

SIlicon chipsets
– Thinkstock / donfiore

Acceleration through intelligent networking

Consider a high-level international diplomatic conference. In the interests of total accountability and accuracy one might insist that every speakers’ words be transcribed in their native language and passed to every other participant, who then has the job of translating those exact words into their own language. But it does not work like that – instead we use interpreters to tell the listeners what was said, in their own language. Translators have time to study dictionaries, interpreters do their job on the fly.

The role of the experienced and intelligent interpreter is not simply to provide an accurate translation, the job requires other skills. First there must be an understanding of national protocols. The Australian speaker might use the word “mate” as a friendly gesture, when the interpreter knows that there will be less diplomatic friction if “mate” is replaced by “your Esteemed Highness.” Then there is the understanding that phrases familiar in one culture – say the English “it isn’t cricket” – might need explanation for others. Then there is a need to accelerate communication by editing “ums and ers,” dramatic pauses and rambling personal anecdotes. There can even be some subtle encryption: “our European competitors” might be encoded as “our European allies,” Finally, in the interests of peace and global security, some utterances might simply be best forgotten.

Instead of lumbering every delegate with task of analyzing pages of transcript, a team of skillful interpreters can make sure that the essential messages are conveyed in the most economical form to those that need to know – while keeping a check on malicious messages. The delegates, spared a load of translating and protocol headaches, can concentrate better on high level diplomacy.

In a traditional data center the network is no more than a web of cables linked by switches to deliver data to and from the CPUs, where all the processing takes place. In a modern data center the network is connected through DPUs that process the data on the fly to reduce the load on the CPUs and free them up for their intended application processing. They play a role not unlike a team of very professional interpreters. So what is in the DPU?

The DPU

The DPU is a new class of programmable processor, a system on a chip (SOC) that combines three elements:

  • An industry standard, high-performance, software programmable, multi-core CPU.
  • A high-performance network interface capable of parsing, processing, and efficiently transferring data at network speed.
  • A rich set of flexible and programmable acceleration engines designed to offload networking tasks and optimize application performance for AI and Machine Learning, security, telecommunications and storage etc.

That third component is especially significant, because it can be compared with the skills of the perfect interpreter described above: a set of functions that together serve to optimize efficiency, make sure the right data goes in the right format to the right place along the fastest and most reliable route. It can also support encryption, as well as identifying anomalous traffic or malware and even initiating an appropriate response. Some “DPU” vendors may rely on proprietary processors, and others expect the CPU to do all the work, but unless all these three elements are present, the device should not be called a DPU.

The DPU is usually built into a network interface controller (NIC) to create what is called a SmartNIC. SmartNICs are key components of next generation data centers, the sort needed to power the data revolution that has been called “The Fourth Revolution."

The Fourth Revolution

Some historians have identified three key revolutions in the evolution of humanity:

  • The Cognitive Revolution, around 70,000 BCE, defined the birth of language and the ability to communicate.
  • The Agricultural Revolution, around 10,000 BCE, defined our ability to domesticate farm animals and cultivate crops to support the rise of cities.
  • The Scientific Revolution, during the early modern period, was when developments in math, physics, astronomy and sciences laid the foundation for our modern way of life.

The fourth revolution is artificial intelligence (AI), that can perform tasks with super-human levels of performance. Advanced machine learning can now organize, process, and extract subtle but valuable information from mountains of data gathered from millions of users on the Internet and the growing Internet of Things. This marks a seminal step towards developing true artificial intelligence and the extraordinary possibilities that this enables.

Even with the benefit of GPUs to crunch the data, progress has been held up by the extreme burden it places on the CPUs to access and share information between computers and keep the GPUs fed with data. A key development has been to make the network even faster – but this has overwhelmed the CPU with data management tasks. But now with DPUs we make the network smarter, like that highly professional team of interpreters, so the network becomes an active agent in the overall data processing. DPUs bring the processing closer to the data itself, and they make the network act like a co-processor offloading labour from the central compute engine.

Digital Transformation is the business application of the Fourth Revolution. That is why we will soon be hearing a lot more about DPUs.