What Is a DPU (Data Processing Unit)? How Does It Work?

 

What Is a DPU, or Knowledge Processing Unit, and How Does It Work?

One of the simplest ways to consider a DPU, says Jim McGregor, principal analyst at TIRIAS Research, is as “an inside or overhead workload accelerator.”

It’s “designed to deal with the overhead duties usually required by the CPU, similar to safety, community processing or compression,” he says, including that community processing typically refers solely to information switch. 

Whereas a DPU could possibly be a very programmable unit, similar to a field programmable gate array, most are primarily a “system on a chip” (SoC), with core blocks optimized for particular features, based on McGregor.

Importantly, he says, “they’re every distinctive and never interchangeable. A DPU from one vendor could have completely different features and options than a DPU from one other vendor.”

DPUs are primarily utilized in information facilities (and in information middle servers particularly), says Brandon Hoff, an analyst who leads networking and communications infrastructure inside IDC’s Enabling Applied sciences workforce. DPUs are sometimes broadly deployed by hyperscale cloud suppliers similar to Amazon Web Services, Microsoft Azure and Google Cloud.

Hoff notes that DPUs are vital for enterprises that wish to construct and use “AI factories,” standardized environments used to develop, deploy and handle AI functions at scale.

AI factories want a scheduled cloth for the GPUs to speak to one another, and DPUs can try this,” he says. “That’s how a lot of the hyperscalers really began doing DPUs, to construct out scheduled factories.”

RELATED: What’s the difference between CPUs vs. GPUs?

What Are Some DPU Capabilities?

Safety, networking, storage and AI analytics are the commonest features being addressed by DPUs, based on McGregor.

A key perform of DPUs is to community GPUs collectively and permit them to switch information extra effectively. There are solely so many transistors that may match on a bit of silicon in a GPU, Hoff notes. 

“And so that you join these collectively in a scale-up community and scale-out community, and DPUS are utilizing the size out community,” he says. GPUs will course of data, after which cease and share that data, Hoff notes, and the time used for that networking and information switch can characterize as much as 50 % of the entire compute time, that means they’re working solely half as effectively as they may. 

DPUs offload that functionality from GPUs, permitting them to be extra environment friendly. “In order that’s why networking is necessary and having a schedule cloth the place you could have assured supply and assured throughput is necessary,” Hoff says. “And DPUs assist try this with normal, off-the-shelf switching.”

Pairing a DPU with normal switching through Ethernet permits enterprises to create a “high-performance cloth that’s almost lossless,” Hoff says, that means a community that may switch information with nearly no packet loss.

LEARN MORE: How can you use serverless computing to build and applications?

DPUs vs. GPUs vs. CPUs: What’s the Distinction?

DPUs are “fully completely different from CPUs and GPUs as a result of they’re also referred to as system-on-a-chip, application-specific built-in circuits, designed for a selected set of features,” McGregor says. 

CPUs are designed to a selected instruction set to be fully programmable for all kinds of features, McGregor says. They’re the backbone of modern computing and deal with a variety of computing duties for working programs and functions.

“GPUs have been designed for graphics and multimedia processing however more and more have AI-specific cores as a result of their huge parallelization makes them good for AI processing,” McGregor says. 

GPUs effectively carry out the sort of calculations wanted for inferences in generative AI large language models, Hoff notes, saving organizations time and vitality prices. 

“The DPU is on the infrastructure facet, the place I can then speed up the community” for scheduled materials, safety, encryption and storage connectivity, Hoff says.

DIVE DEEPER: How automation can help IT leaders alleviate DevOps challenges.

What Are the Advantages of DPUs?

DPUs offload networking and storage features from CPUs, permitting them to run extra functions. “They’ll run the functions extra seamlessly and with out having to fret that one thing’s going to get tripped up in sending the information,” Hoff says.

The first good thing about DPUs is to be used in infrastructure; particularly, in data center servers. DPUs may simplify information middle architectures, Hoff says.

“For hybrid cloud, it makes it in order that my software is on a server at bare-metal or close to bare-metal velocity,” he says. “So I don’t have to fret about that hypervisor anymore; I’ve offloaded quite a lot of hypervisor providers.” 

“Now, I can begin actually bettering my hybrid cloud deployments, both on Amazon or inside my information middle or wherever I wish to put these workloads,” Hoff provides.

Sensi Tech Hub
Logo