Data Processing Units (yes, that is what DPUs stands for…not GPU) have become a hot topic within the past few years, right around when SmartNICs were announced by Nvidia (ConnectX). The new Nvidia Bluepoint lineup is not only able to offload the networking functions, but it can take over a lot of other tasks as well! Do you embrace 200, 400 or 800 Gb/s speeds? Choose between the Bluepoint-2, -3 or -4 model accordingly if that is the case then.
What do I mean by that and why should you even care? Well, everyone wants to squeeze that last bit of performance out of their hardware to improve the user experience, right? What if you could use this DPU to run your OS and offload some security functions such as traffic/packet inspection, micro-segmentation, DDoS protection? Or how about software defined storage tasks such as encryption, erasure coding, deduplication, data integrity, compression/decompression, lower latency and more? Basically reserving the standard compute resources (CPU and memory) exclusively to focus on the workloads (virtual machines, containers, applications) running inside of it.
On the enterprise storage market there are some companies taking advantage of these features already…Pure Storage has been leveraging this same concept since the Flashblade chassis came out in 2016 (each blade is a controller, handles its own processes and can grow the performance linearly) and there are some other newcomers out there like Fungible created specifically with this technology in mind.
Sky is the limit and even though I do not have an immediate DPU use case today, I am looking forward to hear plenty of success stories from others once they start taking advantage of this new and cool technology!
Nvidia Bluefield-3 datasheet specs here