Friday, May 1, 2026
Search

Data Processing Units Offload Security Tasks as Global AI Infrastructure Race Intensifies

NVIDIA's BlueField-3 chips now run FortiGate firewall software directly on silicon, enforcing zero-trust policies without taxing GPU compute resources. The architecture shift—moving security, networking, and storage to dedicated processors—mirrors data center design patterns emerging across hyperscale operators in North America, Europe, and Asia as enterprises convert legacy infrastructure to handle AI workloads.

Salvado
Salvado

March 17, 2026

Data Processing Units Offload Security Tasks as Global AI Infrastructure Race Intensifies
Image generated by AI for illustrative purposes. Not actual footage or photography from the reported events.
Loading stream...

NVIDIA's BlueField-3 data processing units now execute FortiGate firewall software directly on the chip, enforcing zero-trust network policies at line rate while GPUs remain dedicated to AI computation.1 The integration eliminates a bottleneck that has constrained global data center operators: traditional security architectures tax the same processors needed for AI training and inference.

Kevin Deierling, networking executive at NVIDIA, said AI factories require a new class of secure, accelerated infrastructure.2 The design separates infrastructure services from computation by handling networking, storage, and security on dedicated silicon rather than sharing general-purpose CPUs—a shift already underway at hyperscale operators from North America to Asia.

Data center operators worldwide are converting existing facilities to support AI workloads. Milton Ault III projected 2026 as pivotal for hyperscale operations, citing organic expansion in AI infrastructure and the return of Ballista, a restructured data center asset.3 Years of capital investment in infrastructure and software platforms are now positioned to generate scaled revenue as enterprises across regions deploy AI systems.

The hardware approach addresses constraints affecting global operators: AI workloads demand massive parallel processing, but legacy security and networking architectures consume compute resources needed for model execution. Moving these functions to DPUs lets operators run security policies at full network speed while GPUs focus entirely on AI tasks.

Enterprise software vendors building AI features depend on this infrastructure shift. Adobe and UiPath have integrated AI agents into their platforms, while products like AgentMail run autonomous systems requiring consistent, low-latency compute. These applications assume infrastructure can handle both heavy AI workloads and strict security requirements simultaneously—a challenge facing operators in every major market.

The architectural evolution mirrors earlier data center transitions, when storage and networking moved from software on servers to dedicated controllers. DPUs represent the next iteration: chips purpose-built to handle everything except application workloads, creating isolated security boundaries without performance overhead.


Sources:
1 Yahoo Finance, "Crypto Currents: SEC, CFTC sign MOU for Joint Harmonization Initiative" (March 14, 2026)
2 Yahoo Finance, "Fortinet Delivers Isolated Infrastructure Acceleration for the AI Factory with NVIDIA" (December 16, 2025)
3 Milton Ault III, via Yahoo Finance
4 Kevin Deierling, via Yahoo Finance
5 Kevin Deierling, via Yahoo Finance

Salvado
Salvado

Tracking how AI changes money.