Update: With the 2.1 release of F5 BIG-IP Next Cloud-Native Network Functions (CNFs) in August 2025, F5 CNFs are now officially supported on NVIDIA BlueField-3 DPUs. This milestone expands our collaboration with NVIDIA and makes the performance, efficiency, and security benefits of DPU-accelerated CNFs available more broadly to service providers.
Over the past several years, service providers have been building out their infrastructures to prepare for the world of 5G. As 5G converges with the AI era in front of us, these investments are positioning telcos to optimize growth by providing accelerated AI services at the edge.
To help service providers achieve higher ROI on their infrastructures, F5 and NVIDIA have teamed up to accelerate F5 BIG-IP Next Cloud-Native Network Functions (CNFs) with NVIDIA BlueField-3 DPUs.
BIG-IP Next CNFs combined with NVIDIA BlueField-3 DPUs, enables customers to leverage F5’s proven network infrastructure capabilities—including edge firewall, DNS, policy enforcement and distributed denial-of-service (DDoS) refactored into lightweight cloud-native functions—while offloading and accelerating critical processes to the DPU to improve network performance and security. As a result, operators optimize their computing resources while gaining better power consumption per Gbps for edge AI services.
““By combining F5’s cloud-native traffic management and comprehensive security services with NVIDIA’s accelerated data processing capabilities, we’re helping our customers to reduce their total cost of ownership and expedite the rollout of new services using efficient and scalable infrastructure.””
Our solution enables service providers and telecoms to manage demanding 5G and AI workloads—like AI-RAN—without overburdening CPU resources. By combining F5’s cloud-native traffic management and comprehensive security services with NVIDIA’s hardware acceleration capabilities, we’re helping our customers to reduce their total cost of ownership, accelerate the rollout of new services, and future-proof their infrastructures.
Leveraging underutilized RAN capacity for AI applications
At last year’s Mobile World Congress, NVIDIA along with other telecom leaders established the AI RAN Alliance, with the goal of weaving AI into the radio access network (RAN) through shared AI and RAN infrastructure, AI based signal processing and AI on RAN applications. NVIDIA AI Aerial provides a portfolio of training, simulation, and deployment platforms for AI-RAN, paving the way for AI-native wireless networks, including future 6G.
With AI-RAN, one of the opportunities is to leverage a unified accelerated compute infrastructure to run both AI and RAN workloads, compared to bespoke 5G RAN systems that today are underutilized. While doing so could open new revenue streams, it’s difficult to achieve without compromising latency and security. BIG-IP Next CNFs offloaded and accelerated on NVIDIA BlueField-3 DPUs, a key component of NVIDIA’s AI-RAN reference architecture, simplifies this effort by delivering the necessary traffic management and security with the lowest latency for both 5G RAN and AI workloads over the same unified infrastructure.
Increasing resource efficiency at the edge
As more telcos move to a distributed UPF architecture to process data closer to the edge, they’re struggling to minimize space and power as they implement security and traffic management services on N6-LANs at these distributed UPF locations.
Our solution benefits telecoms by increasing the resource efficiency of these edge and far edge deployments. In the past, telcos used dedicated appliances or virtual machines for network functions such as firewall, DDoS, DNS, and traffic management, consuming the limited space and power typically available at edge locations.
Deploying BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs enables service providers to significantly enhance operational efficiency while reducing cost. A notable example is a mobile provider that transitioned from virtual network functions (VNFs) to cloud-native network functions (CNFs), achieving a 31% reduction in vCPU usage. This shift translates into substantial cost savings for large mobile networks and unlocks additional capacity for high-value, revenue-generating telco services. By further offloading CNFs to BlueField, service providers can maximize vCPU availability, leading to even greater resource optimization and cost efficiency.
Addressing edge-related security gaps
As telcos increasingly deploy enterprise and AI applications at the edge to improve the customer experience and process data closer to the source, zero trust security becomes essential. However, a fragmented approach to network security can lead to blind spots and coverage gaps increasing the risk of vulnerabilities.
To ensure robust protection, telcos must adopt a comprehensive, end-to-end security strategy that eliminates these risks. By deploying BIG-IP Next CNFs on NVIDIA BlueField-3 DPUs, service providers can gain access to carrier-grade network functions, including F5’s firewall, DDoS protection, CGNAT and policy enforcement, in a single, cloud-native solution that leverages the zero-trust capabilities of BlueField. This integrated approach ensures robust, zero trust security for edge AI and 5G environments, enhancing both network protection and operational efficiency.
Unlocking the benefits of the AI era
This solution builds on previous collaborations over the past several months. In October, our two companies introduced BIG-IP Next for Kubernetes deployed on NVIDIA BlueField-3 DPUs, designed to help enterprises and service providers simplify the complexity of managing their large-scale AI factories and infrastructure.
This is just the beginning. Be sure to check back as we continue to develop the innovative solutions our customers need to thrive in the AI era.
F5 is sponsoring NVIDIA GTC as a silver sponsor, March 17-21, 2025, in San Jose, California. Learn more about F5 at NVIDIA GTC on our event landing page. Also, be sure to read our press release.
About the Author
Related Blog Posts

F5 accelerates and secures AI inference at scale with NVIDIA Cloud Partner reference architecture
F5’s inclusion within the NVIDIA Cloud Partner (NCP) reference architecture enables secure, high-performance AI infrastructure that scales efficiently to support advanced AI workloads.
F5 Silverline Mitigates Record-Breaking DDoS Attacks
Malicious attacks are increasing in scale and complexity, threatening to overwhelm and breach the internal resources of businesses globally. Often, these attacks combine high-volume traffic with stealthy, low-and-slow, application-targeted attack techniques, powered by either automated botnets or human-driven tools.
F5 Silverline: Our Data Centers are your Data Centers
Customers count on F5 Silverline Managed Security Services to secure their digital assets, and in order for us to deliver a highly dependable service at global scale we host our infrastructure in the most reliable and well-connected locations in the world. And when F5 needs reliable and well-connected locations, we turn to Equinix, a leading provider of digital infrastructure.
Volterra and the Power of the Distributed Cloud (Video)
How can organizations fully harness the power of multi-cloud and edge computing? VPs Mark Weiner and James Feger join the DevCentral team for a video discussion on how F5 and Volterra can help.
Phishing Attacks Soar 220% During COVID-19 Peak as Cybercriminal Opportunism Intensifies
David Warburton, author of the F5 Labs 2020 Phishing and Fraud Report, describes how fraudsters are adapting to the pandemic and maps out the trends ahead in this video, with summary comments.
The Internet of (Increasingly Scary) Things
There is a lot of FUD (Fear, Uncertainty, and Doubt) that gets attached to any emerging technology trend, particularly when it involves vast legions of consumers eager to participate. And while it’s easy enough to shrug off the paranoia that bots...

