F5 will supercharge the delivery of AI applications to providers of services and enterprises with NVIDIA BlueField-3 DPUs

F5 will supercharge the delivery of AI applications to providers of services and enterprises with NVIDIA BlueField-3 DPUs

F5 BIG-IP Next for Kubernetes, F5’s new intelligent proxy, combined with NVIDIA BlueField-3 DPUs, transforms application delivery for AI workloads.

F5 has announced the availability of BIG-IP Next for Kubernetes, an innovative AI application delivery and security solution that equips service providers and large enterprises with a centralized point of control to accelerate, secure, and optimize data traffic into and out of large-scale AI infrastructures.

The solution leverages the power of high-performance NVIDIA BlueField-3 DPUs to improve data center traffic efficiency, which is critical for large-scale AI deployments. With an integrated view of networking, traffic management, and security, customers will be able to maximize the utilization of data center resources while achieving optimal performance of AI applications.

Not only does this improve infrastructure efficiency, but it also promotes faster and more responsive AI inference, ultimately delivering an improved AI-driven customer experience.

F5 BIG-IP Next for Kubernetes is a purpose-built solution for Kubernetes environments, proven in cloud and 5G infrastructures of large telcos. With BIG-IP Next for Kubernetes, this technology is now adapted to key AI use cases, such as inference, augmented recovery generation (RAG), and continuous data management and storage. Integration with NVIDIA BlueField-3 DPUs minimizes hardware footprint, enables granular multi-tenancy, and optimizes power consumption while providing high-performance networking, security and traffic management.

The combination of F5 and NVIDIA technologies enables mobile and fixed-line telecommunications service providers to ease the transition to cloud-native infrastructure (Kubernetes), addressing the growing requirement for vendors to adapt their functions to a cloud-native network functions (CNFs) model. F5 BIG-IP Next for Kubernetes offloads data-intensive tasks to BlueField-3 DPUs, freeing up CPU resources for revenue-generating applications. The solution is particularly beneficial at the network edge for virtualized RAN (vRAN) or DAA for MSO, and at the core network for 5G, enabling future potential for 6G.

Designed specifically for high-demand service providers and large-scale infrastructures, F5 BIG-IP Next for Kubernetes:

  • Simplifies cloud-scale AI service delivery: BIG-IP Next for Kubernetes integrates seamlessly with customers’ front-end networks, significantly reducing latency while delivering high-performance load balancing to handle the immense data demands of AI models with billions of parameters and trillions of operations.
  • Enhances control of AI deployments: The solution provides a centralized integration point into modern AI networks, with rich observability and refined insights. BIG-IP Next for Kubernetes supports multiple L7 protocols in addition to HTTP, ensuring enhanced input and output control with very high performance.
  • Secures the new AI landscape: Customers can fully automate the discovery and security of AI training and inference endpoints. BIG-IP Next for Kubernetes also isolates AI applications from targeted threats, enforcing data integrity and sovereignty while meeting critical encryption capabilities for modern AI environments.

Ash Bhalgat, Senior Director of AI Networking and Security Partnerships at NVIDIA, said: “Service providers and enterprises require accelerated computing to deliver high-performance AI applications, securely and efficiently, at cloud scale. NVIDIA is working with F5 to accelerate the delivery of AI applications, better ensuring maximum efficiency and seamless user experiences powered by BlueField-3 DPUs.”

Browse our latest issue

Intelligent Tech Channels LATAM

View Magazine Archive