ChannelLife Australia - Industry insider news for technology resellers
Story image

F5 collaborates with Intel to boost AI service security & delivery

Tue, 3rd Sep 2024

F5 has announced a collaboration with Intel to enhance the security and delivery of AI services.

The new joint solution merges the security and traffic management features of F5’s NGINX Plus with the Intel Distribution of OpenVINO toolkit and Infrastructure Processing Units (IPUs) to provide improved protection, scalability, and performance for advanced AI inference.

As more organisations adopt AI for intelligent applications and workflows, the need for efficient and secure AI inference is becoming increasingly important. Kunal Anand, Chief Technology Officer at F5, stated, “Teaming up with Intel empowers us to push the boundaries of AI deployment. This collaboration highlights our commitment to driving innovation and delivers a secure, reliable, and scalable AI inference solution that will enable enterprises to securely deliver AI services at speed. Our combined solution ensures that organisations can harness the power of AI with superior performance and security.”

The solution integrates the OpenVINO toolkit, which optimises and accelerates AI model inference, with F5 NGINX Plus, which provides robust traffic management and security. F5 NGINX Plus serves as a reverse proxy, handling traffic, ensuring high availability, and conducting active health checks. Additionally, it facilitates SSL termination and mTLS encryption, thereby securing communications between applications and AI models while maintaining performance.

The OpenVINO toolkit simplifies the optimisation of models from nearly any framework, enabling a write-once, deploy-anywhere approach. This is crucial for developers aiming to create scalable and efficient AI solutions with minimal code modifications. F5 NGINX Plus enhances the security and reliability of these AI models, ensuring their optimal performance and safety.

Intel IPUs further bolster performance by offloading infrastructure services from the host CPU, freeing up resources for AI model servers. These IPUs efficiently manage infrastructure tasks, thereby enhancing the scalability and performance of both NGINX Plus and OpenVINO Model Servers (OVMS).

This integrated solution is especially advantageous for edge applications such as video analytics and the Internet of Things (IoT), where low latency and high performance are critical. Running NGINX Plus on the Intel IPU ensures rapid and reliable responses, making the solution ideal for content delivery networks and distributed microservices deployments.

Pere Monclus, Chief Technology Officer of Intel's Network and Edge Group, commented, “Leveraging the cutting-edge infrastructure acceleration of Intel IPUs and the OpenVINO toolkit alongside F5 NGINX Plus can help enable enterprises to realise innovative AI inference solutions with improved simplicity, security, and performance at scale for multiple vertical markets and workloads.”

The solution, now available, marks a significant step towards empowering enterprises to deploy AI services with enhanced security and performance.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X