ChannelLife Australia - Industry insider news for technology resellers
Modern data center ai cloud infrastructure partnership handshake

AMD, Nutanix ally on open AI stack in USD $250m deal

Thu, 26th Feb 2026

AMD and Nutanix have entered a multi-year partnership to develop an open, full-stack AI infrastructure platform for agentic AI applications. The deal also includes an equity investment and joint funding for engineering and commercial work.

AMD plans to purchase USD $150 million of Nutanix shares at USD $36.26 per share. It will also commit up to USD $100 million for joint engineering initiatives and go-to-market collaboration tied to integrated products.

The first jointly developed platform is expected to reach the market in late 2026. The roadmap targets enterprise and service provider deployments across data centre, hybrid, and edge environments.

Funding and equity

The planned stock purchase creates a direct financial link between the chipmaker and the infrastructure software provider. The investment is expected to close in the second quarter of 2026, subject to regulatory approvals and customary closing conditions.

The additional USD $100 million funding commitment is separate from the share purchase. It is intended to support joint engineering and go-to-market work, with an emphasis on driving adoption of the agentic AI platform on AMD accelerated compute infrastructure.

Platform roadmap

The partnership focuses on optimising the Nutanix Cloud Platform and Nutanix Kubernetes Platform for AMD EPYC CPUs and AMD Instinct GPUs. It also includes plans to integrate AMD ROCm and AMD Enterprise AI software into Nutanix AI full-stack offerings.

The design brief calls for inference acceleration using AMD Instinct GPUs and EPYC CPUs, along with high-core-density compute and orchestration based on AMD EPYC processors. Nutanix Enterprise AI is intended to provide unified lifecycle management.

The approach is positioned as a way to reduce reliance on vertically integrated AI stacks, while giving enterprises the option to run open-source and commercial AI models on a single platform.

OEM support

A broad set of OEM server providers is expected to support the joint solution. Partners, product SKUs, and initial reference architectures were not disclosed. The OEM component was framed as part of an open AI ecosystem combining silicon, runtime software, and orchestration.

The announcement comes as enterprises assess where to run inference workloads and how to manage them alongside existing applications. The companies argued that inference is taking a larger share of AI workloads as organisations move from experimentation to production.

Competitive context

AMD has expanded its AI portfolio across CPUs, GPUs, networking, and software. It has promoted ROCm as an open GPU compute stack for developers and customers seeking alternatives in AI infrastructure.

Nutanix sells software for running applications and managing data across hybrid environments. Its portfolio spans virtualised infrastructure and Kubernetes management, which many enterprises use to support a mix of traditional and cloud-native workloads.

The partnership moves Nutanix closer to the underlying compute layer as demand grows for tightly integrated AI systems. It also gives AMD another path to enterprise infrastructure buyers looking for standardised operations across on-premise and cloud environments.

Dan McNamara, Senior Vice President and General Manager of Compute and Enterprise AI at AMD, said customers want flexibility in model choice.

"Enterprise customers need the freedom to run the models and workloads that matter most to their business, without compromise," McNamara said. "Through our partnership with Nutanix we're building a scalable, full-stack AI platform rooted in openness, designed to give enterprises and service providers the flexibility to innovate, extend and grow AI deployments across Enterprises."

Tarkan Maner, President and Chief Commercial Officer at Nutanix, said the two companies share a view on the infrastructure requirements for AI in production.

"Our partnership with AMD reflects a shared vision for scalable, production-ready AI infrastructure," Maner said. "Together, we are delivering full-stack, integrated platforms optimised for inference and agentic applications across hybrid environments for enterprises and service providers."