ChannelLife Australia - Industry insider news for technology resellers
Story image

New Relic adds Model Context Protocol support to AI monitoring

Yesterday

New Relic has announced support for the Model Context Protocol (MCP) within its AI Monitoring solution, now fully integrated with its Application Performance Monitoring (APM) suite.

The updates provide enterprises with extended observability into agentic AI systems using MCP servers, allowing teams to monitor AI application performance as part of their overall technology stack. Integration of MCP support within the AI Monitoring platform is designed to help developers and MCP service providers gain clarity on the internal workings of AI agents and the tools they employ, with a focus on improving reliability and efficiency.

Expanded observability

With the adoption of agentic AI models and the rise of MCP as a protocol, customers have demanded improved transparency over how AI agents interact with numerous tools and services.

MCP servers can present visibility challenges, as their processes are often opaque, complicating efforts to assess which resources are used by agents, as well as tracking performance and errors.

According to New Relic, these challenges have previously required manual intervention and custom instrumentation to extract actionable insights. The company's new MCP integration aims to address these issues by surfacing detailed analytics and reducing the operational burden on engineering and operations teams.

New Relic Chief Technology Officer Siva Padisetty commented on the significance of the update. He said,

"Since it was released last year, MCP has quickly become the standard protocol for agentic AI. Once again meeting our customers where and how they work, our new MCP integration is a game-changer for anyone building or operating AI systems that rely on this protocol. We've moved beyond siloed LLM monitoring to demystify MCP, connecting insights from AI interactions directly with the performance of the entire application stack for a holistic view. All this is offered as an integral part of our industry leading APM technology."

Monitoring capabilities

The integration provides a range of monitoring features intended to simplify and expand MCP observability. This includes automatic tracing of MCP requests to reveal specific usage patterns, such as which tools were invoked, the sequence of calls and the duration of each process. Proactive optimisation tools allow teams to evaluate and refine agent behaviour by analysing tool choices, measuring latency, identifying errors, and tracking performance across prompts and requests.

The context provided by the monitoring system extends beyond just the MCP layer, enabling users to correlate the protocol's performance with other application components, such as databases, microservices, and message queues, through a unified dashboard.

New Relic's report, AI Unwrapped: 2025 AI Impact Report, notes that usage of its AI Monitoring product has been increasing by 30% quarter-over-quarter over the past year, reflecting continued enterprise adoption as more organisations migrate critical AI workloads into production environments. These customers are, according to the report, prioritising solutions that address concerns over model reliability, output accuracy, regulatory compliance, and operational cost.

Cost model and availability

The company's AI Monitoring, including the new MCP support, continues to use a usage-based pricing model, charging customers in line with the value consumed rather than by number of users. New Relic says this approach is intended to align cost with tangible business outcomes.

The current release adds support for MCP in the Python Agent version 10.13.0, with further language support in development and expected in upcoming releases.

Enterprise users looking to reduce manual troubleshooting and improve reliability across AI-powered applications are the primary audience for the MCP integration, which is positioned as part of the broader New Relic intelligent observability platform.

Follow us on:
Follow us on LinkedIn Follow us on X
Share on:
Share on LinkedIn Share on X