Confluent unveils platform to bridge AI context gap with real-time data
Confluent has announced Confluent Intelligence, a platform designed to address the lack of context in artificial intelligence systems by delivering real-time data streaming and integration capabilities.
The company has positioned Confluent Intelligence as a fully managed stack built on Confluent Cloud, aiming to provide organisations with the infrastructure required to launch and scale event-driven AI systems. The platform streams and processes both historical and real-time data and feeds this information directly into AI applications to enable more dependable, secure and scalable AI workloads.
Context challenges for AI
According to MIT's 'The State of AI in Business 2025' report, while enterprises are collectively investing between USD $30 billion and USD $40 billion in generative AI initiatives, 95% of these projects do not deliver a return on investment. The core issue identified is the absence of adequate context; large language models (LLMs) frequently lack the capacity to understand relevant events, relationships or meanings necessary for effective reasoning. This deficiency results in the need for multiple disconnected solutions, making AI workflows fragmented and challenging to operate at scale.
Jay Kreps, Co-founder and Chief Executive Officer at Confluent, commented,
"We started Confluent to take on one of the hardest problems in data: helping information move freely across a business so companies can act in real time. That same foundation uniquely positions Confluent to close the AI context gap. Off-the-shelf models are powerful, but without the continuous flow of data, they can't deliver decisions that are timely and uniquely valuable to a business. That's where data streaming becomes essential."
Capabilities of Confluent Intelligence
Confluent Intelligence is designed as a managed service for real-time, context-aware AI systems, leveraging open source technologies such as Apache Kafka and Apache Flink.
The platform provides a foundation for organisations to develop and deploy AI agents and applications, incorporating built-in governance, low-latency performance and full data replayability. These attributes are intended to help organisations accelerate their deployments from proof of concept to full-scale production.
Among its features, Confluent Intelligence includes the Real-Time Context Engine, which streams structured and reliable context to any AI agent or application, supporting integrations with systems both based on Kafka and Flink or via the Model Context Protocol (MCP).
This enables development teams to access real-time contextual data via a single service, without requiring direct interaction with Kafka or backend infrastructure.
The platform also introduces Streaming Agents, allowing for the creation, deployment and orchestration of event-driven agents directly on Flink. This unifies the processes of data handling and AI reasoning, enabling agents to operate based on live data with minimal manual intervention. Streaming Agents is currently in open preview, while the Real-Time Context Engine is available in early access.
Additionally, Confluent Intelligence offers built-in machine learning functions within Flink SQL, such as anomaly detection, forecasting, model inference and real-time visualisation. These capabilities are intended to simplify complex data science tasks and support faster decision-making across organisations. The machine learning features are generally available on Confluent Cloud.
User perspectives
Atilio Ranzuglia, Head of Data and AI at Palmerston North City Council, offered a perspective on the need for high-quality data streams in AI systems, stating,
"Good AI needs good data. Confluent is our trusted source of truth, streaming high-quality data to our data lakes and AI platforms to train models in real time. It provides context and orchestration for our agents to automate workflows, accelerating our smart city transformation."
Nithin Prasad, Senior Engineering Manager at GEP, commented on the application of real-time streaming in supply chain solutions:
"AI-powered procurement and supply chain use cases are at the core of what GEP does. Confluent helps make them possible by providing a data streaming platform that fuels our models with real-time streaming data and eliminates fear of data loss."
Collaboration with Anthropic
Confluent also announced a strengthened collaboration with Anthropic, making its Claude model the default LLM for Streaming Agents. The Anthropic model is now natively integrated into Confluent's data streaming platform. According to the announcement, this integration enables enterprises to develop adaptive, context-rich AI systems that can support functions such as advanced anomaly detection and personalised experiences by combining real-time data with the reasoning capabilities of Anthropic's models.