Docker, in partnership with Neo4j, LangChain, and Ollama, has launched a new preparatory resource for developers seeking to boost their generative artificial intelligence (AI) capabilities. Titled GenAI Stack, this new utility equips developers with the resources they need to begin creating applications using generative AI in mere minutes. The need to search, combine, and configure relevant technologies from various sources is eliminated with this integrated and secure ready-to-code tool.
The GenAI Stack encompasses large language models (LLMs) from Ollama, vector and graph databases from Neo4j, and the LangChain framework. Already available on Docker Desktop's Learning Centre and the official repository, this one-of-a-kind pre-configured stack is specifically designed to handle common GenAI scenarios using open-source content trusted by Docker Hub.
"Today's announcement eliminates this dilemma by enabling developers to get started quickly and safely using the Docker tools, content, and services they already know and love, together with partner technologies on the cutting edge of GenAI app development.” Docker CEO Scott Johnston said. Other features extend the building platform's capabilities, content, and partnerships for the swift and secure utilisation of AI/ML in developer applications.
James Governor, Principal Analyst and Co-Founder of RedMonk observed, "Everything changed this year, as AI went from being a field for specialists to something that many of us use every day. The GenAI Stack that Docker, Neo4j, LangChain, and Ollama are collaborating to offer provides the kind of consistent unified experience that makes developers productive with new tools and methods."
CEO and co-founder of Neo4j, Emil Eifrem notes that the driving force behind these collective efforts was a shared goal of empowering developers to easily create GenAI-backed applications. Harrison Chase, CEO and co-founder of LangChain mirrored Eifrem's sentiment, stating that this innovative resource was an excellent advancement towards reducing the development work needed to actualise the potential of GenAI. Ollama's creator, Jeffrey Morgan, expressed his enthusiasm anticipating Docker's developer community's involvement in constructing advanced applications with a focus on AI.
The GenAI Stack offering includes a wide range of knowledge graphs, vector and knowledge graph integration results, and LLMs that make generating diverse responses in various formats a breeze. From summarising data to showcasing knowledge retrieval through versatile knowledge graphs, effortless data loading and vector index population, this tool is a significant stride towards enhancing developer capabilities.
The recent developments were unveiled at the DockerCon 2023 conference and are part of Docker's ongoing commitment to help developers co-construct, share, and run applications thoroughly and collectively. As a leader in Graph Database & Analytics, Neo4j's graph stack delivers powerful native graph storage with a native vector search capability. LangChain, on the other hand, is an open-source framework that aids developers in building LLM-powered context-aware reasoning applications.