
MongoDB boosts AI app reliability with new models & partners
MongoDB has announced a series of product enhancements and AI partner ecosystem expansions aimed at enabling customers to build reliable AI applications at scale, following its acquisition of Voyage AI earlier this year.
The updates allow customers to integrate Voyage AI's latest embedding and reranking models with MongoDB's database infrastructure. These models are designed to introduce context awareness and set new accuracy benchmarks at what the company says are favourable price-performance ratios.
Andrew Davidson, Senior Vice President of Products at MongoDB, said,
"Databases are more central than ever to the technology stack in the age of AI. Modern AI applications require a database that combines advanced capabilities - like integrated vector search and best-in-class AI models - to unlock meaningful insights from all forms of data (structure, unstructured), all while streamlining the stack. These systems also demand scalability, security, and flexibility to support production applications as they evolve and as usage grows. By consolidating the AI data stack and by building a cutting-edge AI ecosystem, we're giving developers the tools they need to build and deploy trustworthy, innovative AI solutions faster than ever before."
According to the company, approximately 8,000 startups - including Laurel and Mercor - have chosen MongoDB as the foundation for their AI projects in the past 18 months. Additionally, more than 200,000 new developers register for MongoDB Atlas each month, highlighting significant adoption across the developer community.
Product highlights
The newly released Voyage AI models include the voyage-context-3, which enables context-aware embeddings for improved data retrieval, and general-purpose models such as voyage-3.5 and voyage-3.5-lite, which focus on delivering higher retrieval quality and price-performance. The rerank-2.5 and rerank-2.5-lite models offer instruction-following reranking to enhance results accuracy across benchmarks.
Fred Roma, Senior Vice President of Engineering at MongoDB, commented,
"Many organisations struggle to scale AI because the models themselves aren't up to the task. They lack the accuracy needed to delight customers, are often complex to fine-tune and integrate, and become too expensive at scale. The quality of your embedding and reranking models is often the difference between a promising prototype and an AI application that delivers meaningful results in production. That's why we've focused on building models that perform better, cost less, and are easier to use - so developers can bring their AI applications into the real world and scale adoption."
MongoDB has also introduced the Model Context Protocol (MCP) Server, now in public preview. This server is designed to standardise the connection between MongoDB deployments and widely used development tools, including GitHub CoPilot, Anthropic's Claude, Cursor, and Windsurf. The aim is to provide developers with the ability to use natural language for managing database operations, thereby accelerating workflow, productivity, and deployment timelines.
AI partner ecosystem
As part of the expanded ecosystem, Galileo, an AI reliability and observability platform, and Temporal, an open-source Durable Execution platform, have joined MongoDB's partner network.
Vikram Chatterji, CEO and co-founder at Galileo, stated,
"As organisations bring AI applications and agents into production, accuracy and reliability are of paramount importance. By formally joining MongoDB's AI ecosystem, MongoDB and Galileo will now be able to better enable customers to deploy trustworthy AI applications that transform their businesses with less friction."
Maxim Fateev, CTO at Temporal, said,
"Building production-ready agentic AI means enabling systems to survive real-world reliability and scale challenges, consistently and without fail. Through our partnership with MongoDB, Temporal empowers developers to orchestrate durable, horizontally scalable AI systems with confidence, ensuring engineering teams build applications their customers can count on."
MongoDB's partnership with LangChain is focused on streamlining AI workflows, introducing features like GraphRAG for greater transparency in data retrieval processes and natural language querying to allow agentic applications direct data interaction. These developments are designed to equip developers to build advanced retrieval-augmented generation (RAG) systems and autonomous agents capable of interacting with MongoDB data.
Harrison Chase, CEO and Co-founder at LangChain, said,
"As AI agents take on increasingly complex tasks, access to diverse, relevant data becomes essential. Our integrations with MongoDB, including capabilities like GraphRAG and natural language querying, equip developers with the tools they need to build and deploy complex, future-proofed agentic AI applications grounded in relevant, trustworthy data."
Industry analysts have noted the increasing importance of integrated data solutions in AI development. Jason Andersen, Vice President and Principal Analyst at Moor Insights and Strategy, commented,
"As more enterprises deploy and scale AI applications and agents, the demand for accurate outputs and reduced latency keeps increasing. By thoughtfully unifying the AI data stack with integrated advanced vector search and embedding capabilities in their core database platform, MongoDB is taking on these challenges while also reducing complexity for developers."
These new models and expanded partnerships are positioned to address the issues of complexity, accuracy and scalability that many organisations face when implementing AI solutions.