Multi-year Strategic Collaboration Agreement includes integration with Amazon Bedrock for enterprise generative AI outcomes that are more accurate, transparent and explainable.
Neo4j, one of the world’s leading graph database and analytics companies, announced a multi-year Strategic Collaboration Agreement (SCA) with Amazon Web Services (AWS) that enables enterprises to achieve better generative Artificial Intelligence (AI) outcomes through a unique combination of knowledge graphs and native vector search that reduces generative AI hallucinations while making results more accurate, transparent and explainable.
This helps solve a common problem for developers who need long-term memory for large language models (LLMs) that are grounded in their specific enterprise data and domains.
Neo4j also announced the general availability of Neo4j Aura Professional, the company’s fully managed graph database offering, in AWS Marketplace, enabling a frictionless fast-start experience for developers on generative AI. AWS Marketplace is a digital catalogue with thousands of software listings from independent software vendors that make it easy to find, test, buy and deploy software that runs on AWS.
Neo4j is a leading graph database with native vector search that captures both explicit and implicit relationships and patterns. Neo4j is also used to create knowledge graphs, enabling AI systems to reason, infer and retrieve relevant information effectively. These capabilities enable Neo4j to serve as an enterprise database for grounding LLMs while serving as long-term memory for more accurate, explainable and transparent outcomes for LLMs and other generative AI systems.
With today’s announcement, Neo4j is releasing a new integration with Amazon Bedrock, a fully managed service that makes foundation models from leading AI companies accessible via an API to build and scale generative AI applications. Neo4j’s native integration with Amazon Bedrock enables the following benefits:
- Reduced Hallucinations: Neo4j with Langchain and Amazon Bedrock can now work together using Retrieval Augmented Generation (RAG) to create virtual assistants grounded in enterprise knowledge. This helps customers by reducing hallucinations and providing more accurate, transparent, and explainable results.
- Personalised experiences: Neo4j’s context-rich knowledge graphs integration with Amazon Bedrock can invoke a rich ecosystem of foundation models that generate highly personalised text generation and summarisation for end users.
- Get complete answers during real-time search: Developers can leverage Amazon Bedrock to generate vector embeddings from unstructured data (text, images, and video) and enrich knowledge graphs using Neo4j’s new vector search and store capability. For example, users can search a retail catalogue for products explicitly based on ID or category, or implicitly search based on product descriptions or images.
- Kickstart a knowledge graph creation: Developers can leverage new generative AI capabilities using Amazon Bedrock to process unstructured data so it becomes structured and load it into a knowledge graph. Once in a knowledge graph, users can extract insights and make real-time decisions based on this knowledge