Today, we’re announcing the integration of Amazon Neptune with Zep, an open-source memory server for LLM applications. Zep enables developers to persist, retrieve, and enrich user interaction history, providing long-term memory and context for AI agents. With this launch, customers can now use Neptune Database or Neptune Analytics as the underlying graph store and Amazon Open Search as the text-search store for Zep’s memory system, enabling graph-powered memory retrieval and reasoning.
This integration makes it easier to build LLM agents with long-term memory, context, and reasoning. Zep users can now store and query memory graphs at scale, unlocking multi-hop reasoning and hybrid retrieval across graph, vector, and keyword modalities. By combining Zep’s memory orchestration with Neptune’s graph-native knowledge representation, developers can build more personalized, context-aware, and intelligent LLM applications.
Zep helps applications remember user interactions, extract structured knowledge, and reason across memory—making it easier to build LLM agents that improve over time. To learn more about the Neptune–Zep integration, check the sample Notebook.
Categories: marketing:marchitecture/databases,general:products/amazon-neptune,marketing:marchitecture/analytics,marketing:marchitecture/artificial-intelligence
Source: Amazon Web Services
Latest Posts
- Amazon EC2 M7i instances are now available in the Israel (Tel Aviv) Region

- Announcing new high performance computing Amazon EC2 Hpc8a instances

- Cloudflare Fundamentals – Content encoding support for Markdown for Agents and other improvements

- AWS Backup announces PrivateLink support for SAP HANA on AWS






