Today, we’re announcing the integration of Amazon Neptune with Zep, an open-source memory server for LLM applications. Zep enables developers to persist, retrieve, and enrich user interaction history, providing long-term memory and context for AI agents. With this launch, customers can now use Neptune Database or Neptune Analytics as the underlying graph store and Amazon Open Search as the text-search store for Zep’s memory system, enabling graph-powered memory retrieval and reasoning.
This integration makes it easier to build LLM agents with long-term memory, context, and reasoning. Zep users can now store and query memory graphs at scale, unlocking multi-hop reasoning and hybrid retrieval across graph, vector, and keyword modalities. By combining Zep’s memory orchestration with Neptune’s graph-native knowledge representation, developers can build more personalized, context-aware, and intelligent LLM applications.
Zep helps applications remember user interactions, extract structured knowledge, and reason across memory—making it easier to build LLM agents that improve over time. To learn more about the Neptune–Zep integration, check the sample Notebook.
Categories: marketing:marchitecture/databases,general:products/amazon-neptune,marketing:marchitecture/analytics,marketing:marchitecture/artificial-intelligence
Source: Amazon Web Services
Latest Posts
- Microsoft Teams | Customize live captions on mobile for improved accessibility [MC1146825]
- Microsoft Viva: Copilot Analytics – Unified Exclusion list [MC1146822]
- Microsoft Teams | Enhanced privacy and improved stability for live captions and transcripts [MC1146824]
- Microsoft Copilot dashboard (in Viva Insights): New benchmarks to compare Copilot usage across organizations [MC1146816]