Technology Radar
We’re moving Graphiti to Trial as this open-source temporal knowledge graph engine from Zep has demonstrated its production viability for addressing the LLM memory problem. While flat vector stores in RAG pipelines fail to track how facts change over time, Graphiti ingests data as discrete episodes and maintains bi-temporal validity windows on graph edges, so outdated facts are invalidated rather than overwritten. Unlike batch-oriented GraphRAG, it updates the graph incrementally and delivers sub-second retrieval via hybrid retrieval combining semantic search, BM25 and graph traversal, without query-time LLM calls. Two factors drove this move: peer-reviewed benchmarks reporting 18.5% accuracy improvements and 90% latency reductions, and the release of a first-class MCP server enabling Model Context Protocol–compliant agents to attach persistent temporal memory with minimal integration effort. Strong community adoption further signals production readiness. We’re using Graphiti to build context-aware agents with stateful, temporally aware knowledge graphs and recommend evaluating it for agentic applications. Neo4j is the primary backend, with FalkorDB as a lighter alternative. Teams should also account for per-write LLM extraction costs and pin dependencies given its pre-1.0 release status.
Graphiti builds dynamic, temporally-aware knowledge graphs that capture evolving facts and relationships. Our teams use GraphRAG to uncover data relationships, which enhances retrieval and response accuracy. As data sets constantly evolve, Graphiti maintains temporal metadata on graph edges to record relationship lifecycles. It ingests both structured and unstructured data as discrete episodes and supports queries using a fusion of time-based, full-text, semantic and graph algorithms. For LLM-based applications — whether RAG or agentic — Graphiti enables long-term recall and state-based reasoning.