Elo Memory – AI 에이전트를 위한 생체 영감 에피소드 메모리(무료, 오픈 소스)

hackernews | | 📦 오픈소스
#ai 에이전트 #claude #em-llm #iclr 2025 #review #에피소드 메모리 #오픈 소스
원문 출처: hackernews · Genesis Park에서 요약 및 분석

요약

ICLR 2025에 제출된 EM-LLM 연구를 기반으로 한 오픈소스 'Elo Memory'는 AI 에이전트의 대화 간 정보 소실 문제를 해결하기 위해 베이지안 놀람(Bayesian Surprise) 기반의 자동 사건 감지 및 인간 같은 기억 통합 기능을 제공합니다. 이 시스템은 약 5ms라는 빠른 검색 속도와 무료 운영 비용을 자랑하며, Claude나 OpenAI 등 MCP 호환 에이전트에 쉽게 통합되어 반복 정보는 자동으로 걸러내고 불필요한 과거 기억은 자연 소멸하는 스마트한 메모리 계층을 구현합니다.

본문

Bio-inspired episodic memory system implementing EM-LLM (ICLR 2025). The missing memory layer for AI agents — automatic event detection, surprise-based encoding, and human-like memory consolidation. Works with: OpenClaw | Claude Code | OpenCode | Codex | Claude | Any MCP-compatible agent AI agents forget everything between conversations. Elo Memory fixes that. - Fast — Retrieves relevant memories in ~5ms. Agent queries by similarity, gets top 5 matches. Never reads all memories. - Smart storage — Bayesian surprise engine decides in <1ms what's worth remembering. Repetitive content is skipped automatically. - Human-like recall — Two-stage retrieval finds by similarity first, then expands by time context. Like how you remember "that whole day" not just one fact. - Self-maintaining — Background consolidation extracts patterns. Old irrelevant memories decay naturally. No manual cleanup. - Works everywhere — Python library, MCP server, or REST API. Drop into any agent framework. User message → Query memory (5ms) → 5 relevant episodes → Added to prompt → Better response ↓ Surprise check (1ms) → Novel? Store it. Boring? Skip it. ↓ Consolidation (background) → Extract patterns, forget noise pip install elo-memory from elo_memory import EpisodicMemoryStore, BayesianSurpriseEngine # Initialize memory memory = EpisodicMemoryStore(embedding_dim=768) surprise = BayesianSurpriseEngine(input_dim=768) # Store an observation embedding = encoder.encode("User loves Italian food") surprise_info = surprise.compute_surprise(embedding) if surprise_info['is_novel']: memory.store_episode( content={"text": "User loves Italian food"}, embedding=embedding, surprise=surprise_info['surprise'] ) # Retrieve relevant memories results = memory.retrieve(query_embedding, k=5) | Elo Memory | Mem0/Zep | Plain RAG | | |---|---|---|---| | Stores | Experiences with surprise | Everything | Documents | | Retrieval | Similarity + temporal | Similarity only | Similarity only | | Filtering | Automatic (surprise) | Manual | None | | Forgetting | Natural decay | Manual cleanup | None | | Speed | ~5ms query | ~50ms | ~100ms | | Cost | Free | $70+/month | API costs | | Component | Description | Status | |---|---|---| | Bayesian Surprise Detection | KL divergence-based novelty detection | ✅ | | Event Segmentation | HMM + prediction error boundaries | ✅ | | Episodic Storage | ChromaDB with temporal-spatial indexing | ✅ | | Two-Stage Retrieval | Similarity + temporal expansion | ✅ | | Memory Consolidation | Sleep-like replay + schema extraction | ✅ | | Forgetting & Decay | Power-law activation decay | ✅ | | Interference Resolution | Pattern separation/completion | ✅ | | Online Learning | Experience replay + adaptive thresholds | ✅ | - Usage Guide — Real-world integration patterns - MCP Setup — Claude Code integration - Performance Comparison — Benchmarks vs competitors - Competitive Analysis — Honest assessment - Real-World Project — Build an AI assistant - Strategic Plan — Roadmap & vision pip install elo-memory git clone https://github.com/server-elo/elo-memory.git cd elo-memory pip install -e ".[dev]" pip install "elo-memory[api]" elo-memory server --port 8000 pytest tests/ -v --cov=elo_memory We welcome contributions! See CONTRIBUTING.md for guidelines. Quick start: git clone https://github.com/server-elo/elo-memory.git cd elo-memory pip install -e ".[dev]" pytest MIT License — see LICENSE for details. - EM-LLM (ICLR 2025) — Research foundation - Itti & Baldi (2009) — Bayesian Surprise - Squire & Alvarez (1995) — Systems Consolidation - Kirkpatrick et al. (2017) — Catastrophic Forgetting - GitHub: https://github.com/server-elo/elo-memory - PyPI: https://pypi.org/project/elo-memory/ - Documentation: https://github.com/server-elo/elo-memory#readme - Issues: https://github.com/server-elo/elo-memory/issues Status: Production ready ✅ Made with ❤️ by the Elo Memory community.

Genesis Park 편집팀이 AI를 활용하여 작성한 분석입니다. 원문은 출처 링크를 통해 확인할 수 있습니다.

공유

관련 저널 읽기

전체 보기 →