header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

The protagonist of "Resident Evil" wrote an AI memory system using Claude, and it achieved a perfect score on the LongMemEval benchmark.

According to 1M AI News monitoring, Hollywood actress Milla Jovovich (known for "The Fifth Element" and "Resident Evil" series) collaborated with Bitcoin entrepreneur and decentralized lending platform Libre founder Ben Sigman to develop the open-source AI memory system MemPalace. The project was released on GitHub under the MIT license and received 5500 stars within three days. Sigman mentioned that the two spent several months developing the project using Anthropic's Claude, with Claude Opus 4.6 listed as a co-author in the Git commit history.


MemPalace's core competitive advantage lies in benchmark test scores. In the memory retrieval industry standard benchmark LongMemEval, the pure on-device retrieval (without calling any external APIs) achieved 96.6% Recall@5. After enabling the optional Haiku model reordering, a perfect score of 500 questions was obtained. The project team claimed this to be the highest score ever achieved on this benchmark, for both free and paid products. In two other benchmarks, ConvoMem scored 92.9%, surpassing the AI memory product Mem0 by more than two-fold, while LoCoMo achieved perfect scores in all multi-hop reasoning categories. The benchmark testing code has been openly shared with the repository for reproducibility.


Distinct from common vector database solutions, MemPalace emulates the ancient Greek orators' "method of loci" to organize information. The system mines and organizes user conversation records into a four-layer structure: Wing (classified by character or project) → Room (specific topic) → Closet (compressed summary) → Drawer (verbatim conversation records). Related rooms within the same wing are horizontally connected through Halls, and different wings are cross-referenced through Tunnels. Project testing revealed that solely based on this structure, the retrieval accuracy could be improved by 34%.


The project also introduced a lossless compression dialect named AAAK, designed specifically for AI agents, which compresses user contexts of thousands of tokens to around 120 tokens, achieving a compression ratio of approximately 30 times. AAAK consists of purely structured text, requiring no special decoder or fine-tuning. Any large language model capable of text comprehension can directly understand it. The system also incorporates contradiction detection to capture inconsistencies in names, pronouns, ages, etc., before output.


The entire system runs entirely locally, independent of cloud services, without the need for API keys, and is free of charge. It supports integration with tools like Claude, ChatGPT, and Cursor through the MCP protocol (providing 19 MCP tools), as well as local model-generated context summaries through the command line for Llama, Mistral, and others.


Jovovich's crossover has caught the tech community by surprise. The project repository is registered under her GitHub account, with 4 out of 7 commits made by her, including the initial commit that contains all the core code. She posted a project introduction video on Instagram.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish