MemoryPort
Persistent memory across every LLM. Every session picks up where you left off. Local-first and open source.
Runs on your machine. LanceDB vector index, zero cloud dependency. Your data stays yours.
Optional Arweave storage. Survive disk failures, sync across devices, rebuild from chain.
AES-256-GCM with Argon2id key derivation. Even if you use cloud backup, only you can read it.
Three-gate system with query expansion. Your AI gets the right context, not everything.
Claude Code, Cursor, Ollama, OpenAI — one memory across all your AI tools.
MIT licensed. Full Rust codebase. Read every line, self-host, or contribute.
MCP Server
MCP Server
Ollama proxy
Native API proxy
OpenAI proxy
Standard protocol
Full AI memory on your machine
Cloud backup + cross-device sync
Start building with persistent memory. With MemoryPort, your AI remembers every conversation you've ever had.