LLM-Wiki
What does memory look like? And, what happens when it changes?
Let's find out using Obsidian as the view-pane.
This is what Andrej Karpathy described in his viral LLM Wiki gist -- a persistent and incrementally-growing collection of documents, searchable, indexed as it grows -- with two interesting features:
- you interact with a knowledge base by talking to your agent,
- things that you talk about are automatically collected, summarized, and linked.
Things that you talk about are remembered. That's keep!
The Obsidian graph view, and the chat pane, are just a convenient UI over the agent's memory.
How To
You'll need Hermes Agent and Obsidian.
Install and configure the Keep plugin for Hermes:
curl -sSL https://keepnotes.ai/scripts/install-hermes.sh | bash
When keep is first installed, I recommend saying to your agent:
Follow the keep instructions in your system prompt.
Enable the API Server in Hermes, so that the chat panel can connect to the agent. Do this by editing ~/.hermes/.env:
API_SERVER_ENABLED=true
API_SERVER_HOST=127.0.0.1
API_SERVER_PORT=8642
API_SERVER_KEY=change-me
Then restart your Hermes gateway.
When keep is running in Hermes, the default store is at ~/.hermes/keep. We will want to use that same store from the command-line too. Make a new directory to hold the Obsidian vault, and configure keep to sync its memory into this directory. The "PT10S" is a 10-second debounce time: longer is laggier.
export KEEP_STORE_PATH=~/.hermes/keep
mkdir -p ~/play/keep-vault
keep data export --sync --include-versions --interval PT10S ~/play/keep-vault
In Obsidian, install the ObsidianClaw plugin (aka Clawdian).
Find the plugin settings, and configure its connection to Hermes:
Now you should be able to start chatting, and see your chats (and anything mentioned in them) show up in Obsidian, linked as a graph.
