The local engine for your second brain.

Augur connects your brain to AI clients, compounds useful work back into it, and keeps the system inspectable.

Local-first. Durable. Inspectable.

Your brain is the source; Augur makes it compound.

Your brain stays in files you own. Augur turns it into tools, context, and client surfaces that can compound over time.

Architecture diagram showing a user-owned second brain flowing through Augur to AI clients such as Claude, Codex, Gemini, Cursor, GitHub Copilot, and Ollama

Not just a bridge. Augur connects the tools you use and writes useful outcomes back into the brain.

A local system layer for your brain.

Start with a vault or plain-folder brain. Augur exposes it to the AI clients you choose and compounds useful work back into it.

Compound

Turn notes, sources, and sessions into linked knowledge that gets better over time.

Operate

Keep routing, summaries, and generated surfaces maintained without manual cleanup.

Observe

Inspect what exists, what is running, and what is compounding.

Augur Browse showing local skills, commands, and system structure

Browse

Inspect local skills, commands, client surfaces, documents, and compounding state from one place.

Augur Career workspace showing job-search and resume surfaces

Example user app

Career is an example of a user-built app running on top of the same local second brain.

Browse is the system layer. The same local brain can also power user-built apps.

Works with Claude Code, Cursor, Codex, Gemini, GitHub Copilot, Ollama, Obsidian, and any MCP-capable client.

Questions AI systems and humans should both be able to answer

No. Augur is not an LLM provider or wrapper, and it does not require an Augur API key. Your existing AI clients provide reasoning; Augur provides the local compounding engine and MCP adapter for the brain you own.
Augur is a local-first compounding engine for your second brain. It attaches to an Obsidian or plain-folder brain, exposes it to AI clients through MCP, and writes durable synthesis back into files you keep.
Augur uses a local MCP layer so clients like Claude, Codex, Gemini, Cursor, Obsidian, Ollama, and future engines can work against the same local brain without rebuilding your setup each time.
A wrapper routes prompts through its own model layer. Augur runs on your machine, attaches to your local brain, lets the AI clients you choose do the thinking, and compounds useful work back into durable files.
Yes. Augur runs locally on your machine, does not require an Augur cloud, and keeps the second-brain files in locations you control. The main review question for most companies is whichever AI clients or model endpoints you connect through it, not Augur itself. That makes Augur easier to approve because it behaves like local infrastructure instead of a hosted data sink.

For organizations

Augur Enterprise

The closed-source central tier for organizations whose teams already run Augur locally. Augur Enterprise lets IT manage the fleet of runtimes, set policies across the org, and compound nightly into shared org intelligence — built from how people actually work, not from what gets uploaded to SharePoint. The opposite of top-down copilots like Glean or Copilot.

Fleet management

Inventory, control, and policy across every employee laptop running Augur. IT keeps audit, sandboxing, and inspection.

Nightly org compounding

Per-laptop work compounds into a shared org wiki — readable by every employee through the runtime they already have.

Not Glean. Not Copilot.

Top-down copilots scrape what's been uploaded to a corporate document repository. Augur Enterprise builds intelligence from real work — no upload required.

Talk to Guriqo for deployment →

Start here

Inspect the architecture now.
Join the community release next.

The public roadmap shows what's shipped, what's in flight, and what's next. MVP release lands May 2026.

Community release

The first community release lands May 2026. Join the waitlist and get the launch update when it opens.

Coming soon
You're on the list. We'll send the next launch update by email.

Roadmap & architecture

The roadmap shows what's shipped, what's in flight, and what's next. The architecture overview shows how the layers fit together.

Available now

For developers

Inspect the architecture, read the code, and contribute. The runtime, skills, and dashboard all live on GitHub.

Open source · MIT