Resonant — Voice for AI-native work
Resonant captures everything you say — dictations, meetings, memos — and makes it queryable by any AI agent, including Claude and Codex, via MCP. Your voice becomes a knowledge base your AI tools can search.
11 MCP tools. Ambient workspace context. On-device speech recognition. No audio leaves your Mac, ever.
Used by engineers at

MCP — the memory layer
Resonant exposes an MCP server with 11 tools. Claude Code, Codex, and more can query your meetings, dictations, memos, ambient context, and daily journal — automatically.
No copy-pasting transcripts. No “let me find my notes.” Your AI tool asks Resonant directly and gets structured data back — with timestamps, speaker labels, and app context.
“What did I commit to in this morning's standup?”
search("standup", type: "meeting")
From your 9:30am standup: You committed to finishing the JWT migration by Thursday and asked Sarah to review the webhook retry PR.
“I described an API design earlier — find it and use it as the spec.”
search("API design", type: "dictation")
Found dictation from 2:14pm in VS Code: "The endpoint should accept a Bearer token, validate against the JWKS endpoint, return a 401 with retry-after..."
“What was I working on yesterday afternoon?”
ambient_timeline(date: "yesterday", start: "12:00")
VS Code (auth-service) 12:00–14:30, Slack (eng-team) 14:30–14:45, Chrome (Grafana) 14:45–15:20, VS Code (webhook-retry) 15:20–17:00.
The speed gap
Every Cursor prompt you abbreviate because typing the full thing takes too long — that's a worse outcome. Every Claude message where you cut the context because the keyboard made it feel like work — that's a worse response.
Voice removes the bottleneck between your thinking and the model. Say everything. Every constraint, every edge case. Resonant transcribes it locally in under a second.
fix token validation order in auth middleware
Missing: which file, which tokens, which environment, what you already tried, why the order matters.
open the auth middleware in the api folder — there's an edge case where tokens issued before the schema migration aren't being validated correctly, the expiry check runs before the version check, swap the order and add a log when it catches an old token so we can see how often it's hitting in prod
Full context. Dictated in 8 seconds. Runs locally on your Mac.
Ambient context
Passively records which apps you use, window titles, URLs, and dwell time — all locally. This data feeds your daily journal, makes dictation context-aware, and is queryable by your AI tools via MCP.
“What was I working on before the meeting?” is a question your AI tool can answer — because Resonant saw the apps, the files, the time you spent. Learn more →
Where it fits
Talk through the tradeoffs before you commit to anything. Resonant captures the reasoning — not just the conclusion. Query it later via MCP when you need to remember why.
Say everything the function needs to do. Every constraint, every edge case, every 'oh and also.' Voice removes the pressure to be brief — and brief prompts produce worse code.
Your AI tool queries your meeting transcripts via MCP. What did we decide? Who owns what? What was the deadline? Exact quotes with timestamps.
Describe the stack trace, the reproduction steps, the thing you already tried. Give Claude the full picture. Stop getting answers to the question you typed instead of the problem you have.
"What was I working on before lunch?" Your AI tool queries the ambient timeline and gets a real answer — which apps, which files, how long.
Record a voice memo during a walk. It's transcribed, titled, and searchable. Next week when you need that insight, your AI tool finds it via MCP.
Real prompts
“fix token validation order in auth middleware”
“open the auth middleware in the api folder — there's an edge case where tokens issued before the schema migration aren't being validated correctly, the expiry check runs before the version check, swap the order and add a log when it catches an old token so we can see how often it's happening in prod”
“help me design a notifications data model”
“I'm thinking through the data model for the new notifications service — we have three event types, user actions, system events, and scheduled digests, and they have different retention policies and different fan-out patterns, I'm leaning toward three separate tables rather than a polymorphic design but I want to think through the foreign key implications before I commit to that”
“why is my test failing with nil pointer”
“I'm looking at this stack trace — it's a nil pointer on the cache layer, but only in the test environment, and only when the test suite runs in parallel — I think it's a race condition in the mock setup but I want to understand if there's a pattern here before I start patching individual tests”
Architecture
Resonant processes everything on your Mac using Apple Neural Engine. Audio never leaves your device. The MCP server runs locally — queries and responses stay on your machine.
The finished text — the prompt, the comment, the memo — is the only thing that leaves your machine. Everything else stays local. How the on-device AI works →
How it works
Works in any text field — Cursor, ChatGPT, Claude, Slack, Notion, a terminal. No app to open.
Say everything. Every constraint, every edge case. Voice removes the pressure to be concise.
Apple Neural Engine transcribes locally. No cloud, no round-trip, no exposure.
Text pastes into the active field. The dictation is saved to your workspace — searchable and MCP-queryable.
Free. Local. Always.
Voice workspace with MCP. No subscription. No cloud. Just a hotkey, a Mac, and AI tools that remember.
Download for MacDRequires macOS 14+ · Apple Silicon