Documentation
latest-fieDevNote v0.5.0 β Context Bridge Brainstorm
Date: 2026-04-19 Status: In discussion β no decisions locked yet Purpose: Share with collaborators the full exploration of how DevNote could integrate with AI coding assistants (Claude Code, Cursor, Copilot, Codex, etc.)
π Where we are
v0.4.2 just shipped. DevNote now has:
- β Semantic search over all past notes (v0.4.0)
- β
Python worker with
gemini-embedding-001(3072-dim, L2-normalized) - β SQLite local memory (notes + embeddings)
- β Windows Python detection hardening (v0.4.1)
- β Threshold recalibration 0.35 β 0.70 for gemini-embedding-001 noise floor (v0.4.2)
The retrieval layer is done. v0.5.0 is the question: how do we get the retrieved notes to the AI tool thatβs actually coding?
π‘ The spark idea
Original user framing:
This turns DevNote into the memory bridge between past dev work and the current AI coding session.
Stronger positioning than βDevNote has its own chatβ (which competes with Claude Code) β this makes DevNote infrastructure that feeds ALL AI tools.
π First conceptual unlock β why we still need markdown
Confusion that came up: βWeβve already embedded everything. Why do we need to send markdown to the AI? Why not vectors?β
The answer
Embeddings have ONE job: retrieval. Markdown has ONE job: delivery.
Library analogy:
- Embeddings = the catalog that helps you find the right book
- Markdown = the actual book you hand to the reader (the LLM)
Why LLMs canβt read vectors
LLMs tokenize text. A 3072-dim float array is meaningless to Claude/Codex β they need words. So the pipeline MUST be:
- Retrieval: query β embed β similarity search β top-k note IDs
- Delivery: pull markdown of those notes from SQLite β send text to LLM
Nothing from v0.4.0 is wasted. The embedding layer is what makes retrieval efficient (send 5 relevant notes instead of 500). The markdown layer is how content actually reaches the LLM.
The pipeline for v0.5.0
βββββββββββββββββββββββββββββββββββββββββββββββ
β User types query β DevNote embeds β β
β semantic search β top 5 notes shown β β v0.4.0 (DONE)
βββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββββββ
β User picks checkboxes on results β
β β "Add to Context" button β β v0.5.0 (THIS)
βββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββββββ
β DevNote pulls markdown of selected notes β
β β delivers to Claude / Codex / Cursor etc. β
βββββββββββββββββββββββββββββββββββββββββββββββ π The 5 delivery mechanisms β how markdown actually reaches the AI tool
Every option transports the same markdown payload. They differ in who initiates, whether user is in the loop, and which AI tools accept it.
Mechanism A β π Clipboard
What: DevNote writes markdown to the OS clipboard. User pastes into AI chat.
Analogy: Photocopier. You copy pages, walk to the reader, hand them over.
User flow:
- Search in DevNote β select 3 notes β click βAdd to Contextβ
- Toast: βCopied 3 notes to clipboardβ
- User switches to Claude Code / Cursor / Copilot chat panel
- User pastes (Ctrl+V) β markdown appears in input
- User hits Enter β AI has the context
Code sketch:
const markdown = selectedNotes.map(n =>
`## ${n.title}\n\n${n.content}\n`
).join('\n---\n\n');
await vscode.env.clipboard.writeText(markdown); Works with: π’ Every AI tool that accepts pasted text β universal
Pros:
- Simplest possible implementation (~10 lines)
- Zero tool-specific code
- User fully in control
- No permissions, no config, no protocol setup
Cons:
- User must manually paste (one extra step)
- Not βmagicalβ
- Clipboard can be overwritten if user copies something else
Mechanism B β π Workspace file drop
What: DevNote writes markdown to a file in the workspace (e.g., .devnote/context.md). User types @.devnote/context.md in AI chat β the AI tool reads the file.
Analogy: Post-it on a shared whiteboard. Whoever looks at the whiteboard sees it.
User flow:
- Search β select 3 notes β βAdd to Contextβ
- DevNote writes
.devnote/context.mdin workspace (gitignored) - Toast: βContext ready β type
@.devnote/context.mdin your AI chatβ - User types
@in Claude Code / Cursor β file autocomplete β pick it β Enter - AI tool reads the file as context
Code sketch:
const targetDir = path.join(vscode.workspace.rootPath, '.devnote');
fs.mkdirSync(targetDir, { recursive: true });
fs.writeFileSync(path.join(targetDir, 'context.md'), markdown); Works with: π’ Claude Code, Cursor, Copilot, Windsurf, Continue (all @file-aware tools)
Pros:
- Works with most modern AI IDE tools
- User can re-reference same file across multiple turns
- Zero protocol complexity
Cons:
- Pollutes workspace (one file, gitignored, but still)
- User must remember the
@.devnote/context.mdsyntax - Need cleanup/stale-file logic
Mechanism C β π MCP Server (Model Context Protocol)
What: DevNote runs a small MCP server. AI tools connect and call tools like devnote_search_notes(query) or devnote_get_notes(ids). AI decides when to query β not the user.
Analogy: Installing a new department at a company. Other teams (AI tools) can send work requests and get responses.
User flow:
- User adds DevNote MCP server to AI tool config (one-time, ~30s)
- User chats normally: βI need to fix this auth bugβ
- AI notices: βI should check DevNote for relevant past workβ
- AI calls
devnote_search_notes("auth bug")β DevNote returns top-5 note markdowns - AI uses those as context: βBased on your note from 2 weeks ago about token refreshβ¦β
Biggest philosophical shift: user doesnβt manually search or pick checkboxes. The AI does it autonomously.
Code sketch:
import { Server } from '@modelcontextprotocol/sdk/server';
const server = new Server({ name: 'devnote', version: '0.5.0' });
server.tool('search_notes', {
description: 'Search your past dev notes semantically',
parameters: { query: { type: 'string' } },
handler: async ({ query }) => {
const results = await searchService.searchQuery(query);
const notes = await memoryStore.getByIds(results.map(r => r.id));
return notes.map(n => n.contentMarkdown);
},
});
server.listen(); Works with: π‘ Claude Desktop β , Claude Code β , Cursor (custom MCP), Copilot π§. Not Codex/ChatGPT yet.
Pros:
- Truly agentic β AI pulls context when it decides itβs needed
- No manual checkboxes, no button clicks β natural conversation
- Standardized protocol (industry converging on MCP)
- Future-proof
Cons:
- Bigger implementation (~100β200 lines)
- User config friction (add to AI tool config)
- Doesnβt work with non-MCP tools yet (Codex, some Cursor setups)
- Doesnβt match the βcheckbox + buttonβ UX β different philosophy
Mechanism D β π VS Code Extension API (direct extension-to-extension)
What: DevNote calls INTO Claude Code / Copilot / Cursor extension APIs directly via VS Codeβs extension messaging.
Analogy: Hotline phone between two offices.
User flow:
- Search β select 3 notes β βAdd to Context β Claude Codeβ (dropdown picks target)
- DevNote calls
vscode.commands.executeCommand('claudeCode.addContext', { content: markdown }) - Target extension receives β adds to chatβs context pool
- Userβs next message to AI includes that context automatically
Code sketch:
const copilot = vscode.extensions.getExtension('GitHub.copilot-chat');
if (copilot) {
await vscode.commands.executeCommand(
'github.copilot.chat.attachContext',
{ type: 'text', value: markdown, label: 'DevNote Context' }
);
} Works with: π‘ Tool-specific. Copilot has a public attach-context API. Claude Code CLI doesnβt (separate process). Cursor has less-documented APIs.
Pros:
- Most βnative feelβ β context just appears in AI chat
- No pasting, no file references, no MCP setup
Cons:
- Requires per-tool integration code (~50β100 lines per AI tool)
- AI tool must expose a public API (most donβt, fully)
- Brittle β API changes break it
- Limited to VS Code (not Claude Desktop, Codex, web-based tools)
Mechanism E β π§© AI Tool Plugin (like context-mode)
What: DevNote ships as a plugin inside the AI toolβs own ecosystem (e.g., Claude Code plugin), NOT a separate VS Code extension.
Analogy: Instead of DevNote calling the AI tool from outside, DevNote moves into the AI toolβs building and becomes a new department.
How plugins work (context-mode reference):
User flow:
- User installs βDevNoteβ plugin from Claude Code plugin marketplace (one click)
- User chats normally: βhelp me debug this auth issueβ
- Claude (with DevNote pluginβs skills loaded) auto-calls
devnote.search("auth") - Plugin queries local SQLite β returns top 3 notes as markdown
- Claude uses context and responds
Plus: slash commands like /devnote-recent for manual triggering.
Works with: π‘ One AI tool per build β need separate plugins for Claude Code, Cursor, Continue, Windsurf, Zed, etc.
Pros:
- Best distribution (plugin marketplaces = one-click install)
- Most native UX
- Can do token optimization via hooks (like context-mode)
- Works across Claude Code CLI, Desktop, web β not just VS Code
- Standard of the future AI landscape
Cons:
- Per-AI-tool packaging required (biggest cost)
- ~20β30 hours per tool, Γ multiple tools = big commitment
- Different UX (agentic, not checkbox-driven)
- Shared-state problem: plugin needs to read SQLite DB that VS Code extension writes
π Full comparison table
π The bigger question β DevNoteβs architectural identity
During the brainstorm, a deeper question surfaced:
This reframes the entire decision. DevNoteβs core value is a searchable developer memory. The VS Code extension is just one UI for it.
Three possible architectural futures
Future 1 β Extension-centric (current state)
βββββββββββββββββββββββββββββββββββ
β VS Code Extension β
β βββββββββββββββββββββββββββββ β
β β SQLite + Python worker β β
β β Notion + Git + UI β β
β β all logic bundled β β
β βββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββ - Tied to VS Code entirely
- If AI tools take over (Claude Code CLI, Cursor, web IDEs) β DevNote stuck
- Mobile/web access β impossible without full rewrite
Verdict: π΄ Fragile long-term
Future 2 β MCP-server addition
βββββββββββββββββββββββββββββββββββ ββββββββββββββββ
β VS Code Extension β β MCP Server β
β βββββββββββββββββββββββββββββ β ββββ β (sidecar) β
β β SQLite + Python + Notion β β β wraps SQLite β
β βββββββββββββββββββββββββββββ β ββββββββββββββββ
βββββββββββββββββββββββββββββββββββ - Extension 100% as-is, no refactor
- MCP server reads the same SQLite the extension writes
- Claude Code, Claude Desktop, MCP-Cursor β all connect
- Answer to βwill the extension be saved?β = YES, untouched
Verdict: π‘ Better. But still VS-Code-anchored (new notes only generated from VS Code)
Future 3 β CLI-first architecture β
ββββββββββββββββββββββββ
β devnote CLI (core) β
β βββββββββββββββββ β
β SQLite + Python β
β Notion + Git β
β all business logic β
ββββββββββββββββββββββββ
β² β²
β β
βββββββββββββ ββββββββββββ¬βββββββββββββββ
β β β
ββββββ΄ββββββββββ βββββββββ΄βββββββ ββββββ΄βββββββββ
β VS Code ext β β MCP Server β β Plugin β
β (thin UI) β β (wraps CLI) β β (wraps MCP) β
β calls CLI β β β β β
ββββββββββββββββ ββββββββββββββββ βββββββββββββββ - CLI is source of truth. Everything else wraps it.
- Extension stays β but becomes a thin UI layer
- Run
devnote search "auth"in terminal without any IDE - MCP server = 100-line wrapper around CLI commands
- Plugin = packaging of MCP + skills for distribution
- Tomorrowβs AI tool? Write a new wrapper in a few hours.
This is the architecture context-mode uses. One core sandbox, many frontends.
Verdict: π’π’ Future-proof. Tool-agnostic.
π Refactor cost: βDo we rewrite everything?β
NO β we refactor, not rewrite.
~80% of DevNoteβs code is already tool-agnostic and can move to CLI with minimal changes:
Only ~20% is VS-Code-specific:
Total refactor: ~15β20 hours of focused work. Not a rewrite. Most imports just change paths.
π£ Proposed phased roadmap
Donβt pick ONE future β pick the path that gets to Future 3 gradually, with each step shippable on its own.
Why this sequence
- v0.5.0 ships YOUR idea fast β validates user demand, delivers something this week
- v0.6.0 does the strategic refactor β future-proofs DevNote
- v0.7.0 unlocks agentic AI β AI tools autonomously pull context
- v0.8.0 is distribution β plugin marketplace = one-click install
- v0.9.0+ is expansion β each new AI tool = small wrapper, not big rewrite
Nothing is wasted. Each release builds on the last. Extension never dies β it evolves into a thin UI layer on top of the CLI core.
π― The 2026 reality check
Where is the AI tooling landscape likely headed in 12β18 months:
- MCP becomes table-stakes across AI tools
- Plugin marketplaces drive distribution (Claude Code plugins, Cursor plugins)
- Terminal-first AI workflows keep growing (
claude,gh copilot, etc.) - Multiple IDEs coexist β VS Code wonβt die but wonβt dominate
- AI tools proliferate β canβt predict which is dominant in 2 years
CLI-first is the only architecture that survives all scenarios. Extension-centric dies if VS Code loses share. MCP-only works but only for AI-driven contexts. CLI-first works everywhere.
β Open decisions for collaboration
Decision 1 β v0.5.0 scope
- (a) Checkbox UX + Clipboard + File drop (ship fast, small, validates concept)
Yeah, this will work but clipboard paste get ugly really fast. UX poor. Iβd prefer much cleaner user input field.
File drop is a better alternative. Instead of writing everything to a same context file and load it, Why not create a context folder in the codebase itself which can be git syncβed.
Add context files as like other files. (Simplest ever). - (b) Skip checkbox UX, go straight to CLI extraction (bigger bet, future-proof sooner)
- One more question is do we couple the embeding within cli or cli can just work for md contexts.
- (c) Do BOTH in v0.5.0 (double scope, slower ship)
Decision 2 β 5-release phased roadmap
Are we comfortable with the v0.5 β v0.9 plan above? Any stages to swap/skip/combine?
Decision 3 β within v0.5.0 (if chosen)
If we ship checkbox UX in v0.5.0:
- D15: Delivery = Clipboard (A) + File drop (B) together, or just one?
- See if we can avoid all the next 4 questions with the above approach just storing each context as file and try to load those files right away.
- D16: Default action on βAdd to Contextβ click?
- D17: Selection UX β checkboxes + bottom button, or right-click menu, or multi-select with Shift?
- D18: Markdown format β concatenated with
---separators? Per-note files in a folder? Include metadata (branch, date) in the bundle?
Decision 4 β plugin ecosystem scope
Which AI tools to target in v0.8.0+ order of priority?
- Claude Code
- Cursor
- Continue.dev
- Windsurf
- Zed
- Claude Desktop (MCP only, no plugin needed)
π Appendix β terms and references
- RAG (Retrieval-Augmented Generation): Pattern where you retrieve relevant context from a knowledge base, then pass it to an LLM as grounding. DevNote v0.4.0 built the retrieval layer.
- MCP (Model Context Protocol): Anthropicβs standard protocol for AI tools to interact with external data sources. modelcontextprotocol.io
- Embeddings: Numerical vector representations of text, used for semantic similarity search. DevNote uses
gemini-embedding-001(3072-dim, L2-normalized, dot-product similarity). - Context window: The chunk of text an LLM can βseeβ in a single call. Efficient retrieval = fit only relevant notes into this window.
- context-mode: Reference plugin for Claude Code that does token-budget management via MCP + hooks. Exemplar of what a DevNote plugin could look like.
End of brainstorm doc. Share this with collaborators for feedback before we lock decisions.# DevNote v0.5.0 β Context Bridge Brainstorm
Date: 2026-04-19 Status: In discussion β no decisions locked yet Purpose: Share with collaborators the full exploration of how DevNote could integrate with AI coding assistants (Claude Code, Cursor, Copilot, Codex, etc.)
π Where we are
v0.4.2 just shipped. DevNote now has:
- β Semantic search over all past notes (v0.4.0)
- β
Python worker with
gemini-embedding-001(3072-dim, L2-normalized) - β SQLite local memory (notes + embeddings)
- β Windows Python detection hardening (v0.4.1)
- β Threshold recalibration 0.35 β 0.70 for gemini-embedding-001 noise floor (v0.4.2)
The retrieval layer is done. v0.5.0 is the question: how do we get the retrieved notes to the AI tool thatβs actually coding?
π‘ The spark idea
Original user framing:
This turns DevNote into the memory bridge between past dev work and the current AI coding session.
Stronger positioning than βDevNote has its own chatβ (which competes with Claude Code) β this makes DevNote infrastructure that feeds ALL AI tools.
π First conceptual unlock β why we still need markdown
Confusion that came up: βWeβve already embedded everything. Why do we need to send markdown to the AI? Why not vectors?β
The answer
Embeddings have ONE job: retrieval. Markdown has ONE job: delivery.
Library analogy:
- Embeddings = the catalog that helps you find the right book
- Markdown = the actual book you hand to the reader (the LLM)
Why LLMs canβt read vectors
LLMs tokenize text. A 3072-dim float array is meaningless to Claude/Codex β they need words. So the pipeline MUST be:
- Retrieval: query β embed β similarity search β top-k note IDs
- Delivery: pull markdown of those notes from SQLite β send text to LLM
Nothing from v0.4.0 is wasted. The embedding layer is what makes retrieval efficient (send 5 relevant notes instead of 500). The markdown layer is how content actually reaches the LLM.
The pipeline for v0.5.0
βββββββββββββββββββββββββββββββββββββββββββββββ
β User types query β DevNote embeds β β
β semantic search β top 5 notes shown β β v0.4.0 (DONE)
βββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββββββ
β User picks checkboxes on results β
β β "Add to Context" button β β v0.5.0 (THIS)
βββββββββββββββββββββββββββββββββββββββββββββββ
β
βββββββββββββββββββββββββββββββββββββββββββββββ
β DevNote pulls markdown of selected notes β
β β delivers to Claude / Codex / Cursor etc. β
βββββββββββββββββββββββββββββββββββββββββββββββ π The 5 delivery mechanisms β how markdown actually reaches the AI tool
Every option transports the same markdown payload. They differ in who initiates, whether user is in the loop, and which AI tools accept it.
Mechanism A β π Clipboard
What: DevNote writes markdown to the OS clipboard. User pastes into AI chat.
Analogy: Photocopier. You copy pages, walk to the reader, hand them over.
User flow:
- Search in DevNote β select 3 notes β click βAdd to Contextβ
- Toast: βCopied 3 notes to clipboardβ
- User switches to Claude Code / Cursor / Copilot chat panel
- User pastes (Ctrl+V) β markdown appears in input
- User hits Enter β AI has the context
Code sketch:
const markdown = selectedNotes.map(n =>
`## ${n.title}\n\n${n.content}\n`
).join('\n---\n\n');
await vscode.env.clipboard.writeText(markdown); Works with: π’ Every AI tool that accepts pasted text β universal
Pros:
- Simplest possible implementation (~10 lines)
- Zero tool-specific code
- User fully in control
- No permissions, no config, no protocol setup
Cons:
- User must manually paste (one extra step)
- Not βmagicalβ
- Clipboard can be overwritten if user copies something else
Mechanism B β π Workspace file drop
What: DevNote writes markdown to a file in the workspace (e.g., .devnote/context.md). User types @.devnote/context.md in AI chat β the AI tool reads the file.
Analogy: Post-it on a shared whiteboard. Whoever looks at the whiteboard sees it.
User flow:
- Search β select 3 notes β βAdd to Contextβ
- DevNote writes
.devnote/context.mdin workspace (gitignored) - Toast: βContext ready β type
@.devnote/context.mdin your AI chatβ - User types
@in Claude Code / Cursor β file autocomplete β pick it β Enter - AI tool reads the file as context
Code sketch:
const targetDir = path.join(vscode.workspace.rootPath, '.devnote');
fs.mkdirSync(targetDir, { recursive: true });
fs.writeFileSync(path.join(targetDir, 'context.md'), markdown); Works with: π’ Claude Code, Cursor, Copilot, Windsurf, Continue (all @file-aware tools)
Pros:
- Works with most modern AI IDE tools
- User can re-reference same file across multiple turns
- Zero protocol complexity
Cons:
- Pollutes workspace (one file, gitignored, but still)
- User must remember the
@.devnote/context.mdsyntax - Need cleanup/stale-file logic
Mechanism C β π MCP Server (Model Context Protocol)
What: DevNote runs a small MCP server. AI tools connect and call tools like devnote_search_notes(query) or devnote_get_notes(ids). AI decides when to query β not the user.
Analogy: Installing a new department at a company. Other teams (AI tools) can send work requests and get responses.
User flow:
- User adds DevNote MCP server to AI tool config (one-time, ~30s)
- User chats normally: βI need to fix this auth bugβ
- AI notices: βI should check DevNote for relevant past workβ
- AI calls
devnote_search_notes("auth bug")β DevNote returns top-5 note markdowns - AI uses those as context: βBased on your note from 2 weeks ago about token refreshβ¦β
Biggest philosophical shift: user doesnβt manually search or pick checkboxes. The AI does it autonomously.
Code sketch:
import { Server } from '@modelcontextprotocol/sdk/server';
const server = new Server({ name: 'devnote', version: '0.5.0' });
server.tool('search_notes', {
description: 'Search your past dev notes semantically',
parameters: { query: { type: 'string' } },
handler: async ({ query }) => {
const results = await searchService.searchQuery(query);
const notes = await memoryStore.getByIds(results.map(r => r.id));
return notes.map(n => n.contentMarkdown);
},
});
server.listen(); Works with: π‘ Claude Desktop β , Claude Code β , Cursor (custom MCP), Copilot π§. Not Codex/ChatGPT yet.
Pros:
- Truly agentic β AI pulls context when it decides itβs needed
- No manual checkboxes, no button clicks β natural conversation
- Standardized protocol (industry converging on MCP)
- Future-proof
Cons:
- Bigger implementation (~100β200 lines)
- User config friction (add to AI tool config)
- Doesnβt work with non-MCP tools yet (Codex, some Cursor setups)
- Doesnβt match the βcheckbox + buttonβ UX β different philosophy
Mechanism D β π VS Code Extension API (direct extension-to-extension)
What: DevNote calls INTO Claude Code / Copilot / Cursor extension APIs directly via VS Codeβs extension messaging.
Analogy: Hotline phone between two offices.
User flow:
- Search β select 3 notes β βAdd to Context β Claude Codeβ (dropdown picks target)
- DevNote calls
vscode.commands.executeCommand('claudeCode.addContext', { content: markdown }) - Target extension receives β adds to chatβs context pool
- Userβs next message to AI includes that context automatically
Code sketch:
const copilot = vscode.extensions.getExtension('GitHub.copilot-chat');
if (copilot) {
await vscode.commands.executeCommand(
'github.copilot.chat.attachContext',
{ type: 'text', value: markdown, label: 'DevNote Context' }
);
} Works with: π‘ Tool-specific. Copilot has a public attach-context API. Claude Code CLI doesnβt (separate process). Cursor has less-documented APIs.
Pros:
- Most βnative feelβ β context just appears in AI chat
- No pasting, no file references, no MCP setup
Cons:
- Requires per-tool integration code (~50β100 lines per AI tool)
- AI tool must expose a public API (most donβt, fully)
- Brittle β API changes break it
- Limited to VS Code (not Claude Desktop, Codex, web-based tools)
Mechanism E β π§© AI Tool Plugin (like context-mode)
What: DevNote ships as a plugin inside the AI toolβs own ecosystem (e.g., Claude Code plugin), NOT a separate VS Code extension.
Analogy: Instead of DevNote calling the AI tool from outside, DevNote moves into the AI toolβs building and becomes a new department.
How plugins work (context-mode reference):
User flow:
- User installs βDevNoteβ plugin from Claude Code plugin marketplace (one click)
- User chats normally: βhelp me debug this auth issueβ
- Claude (with DevNote pluginβs skills loaded) auto-calls
devnote.search("auth") - Plugin queries local SQLite β returns top 3 notes as markdown
- Claude uses context and responds
Plus: slash commands like /devnote-recent for manual triggering.
Works with: π‘ One AI tool per build β need separate plugins for Claude Code, Cursor, Continue, Windsurf, Zed, etc.
Pros:
- Best distribution (plugin marketplaces = one-click install)
- Most native UX
- Can do token optimization via hooks (like context-mode)
- Works across Claude Code CLI, Desktop, web β not just VS Code
- Standard of the future AI landscape
Cons:
- Per-AI-tool packaging required (biggest cost)
- ~20β30 hours per tool, Γ multiple tools = big commitment
- Different UX (agentic, not checkbox-driven)
- Shared-state problem: plugin needs to read SQLite DB that VS Code extension writes
π Full comparison table
π The bigger question β DevNoteβs architectural identity
During the brainstorm, a deeper question surfaced:
This reframes the entire decision. DevNoteβs core value is a searchable developer memory. The VS Code extension is just one UI for it.
Three possible architectural futures
Future 1 β Extension-centric (current state)
βββββββββββββββββββββββββββββββββββ
β VS Code Extension β
β βββββββββββββββββββββββββββββ β
β β SQLite + Python worker β β
β β Notion + Git + UI β β
β β all logic bundled β β
β βββββββββββββββββββββββββββββ β
βββββββββββββββββββββββββββββββββββ - Tied to VS Code entirely
- If AI tools take over (Claude Code CLI, Cursor, web IDEs) β DevNote stuck
- Mobile/web access β impossible without full rewrite
Verdict: π΄ Fragile long-term
Future 2 β MCP-server addition
βββββββββββββββββββββββββββββββββββ ββββββββββββββββ
β VS Code Extension β β MCP Server β
β βββββββββββββββββββββββββββββ β ββββ β (sidecar) β
β β SQLite + Python + Notion β β β wraps SQLite β
β βββββββββββββββββββββββββββββ β ββββββββββββββββ
βββββββββββββββββββββββββββββββββββ - Extension 100% as-is, no refactor
- MCP server reads the same SQLite the extension writes
- Claude Code, Claude Desktop, MCP-Cursor β all connect
- Answer to βwill the extension be saved?β = YES, untouched
Verdict: π‘ Better. But still VS-Code-anchored (new notes only generated from VS Code)
Future 3 β CLI-first architecture β
ββββββββββββββββββββββββ
β devnote CLI (core) β
β βββββββββββββββββ β
β SQLite + Python β
β Notion + Git β
β all business logic β
ββββββββββββββββββββββββ
β² β²
β β
βββββββββββββ ββββββββββββ¬βββββββββββββββ
β β β
ββββββ΄ββββββββββ βββββββββ΄βββββββ ββββββ΄βββββββββ
β VS Code ext β β MCP Server β β Plugin β
β (thin UI) β β (wraps CLI) β β (wraps MCP) β
β calls CLI β β β β β
ββββββββββββββββ ββββββββββββββββ βββββββββββββββ - CLI is source of truth. Everything else wraps it.
- Extension stays β but becomes a thin UI layer
- Run
devnote search "auth"in terminal without any IDE - MCP server = 100-line wrapper around CLI commands
- Plugin = packaging of MCP + skills for distribution
- Tomorrowβs AI tool? Write a new wrapper in a few hours.
This is the architecture context-mode uses. One core sandbox, many frontends.
Verdict: π’π’ Future-proof. Tool-agnostic.
π Refactor cost: βDo we rewrite everything?β
NO β we refactor, not rewrite.
~80% of DevNoteβs code is already tool-agnostic and can move to CLI with minimal changes:
Only ~20% is VS-Code-specific:
Total refactor: ~15β20 hours of focused work. Not a rewrite. Most imports just change paths.
π£ Proposed phased roadmap
Donβt pick ONE future β pick the path that gets to Future 3 gradually, with each step shippable on its own.
Why this sequence
- v0.5.0 ships YOUR idea fast β validates user demand, delivers something this week
- v0.6.0 does the strategic refactor β future-proofs DevNote
- v0.7.0 unlocks agentic AI β AI tools autonomously pull context
- v0.8.0 is distribution β plugin marketplace = one-click install
- v0.9.0+ is expansion β each new AI tool = small wrapper, not big rewrite
Nothing is wasted. Each release builds on the last. Extension never dies β it evolves into a thin UI layer on top of the CLI core.
π― The 2026 reality check
Where is the AI tooling landscape likely headed in 12β18 months:
- MCP becomes table-stakes across AI tools
- Plugin marketplaces drive distribution (Claude Code plugins, Cursor plugins)
- Terminal-first AI workflows keep growing (
claude,gh copilot, etc.) - Multiple IDEs coexist β VS Code wonβt die but wonβt dominate
- AI tools proliferate β canβt predict which is dominant in 2 years
CLI-first is the only architecture that survives all scenarios. Extension-centric dies if VS Code loses share. MCP-only works but only for AI-driven contexts. CLI-first works everywhere.
β Open decisions for collaboration
Decision 1 β v0.5.0 scope
- (a) Checkbox UX + Clipboard + File drop (ship fast, small, validates concept)
Yeah, this will work but clipboard paste get ugly really fast. UX poor. Iβd prefer much cleaner user input field.
File drop is a better alternative. Instead of writing everything to a same context file and load it, Why not create a context folder in the codebase itself which can be git syncβed.
Add context files as like other files. (Simplest ever). - (b) Skip checkbox UX, go straight to CLI extraction (bigger bet, future-proof sooner)
- One more question is do we couple the embeding within cli or cli can just work for md contexts.
- (c) Do BOTH in v0.5.0 (double scope, slower ship)
Decision 2 β 5-release phased roadmap
Are we comfortable with the v0.5 β v0.9 plan above? Any stages to swap/skip/combine?
Decision 3 β within v0.5.0 (if chosen)
If we ship checkbox UX in v0.5.0:
- D15: Delivery = Clipboard (A) + File drop (B) together, or just one?
- See if we can avoid all the next 4 questions with the above approach just storing each context as file and try to load those files right away.
- D16: Default action on βAdd to Contextβ click?
- D17: Selection UX β checkboxes + bottom button, or right-click menu, or multi-select with Shift?
- D18: Markdown format β concatenated with
---separators? Per-note files in a folder? Include metadata (branch, date) in the bundle?
Decision 4 β plugin ecosystem scope
Which AI tools to target in v0.8.0+ order of priority?
- Claude Code
- Cursor
- Continue.dev
- Windsurf
- Zed
- Claude Desktop (MCP only, no plugin needed)
π Appendix β terms and references
- RAG (Retrieval-Augmented Generation): Pattern where you retrieve relevant context from a knowledge base, then pass it to an LLM as grounding. DevNote v0.4.0 built the retrieval layer.
- MCP (Model Context Protocol): Anthropicβs standard protocol for AI tools to interact with external data sources. modelcontextprotocol.io
- Embeddings: Numerical vector representations of text, used for semantic similarity search. DevNote uses
gemini-embedding-001(3072-dim, L2-normalized, dot-product similarity). - Context window: The chunk of text an LLM can βseeβ in a single call. Efficient retrieval = fit only relevant notes into this window.
- context-mode: Reference plugin for Claude Code that does token-budget management via MCP + hooks. Exemplar of what a DevNote plugin could look like.
End of brainstorm doc. Share this with collaborators for feedback before we lock decisions.