欣淇
发布于 2026-05-14 / 1 阅读
0
0

🔥 Context Mode:14.6k Stars 的上下文优化 MCP 服务器,98% Token 节省

🔥 Context Mode:14.6k Stars 的上下文优化 MCP 服务器,98% Token 节省,15 个平台一把梭

Context Mode 是个 MCP 服务器,核心干一件事:管好 AI Coding Agent 的上下文窗口。一个 Playwright 截图 56 KB,20 个 GitHub Issue 59 KB,500 条访问日志 45 KB——30 分钟后 40% 的上下文就没了。Context Mode 的沙箱工具把 315 KB 原始输出压到 5.4 KB,省 98%。

不是滤掉输出,是压根不让原始数据进上下文。数据在沙箱子进程里处理完,只把摘要喂给 AI。

四个维度解决上下文问题

  • 沙箱保存——工具输出隔离执行,原始数据不进窗口。315 KB → 5.4 KB
  • 2. 会话连续性——每一次编辑、git 操作、任务、错误、用户决策都记到 SQLite。上下文压缩时不是把数据倒回来,而是建 FTS5 索引,BM25 检索,模型接着上次的问继续

    3. Think in Code——别让 LLM 读 50 个文件去数函数行数,写个脚本算了就 console.log 结果。一个脚本替换十次工具调用,省 100 倍上下文

    4. 不碰输出风格——只管数据去哪,不管模型怎么写回答

    装一个试试(Claude Code)

    # 插件市场安装
    /plugin marketplace add mksglu/context-mode
    /plugin install context-mode@context-mode
    
    # 重启后验证
    /context-mode:ctx-doctor
    

    Gemini CLI / VS Code Copilot / JetBrains / Cursor / OpenCode / KiloCode / OpenClaw / Codex CLI / Pi Agent……15 个平台全支持,安装路径各有不同。

    真实命令

    // Before: 47 × Read() = 700 KB.  After: 1 × ctx_execute() = 3.6 KB.
    ctx_execute("javascript", `
      const files = fs.readdirSync('src').filter(f => f.endsWith('.ts'));
      files.forEach(f => console.log(f + ': ' + fs.readFileSync('src/'+f,'utf8').split('\n').length + ' lines'));
    `);
    

    # 终端跑诊断
    context-mode doctor
    context-mode upgrade
    context-mode insight  # 打开个人分析面板
    

    基准数据

    | 场景 | 原始大小 | 上下文大小 | 节省 |

    |---|---|---|---|

    | Playwright 截图 | 56.2 KB | 299 B | 99% |

    | GitHub Issues ×20 | 58.9 KB | 1.1 KB | 98% |

    | 访问日志 500 条 | 45.1 KB | 155 B | 100% |

    | 分析 CSV 500行 | 85.5 KB | 222 B | 100% |

    | Git 日志 153 commits | 11.6 KB | 107 B | 99% |

    | 全会话总计 | 315 KB | 5.4 KB | 98% |

    会话时间从 ~30 分钟拉到 ~3 小时。

    隐私

    什么都不传出去。无遥测、无云同步、无用量追踪。SQLite 全在本地,用完即焚。

    已获 Hacker News #1(570+ 分),Microsoft / Google / Meta / Amazon / NVIDIA / ByteDance / Stripe 等团队都在用。


    🔥 Context Mode: 14.6k Stars MCP Server for Context Optimization — 98% Token Savings Across 15 Platforms

    Context Mode is an MCP server that solves one thing: keeping your AI coding agent's context window under control. A Playwright snapshot costs 56 KB. Twenty GitHub issues cost 59 KB. One access log — 45 KB. After 30 minutes, 40% of your context is gone. Context Mode's sandboxed tools compress 315 KB of raw output down to 5.4 KB — a 98% reduction.

    Not filtering output — preventing raw data from ever entering the context window. Data is processed in isolated subprocesses; only summaries reach the model.

    Four-Sided Solution

  • Context Saving — Sandbox tools keep raw data out. 315 KB → 5.4 KB
  • 2. Session Continuity — Every event tracked in SQLite with FTS5 indexing. After compaction, BM25 search retrieves relevant state — model picks up where it left off

    3. Think in Code — Write a script instead of reading 50 files. One ctx_execute() replaces ten tool calls, saves 100x context

    4. No prose-style enforcement — Routes data away from context without telling the model how to talk

    Installation (Claude Code)

    /plugin marketplace add mksglu/context-mode
    /plugin install context-mode@context-mode
    
    # Verify
    /context-mode:ctx-doctor
    

    Supports 15 platforms: Claude Code, Gemini CLI, VS Code Copilot, JetBrains Copilot, Cursor, OpenCode, KiloCode, OpenClaw, Codex CLI, Antigravity, Kiro, Zed, Pi, OMP, Qwen Code.

    Benchmarks

    | Scenario | Raw | Context | Saved |

    |---|---|---|---|

    | Playwright snapshot | 56.2 KB | 299 B | 99% |

    | GitHub Issues (20) | 58.9 KB | 1.1 KB | 98% |

    | Access log (500 req) | 45.1 KB | 155 B | 100% |

    | Analytics CSV (500 rows) | 85.5 KB | 222 B | 100% |

    | Git log (153 commits) | 11.6 KB | 107 B | 99% |

    | Full session total | 315 KB | 5.4 KB | 98% |

    Session time extends from ~30 min to ~3 hours.

    Privacy

    Nothing leaves your machine. No telemetry, no cloud sync, no account. SQLite lives in your home directory, dies when you're done.

    Hacker News #1 (570+ points). Used across teams at Microsoft, Google, Meta, Amazon, NVIDIA, ByteDance, Stripe.

    GitHub: mksglu/context-mode | Install: npm install -g context-mode


    评论