欣淇
发布于 2026-05-11 / 0 阅读
0
0

🦀 ZeroClaw:31k Stars 的 Rust 自主 AI 助手运行时,一个二进制文件跑通全平台

🦀 ZeroClaw:31k Stars 的 Rust 自主 AI 助手运行时,一个二进制文件跑通全平台

项目地址:zeroclaw-labs/zeroclaw | ⭐ 31,224 | 🛠 Rust | 作者:ZeroClaw Labs


老实说,现在市面上 AI Agent 框架多得离谱。Python 的有 CrewAI、LangChain,Node 的有 ElizaOS,每个都要装一堆依赖、配一堆环境变量。踩过的坑都是泪——版本冲突、Python 3.11 不兼容某个包、pip install 跑十分钟崩了。最骚的是,你只是想有一个自己能控制的 AI 助手,不是搭一个微服务架构。

ZeroClaw 换了个思路:一个 Rust 二进制文件,下载就能跑,配置全在 ~/.zeroclaw/config.toml,没有运行时依赖。

一、安装:一条命令搞定

curl -fsSL https://raw.githubusercontent.com/zeroclaw-labs/zeroclaw/master/install.sh | bash

想自己编译也行:

git clone https://github.com/zeroclaw-labs/zeroclaw.git
cd zeroclaw
./install.sh

装完后 zeroclaw onboard 自动启动配置向导。

几个实用 flag:

./install.sh --prebuilt        # 跳过编译提示,直接下二进制
./install.sh --source           # 强制源码编译
./install.sh --minimal          # 仅内核 (~6.6 MB)
./install.sh --skip-onboard     # 只安装,稍后手动配置

二、启动:三步走

zeroclaw onboard                  # 向导:选模型供应商 + 配通信渠道
zeroclaw agent                    # 终端里直接聊天
zeroclaw service install          # 注册为系统服务(systemd/launchctl/Windows Service)

配好之后,你的 AI 助手就跑起来了——支持 30+ 通信渠道(Discord、Telegram、Matrix、邮件、Webhook、CLI),20+ 模型供应商(Anthropic、OpenAI、Ollama、本地模型)。

三、配置:一个 TOML 文件搞定

# ~/.zeroclaw/config.toml
default_provider = "openai-codex"
default_model = "gpt-5-codex"

想加本地 Ollama 模型?

[providers.models.ollama]
api_url = "http://localhost:11434/v1"
api_key = ""
default_model = "llama-4"

每个通信渠道对应一个 [channels.] 块。想接 Telegram?加个 [channels.telegram] 配 token 就行。

四、为什么选 ZeroClaw?

  • 安全优先:默认 supervised 模式,高危操作要你点头。支持 Landlock/Bubblewrap/Seatbelt 沙箱。
  • 硬件原生:支持 GPIO / I2C / SPI / USB,树莓派上直接控制外设。
  • SOP 引擎:事件触发的标准操作流程(MQTT/webhook/cron/外设事件),审批流 + 可恢复执行。
  • ACP 协议:IDE/编辑器通过 JSON-RPC 2.0 over stdio 集成。
  • 最骚的是,整个运行时就是一个 Rust 二进制,没有 npm node_modules、没有 Python virtualenv、没有 Docker。配好了就是你的。

    总结

  • 一个 Rust 二进制,全平台跑,零运行时依赖
  • 一条命令安装,三步启动
  • 30+ 通信渠道 + 20+ 模型供应商,TOML 配置即插即用
  • 安全沙箱 + 硬件支持 + SOP 自动化,从个人助手到边缘设备都管
  • 🦀 ZeroClaw: 31k Stars Rust-Powered Autonomous AI Assistant Runtime — One Binary, Every Platform

    Project: zeroclaw-labs/zeroclaw | ⭐ 31,224 | 🛠 Rust | By: ZeroClaw Labs


    Let's be honest — the AI agent ecosystem is a mess. Python frameworks need a virtualenv and 200MB of deps. Node ones dump 50,000 files into node_modules. Every framework wants you to build a microservice architecture just to chat with an LLM.

    ZeroClaw takes a different approach: a single Rust binary with zero runtime dependencies. Download it, run it, configure it with one TOML file.

    Install: One command

    curl -fsSL https://raw.githubusercontent.com/zeroclaw-labs/zeroclaw/master/install.sh | bash
    

    Or build from source:

    git clone https://github.com/zeroclaw-labs/zeroclaw.git
    cd zeroclaw
    ./install.sh
    

    Useful flags:

    ./install.sh --prebuilt        # skip build prompt, download binary directly
    ./install.sh --source           # force source build
    ./install.sh --minimal          # kernel only (~6.6 MB)
    

    Quick Start: Three commands

    zeroclaw onboard                  # setup wizard: choose provider + channels
    zeroclaw agent                    # interactive chat in terminal
    zeroclaw service install          # register as system service
    

    Config: One TOML file

    # ~/.zeroclaw/config.toml
    default_provider = "openai-codex"
    default_model = "gpt-5-codex"
    

    Each channel gets its own [channels.] block. Provider configs go in [providers.models.].

    Why ZeroClaw stands out

  • Security-first — supervised mode by default, sandbox via Landlock/Bubblewrap/Seatbelt
  • Hardware-native — GPIO / I2C / SPI / USB on Raspberry Pi and microcontrollers
  • SOP engine — event-triggered workflows with approval gates and resume
  • ACP protocol — IDE integration via JSON-RPC 2.0 over stdio
  • The whole runtime is a single Rust binary. No npm, no venv, no Docker. Just one file that does everything.

    Key takeaways

  • Single Rust binary, zero runtime deps, runs everywhere
  • One-line install, three-command startup
  • 30+ channels + 20+ model providers, TOML config
  • Security sandbox, hardware support, SOP automation — from personal assistant to edge devices

  • 评论