欣淇
发布于 2026-05-11 / 0 阅读
0
0

Langflow147k Stars AI Agent MCP Server:

🔀 Langflow:147k Stars 的可视化 AI Agent 编排平台,拖拽式搭建工作流,还能当 MCP Server 用

每次想搭一个 AI 工作流,你是不是也得先装 LangChain、配 API Key、写几十行 callback 函数?老实说,我踩过这个坑太多次了。后来发现 Langflow——一个让你用拖拽代替写代码的 AI Agent 编排平台,147k Stars,Python 写的,底层用 React Flow 做可视化引擎,支持所有主流 LLM 和向量数据库。

项目数据: 147,926 Stars | Python | MIT 协议 | 日下载量超 10 万 | 支持 Windows/macOS Desktop

它能干啥

🔀 可视化拖拽构建 — 不用写代码。拖个 LLM 节点,拖个 Prompt 节点,连起来,就跑通了。所有组件用 Python 写的,你随时可以打开源码改逻辑。

内置 MCP Server — 这招最骚。你搭好的工作流可以直接部署成 MCP Server,别的 AI Agent 通过 MCP 协议直接调你的流当工具用。相当于你搭一个链,它就变成了一个 API。

🛠 多智能体编排 — 支持多 Agent 对话管理 + 检索增强。不是单链条,是真·多智能体协同。

📦 一键部署 API — 工作流搭好后,点一下就能部署成 REST API,或者导出 JSON 给 Python 应用用。

3 步上手

装 Langflow 前你需要 Python 3.10–3.13 和 uv(推荐):

# 1. 安装
uv pip install langflow -U

# 2. 启动
uv run langflow run

浏览器打开 http://127.0.0.1:7860,就能看到可视化界面了。

想用 Docker?一行搞定:

docker run -p 7860:7860 langflowai/langflow:latest

不想折腾环境?官方有 Desktop 版,Windows 和 macOS 直接下载安装包,依赖全打包好了。

和 ComfyUI 有点像,但方向不同

用过 ComfyUI 的人看到 Langflow 应该很眼熟——都是节点编排。但 ComfyUI 管的是 Stable Diffusion 的图片生成管线,Langflow 管的是 LLM + Agent 的逻辑编排。一个是画图的,一个是干活的。

实际开发中注意

Langflow 的 MCP Server 功能是 2.0 之后加的,如果你跑旧版本,记得升级到最新。还有,节点多了之后界面会有点卡,建议把复杂工作流拆成子流。


要点总结:

  • Langflow 把 AI 工作流变成了拖拽游戏,147k Stars 证明了这不是小众玩具
  • 内置 MCP Server 支持,搭好的流可以直接被其他 Agent 调用
  • 支持 Docker、Desktop、pip 三种部署方式,环境适配做得不错
  • 适合快速原型验证,复杂生产场景建议搭配 LangSmith 做可观测
  • 🔀 Langflow: 147k Stars Visual AI Agent Orchestration Platform — Drag, Drop, and Deploy Workflows as MCP Servers

    Every time you want to build an AI workflow, it's the same story — install LangChain, configure API keys, write dozens of lines of callback functions. I've been there too many times. Then I found Langflow: a visual AI agent builder that replaces code with drag-and-drop. 147k Stars, built with Python and React Flow, supports every major LLM and vector database out there.

    Project stats: 147,926 Stars | Python | MIT License | 100k+ daily downloads | Windows/macOS Desktop

    What it does

    🔀 Visual drag-and-drop builder — Zero code required. Drop an LLM node, drop a Prompt node, connect them, and it runs. Every component is written in Python, so you can open the source and tweak it anytime.

    Built-in MCP Server — Here's the killer feature. Your finished workflow can be deployed as an MCP Server instantly. Other AI agents call your flow as a tool via the MCP protocol. Build a chain, get an API.

    🛠 Multi-agent orchestration — Conversation management + retrieval augmented generation for real multi-agent collaboration, not just a single chain.

    📦 One-click API deployment — Export your flow as a REST API or JSON for Python apps.

    Getting started in 3 steps

    You'll need Python 3.10–3.13 and uv (recommended):

    # 1. Install
    uv pip install langflow -U
    
    # 2. Run
    uv run langflow run
    

    Open http://127.0.0.1:7860 in your browser and you're looking at the visual editor.

    Prefer Docker?

    docker run -p 7860:7860 langflowai/langflow:latest
    

    Don't want to deal with environments? Grab the Desktop app for Windows or macOS — all dependencies included.

    Think of it as ComfyUI, but for LLMs

    If you've used ComfyUI, Langflow will feel familiar — same node-based visual programming. ComfyUI orchestrates Stable Diffusion image pipelines; Langflow orchestrates LLM + Agent logic. One draws pictures, the other gets work done.

    A few things I learned

    The MCP Server feature landed in Langflow 2.0+. If you're running an older version, upgrade first. Also, the UI can get sluggish with too many nodes — split complex flows into sub-flows.


    Quick recap:

  • Langflow turns AI workflow building into a drag-and-drop game — 147k Stars says it's not a gimmick
  • Built-in MCP Server lets other agents consume your flows as tools
  • Supports Docker, Desktop, and pip — good coverage for different dev setups
  • Great for rapid prototyping; pair with LangSmith for production observability

  • 评论