环境:

pnpm:可以使用 npm 进行安装,命令为 npm install -g pnpm

Python3.7以上

  1.  Poetry 环境安装:下载脚本:使用 Invoke-WebRequest 下载脚本到本地文件。

Invoke-WebRequest -Uri https://install.python-poetry.org -OutFile install-poetry.py

运行脚本:使用 Python 执行下载好的脚本。

python install-poetry.py

手动更新 PATH 环境变量

你可以按照以下步骤把 C:\Users\24558\AppData\Roaming\Python\Scripts 添加到系统的 PATH 环境变量中:

  1. 按下 Win + R 组合键,打开 “运行” 对话框,输入 sysdm.cpl 并回车,打开 “系统属性” 窗口。
  2. 在 “系统属性” 窗口中,切换到 “高级” 选项卡,点击 “环境变量” 按钮。
  3. 在 “环境变量” 窗口中,找到 “系统变量” 列表中的 Path 变量,选中它并点击 “编辑” 按钮。
  4. 在 “编辑环境变量” 窗口中,点击 “新建” 按钮,将 C:\Users\24558\AppData\Roaming\Python\Scripts 添加到列表中。
  5. 依次点击 “确定” 保存更改。
  1. 克隆仓库

git clone GitHub - CopilotKit/CopilotKit: React UI + elegant infrastructure for AI Copilots, AI chatbots, and in-app AI agents. The Agentic last-mile 🪁

cd CopilotKit/examples/coagents-starter

  1. 进入 ui 目录并安装前端依赖

cd ui

pnpm install

  1. 安装后端依赖(Python 代理)

cd ../agent-py

poetry install

  1. 运行后端服务(Python 代理):poetry run uvicorn sample_agent.demo:app --reload
  2. 运行前端服务

打开一个新的终端窗口,进入 ui 目录并启动前端服务

cd ui

pnpm run dev

访问端口:http://localhost:3000

关键技术难点:

1.修改调整前端demo.py代码:

"""

This serves the "sample_agent" agent.

"""

from langgraph.checkpoint.base import CheckpointMetadata

EXCLUDED_METADATA_KEYS = getattr(CheckpointMetadata, "EXCLUDED_KEYS", set())

import os

from dotenv import load_dotenv

load_dotenv()

from fastapi import FastAPI

import uvicorn

from copilotkit.integrations.fastapi import add_fastapi_endpoint

from copilotkit import CopilotKitRemoteEndpoint, LangGraphAgent

from sample_agent.agent import graph

from langgraph.checkpoint.memory import MemorySaver

app = FastAPI()

# 创建检查点实例

checkpointer = MemorySaver()

# 正确配置代理(通过graph配置检查点)

graph.checkpointer = checkpointer  # 直接在graph上设置检查点

sdk = CopilotKitRemoteEndpoint(

    agents=[

        LangGraphAgent(

            name="sample_agent",

            description="An example agent",

            graph=graph  # 只传递graph参数

        )

    ],

)

add_fastapi_endpoint(app, sdk, "/copilotkit")

def main():

    port = int(os.getenv("PORT", "8000"))

    uvicorn.run(

        "sample_agent.demo:app",

        host="0.0.0.0",

        port=port,

        reload=True,

    )

if __name__ == "__main__":

    main()

2.修改调整后端agent.py代码(设置为调用DeepSeek API服务)

"""

This is the main entry point for the agent.

It defines the workflow graph, state, tools, nodes and edges.

"""

from dotenv import load_dotenv

import os

from typing_extensions import Literal

from langchain_openai import ChatOpenAI  

from langchain_core.messages import SystemMessage, AIMessage

from langchain_core.runnables import RunnableConfig

from langchain.tools import tool

from langgraph.graph import StateGraph, END

from langgraph.types import Command

from langgraph.prebuilt import ToolNode

from copilotkit import CopilotKitState

load_dotenv()

class AgentState(CopilotKitState):

    """

    Here we define the state of the agent

    In this instance, we're inheriting from CopilotKitState, which will bring in

    the CopilotKitState fields. We're also adding a custom field, `language`,

    which will be used to set the language of the agent.

    """

    proverbs: list[str] = []

    # your_custom_agent_state: str = ""

@tool

def get_weather(location: str):

    """

    Get the weather for a given location.

    """

    return f"The weather for {location} is 70 degrees."

# @tool

# def your_tool_here(your_arg: str):

#     """Your tool description here."""

#     print(f"Your tool logic here")

#     return "Your tool response here."

tools = [

    get_weather

    # your_tool_here

]

async def chat_node(state: AgentState, config: RunnableConfig):

    """

    Standard chat node based on the ReAct design pattern. It handles:

    - The model to use (and binds in CopilotKit actions and the tools defined above)

    - The system prompt

    - Getting a response from the model

    - Handling tool calls

    For more about the ReAct design pattern, see:

    https://www.perplexity.ai/search/react-agents-NcXLQhreS0WDzpVaS4m9Cg

    """

    # 1. Define the model

    model = ChatOpenAI(

        openai_api_key=os.getenv("DEEPSEEK_API_KEY"),

        openai_api_base="https://api.deepseek.com/v1",

        model_name="deepseek-chat",

        temperature=0.7,

        max_tokens=2048,

   

    )

    # 2. Bind the tools to the model

    model_with_tools = model.bind_tools(

        [

            *state["copilotkit"]["actions"],

            get_weather,

            # your_tool_here

        ],

        # 2.1 Disable parallel tool calls to avoid race conditions,

        #     enable this for faster performance if you want to manage

        #     the complexity of running tool calls in parallel.

        parallel_tool_calls=False,

    )

    # 3. Define the system message by which the chat model will be run

    system_message = SystemMessage(

        content=f"You are a helpful assistant. Talk in {state.get('language', 'english')}."

    )

    # 4. Run the model to generate a response

    response = await model_with_tools.ainvoke([

        system_message,

        *state["messages"],

    ], config)

    # 5. Check for tool calls in the response and handle them. We ignore

    #    CopilotKit actions, as they are handled by CopilotKit.

    if isinstance(response, AIMessage) and response.tool_calls:

        actions = state["copilotkit"]["actions"]

        # 5.1 Check for any non-copilotkit actions in the response and

        #     if there are none, go to the tool node.

        if not any(

            action.get("name") == response.tool_calls[0].get("name")

            for action in actions

        ):

            return Command(goto="tool_node", update={"messages": response})

    # 6. We've handled all tool calls, so we can end the graph.

    return Command(

        goto=END,

        update={

            "messages": response

        }

    )

# Define the workflow graph

workflow = StateGraph(AgentState)

workflow.add_node("chat_node", chat_node)

workflow.add_node("tool_node", ToolNode(tools=tools))

workflow.add_edge("tool_node", "chat_node")

workflow.set_entry_point("chat_node")

# Compile the workflow graph

graph = workflow.compile()

3.langchain-openai 依赖冲突错误:

修改 pyproject.toml 强制统一版本

  # 用记事本编辑(或使用其他编辑器如VS Code)

   notepad pyproject.toml

[tool.poetry.dependencies]

python = "^3.8"

langchain-openai = ">=0.3.9,<0.4.0"  # 确保只有这一个约束

Logo

欢迎加入 MCP 技术社区!与志同道合者携手前行,一同解锁 MCP 技术的无限可能!

更多推荐