Stay Hungry,Stay Foolish!

full-stack-langgraph-chat -- 流式响应客户端

langgraph

https://github.com/langchain-ai/langgraph/tree/main/examples

 

LangGraph is a library for building stateful, multi-actor applications with LLMs, used to create agent and multi-agent workflows. Check out an introductory tutorial here.

LangGraph is inspired by Pregel and Apache Beam. The public interface draws inspiration from NetworkX. LangGraph is built by LangChain Inc, the creators of LangChain, but can be used without LangChain.

Why use LangGraph?

LangGraph provides fine-grained control over both the flow and state of your agent applications. It implements a central persistence layer, enabling features that are common to most agent architectures:

  • Memory: LangGraph persists arbitrary aspects of your application's state, supporting memory of conversations and other updates within and across user interactions;
  • Human-in-the-loop: Because state is checkpointed, execution can be interrupted and resumed, allowing for decisions, validation, and corrections at key stages via human input.

Standardizing these components allows individuals and teams to focus on the behavior of their agent, instead of its supporting infrastructure.

Through LangGraph Platform, LangGraph also provides tooling for the development, deployment, debugging, and monitoring of your applications.

LangGraph integrates seamlessly with LangChain and LangSmith (but does not require them).

To learn more about LangGraph, check out our first LangChain Academy course, Introduction to LangGraph, available for free here.

full-stack-langgraph-chat

流式响应客户端。

Full Stack LangGraph Example

This is an example app that uses LangGraph to create an AI agent. It will be a simpler application that uses React/Python/FastAPI/ Postgres and LLMs to perform analysis on customer feedback for products for a fictional company called 'Acme'.

 

 

 

main.py

import logging
from contextlib import asynccontextmanager

from fastapi import FastAPI
from sqlalchemy import create_engine
from sqlalchemy.orm import sessionmaker

from app.core.database import Base, SessionLocal
from app.core.settings import get_settings
from app.router.agent import router as agents_router
from app.router.auth import router as auth_router
from app.router.user import router as users_router
from app.scripts.seed_db import seed_users
from app.services.user_service import get_user_service

settings = get_settings()
ENV = settings.env
DATABASE_URL = settings.database_url
engine = create_engine(DATABASE_URL)
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)


@asynccontextmanager
async def lifespan(app: FastAPI):
    Base.metadata.create_all(bind=engine)
    db = SessionLocal()
    user_service = get_user_service(db)
    # Seed users if dev
    if ENV == "dev":
        try:
            seed_users(db=db, user_service=user_service)
            logging.info("Test users seeded.")
        finally:
            db.close()
    yield
    # Clean up


app = FastAPI(lifespan=lifespan)

app.include_router(router=agents_router)
app.include_router(router=users_router)
app.include_router(router=auth_router)


@app.get("/")
def read_root():
    return {"message": "Welcome to the FastAPI with LangGraph example"}


logging.info(f"API Running in {ENV} mode")

 

agent_router.py

import logging

from app.schema.agent import AgentRequest
from app.services import agent_service
from fastapi import APIRouter
from fastapi.responses import StreamingResponse

router = APIRouter(prefix="/agents", tags=["agents"])


@router.post("/stream")
async def stream(request: AgentRequest) -> StreamingResponse:
    logging.info(request)
    config = {"configurable": {"thread_id": "1"}}
    stream = agent_service.stream_agent_output(request.user_input, config)
    return StreamingResponse(stream, media_type="text/plain")

 

 

agent_service.py

import logging

from app.agent.agent import get_agent
from fastapi import HTTPException
from langchain_core.messages import HumanMessage
from langgraph.graph.state import CompiledStateGraph


class AgentService:
    def __init__(self, agent: CompiledStateGraph) -> None:
        self.agent = agent

    async def stream_agent_output(self, input: str, config: dict[str, dict[str, str]]):
        try:
            inputs = {"messages": [HumanMessage(content=input)]}
            async for output in self.agent.astream(
                inputs, config=config, stream_mode="updates"
            ):
                for value in output.values():
                    yield value["messages"][-1]
        except Exception as e:
            logging.exception(e)
            raise HTTPException(
                status_code=500, detail="Oops! An error occurred. Try again later."
            )


def get_agent_service() -> AgentService:
    return AgentService(get_agent())

 

posted @ 2025-01-27 17:14  lightsong  阅读(124)  评论(0)    收藏  举报
千山鸟飞绝,万径人踪灭