Stay Hungry,Stay Foolish!

assistant-ui

assistant-ui.

https://langgraph.com.cn/cloud/how-tos/use_stream_react/index.html

useStream() React hook 提供了一种将 LangGraph 无缝集成到 React 应用程序中的方式。它处理了流式传输、状态管理和分支逻辑的所有复杂性,让你专注于构建出色的聊天体验。

主要功能

  • 消息流:处理消息块流以形成完整消息
  • 消息、中断、加载状态和错误的自动状态管理
  • 对话分支:从聊天历史中的任何点创建备用对话路径
  • 与 UI 无关的设计:使用你自己的组件和样式

让我们探讨如何在你的 React 应用程序中使用 useStream()

useStream() 为创建定制聊天体验提供了坚实的基础。对于预构建的聊天组件和界面,我们还建议查看 CopilotKitassistant-ui

 

https://docs.langchain.com/langgraph-platform/use-stream-react

How to integrate LangGraph into your React application

 
 

The useStream() React hook provides a seamless way to integrate LangGraph into your React applications. It handles all the complexities of streaming, state management, and branching logic, letting you focus on building great chat experiences. Key features:

  • Messages streaming: Handle a stream of message chunks to form a complete message
  • Automatic state management for messages, interrupts, loading states, and errors
  • Conversation branching: Create alternate conversation paths from any point in the chat history
  • UI-agnostic design: bring your own components and styling

Let’s explore how to use useStream() in your React application. The useStream() provides a solid foundation for creating bespoke chat experiences. For pre-built chat components and interfaces, we also recommend checking out CopilotKit and assistant-ui.

 

https://www.npmjs.com/package/@langchain/langgraph-sdk

To get started with the JS/TS SDK, install the package

yarn add @langchain/langgraph-sdk

You will need a running LangGraph API server. If you're running a server locally using langgraph-cli, SDK will automatically point at http://localhost:8123, otherwise you would need to specify the server URL when creating a client.

import { Client } from "@langchain/langgraph-sdk";

const client = new Client();

// List all assistants
const assistants = await client.assistants.search({
  metadata: null,
  offset: 0,
  limit: 10,
});

// We auto-create an assistant for each graph you register in config.
const agent = assistants[0];

// Start a new thread
const thread = await client.threads.create();

// Start a streaming run
const messages = [{ role: "human", content: "what's the weather in la" }];

const streamResponse = client.runs.stream(
  thread["thread_id"],
  agent["assistant_id"],
  {
    input: { messages },
  }
);

for await (const chunk of streamResponse) {
  console.log(chunk);
}

 

https://github.com/assistant-ui/assistant-ui

The UX of ChatGPT in your React app 💬🚀

assistant-ui is an open source TypeScript/React library to build production-grade AI chat experiences fast.

  • Handles streaming, auto-scrolling, accessibility, and real-time updates for you
  • Fully composable primitives inspired by shadcn/ui and cmdk — customize every pixel
  • Works with your stack: AI SDK, LangGraph, Mastra, or any custom backend
  • Broad model support out of the box (OpenAI, Anthropic, Mistral, Perplexity, AWS Bedrock, Azure, Google Gemini, Hugging Face, Fireworks, Cohere, Replicate, Ollama) with easy extension to custom APIs

Why assistant-ui

  • Fast to production: battle-tested primitives, built-in streaming and attachments
  • Designed for customization: composable pieces instead of a monolithic widget
  • Great DX: sensible defaults, keyboard shortcuts, a11y, and strong TypeScript
  • Enterprise-ready: optional chat history and analytics via Assistant Cloud

 

https://www.assistant-ui.com/docs/getting-started

UX of ChatGPT in your own app

assistant-ui is the TypeScript/React library for AI Chat.
Built on shadcn/ui and Tailwind.

About assistant-ui

assistant-ui helps you create beautiful, enterprise-grade AI chat interfaces in minutes. Whether you're building a chatGPT clone, a customer support chatbot, an AI assistant, or a complex multi agent application, assistant-ui provides the frontend primative components and state management layers to focus on what makes your application unique.

Key Features

 

Instant Chat UI

Pre-built beautiful, customizable chat interfaces out of the box. Easy to quickly iterate on your idea.

 

Chat State Management

Powerful state management for chat interactions, optimized for streaming responses and efficient rendering.

 

High Performance

Optimized for speed and efficiency with minimal bundle size, ensuring your AI chat interfaces remain responsive.

 

Framework Agnostic

Easily integrate with any backend system, whether using Vercel AI SDK, direct LLM connections, or custom solutions. Works with any React-based framework.

 

 

https://github.com/assistant-ui/assistant-ui/tree/main/examples/with-langgraph

LangGraph Example

Hosted Demo

This example demonstrates how to use LangChain LangGraph with assistant-ui.

It is meant to be used with the backend found at LangGraph's Stockbroker example: https://github.com/bracesproul/langgraphjs-examples/tree/main/stockbroker

You need to set the following environment variables:

NEXT_PUBLIC_API_URL=https://stockbrokeragent-bracesprouls-projects.vercel.app/api
NEXT_PUBLIC_LANGGRAPH_ASSISTANT_ID=stockbroker
 

To run the example, run the following commands:

npm install
npm run dev
 

 

langgraph例子:

https://github.com/fanqingsong/assistant-ui-langgraph-fastapi

https://github.com/fanqingsong/assistant-ui-langgraph-interrupt/tree/main/backend

https://github.com/fanqingsong/assistant-ui-stockbroker

Stockbroker Human in the Loop

The code for the Stockbroker Human in the Loop video can be found in this directory. It's setup as a monorepo-style project, with frontend and backend directories. The frontend directory contains a Next.js application which allows you to interact with the Stockbroker agent via a chat interface. The backend contains a LangGraph agent which powers the core functionality of the stockbroker.

https://github.com/fanqingsong/open-canvas?tab=readme-ov-file

 

posted @ 2025-10-06 12:06  lightsong  阅读(30)  评论(0)    收藏  举报
千山鸟飞绝,万径人踪灭