《60天AI学习计划启动 | Day 40: 前端 AI SDK 抽象(aiClient + hooks)》

Day 40:前端 AI SDK 抽象(aiClient + hooks)

学习目标

  • 抽象 一套通用的 aiClient 接口(不用关心具体后端实现细节)
  • 封装 常用 hooks:useChat(非流式)、useStreamingChat(流式)
  • 为后面 在任何项目中快速接 AI 打基础

核心知识点

  • aiClient 抽象

    • 核心思想:前端只依赖一个统一客户端,不直接散落 fetch('/api/xxx')
    • 可以定义接口:
      interface AIClient {
        chat(payload: ChatRequest): Promise<ChatResponse>
        streamChat(payload: ChatRequest): AsyncIterable<ChatChunk>
      }
      
    • 具体实现可以对接不同后端(自家服务 / OpenAI / 其他网关)
  • hooks 层

    • useChat(client):一次性请求,适合短回答、非流式场景
    • useStreamingChat(client):基于 AsyncIterable 或 SSE 读流,适合聊天页面

实战作业(附完整代码)

作业 1:定义 AIClient 接口 + 一个 HTTP 实现

// aiClient.ts
export interface ChatMessage {
  role: 'system' | 'user' | 'assistant'
  content: string
}

export interface ChatRequest {
  messages: ChatMessage[]
  meta?: Record<string, any>
}

export interface ChatResponse {
  answer: string
  usage?: {
    promptTokens?: number
    completionTokens?: number
    totalTokens?: number
  }
}

export interface ChatChunk {
  type: 'delta' | 'final' | 'error'
  content?: string
  error?: string
}

export interface AIClient {
  chat(req: ChatRequest): Promise<ChatResponse>
  streamChat(req: ChatRequest): AsyncIterable<ChatChunk>
}

// 一个基于 HTTP 的简单实现(假设后端有 /api/chat 和 /api/chat/stream)
export class HttpAIClient implements AIClient {
  constructor(private baseUrl = '') {}

  async chat(req: ChatRequest): Promise<ChatResponse> {
    const res = await fetch(this.baseUrl + '/api/chat', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(req)
    })
    if (!res.ok) throw new Error(`HTTP ${res.status}`)
    const data = await res.json()
    return {
      answer: data.answer ?? '',
      usage: data.usage
    }
  }

  async *streamChat(req: ChatRequest): AsyncIterable<ChatChunk> {
    const res = await fetch(this.baseUrl + '/api/chat/stream', {
      method: 'POST',
      headers: { 'Content-Type': 'application/json' },
      body: JSON.stringify(req)
    })
    if (!res.body) throw new Error('No response body')
    const reader = res.body.getReader()
    const decoder = new TextDecoder()
    let done = false
    let buffer = ''

    while (!done) {
      const chunk = await reader.read()
      done = chunk.done
      if (chunk.value) {
        buffer += decoder.decode(chunk.value, { stream: true })
        const parts = buffer.split('\n\n')
        buffer = parts.pop() || ''
        for (const part of parts) {
          const line = part.trim()
          if (!line.startsWith('data:')) continue
          const jsonStr = line.slice(5).trim()
          if (!jsonStr || jsonStr === '[DONE]') continue
          const data = JSON.parse(jsonStr) as ChatChunk
          yield data
        }
      }
    }
  }
}

作业 2:useChat(非流式)hook

// useChat.ts
import { useState, useCallback } from 'react'
import type { AIClient, ChatMessage } from './aiClient'

interface UseChatOptions {
  client: AIClient
  initialMessages?: ChatMessage[]
}

export function useChat({ client, initialMessages = [] }: UseChatOptions) {
  const [messages, setMessages] = useState<ChatMessage[]>(initialMessages)
  const [loading, setLoading] = useState(false)
  const [error, setError] = useState<string | null>(null)

  const send = useCallback(
    async (content: string) => {
      const text = content.trim()
      if (!text || loading) return
      setError(null)

      const userMsg: ChatMessage = { role: 'user', content: text }
      const newMessages = [...messages, userMsg]
      setMessages(newMessages)
      setLoading(true)

      try {
        const res = await client.chat({ messages: newMessages })
        const aiMsg: ChatMessage = {
          role: 'assistant',
          content: res.answer
        }
        setMessages((prev) => [...prev, aiMsg])
      } catch (e: any) {
        setError(e?.message || '请求失败')
      } finally {
        setLoading(false)
      }
    },
    [client, messages, loading]
  )

  return { messages, loading, error, send }
}

作业 3:useStreamingChat hook(基于 streamChat

// useStreamingChat.ts
import { useState, useCallback, useRef } from 'react'
import type { AIClient, ChatMessage, ChatChunk } from './aiClient'

interface UseStreamingChatOptions {
  client: AIClient
  initialMessages?: ChatMessage[]
}

export function useStreamingChat({
  client,
  initialMessages = []
}: UseStreamingChatOptions) {
  const [messages, setMessages] = useState<ChatMessage[]>(initialMessages)
  const [loading, setLoading] = useState(false)
  const [error, setError] = useState<string | null>(null)
  const abortRef = useRef<AbortController | null>(null)

  const send = useCallback(
    async (content: string) => {
      const text = content.trim()
      if (!text || loading) return
      setError(null)

      const userMsg: ChatMessage = { role: 'user', content: text }
      const baseMessages = [...messages, userMsg]
      setMessages(baseMessages)

      const aiMsgId = Symbol('aiMsg') // 本地标记
      let currentAI: ChatMessage = { role: 'assistant', content: '' }

      setMessages((prev) => [...prev, currentAI])
      setLoading(true)

      try {
        // 不使用 AbortController 控制 fetch 本身,因为 AsyncIterable 内部已封装
        for await (const chunk of client.streamChat({ messages: baseMessages })) {
          if (chunk.type === 'delta' && chunk.content) {
            currentAI = {
              ...currentAI,
              content: currentAI.content + chunk.content
            }
            setMessages((prev) => {
              const next = [...prev]
              next[next.length - 1] = currentAI
              return next
            })
          } else if (chunk.type === 'error') {
            throw new Error(chunk.error || '流式错误')
          }
        }
      } catch (e: any) {
        setError(e?.message || '流式请求失败')
      } finally {
        setLoading(false)
        abortRef.current = null
      }
    },
    [client, messages, loading]
  )

  return { messages, loading, error, send }
}

明日学习计划预告(Day 41)

  • 主题:LangChain 复杂 Chain(Router / Parallel / Map-Reduce)
  • 方向
    • 用 Router Chain 按“问答/代码/报表”路由到不同链路
    • 前端只需要感知一个统一的 /smart-chat 接口,由后端 Chain 决定内部流程
posted @ 2025-12-17 10:42  XiaoZhengTou  阅读(2)  评论(0)    收藏  举报