[论文笔记] A Prompt Pattern Catalog to Enhance Prompt Engineering with ChatGPT

Introduction : 一个好的 prompt 可以提高 LLM 的表现;prompt 可以像软件开发一样被工程化;这篇论文的主要贡献在于提出了 prompt patterns 用于 prompt engineering

Comparing software patterns with prompt patterns : 这篇论文提出的用于构建 prompt 的 framework 可以帮助用户与 LLM 进行有效交流;我们将 prompt pattern 和 software designing patterns 做对比,二者有以下的共同之处:

software pattern prompt pattern
A name and classification 用来标识是哪一种模式并对其进行分类 用来标识是哪一种模式并对其进行分类
The intent 描述这种设计模式的意图 描述这种 prompt 解决的问题或目的
The motivation 描述设计模式针对的基本问题及其重要性 向 LLM 解释动机
The structure and participants / key ideas 描述不同的设计模式及其协作 描述一系列基本信息,比如一些 key ideas
Example code / implementation 样例代码 向 LLM 举一个例子
Consequences 总结 总结

之后在介绍具体 patterns 时也采用这一套分析流程;在软件设计中,经常用 UML 类图来描述结构;在 prompt engineering 中,固然可以构建 grammar for prompt,即使这会带来一些挑战

A catalog of prompt patterns for conversational LLMs : 下面是一张 prompt patterns 表

Pattern category Prompt pattern Description
Input semantics Meta language creation How an LLM understands and translates
Output customization Output autometer Persona Visualization Generator Recipe Template Focus on the type of output
Error identification Fact check list Reflection
Prompt improvement Question refinement Alternative approaches Cognitive Verifier Refusal breaker Improving quality of answers
Interaction Flipped interaction Game play Infinite generation Change styles of interaction
Context control Context manager Controling the context

The meta language creation : 元语言创建

实验工具:讯飞星火

messages = [
    ChatMessage(
        role='system',
        content='When I type two identifiers separated by "->", I am describing a graph traversal. For example, "A->B" means "from A, go to B".',
    ), ChatMessage(
        role='user',
        content='What does "A->B->C" mean?',
    )]
generations=[[ChatGeneration(text='"A->B->C" means "from A, go to B, and then go to C".', message=AIMessage(content='"A->B->C" means "from A, go to B, and then go to C".'))]] llm_output={'token_usage': {'question_tokens': 11, 'prompt_tokens': 50, 'completion_tokens': 23, 'total_tokens': 73}} run=[RunInfo(run_id=UUID('bd98c7b1-26c9-4c7d-96ab-1dffd4ef18be'))]

感觉只是做了单纯的翻译

messages = [
        ChatMessage(
            role='system',
            content='When I type two identifiers separated by "->", I am describing a graph traversal. For example, "A->B" means "from A, go to B".',
        ), ChatMessage(
            role='user',
            content='There is "A->B->C" and "A->D->E->C". What is the shortest path from A to C? Why?',
        )]
generations=[[ChatGeneration(text='The shortest path from A to C is "A->B->C". \n\nThis is because it only involves two edges, whereas the other path "A->D->E->C" involves three edges. In graph theory, the shortest path between two vertices is the one with the minimum number of edges.', message=AIMessage(content='The shortest path from A to C is "A->B->C". \n\nThis is because it only involves two edges, whereas the other path "A->D->E->C" involves three edges. In graph theory, the shortest path between two vertices is the one with the minimum number of edges.'))]] llm_output={'token_usage': {'question_tokens': 33, 'prompt_tokens': 72, 'completion_tokens': 68, 'total_tokens': 140}} run=[RunInfo(run_id=UUID('40c21154-9949-4109-bbd5-826baf7a713d'))]

The output automater pattern : 输出自动化

messages = [
        ChatMessage(
            role='system',
            content='当输出包含一个或多个步骤时,输出每个步骤对应的html代码',
        ), ChatMessage(
            role='user',
            content='创建一个圣杯布局的网页',
        )]

结果太长,不放了

The flipped interaction pattern : 反问模式

posted @ 2024-04-29 21:10  sysss  阅读(10)  评论(0编辑  收藏  举报