分类目录:《自然语言处理从入门到应用》总目录
使用少量示例
本部分的内容介绍了如何在聊天模型(Chat Models)中使用少量示例。关于如何最好地进行少量示例提示尚未形成明确的共识。因此,我们尚未固定任何关于此的抽象概念,而是使用现有的抽象概念。
交替的人工智能/人类消息
进行少量示例提示的第一种方式是使用交替的人工智能/人类消息。以下是一个示例:
from langchain.chat_models import ChatOpenAI
from langchain import PromptTemplate, LLMChain
from langchain.prompts.chat import (ChatPromptTemplate,SystemMessagePromptTemplate,AIMessagePromptTemplate,HumanMessagePromptTemplate,
)
from langchain.schema import (AIMessage,HumanMessage,SystemMessage
)chat = ChatOpenAI(temperature=0)template="You are a helpful assistant that translates english to pirate."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
example_human = HumanMessagePromptTemplate.from_template("Hi")
example_ai = AIMessagePromptTemplate.from_template("Argh me mateys")
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, example_human, example_ai, human_message_prompt])chain = LLMChain(llm=chat, prompt=chat_prompt)# 从格式化的消息中获取聊天完成结果
chain.run("I love programming.")
输出:
"I be lovin' programmin', me hearty!"
系统消息
OpenAI提供了一个可选的name
参数,我们也建议与系统消息一起使用以进行少量示例提示。以下是如何使用此功能的示例:
template="You are a helpful assistant that translates english to pirate."
system_message_prompt = SystemMessagePromptTemplate.from_template(template)
example_human = SystemMessagePromptTemplate.from_template("Hi", additional_kwargs={"name": "example_user"})
example_ai = SystemMessagePromptTemplate.from_template("Argh me mateys", additional_kwargs={"name": "example_assistant"})
human_template="{text}"
human_message_prompt = HumanMessagePromptTemplate.from_template(human_template)chat_prompt = ChatPromptTemplate.from_messages([system_message_prompt, example_human, example_ai, human_message_prompt])
chain = LLMChain(llm=chat, prompt=chat_prompt)# 从格式化的消息中获取聊天完成结果
chain.run("I love programming.")
输出:
"I be lovin' programmin', me hearty!"
响应流式传输
本部分介绍了如何在聊天模型中使用流式传输:
from langchain.chat_models import ChatOpenAI
from langchain.schema import (HumanMessage,
)
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
chat = ChatOpenAI(streaming=True, callbacks=[StreamingStdOutCallbackHandler()], temperature=0)
resp = chat([HumanMessage(content="Write me a song about sparkling water.")])
输出:
Verse 1:
Bubbles rising to the top
A refreshing drink that never stops
Clear and crisp, it's pure delight
A taste that's sure to exciteChorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibeVerse 2:
No sugar, no calories, just pure bliss
A drink that's hard to resist
It's the perfect way to quench my thirst
A drink that always comes firstChorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibeBridge:
From the mountains to the sea
Sparkling water, you're the key
To a healthy life, a happy soul
A drink that makes me feel wholeChorus:
Sparkling water, oh so fine
A drink that's always on my mind
With every sip, I feel alive
Sparkling water, you're my vibeOutro:
Sparkling water, you're the one
A drink that's always so much fun
I'll never let you go, my friend
Sparkling
参考文献:
[1] LangChain 🦜️🔗 中文网,跟着LangChain一起学LLM/GPT开发:https://www.langchain.com.cn/
[2] LangChain中文网 - LangChain 是一个用于开发由语言模型驱动的应用程序的框架:http://www.cnlangchain.com/