LangChain Routing 学习笔记

devtools/2024/10/16 2:30:45/

LangChain Routing 学习笔记

  • 0. 引言
  • 1. 使用提示词
  • 2. 使用 RunnableLambda

0. 引言

在使用大语言模型开发应用时,其中一个场景就是根据不同的输入,调用(或者说路由到)不同的逻辑。这就好比我们以前开发时经常使用的if ... else ... 一样。

实现路由有多种方法,下面介绍2种简单的方法。

1. 使用提示词

这种方法简单来说就是使用提示词,让大语言模型根据输入给出特定的输出。

示例代码,

from dotenv import load_dotenv, find_dotenv_ = load_dotenv(find_dotenv())
from langchain_openai import ChatOpenAI
# from langchain_anthropic import ChatAnthropic
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import PromptTemplatechain = (PromptTemplate.from_template("""Given the user question below, classify it as either being about `LangChain`, `Anthropic`, or `Other`.Do not respond with more than one word.<question>
{question}
</question>Classification:""")# | ChatAnthropic(model_name="claude-3-haiku-20240307")| ChatOpenAI(model="gpt-4", temperature=0)| StrOutputParser()
)chain.invoke({"question": "how do I call Anthropic?"})

输出结果,

Anthropic

2. 使用 RunnableLambda

示例代码,

from dotenv import load_dotenv, find_dotenv_ = load_dotenv(find_dotenv())
from langchain_openai import ChatOpenAI
# from langchain_anthropic import ChatAnthropic
from langchain_core.output_parsers import StrOutputParser
from langchain_core.prompts import PromptTemplatechain = (PromptTemplate.from_template("""Given the user question below, classify it as either being about `LangChain`, `Anthropic`, or `Other`.Do not respond with more than one word.<question>
{question}
</question>Classification:""")# | ChatAnthropic(model_name="claude-3-haiku-20240307")| ChatOpenAI(model="gpt-4", temperature=0)| StrOutputParser()
)
from langchain_core.prompts import PromptTemplate
# from langchain_anthropic import ChatAnthropiclangchain_chain = PromptTemplate.from_template("""You are an expert in langchain. \
Always answer questions starting with "As Harrison Chase told me". \
Respond to the following question:Question: {question}
Answer:"""
# ) | ChatAnthropic(model_name="claude-3-haiku-20240307")
) | ChatOpenAI(model="gpt-4", temperature=0)
anthropic_chain = PromptTemplate.from_template("""You are an expert in anthropic. \
Always answer questions starting with "As Dario Amodei told me". \
Respond to the following question:Question: {question}
Answer:"""
# ) | ChatAnthropic(model_name="claude-3-haiku-20240307")
) | ChatOpenAI(model="gpt-4", temperature=0)
general_chain = PromptTemplate.from_template("""Respond to the following question:Question: {question}
Answer:"""
# ) | ChatAnthropic(model_name="claude-3-haiku-20240307")
) | ChatOpenAI(model="gpt-4", temperature=0)
def route(info):if "anthropic" in info["topic"].lower():return anthropic_chainelif "langchain" in info["topic"].lower():return langchain_chainelse:return general_chain
from langchain_core.runnables import RunnableLambdafull_chain = {"topic": chain, "question": lambda x: x["question"]} | RunnableLambda(route
)
full_chain.invoke({"question": "how do I use Anthropic?"})

输出,

AIMessage(content="As Dario Amodei, the co-founder of Anthropic, explained, using Anthropic's language models typically involves accessing their APIs to generate text or analyze inputs. While exact steps depend on the specific application and whether you're working with a public or private API, generally, you would:\n\n1. **Sign up for access**: Visit Anthropic's website and sign up for an account if they offer public access or reach out to them for partnership if their services are not publicly available.\n\n2. **Obtain an API key**: Once your account is set up, you'll receive an API key that authorizes your application to interact with their models.\n\n3. **Understand the API documentation**: Familiarize yourself with Anthropic's API documentation which outlines how to structure requests, what parameters are available, and how to interpret responses.\n\n4. **Make API calls**: Using a programming language of your choice (like Python), write code that constructs API requests. This usually involves specifying the prompt you want the model to respond to, the maximum length of the response, and other optional settings.\n\n5. **Process the response**: The API will return a response which is typically in JSON format. Your code should parse this response to extract the generated text or any other data provided.\n\n6. **Integrate into your application**: Depending on your use case, integrate the generated text or insights into your software, whether it's for chatbots, content generation, language translation, or analysis.\n\n7. **Respect usage guidelines and ethical considerations**: Always adhere to Anthropic's terms of service, be mindful of the ethical implications of using AI, and ensure you're handling user data responsibly.\n\nRemember that the specifics might change as Anthropic evolves its services, so always refer to their latest documentation for the most accurate instructions.", response_metadata={'token_usage': {'completion_tokens': 364, 'prompt_tokens': 47, 'total_tokens': 411}, 'model_name': 'gpt-4', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'stop', 'logprobs': None}, id='run-636a9a22-389b-478f-8938-51309df9a3d1-0')

示例代码,

full_chain.invoke({"question": "how do I use LangChain?"})

输出,

AIMessage(content="As Harrison Chase explained, using LangChain involves several steps:\n\n1. **Understand Your Use Case**: First, determine what problem you want to solve or what task you aim to accomplish with LangChain. It's a framework designed to create powerful language models and AI applications.\n\n2. **Choose Components**: LangChain is modular, so you'll select the appropriate components for your use case. This might include LLMs (Large Language Models), vector databases, prompt engineering tools, and more.\n\n3. **Set Up Environment**: You need a development environment that supports the technologies used by LangChain, typically Python with libraries like Langchain, Hugging Face Transformers, or other necessary dependencies.\n\n4. **Integrate APIs**: If you're using external models or services, set up API keys and integrate them into your project.\n\n5. **Design Workflows**: Define how data will flow through the system, from input to processing by language models to output. This might involve creating chains of different components.\n\n6. **Write Code**: Implement your design using LangChain's APIs and modules. Start with simple scripts or move on to more complex applications as you become comfortable.\n\n7. **Test and Iterate**: Use sample inputs to test your setup, analyze the outputs, and refine your implementation based on the results.\n\n8. **Deploy and Monitor**: Once satisfied with the performance, deploy your application to a server or cloud platform. Continuously monitor its performance and make adjustments as needed.\n\nRemember, LangChain is about combining different AI components effectively, so it's crucial to have a clear understanding of each part you're using and how they interact. Always refer to the official documentation for the most up-to-date guidance and examples.", response_metadata={'token_usage': {'completion_tokens': 344, 'prompt_tokens': 44, 'total_tokens': 388}, 'model_name': 'gpt-4', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'stop', 'logprobs': None}, id='run-abe4f2fd-9d7c-4f08-8e48-d32ff173d6e3-0')

示例代码,

full_chain.invoke({"question": "whats 2 + 2"})

输出,

AIMessage(content='4', response_metadata={'token_usage': {'completion_tokens': 2, 'prompt_tokens': 23, 'total_tokens': 25}, 'model_name': 'gpt-4', 'system_fingerprint': 'fp_ollama', 'finish_reason': 'stop', 'logprobs': None}, id='run-3c6d5a95-cac9-4dc8-a600-63180f655196-0')

完结!


http://www.ppmy.cn/devtools/24028.html

相关文章

安卓manifest中的meta-data及其应用

目录 前言一、meta-data简介二、meta-data用法三、meta-data应用场景参考链接&#xff1a; 前言 在日常的Android开发中&#xff0c;AndroidManifest中总会出现一些< meta-data>标签&#xff0c;或是第三方SDK配置信息&#xff0c;或是系统配置&#xff0c;那么< met…

[SQL系列]从零开始学Clickhouse——集群篇

在上一篇中&#xff0c;我们通过Docker构建了一个简单的单点Clickhouse&#xff0c;但是如果要做大数据的处理的话&#xff0c;Clickhouse集群是必不可少的&#xff0c;今天我们先用Docker简单地搭建一个Clickhouse集群。 容器逐个部署 使用Docker部署ClickHouse集群涉及几个步…

20240331-1-基于深度学习的模型

基于深度学习的模型 知识体系 主要包括深度学习相关的特征抽取模型&#xff0c;包括卷积网络、循环网络、注意力机制、预训练模型等。 CNN TextCNN 是 CNN 的 NLP 版本&#xff0c;来自 Kim 的 [1408.5882] Convolutional Neural Networks for Sentence Classification 结…

如何查看连接的Linux服务器是ubuntu还是centos

测试环境 Ubuntu 22.04.4CentOS Linux release 7.9.2009 (Core) centos使用以下命令 cat /etc/centos-release结果 CentOS Linux release 7.9.2009 (Core)或者 cat /etc/issue结果 \S Kernel \r on an \mubuntu使用以下命令 cat /etc/issue结果 Ubuntu 22.04.4 LTS \n …

Spring Boot分段处理List集合多线程批量插入数据

项目场景&#xff1a; 大数据量的List集合&#xff0c;需要把List集合中的数据批量插入数据库中。 解决方案&#xff1a; 拆分list集合后&#xff0c;然后使用多线程批量插入数据库 1.实体类 package com.test.entity;import lombok.Data;Data public class TestEntity {priv…

机器学习Sklean基础教程

Scikit-learn&#xff08;也称为 sklearn&#xff09;是一个使用 python 语言的机器学习模块&#xff0c;内置了大量的监督和无监督学习算法&#xff0c;主要用于数据挖掘和数据分析。 以下是一个简单关于如何使用 sklearn 进行机器学习的指导&#xff1a; 安装: 首先&#…

封装 H.264 视频为 FLV 格式然后推流

封装 H.264 视频为 FLV 格式并通过 RTMP 推流 flyfish 协议 RTMP (Real-Time Messaging Protocol) RTSP (Real Time Streaming Protocol) SRT (Secure Reliable Transport) WebRTC RTMP&#xff08;Real Time Messaging Protocol&#xff09;是一种用于实时音视频流传输的协…

Python将浮点数格式化为字符串,精确到小数点后十位的精度

要将百分比精确到千万分之一&#xff0c;你可以在计算百分比后使用格式化字符串进行精确控制。下面是修改后的代码&#xff1a; def calculate_percentage(part, whole):"""计算百分比参数:part (float): 部分值whole (float): 总体值返回值:float: 百分比值&a…