Open ai 开发指南:gpt接口的第一个问答机器人demo

news/2025/2/13 2:08:09/

目录

内容

Python代码

C++ 代码

workspace 文件

BUILD文件

Java 代码

maven文件

执行效果


内容

基于openai接口实现循环gpt问答,并使用一个文件将问答内容进行记录。

Python代码

# -*- coding: utf-8 -*-
import openai
import time
from pathlib import Path# Set the OpenAI API key
openai.api_key = ""class ChatBot:def __init__(self, model):self.user = "\nYou: "self.bot = "GPT-3.5-turbo: "self.model = modelself.question_list = []self.answer_list = []self.text = ''self.turns = []self.last_result = ''def save_dialogue(self):# Generate a timestamp and create a filename for saving the dialoguetimestamp = time.strftime("%Y%m%d-%H%M-%S", time.localtime())  # Timestampfile_name = 'output/Chat_' + timestamp + '.md'  # Filenamef = Path(file_name)f.parent.mkdir(parents=True, exist_ok=True)# Save the dialogue to the filewith open(file_name, "w", encoding="utf-8") as f:for q, a in zip(self.question_list, self.answer_list):f.write(f"You: {q}\nGPT-3.5-turbo: {a}\n\n")print("Dialogue content saved to file: " + file_name)def generate(self):print('\nStart your conversation, type "exit" to end.')while True:question = input(self.user)self.question_list.append(question)prompt = self.bot + self.text + self.user + questionif question == 'exit':breakelse:try:response = openai.ChatCompletion.create(model=self.model,messages=[{"role": "user", "content": prompt},],)result = response["choices"][0]["message"]["content"].strip()print(result)self.answer_list.append(result)self.last_result = resultself.turns += [question] + [result]if len(self.turns) <= 10:self.text = " ".join(self.turns)else:self.text = " ".join(self.turns[-10:])except Exception as exc:print(exc)self.save_dialogue()if __name__ == '__main__':bot = ChatBot('gpt-3.5-turbo')bot.generate()

在这个代码中,我们实现了一个与 GPT-3.5-turbo 模型进行对话的聊天机器人。以下是代码的执行流程:

  1. 首先,我们导入需要用到的库,包括 openai、time 和 pathlib。
  2. 我们设置 OpenAI API 密钥,以便能够调用 GPT-3.5-turbo 模型。
  3. 定义一个名为 ChatBot 的类,该类用于与 GPT-3.5-turbo 模型进行交互。
  4. 类的 __init__ 方法初始化聊天机器人,包括设定用户和机器人的提示符(self.user 和 self.bot),以及其他一些变量,如问题列表、回答列表等。
  5. save_dialogue 方法用于将聊天记录保存到文件中。这个方法首先创建一个带有时间戳的文件名,然后将问题和回答列表中的内容写入该文件。
  6. generate 方法用于与 GPT-3.5-turbo 模型进行交互。在这个方法中,我们使用一个循环来获取用户输入的问题,并将其添加到问题列表中。然后,我们构建提示并使用 OpenAI API 调用 GPT-3.5-turbo 模型。我们获取模型的回答,并将其添加到回答列表中。循环会一直进行,直到用户输入 "exit",此时循环结束,调用 save_dialogue 方法将对话内容保存到文件中。
  7. 在主函数(if __name__ == '__main__')中,我们创建一个 ChatBot 类的实例,并调用 generate 方法开始与 GPT-3.5-turbo 模型的交互。

C++ 代码

// main.cpp
#include <cpprest/http_client.h>
#include <cpprest/filestream.h>
#include <cpprest/json.h>
#include <iostream>
#include <fstream>
#include <string>
#include <ctime>
#include <iomanip>using namespace utility;
using namespace web;
using namespace web::http;
using namespace web::http::client;
using namespace concurrency::streams;
namespace {const std::string api_key = "";
}
class ChatBot {
public:
/**
* @brief Construct a new ChatBot object
*
* @param model The OpenAI GPT model to use (e.g., "gpt-3.5-turbo")
*/
explicit ChatBot(std::string model) : model_(model) {}/**
* @brief Save the dialogue content to a Markdown file
*
* @note This function saves the generated dialogue to a Markdown file named
* "Chat_timestamp.md" in the "output" directory.
*/
void SaveDialogue() {auto timestamp = std::time(nullptr);std::stringstream ss;ss << "output/Chat_" << std::put_time(std::localtime(&timestamp), "%Y%m%d-%H%M-%S") << ".md";std::string file_name = ss.str();std::ofstream out(file_name);if (!out) {std::cerr << "Could not create the output file.\n";return;}for (size_t i = 0; i < question_list_.size(); ++i) {out << "You: " << question_list_[i] << "\nGPT-3.5-turbo: " << answer_list_[i] << "\n\n";}out.close();std::cout << "Dialogue content saved to file: " << file_name << std::endl;
}/**
* @brief Generate answers to the user's questions using the GPT model
*
* @note This function prompts the user for their questions, generates answers
* using the GPT model, and saves the dialogue to a file when the user exits.
*/
void Generate() {std::string question;std::cout << "\nStart your conversation, type \"exit\" to end." << std::endl;while (true) {std::cout << "\nYou: ";std::getline(std::cin, question);if (question == "exit") {break;}question_list_.push_back(question);auto answer = GetAnswer(question);if (!answer.empty()) {std::cout << "GPT-3.5-turbo: " << answer << std::endl;answer_list_.push_back(answer);}}SaveDialogue();
}private:
/**
* @brief Get the GPT model's answer to the given question
*
* @param question The user's question
* @return The GPT model's answer as a string
*
* @note This function sends the user's question to the GPT model and
* retrieves the model's answer as a string. If an error occurs, it
* returns an empty string.
*/
std::string GetAnswer(const std::string& question) {
http_client client(U("https://api.openai.com/v1/engines/gpt-3.5-turbo/completions"));
http_request request(methods::POST);
request.headers().add("Authorization", "Bearer "+api_key);
request.headers().add("Content-Type", "application/json");json::value body;
body[U("prompt")] = json::value::string(U("GPT-3.5-turbo: " + question));
body[U("max_tokens")] = 50;
body[U("n")] = 1;
body[U("stop")] = json::value::array({json::value::string(U("\n"))});
request.set_body(body);try {
auto response = client.request(request).get();
auto response_body = response.extract_json().get();
auto result = response_body.at(U("choices")).as_array()[0].at(U("text")).as_string();
return utility::conversions::to_utf8string(result);
} catch (const std::exception& e) {
std::cerr << "Error: " << e.what() << std::endl;
return "";
}
}std::string model_;
std::vector<std::string> question_list_;
std::vector<std::string> answer_list_;
};int main(){
ChatBot bot("gpt-3.5-turbo");
bot.Generate();
return 0;
}

因为我的最近经常在使用bazel,所以给出一个基于bazel 的C++编译方法

workspace 文件

load("@bazel_tools//tools/build_defs/repo:http.bzl", "http_archive")# Add cpprestsdk as a dependency
http_archive(name = "cpprestsdk",build_file = "@//:cpprestsdk.BUILD",sha256 = "106e8a5a4f9667f329b8f277f8f25e1f2d31f1a7c7f9e1f366c5b1e3af2f8c4c",strip_prefix = "cpprestsdk-2.10.18",urls = ["https://github.com/microsoft/cpprestsdk/archive/v2.10.18.zip"],
)

BUILD文件

cc_binary(name = "chatbot",srcs = ["main.cpp"],deps = ["@cpprestsdk//:cpprest",],linkopts = [# Required for cpprestsdk"-lboost_system","-lssl","-lcrypto",],
)# cpprestsdk.BUILD file content (create this file in the same directory as WORKSPACE)
'''
load("@rules_cc//cc:defs.bzl", "cc_library")cc_library(name = "cpprest",hdrs = glob(["cpprestsdk/cpprestsdk/Release/include/**/*.h"]),srcs = glob(["cpprestsdk/cpprestsdk/Release/src/http/client/*.cpp","cpprestsdk/cpprestsdk/Release/src/http/common/*.cpp","cpprestsdk/cpprestsdk/Release/src/http/listener/*.cpp","cpprestsdk/cpprestsdk/Release/src/json/*.cpp","cpprestsdk/cpprestsdk/Release/src/pplx/*.cpp","cpprestsdk/cpprestsdk/Release/src/uri/*.cpp","cpprestsdk/cpprestsdk/Release/src/utilities/*.cpp","cpprestsdk/cpprestsdk/Release/src/websockets/client/*.cpp",]),includes = ["cpprestsdk/cpprestsdk/Release/include"],copts = ["-std=c++11"],linkopts = ["-lboost_system","-lssl","-lcrypto",],visibility = ["//visibility:public"],
)
'''

Java 代码

// ChatBot.java
import okhttp3.*;
import com.google.gson.*;
import java.io.*;
import java.time.LocalDateTime;
import java.time.format.DateTimeFormatter;
import java.util.ArrayList;public class ChatBot {private static final String API_KEY = "your_openai_api_key_here";private String model;private ArrayList<String> questionList;private ArrayList<String> answerList;public ChatBot(String model) {this.model = model;this.questionList = new ArrayList<>();this.answerList = new ArrayList<>();}public void saveDialogue() {LocalDateTime timestamp = LocalDateTime.now();DateTimeFormatter formatter = DateTimeFormatter.ofPattern("yyyyMMdd-HHmm-ss");String fileName = "output/Chat_" + timestamp.format(formatter) + ".md";try (BufferedWriter writer = new BufferedWriter(new FileWriter(fileName))) {for (int i = 0; i < questionList.size(); ++i) {writer.write("You: " + questionList.get(i) + "\nGPT-3.5-turbo: " + answerList.get(i) + "\n\n");}System.out.println("Dialogue content saved to file: " + fileName);} catch (IOException e) {System.err.println("Could not create the output file.");e.printStackTrace();}}public void generate() {System.out.println("\nStart your conversation, type \"exit\" to end.");BufferedReader reader = new BufferedReader(new InputStreamReader(System.in));String question;while (true) {System.out.print("\nYou: ");try {question = reader.readLine();} catch (IOException e) {System.err.println("Error reading input.");e.printStackTrace();break;}if ("exit".equalsIgnoreCase(question)) {break;}questionList.add(question);String answer = getAnswer(question);if (!answer.isEmpty()) {System.out.println("GPT-3.5-turbo: " + answer);answerList.add(answer);}}saveDialogue();}private String getAnswer(String question) {OkHttpClient client = new OkHttpClient();Gson gson = new Gson();String requestBodyJson = gson.toJson(new RequestBody("GPT-3.5-turbo: " + question, 50, 1, "\n"));RequestBody requestBody = RequestBody.create(requestBodyJson, MediaType.parse("application/json"));Request request = new Request.Builder().url("https://api.openai.com/v1/engines/gpt-3.5-turbo/completions").addHeader("Authorization", "Bearer " + API_KEY).addHeader("Content-Type", "application/json").post(requestBody).build();try (Response response = client.newCall(request).execute()) {if (!response.isSuccessful()) {throw new IOException("Unexpected code " + response);}String responseBodyJson = response.body().string();ResponseBody responseBody = gson.fromJson(responseBodyJson, ResponseBody.class);return responseBody.choices.get(0).text.trim();} catch (IOException e) {System.err.println("Error: " + e.getMessage());e.printStackTrace();return "";}}public static void main(String[] args) {ChatBot bot = new ChatBot("gpt-3.5-turbo");bot.generate();}private static class RequestBody {String prompt;int max_tokens;int n;String stop;RequestBody(String prompt, int max_tokens, int n, String stop) {this.prompt = prompt;this.max_tokens = max_tokens;this.n = n;this.stop = stop;}}private static class ResponseBody {ArrayList<Choice> choices;private static class Choice {String text;}}
}

maven文件

<?xml version="1.0" encoding="UTF-8"?>
<project xmlns="http://maven.apache.org/POM/4.0.0"xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"><modelVersion>4.0.0</modelVersion><groupId>com.example</groupId><artifactId>gpt-chatbot</artifactId><version>1.0-SNAPSHOT</version><properties><project.build.sourceEncoding>UTF-8</project.build.sourceEncoding><maven.compiler.source>1.8</maven.compiler.source><maven.compiler.target>1.8</maven.compiler.target></properties><dependencies><dependency><groupId>com.squareup.okhttp3</groupId><artifactId>okhttp</artifactId><version>4.9.3</version></dependency><dependency><groupId>com.google.code.gson</groupId><artifactId>gson</artifactId><version>2.8.9</version></dependency></dependencies><build><plugins><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-compiler-plugin</artifactId><version>3.8.0</version></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-jar-plugin</artifactId><version>3.1.0</version><configuration><archive><manifest><addClasspath>true</addClasspath><classpathPrefix>lib/</classpathPrefix><mainClass>com.example.ChatBot</mainClass></manifest></archive></configuration></plugin><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-dependency-plugin</artifactId><version>3.1.1</version><executions><execution><id>copy-dependencies</id><phase>package</phase><goals><goal>copy-dependencies</goal></goals><configuration><outputDirectory>${project.build.directory}/lib</outputDirectory></configuration></execution></executions></plugin></plugins></build>
</project>

执行效果


http://www.ppmy.cn/news/583052.html

相关文章

Android13 安装最新版 Frida

本文所有教程及源码、软件仅为技术研究。不涉及计算机信息系统功能的删除、修改、增加、干扰&#xff0c;更不会影响计算机信息系统的正常运行。不得将代码用于非法用途&#xff0c;如侵立删&#xff01; Android13 安装最新版 Frida 环境 win10Pixel4Android13Python3.9Frida1…

计算机主机箱进行总结,工业级主机用机箱分类总结

原标题&#xff1a;工业级主机用机箱分类总结 由于工业控制计算机的应用环境不同&#xff0c;从机箱的组成上可以分为多种。这也是工业控制计算机的一个主要特点&#xff01;在工业控制计算机的整个组成中&#xff0c;工业级机箱的重要性仍然是相当大的&#xff0c;那么工业级机…

1u服务器系统风扇,1U工控服务器机箱介绍

原标题&#xff1a;1U工控服务器机箱介绍 随着服务器的集成度越来越高&#xff0c;像INTEL的XEON刀片服务器(其广告随处可见)和1U服务器等大量的普及使用&#xff0c;并且国内的服务器市场逐步升温服务器的情况下。服务器散热受到散热器厂商和服务器用户的高度重视&#xff0c;…

哪个型号服务器静音风扇可调,全新 原装台达 4020 4厘米服务器静音风扇 DSB0412LD 12V 0.10A...

品牌&#xff1a;台湾台达 DELTA 型号&#xff1a;DSB0412LD-7S43 外形尺寸&#xff1a;40*40*20mm 4020 电压&#xff1a;12V DC 电流&#xff1a; 0.10A 超静音风扇 功率&#xff1a; 1.2W AFB02505LAAFB02505MAAFB0250512MAAFB02505HAAFB02505HHAAFB02512HHAAFB0305LAAFB031…

ModaHub魔搭社区:安装、启动 Milvus 服务(CPU版)教程

目录 安装、启动 Milvus 服务 安装前提 操作系统 硬件 软件 确认 Docker 状态 拉取 Milvus 镜像 下载配置文件 启动 Milvus Docker 容器 常见问题 接下来你可以 安装、启动 Milvus 服务 CPU 版 MilvusGPU 版 Milvus 安装前提 操作系统 操作系统 版本 CentOS 7…

宏基微型计算机机箱怎么打开,机箱也智能 自己动手打造自动温控机箱

盛夏酷暑&#xff0c;当自己享受着空调冷气袭面的时候&#xff0c;可曾想到朝夕相处的电脑正面临着天气温度与日俱增的煎熬&#xff1f;如何让自己的机箱拥有出色的散热性能&#xff0c;是每一位DIYer都需要面对的挑战。与其整日开着空调&#xff0c;不如自己DIY一下&#xff0…

厚物科技PXI机箱PXIe机箱PXIe控制器PXIe台式测控平台HW-10183d(G2)

dMCS台式PXIe测控系列HW-10183d(G2) 符合PXIe/PXI总线标准规范 国产标准19"台式PXIe测控平台 支持19"标准机架式安装 内置厚物科技PXIe-9170控制器 内置厚物科技3U 18槽PXIe背板 1个3U PXIe系统槽和17个3U PXIe/PXI混合扩展槽 兼容数采、模块化仪器、航空总线、FPGA等…

VPX加固机箱学习资料第289篇:基于3U VPX的 5槽加固机箱

基于3U VPX的 5槽加固机箱 一、产品简介 VPX-305 机箱平台是4U高度 3U 5S的一种便携式VPX机箱。机箱提供1个VPX总线系统槽和4个VPX 外设槽。主要用于组建VPX测试系统。整机美观大方、小巧灵活、技术先进、安全可靠。可用于外场测试。 VPX-305采用工业级3U300W的C…