📘 **TELUS Agriculture & Consumer Goods** 如何通过 **Haystack Agents** 转变促销交易

使用 OpenAI ChatGenerator 进行函数调用


⚠️ 自 Haystack 2.9.0 起,本示例已弃用。如需相同示例,请遵循 教程:使用函数调用构建聊天代理

Notebook 作者:Bilge Yucel ( LI & X (Twitter))

本指南旨在帮助您了解函数调用,以及如何在 Haystack 中使用 OpenAI 的函数调用功能。

📚 有用的资源

概述

以下是来自 OpenAI 文档的函数调用的一些用例

  • 创建可以通过调用外部 API 来回答问题的助手(例如,类似 ChatGPT 插件),例如定义 send_email(to: string, body: string) 或 get_current_weather(location: string, unit: ‘celsius’ | ‘fahrenheit’) 等函数
  • 将自然语言转换为 API 调用,例如,将“我的顶级客户是谁?”转换为 get_customers(min_revenue: int, created_before: string, limit: int) 并调用您的内部 API
  • 从文本中提取结构化数据,例如,定义一个名为 extract_data(name: string, birthday: string) 或 sql_query(query: string) 的函数

设置开发环境

%%bash

pip install haystack-ai==2.8.1
import os
from getpass import getpass
from google.colab import userdata

os.environ["OPENAI_API_KEY"] = userdata.get('OPENAI_API_KEY') or getpass("OPENAI_API_KEY: ")

了解 OpenAIChatGenerator

OpenAIChatGenerator 是一个支持 OpenAI 函数调用功能的组件。

OpenAIChatGenerator 通信的方式是通过 ChatMessage 列表。因此,使用 ChatMessage.from_user() 创建一个具有“USER”角色的 ChatMessage 并将其发送到 OpenAIChatGenerator

from haystack.dataclasses import ChatMessage
from haystack.components.generators.chat import OpenAIChatGenerator

client = OpenAIChatGenerator()
response = client.run(
    [ChatMessage.from_user("What's Natural Language Processing? Be brief.")]
)
print(response)
{'replies': [ChatMessage(content='Natural Language Processing (NLP) is a branch of artificial intelligence that deals with the interaction between computers and humans in natural language. It focuses on the understanding, interpretation, and generation of human language to enable machines to process and analyze textual data efficiently.', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-4o-mini-2024-07-18', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 50, 'prompt_tokens': 16, 'total_tokens': 66}})]}

基本流式传输

OpenAIChatGenerator 支持流式传输,提供一个 streaming_callback 函数并重新运行客户端以查看差异。

from haystack.dataclasses import ChatMessage
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.components.generators.utils import print_streaming_chunk

client = OpenAIChatGenerator(streaming_callback=print_streaming_chunk)
response = client.run(
    [ChatMessage.from_user("What's Natural Language Processing? Be brief.")]
)
Natural Language Processing (NLP) is a field of artificial intelligence that focuses on the interaction between humans and computers using natural language. It involves the development of algorithms and methods to enable computers to understand, interpret, and generate human language in a manner that is meaningful and useful.

使用 OpenAI ChatGenerator 进行函数调用

我们将尝试重现 OpenAI 文档中的示例

定义一个函数

我们将定义一个 get_current_weather 函数,该函数模拟天气 API 调用。

def get_current_weather(location: str, unit: str = "celsius"):
  ## Do something
  return {"weather": "sunny", "temperature": 21.8, "unit": unit}

available_functions = {
  "get_current_weather": get_current_weather
}

创建 tools

然后,我们将按照 OpenAI 的工具模式将此函数的信息添加到我们的 tools 列表中。

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                        "description": "The temperature unit to use. Infer this from the users location.",
                    },
                },
                "required": ["location", "unit"],
            },
        }
    }
]

使用工具运行 OpenAIChatGenerator

我们将在 run() 方法中通过 generation_kwargs 传递工具列表。

让我们定义消息并运行生成器

from haystack.dataclasses import ChatMessage

messages = []
messages.append(ChatMessage.from_system("Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."))
messages.append(ChatMessage.from_user("What's the weather like in Berlin?"))

client = OpenAIChatGenerator(streaming_callback=print_streaming_chunk)
response = client.run(
    messages=messages,
    generation_kwargs={"tools":tools}
)

这是一个函数调用!📞 响应为我们提供了有关函数名称和要用于调用该函数的参数的信息。

response
{'replies': [ChatMessage(content='[{"index": 0, "id": "call_fFQKCAUba8RRu2BZ4v8IVYPH", "function": {"arguments": "{\\n  \\"location\\": \\"Berlin\\",\\n  \\"unit\\": \\"celsius\\"\\n}", "name": "get_current_weather"}, "type": "function"}]', role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'gpt-4o-mini-2024-07-18', 'index': 0, 'finish_reason': 'tool_calls', 'usage': {}})]}

您可以选择将包含函数信息的此处消息添加到消息列表中

messages.append(response["replies"][0])

看看我们如何从消息中提取 function_namefunction_args

import json

function_call = json.loads(response["replies"][0].content)[0]
function_name = function_call["function"]["name"]
function_args = json.loads(function_call["function"]["arguments"])
print("function_name:", function_name)
print("function_args:", function_args)
function_name: get_current_weather
function_args: {'location': 'Berlin', 'unit': 'celsius'}

进行工具调用

让我们在我们 available_functions 字典中找到 function_name 对应的函数,并在调用它时使用 function_args。一旦收到工具的响应,我们将将其追加到我们的 messages 中,以便稍后发送给 OpenAI。

function_to_call = available_functions[function_name]
function_response = function_to_call(**function_args)
function_message = ChatMessage.from_function(content=json.dumps(function_response), name=function_name)
messages.append(function_message)

使用来自函数的响应进行最后一次 OpenAI 调用,看看 OpenAI 如何使用提供的信息。

response = client.run(
    messages=messages,
    generation_kwargs={"tools":tools}
)
The current weather in Berlin is sunny with a temperature of 21.8°C.

改进示例

让我们在示例中添加更多工具,以改善用户体验 👇

我们将添加另一个工具 use_haystack_pipeline,以便 OpenAI 在有关于国家和首都的问题时使用。

tools = [
    {
        "type": "function",
        "function": {
            "name": "get_current_weather",
            "description": "Get the current weather",
            "parameters": {
                "type": "object",
                "properties": {
                    "location": {
                        "type": "string",
                        "description": "The city and state, e.g. San Francisco, CA",
                    },
                    "unit": {
                        "type": "string",
                        "enum": ["celsius", "fahrenheit"],
                        "description": "The temperature unit to use. Infer this from the users location.",
                    },
                },
                "required": ["location", "unit"],
            },
        }
    },
    {
        "type": "function",
        "function": {
            "name": "use_haystack_pipeline",
            "description": "Use for search about countries and capitals",
            "parameters": {
                "type": "object",
                "properties": {
                    "query": {
                        "type": "string",
                        "description": "The query to use in the search. Infer this from the user's message",
                    },
                },
                "required": ["query"]
            },
        }
    },
]
def get_current_weather(location: str, unit: str = "celsius"):
  return {"weather": "sunny", "temperature": 21.8, "unit": unit}

def use_haystack_pipeline(query: str):
  # It returns a mock response
  return {"documents": "Cutopia is the capital of Utopia", "query": query}

available_functions = {
  "get_current_weather": get_current_weather,
  "use_haystack_pipeline": use_haystack_pipeline,
}

启动应用程序

享受与 OpenAI 聊天的乐趣!🎉

您可以尝试的一些示例查询

  • 乌托邦的首都是哪里”,“那里是晴天吗?”:用于测试消息是否被记录和发送
  • 乌托邦首府今天天气如何?”:强制两次函数调用
  • 今天天气如何?”:强制 OpenAI 询问更多澄清信息
import json
from haystack.dataclasses import ChatMessage, ChatRole

messages = []
messages.append(ChatMessage.from_system("Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous."))

print(messages[-1].content)

while True:
  # if this is a tool call
  if response and response["replies"][0].meta["finish_reason"] == 'tool_calls':
    function_calls = json.loads(response["replies"][0].content)
    for function_call in function_calls:
      function_name = function_call["function"]["name"]
      function_to_call = available_functions[function_name]
      function_args = json.loads(function_call["function"]["arguments"])

      function_response = function_to_call(**function_args)
      function_message = ChatMessage.from_function(content=json.dumps(function_response), name=function_name)
      messages.append(function_message)

  # Regular Conversation
  else:
    # If it's not user's first message and there's an assistant message
    if not messages[-1].is_from(ChatRole.SYSTEM):
      messages.append(ChatMessage.from_assistant(response["replies"][0].content))

    user_input = input("INFO: Type 'exit' or 'quit' to stop\n")
    if user_input.lower() == "exit" or user_input.lower() == "quit":
      break
    else:
      messages.append(ChatMessage.from_user(user_input))

  response = client.run(
    messages=messages,
    generation_kwargs={"tools":tools}
  )
Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.
INFO: Type 'exit' or 'quit' to stop
What's the weather like today?
Sure, can you please tell me your current location?INFO: Type 'exit' or 'quit' to stop
utopia
The weather in Utopia today is sunny with a temperature of 21.8 degrees Celsius.INFO: Type 'exit' or 'quit' to stop
exit

这部分可以帮助您理解消息的顺序。

print("\n=== SUMMARY ===")
for m in messages:
  print(f"\n - {m.content}")
=== SUMMARY ===

 - Don't make assumptions about what values to plug into functions. Ask for clarification if a user request is ambiguous.

 - What's the weather like today?

 - Sure, can you please tell me your current location?

 - utopia

 - {"weather": "sunny", "temperature": 21.8, "unit": "celsius"}

 - The weather in Utopia today is sunny with a temperature of 21.8 degrees Celsius.