📘 **TELUS Agriculture & Consumer Goods** 如何通过 **Haystack Agents** 转变促销交易
由 deepset 维护

集成:Anthropic

将 Anthropic 模型与 Haystack 结合使用

作者
deepset

目录

概述

此集成通过 Anthropic 的推理基础设施支持 Anthropic Claude 模型,例如 Claude Haiku 3.5、Claude Sonnet 3.7 和 Claude Sonnet 4.5。有关可用模型的完整列表,请参阅 Anthropic Claude 文档

您可以使用 AnthropicGeneratorAnthropicChatGenerator 来使用 Anthropic 模型。

安装

pip install anthropic-haystack

使用

根据您的用例,您可以选择 AnthropicGeneratorAnthropicChatGenerator 来处理 Anthropic 模型。要了解更多关于两者区别的信息,请访问 Generators vs Chat Generators 指南。
在使用之前,请确保设置了 ANTHROPIC_API_KEY 环境变量。

使用 AnthropicChatGenerator

以下是一个 RAG 管道示例,我们使用指向 Anthropic 提示工程指南的 URL 的内容来回答预定义的问题。我们获取 URL 的内容,并使用 AnthropicChatGenerator 生成答案。

# To run this example, you need to set the `ANTHROPIC_API_KEY` environment variable.
# !pip install trafilatura

from haystack import Pipeline
from haystack.components.builders import ChatPromptBuilder
from haystack.components.converters import HTMLToDocument
from haystack.components.fetchers import LinkContentFetcher
from haystack.components.generators.utils import print_streaming_chunk
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret

from haystack_integrations.components.generators.anthropic import AnthropicChatGenerator

messages = [
    ChatMessage.from_system("You are a prompt expert who answers questions based on the given documents."),
    ChatMessage.from_user(
        "Here are the documents:\n"
        "{% for d in documents %} \n"
        "    {{d.content}} \n"
        "{% endfor %}"
        "\nAnswer: {{query}}"
    ),
]

rag_pipeline = Pipeline()
rag_pipeline.add_component("fetcher", LinkContentFetcher())
rag_pipeline.add_component("converter", HTMLToDocument())
rag_pipeline.add_component("prompt_builder", ChatPromptBuilder(variables=["documents"]))
rag_pipeline.add_component(
    "llm",
    AnthropicChatGenerator(
        api_key=Secret.from_env_var("ANTHROPIC_API_KEY"),
        streaming_callback=print_streaming_chunk,
    ),
)


rag_pipeline.connect("fetcher", "converter")
rag_pipeline.connect("converter", "prompt_builder")
rag_pipeline.connect("prompt_builder.prompt", "llm.messages")

question = "When should we use prompt engineering and when should we fine-tune?"
rag_pipeline.run(
    data={
        "fetcher": {"urls": ["https://docs.anthropic.com/en/docs/build-with-claude/prompt-engineering/overview"]},
        "prompt_builder": {"template_variables": {"query": question}, "template": messages},
    }
)

使用 AnthropicGenerator

以下是使用 AnthropicGenerator 的示例

from haystack_integrations.components.generators.anthropic import AnthropicGenerator

client = AnthropicGenerator()
response = client.run("What's Natural Language Processing? Be brief.")
print(response)

>>{'replies': ['Natural language processing (NLP) is a branch of artificial intelligence focused on enabling
>>computers to understand, interpret, and manipulate human language. The goal of NLP is to read, decipher,
>> understand, and make sense of the human languages in a manner that is valuable.'], 'meta': {'model':
>> 'claude-2.1', 'index': 0, 'finish_reason': 'end_turn', 'usage': {'input_tokens': 18, 'output_tokens': 58}}}