集成:Arize AI
使用 Arize AI 跟踪和监控您的 Haystack 管道
目录
概述
Arize 是一个 AI 可观测性和评估平台,旨在帮助您对 LLM 和 ML 应用程序进行故障排除、评估和实验。开发人员使用 Arize 来快速运行应用程序、评估性能、检测和防止生产问题以及整理数据集。
安装
pip install openinference-instrumentation-haystack haystack-ai arize-otel opentelemetry-sdk opentelemetry-exporter-otlp
使用
要使用 Arize 跟踪任何 Haystack 管道,只需初始化 OpenTelemetry 和 HaystackInstrumentor 即可。在同一环境中运行的 Haystack 管道会将跟踪发送到 Arize。
from openinference.instrumentation.haystack import HaystackInstrumentor
# Import open-telemetry dependencies
from arize_otel import register_otel, Endpoints
# Setup OTEL via our convenience function
register_otel(
endpoints = Endpoints.ARIZE,
space_id = "<your-space-id>", # from the space settings page
api_key = "<your-api-key>", # from the space settings page
model_id = "<your-haystack-app-name>", # name this to whatever you would like
)
现在,您可以在同一环境中运行 Haystack 管道,从而产生以下跟踪
要运行下面的示例,请将您的 OpenAI Key 导出到
OPENAI_API_KEY环境变量。
from haystack import Document, Pipeline
from haystack.components.builders.prompt_builder import PromptBuilder
from haystack.components.generators import OpenAIGenerator
from haystack.components.retrievers.in_memory import InMemoryBM25Retriever
from haystack.document_stores.in_memory import InMemoryDocumentStore
document_store = InMemoryDocumentStore()
document_store.write_documents([
Document(content="My name is Jean and I live in Paris."),
Document(content="My name is Mark and I live in Berlin."),
Document(content="My name is Giorgio and I live in Rome.")
])
prompt_template = """
Given these documents, answer the question.
Documents:
{% for doc in documents %}
{{ doc.content }}
{% endfor %}
Question: {{question}}
Answer:
"""
retriever = InMemoryBM25Retriever(document_store=document_store)
prompt_builder = PromptBuilder(template=prompt_template)
llm = OpenAIGenerator()
rag_pipeline = Pipeline()
rag_pipeline.add_component("retriever", retriever)
rag_pipeline.add_component("prompt_builder", prompt_builder)
rag_pipeline.add_component("llm", llm)
rag_pipeline.connect("retriever", "prompt_builder.documents")
rag_pipeline.connect("prompt_builder", "llm")
question = "Who lives in Paris?"
results = rag_pipeline.run(
{
"retriever": {"query": question},
"prompt_builder": {"question": question},
}
)
