A helper package to interact with Arize AI APIs.
Arize is an AI engineering platform. It helps engineers develop, evaluate, and observe AI applications and agents.
Arize has both Enterprise and OSS products to support this goal:
- Arize AX β an enterprise AI engineering platform from development to production, with an embedded AI Copilot
- Phoenix β a lightweight, open-source project for tracing, prompt engineering, and evaluation
- OpenInference β an open-source instrumentation package to trace LLM applications across models and frameworks
We log over 1 trillion inferences and spans, 10 million evaluation runs, and 2 million OSS downloads every month.
- Tracing - Trace your LLM application's runtime using OpenTelemetry-based instrumentation.
- Evaluation - Leverage LLMs to benchmark your application's performance using response and retrieval evals.
- Datasets - Create versioned datasets of examples for experimentation, evaluation, and fine-tuning.
- Experiments - Track and evaluate changes to prompts, LLMs, and retrieval.
- Playground- Optimize prompts, compare models, adjust parameters, and replay traced LLM calls.
- Prompt Management- Manage and test prompt changes systematically using version control, tagging, and experimentation.
Install Arize via pip
or conda
:
pip install arize
Install the arize-otel
package for auto-instrumentation of your LLM library:
pip install arize-otel
See https://pypi.org/project/arize-otel/
from arize.otel import register
from openinference.instrumentation.openai import OpenAIInstrumentor
# Setup OpenTelemetry via our convenience function
tracer_provider = register(
space_id=SPACE_ID,
api_key=API_KEY,
project_name="agents-cookbook",
)
# Start instrumentation
OpenAIInstrumentor().instrument(tracer_provider=tracer_provider)
Use arize.pandas.logger
to log spans, evaluations, and annotations in bulk. See https://arize-client-python.readthedocs.io/en/latest/llm-api/logger.html
from arize.pandas.logger import Client
arize_client = Client(
space_key=os.environ["ARIZE_SPACE_KEY"],
api_key=os.environ["ARIZE_API_KEY"],
)
arize_client.log_spans(
dataframe=spans_df,
project_name="your-llm-project",
)
arize_client.log_evaluations_sync(
dataframe=evals_df,
project_name="your-llm-project",
)
arize_client.log_annotations(
dataframe=annotations_df,
project_name="your-llm-project",
)
Use arize.experimental.datasets
to create datasets and run experiments. See https://arize-client-python.readthedocs.io/en/latest/llm-api/datasets.html
from arize.experimental.datasets import ArizeDatasetsClient
datasets_client = ArizeDatasetsClient(api_key=os.environ["ARIZE_API_KEY"])
dataset_id = datasets_client.create_dataset(
space_id=os.environ["ARIZE_SPACE_KEY"],
dataset_name="llm-span-dataset",
data=spans_df,
)
Arize is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.
Python Integrations
Integration | Package | Version Badge |
---|---|---|
OpenAI | openinference-instrumentation-openai |
|
OpenAI Agents | openinference-instrumentation-openai-agents |
|
LlamaIndex | openinference-instrumentation-llama-index |
|
DSPy | openinference-instrumentation-dspy |
|
AWS Bedrock | openinference-instrumentation-bedrock |
|
LangChain | openinference-instrumentation-langchain |
|
MistralAI | openinference-instrumentation-mistralai |
|
Google GenAI | openinference-instrumentation-google-genai |
|
Guardrails | openinference-instrumentation-guardrails |
|
VertexAI | openinference-instrumentation-vertexai |
|
CrewAI | openinference-instrumentation-crewai |
|
Haystack | openinference-instrumentation-haystack |
|
LiteLLM | openinference-instrumentation-litellm |
|
Groq | openinference-instrumentation-groq |
|
Instructor | openinference-instrumentation-instructor |
|
Anthropic | openinference-instrumentation-anthropic |
|
Smolagents | openinference-instrumentation-smolagents |
Integration | Package | Version Badge |
---|---|---|
OpenAI | @arizeai/openinference-instrumentation-openai |
|
LangChain.js | @arizeai/openinference-instrumentation-langchain |
|
Vercel AI SDK | @arizeai/openinference-vercel |
|
BeeAI | @arizeai/openinference-instrumentation-beeai |
Join our community to connect with thousands of AI builders.
- π Join our Slack community.
- π Read our documentation.
- π‘ Ask questions and provide feedback in the #arize-support channel.
- π Follow us on π.
- π§βπ« Deep dive into everything Agents and LLM Evaluations on Arize's Learning Hubs.
Copyright 2025 Arize AI, Inc. All Rights Reserved.