Inspect traces, manage prompts, curate datasets, and run experiments using Arize Phoenix, an open-source AI and LLM observability tool.
Phoenix is an open-source AI observability platform designed for experimentation, evaluation, and troubleshooting. It provides:
Phoenix is vendor and language agnostic with out-of-the-box support for popular frameworks (🦙LlamaIndex, 🦜⛓LangChain, Haystack, 🧩DSPy, 🤗smolagents) and LLM providers (OpenAI, Bedrock, MistralAI, VertexAI, LiteLLM, Google GenAI and more). For details on auto-instrumentation, check out the OpenInference project.
Phoenix runs practically anywhere, including your local machine, a Jupyter notebook, a containerized deployment, or in the cloud.
Install Phoenix via pip
or conda
pip install arize-phoenix
Phoenix container images are available via Docker Hub and can be deployed using Docker or Kubernetes.
The arize-phoenix
package includes the entire Phoenix platfom. However if you have deployed the Phoenix platform, there are light-weight Python sub-packages and TypeScript packages that can be used in conjunction with the platfrom.
Package | Language | Description |
---|---|---|
arize-phoenix-otel | Python |
Provides a lightweight wrapper around OpenTelemetry primitives with Phoenix-aware defaults |
arize-phoenix-client | Python |
Lightweight client for interacting with the Phoenix server via its OpenAPI REST interface |
arize-phoenix-evals | Python |
Tooling to evaluate LLM applications including RAG relevance, answer relevance, and more |
@arizeai/phoenix-client | JavaScript |
Client for the Arize Phoenix API |
@arizeai/phoenix-evals | TypeScript |
TypeScript evaluation library for LLM applications (alpha release) |
@arizeai/phoenix-mcp | JavaScript |
MCP server implementation for Arize Phoenix providing unified interface to Phoenix's capabilities |
Phoenix is built on top of OpenTelemetry and is vendor, language, and framework agnostic. For details about tracing integrations and example applications, see the OpenInference project.
Python Integrations
Integration | Package | Version Badge |
---|---|---|
OpenAI | openinference-instrumentation-openai |
|
OpenAI Agents | openinference-instrumentation-openai-agents |
|
LlamaIndex | openinference-instrumentation-llama-index |
|
DSPy | openinference-instrumentation-dspy |
|
AWS Bedrock | openinference-instrumentation-bedrock |
|
LangChain | openinference-instrumentation-langchain |
|
MistralAI | openinference-instrumentation-mistralai |
|
Google GenAI | openinference-instrumentation-google-genai |
|
Google ADK | openinference-instrumentation-google-adk |
|
Guardrails | openinference-instrumentation-guardrails |
|
VertexAI | openinference-instrumentation-vertexai |
|
CrewAI | openinference-instrumentation-crewai |
|
Haystack | openinference-instrumentation-haystack |
|
LiteLLM | openinference-instrumentation-litellm |
|
Groq | openinference-instrumentation-groq |
|
Instructor | openinference-instrumentation-instructor |
|
Anthropic | openinference-instrumentation-anthropic |
|
Smolagents | openinference-instrumentation-smolagents |
|
Agno | openinference-instrumentation-agno |
|
MCP | openinference-instrumentation-mcp |
|
Pydantic AI | openinference-instrumentation-pydantic-ai |
|
Autogen AgentChat | openinference-instrumentation-autogen-agentchat |
|
Portkey | openinference-instrumentation-portkey |
Integration | Package | Version Badge |
---|---|---|
OpenAI | @arizeai/openinference-instrumentation-openai |
|
LangChain.js | @arizeai/openinference-instrumentation-langchain |
|
Vercel AI SDK | @arizeai/openinference-vercel |
|
BeeAI | @arizeai/openinference-instrumentation-beeai |
|
Mastra | @arizeai/openinference-mastra |
Integration | Package | Version Badge |
---|---|---|
LangChain4j | openinference-instrumentation-langchain4j |
Platform | Description | Docs |
---|---|---|
Dify | Open-source LLM app development platform | Integration Guide |
LangFlow | Visual framework for building multi-agent and RAG applications | Integration Guide |
BeeAI | AI agent framework with built-in observability | Integration Guide |
LiteLLM Proxy | Proxy server for LLMs | Integration Guide |
Join our community to connect with thousands of AI builders.
See the migration guide for a list of breaking changes.
Copyright 2025 Arize AI, Inc. All Rights Reserved.
Portions of this code are patent protected by one or more U.S. Patents. See the IP_NOTICE.
This software is licensed under the terms of the Elastic License 2.0 (ELv2). See LICENSE.