Get comprehensive ML experiment context and analysis directly from ClearML in your AI conversations.
A lightweight Model Context Protocol (MCP) server that enables AI assistants to interact with ClearML experiments, models, and projects. Get comprehensive ML experiment context and analysis directly in your AI conversations.
uvx
command~/.clearml/clearml.conf
You need a configured ClearML environment with your credentials in ~/.clearml/clearml.conf
:
[api]
api_server = https://api.clear.ml
web_server = https://app.clear.ml
files_server = https://files.clear.ml
credentials {
"access_key": "your-access-key",
"secret_key": "your-secret-key"
}
Get your credentials from ClearML Settings.
# Install from PyPI
pip install clearml-mcp
# Or run directly with uvx (no installation needed)
uvx clearml-mcp
Add to your Claude Desktop configuration:
macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
Windows: %APPDATA%/Claude/claude_desktop_config.json
{
"mcpServers": {
"clearml": {
"command": "uvx",
"args": ["clearml-mcp"]
}
}
}
Alternative with pip installation:
{
"mcpServers": {
"clearml": {
"command": "python",
"args": ["-m", "clearml_mcp.clearml_mcp"]
}
}
}
Add to your Cursor settings (Ctrl/Cmd + ,
→ Search "MCP"):
{
"mcp.servers": {
"clearml": {
"command": "uvx",
"args": ["clearml-mcp"]
}
}
}
Or add to .cursorrules
in your project:
When analyzing ML experiments or asking about model performance, use the clearml MCP server to access experiment data, metrics, and artifacts.
Add to your Continue configuration (~/.continue/config.json
):
{
"mcpServers": {
"clearml": {
"command": "uvx",
"args": ["clearml-mcp"]
}
}
}
Add to your Cody settings:
{
"cody.experimental.mcp": {
"servers": {
"clearml": {
"command": "uvx",
"args": ["clearml-mcp"]
}
}
}
}
For any MCP-compatible AI assistant, use this configuration:
{
"mcpServers": {
"clearml": {
"command": "uvx",
"args": ["clearml-mcp"]
}
}
}
Compatible with:
The ClearML MCP server provides 14 comprehensive tools for ML experiment analysis:
get_task_info
- Get detailed task information, parameters, and statuslist_tasks
- List tasks with advanced filtering (project, status, tags, user)get_task_parameters
- Retrieve hyperparameters and configurationget_task_metrics
- Access training metrics, scalars, and plotsget_task_artifacts
- Get artifacts, model files, and outputsget_model_info
- Get model metadata and configuration detailslist_models
- Browse available models with filteringget_model_artifacts
- Access model files and download URLslist_projects
- Discover available ClearML projectsget_project_stats
- Get project statistics and task summariesfind_project_by_pattern
- Find projects matching name patternsfind_experiment_in_project
- Find specific experiments within projectscompare_tasks
- Compare multiple tasks by specific metricssearch_tasks
- Advanced search by name, tags, comments, and moreOnce configured, you can ask your AI assistant questions like:
# Clone and setup with UV
git clone https://github.com/prassanna-ravishankar/clearml-mcp.git
cd clearml-mcp
uv sync
# Run locally
uv run python -m clearml_mcp.clearml_mcp
# Run tests with coverage
uv run task coverage
# Lint and format
uv run task lint
uv run task format
# Type checking
uv run task type
# Run examples
uv run task consolidated-debug # Full ML debugging demo
uv run task example-simple # Basic integration
uv run task find-experiments # Discover real experiments
# Test the MCP server directly
npx @modelcontextprotocol/inspector uvx clearml-mcp
"No ClearML projects accessible"
~/.clearml/clearml.conf
credentialspython -c "from clearml import Task; print(Task.get_projects())"
Module not found errors
bunx clearml-mcp
instead of uvx clearml-mcp
python -m clearml_mcp.clearml_mcp
Large dataset queries
list_tasks
to limit resultsproject_name
to narrow scopetask_status
filters (completed
, running
, failed
)Slow metric retrieval
compare_tasks
with metric names for focused analysisContributions welcome! This project uses:
See our testing philosophy and linting approach for development guidelines.
MIT License - see LICENSE for details.
Created by Prass, The Nomadic Coder