CLI¶
Install the opentraceai command-line tool to index repositories and run an MCP server from your terminal.
Prerequisites¶
- Python 3.12+
uvrecommended — makesuvxand isolated installs trivial. See theuvinstall guide.
Install¶
Run OpenTrace without installing it globally — uvx downloads and caches it on first use.
Best for: kicking the tires, or using it from a CI job.
Install globally in an isolated environment managed by uv. Re-running the command upgrades it in place.
Best for: daily use from any shell. This is the recommended permanent install.
Install into your current Python environment (ideally a venv).
Best for: an environment you already manage with pip.
Using It¶
The package installs as opentraceai, but the CLI binary is opentrace (shorter alias — opentraceai also works).
opentrace index /path/to/repo # index a repo into a knowledge graph
opentrace mcp # start an MCP server over stdio
opentrace --help # see all commands
The graph is stored at .opentrace/index.db at the repo root. Every opentrace command walks up from your current directory to find it, so you can run commands from any subdirectory.
MCP Server¶
opentrace mcp starts a Model Context Protocol server over stdio. Any MCP-compatible client (Claude Code, Cursor, etc.) can connect to it to query the graph.
If you're using Claude Code, the plugin handles this for you.
What Next¶
- Run it inside Claude Code? → Claude Code Plugin (installs the CLI automatically)
- Something not working? → Troubleshooting
- See what the graph exposes → Graph Tools