Skip to content

CLI

Install the opentraceai command-line tool to index repositories and run an MCP server from your terminal.

Prerequisites

  • Python 3.12+
  • uv recommended — makes uvx and isolated installs trivial. See the uv install guide.

Install

Run OpenTrace without installing it globally — uvx downloads and caches it on first use.

uvx opentraceai index .

Best for: kicking the tires, or using it from a CI job.

Install globally in an isolated environment managed by uv. Re-running the command upgrades it in place.

uv tool install opentraceai --upgrade
opentrace index .

Best for: daily use from any shell. This is the recommended permanent install.

Install into your current Python environment (ideally a venv).

pip install opentraceai
opentrace index .

Best for: an environment you already manage with pip.

Install globally in an isolated environment. Similar to uv tool install but via pipx.

pipx install opentraceai
opentrace index .

Best for: if you already use pipx and don't want to install uv.

Using It

The package installs as opentraceai, but the CLI binary is opentrace (shorter alias — opentraceai also works).

opentrace index /path/to/repo   # index a repo into a knowledge graph
opentrace mcp                   # start an MCP server over stdio
opentrace --help                # see all commands

The graph is stored at .opentrace/index.db at the repo root. Every opentrace command walks up from your current directory to find it, so you can run commands from any subdirectory.

MCP Server

opentrace mcp starts a Model Context Protocol server over stdio. Any MCP-compatible client (Claude Code, Cursor, etc.) can connect to it to query the graph.

If you're using Claude Code, the plugin handles this for you.

What Next


Other install paths: Browser · CLI · Plugin · Source