MCP quickstart guide🔗
This guide walks you through connecting your AI assistant to Lightrun using the Model Context Protocol (MCP). Once connected, your tool can enrich code analysis with real-time runtime context from Lightrun, such as the values of code expressions, execution duration of code sections, and the call stack at specific points. For an overview of Lightrun MCP concepts and architecture, see Lightrun MCP Overview.
Prerequisites🔗
Before you begin, ensure that:
- You have a Lightrun user account within your company’s Lightrun organization.
- You are assigned to a role that has permission to view and copy Lightrun MCP server setup details and access the MCP page in the Lightrun Management Portal. This permission is available for the following roles:
DeveloperCompany AdminGroup Admin
Connecting the MCP server🔗
These general guidelines apply to most AI assistants that support MCP. Lightrun’s MCP follows the standard for remote MCP via Streamable HTTP (as defined by Anthropic). Many clients use an mcp.json (or similar) file to store connection parameters.
Finding the actual MCP URL
The actual MCP URL and configuration snippet for your environment are on the MCP page in the web management portal of your Lightrun server (Platform → Lightrun MCP).
Remote MCP (Streamable HTTP)🔗
The connection entry for Lightrun’s remote MCP server has this form:
"Lightrun": {
"url": "https://your-lightrun-server-domain/mcp"
}
Different parameter naming
If connection fails, some clients expect different parameter names: try serverUrl, httpUrl, or add "type": "http" to the same object.
Local MCP server (proxy)🔗
The following clients support only locally running MCP servers and cannot use a remote MCP URL. For these, use a local MCP proxy such as the mcp-remote Node package to connect to Lightrun’s remote MCP server.
Clients known to support local MCP, only
Google: Gemini Code (VSCode and JetBrains plugins), Antigravity. JetBrains: JetBrains AI assistant, Junie.
Use the proxy configuration below in your client's MCP settings:
"Lightrun": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://your-lightrun-server-domain/mcp"]
}
Ensure Node.js and npx are available in the environment where the AI assistant runs.
Authenticating to the MCP server🔗
Lightrun’s MCP is authenticated. After you add the MCP server in your AI assistant, you must sign in with your Lightrun user credentials. Once authenticated, the MCP and its tools are available to the assistant.
Action attribution
Actions created in Lightrun by the MCP tools are attributed to your user.
Connecting common AI assistants🔗
The following are step-by-step instructions for some widely used AI assistants.
Cursor🔗
- Go to Cursor > Settings > Cursor Settings.
- Click Tools & MCP.
- Click New MCP Server (or Add Custom MCP) to open the MCP configuration JSON.
-
Add Lightrun’s MCP entry to the JSON. For example:
"Lightrun": { "url": "https://your-lightrun-server-domain/mcp" } -
Save the file and return to Tools & MCP.
- In Installed MCP Servers, find Lightrun and click Connect.
- When prompted, open the authentication page and complete the sign-in and consent flow.
- When complete, the MCP shows as enabled and the number of tools loaded.
- The Lightrun MCP server is then available to Cursor.
Reconnect the MCP
With the toggle, turn the MCP server off and on again to reconnect and reload the tools.
GitHub Copilot (VS Code)🔗
Copilot is VS Code’s default AI assistant. Use the MCP extension to add Lightrun.
- Open the Command Palette and run >MCP: Add Server (the leading
>is required). - Choose HTTP (or HTTP or Server-Sent Events). The MCP server configuration JSON opens.
-
Add the Lightrun MCP server to the JSON. Include
"type": "http"if your configuration requires it:"Lightrun": { "type": "http", "url": "https://your-lightrun-server-domain/mcp" } -
Save the configuration file.
- Open the Command Palette and run >MCP: List Servers.
- Select the Lightrun MCP server, then choose Start Server.
- When prompted, open the authentication page and complete the sign-in and consent flow.
- The Lightrun MCP server is then available to Copilot.
GitHub Copilot (JetBrains)🔗
- Open the Copilot's MCP settings at: Settings → Tools → GitHub Copilot → Model Context Protocol (MCP).
- Click Configure under Model Context Protocol (MCP) to open the MCP configuration file.
-
Add the Lightrun MCP server to the JSON and save. For example:
4. In the configuration file, a clickable Start link appears above the Lightrun entry. Click Start. 5. When prompted, open the authentication page and complete the Lightrun sign-in flow. 6. When complete, the Start control above the Lightrun entry changes to running and the loaded tools are shown. 7. The Lightrun MCP server is then available to Copilot."Lightrun": { "url": "https://your-lightrun-server-domain/mcp" }
Claude Code🔗
-
In a terminal, add the Lightrun MCP server with HTTP transport:
claude mcp add --transport http Lightrun https://your-lightrun-server-domain/mcp--transport http: use HTTP (Streamable HTTP) for the MCP connection.Lightrun: name used for this MCP server in Claude Code.https://your-lightrun-server-domain/mcp: your Lightrun MCP endpoint URL.
Control MCP availability.
Use --scope local for current project (default). Use --scope user for all projects.
-
Start the MCP by running /mcp in Claude Code.
- When prompted, open the authentication page and complete the OAuth flow in the browser.
- The Lightrun MCP server is then available to Claude Code.
Kiro🔗
- Open the Kiro menu (ghost icon on the left side menu).
- In the MCP Servers section, click the edit icon to open the configuration JSON.
-
Add the Lightrun MCP server to the JSON and save. For example:
"Lightrun": { "url": "https://your-lightrun-server-domain/mcp" } -
In MCP Servers, find Lightrun and click Authenticate to open the authentication page.
- When prompted, open the authentication page and complete the sign-in and consent flow.
- When complete, the Authenticate link is replaced by a checkmark and you can view the loaded tools.
- The Lightrun MCP server is then available to Kiro.
Gemini Code🔗
Gemini Code is available as a VS Code extension and a JetBrains plugin. Both versions support local MCP servers only and cannot connect directly to a remote MCP URL.
To use Lightrun’s MCP, configure the Local MCP server (proxy).
Google Antigravity🔗
Antigravity supports local MCP servers only and cannot connect directly to a remote MCP URL.
To use Lightrun’s MCP, configure the Local MCP server (proxy).
JetBrains AI Assistant🔗
JetBrains AI Assistant supports local MCP servers only and cannot connect directly to a remote MCP URL. MCP settings: Settings → Tools → AI Assistant → Model Context Protocol.
To use Lightrun’s MCP, configure the Local MCP server (proxy).
JetBrains Junie🔗
JetBrains Junie supports local MCP servers only and cannot connect directly to a remote MCP URL. MCP settings: Settings → Tools → Junie → MCP Settings.
To use Lightrun’s MCP, configure the Local MCP server (proxy).
Next steps🔗
You’re ready to debug live applications directly from your AI assistant.
To continue, explore the Supported Lightrun Tools to see which runtime inspection and debugging capabilities are available through Lightrun MCP.