Skip to content

MCP quickstart guide🔗

This guide walks you through connecting your AI assistant or automation to Lightrun using the Model Context Protocol (MCP). Once connected, your tool can enrich code analysis with real-time runtime context from Lightrun, such as the values of code expressions, execution duration of code sections, and the call stack at specific points. For an overview of Lightrun MCP concepts and architecture, see Lightrun MCP Overview.

Version availability

  • OAuth (personal sign-in): Lightrun MCP with OAuth authentication is available starting Lightrun version 1.76.
  • API key (Bearer token): API key authentication for Lightrun MCP is available starting Lightrun version 1.80.

Finding your MCP URL

The actual MCP URL to use in your configuration depends on your Lightrun deployment:

  • SaaS: Use https://app.lightrun.com/mcp.
  • Single-tenant and on-premises: See the MCP page in the web management portal (Platform → Lightrun MCP).

Choose your setup path🔗

For a comparison of OAuth and API key authentication, see OAuth vs API key in the overview.

AI assistants (OAuth)🔗

These steps apply only when you use an interactive assistant and expect personal OAuth: you add the MCP server, then complete a browser sign-in flow with your Lightrun credentials.

Step 1: Prerequisites🔗

  • You have a Lightrun user account within your company’s Lightrun organization.
  • Your user is assigned one of the following Lightrun roles: Developer, Company Admin, or Group Admin.
  • Confirm your deployment’s MCP endpoint URL (for example https://app.lightrun.com/mcp on SaaS).

Step 2: Connect Lightrun MCP to the AI assistant🔗

Lightrun’s MCP follows the standard for remote MCP via Streamable HTTP (as defined by Anthropic). Adding a new MCP server is usually done by editing the client’s mcp.json file; some clients also offer a dedicated UI.

Most current clients support remote MCP servers. If yours does not, use a local MCP proxy (such as mcp-remote) to reach Lightrun’s remote MCP server.

These guidelines apply to most AI assistants. Step-by-step instructions for common tools (Cursor, GitHub Copilot, Amazon Q, Claude Code, and others) appear in Detailed instructions for common AI assistants below.

Remote MCP (Streamable HTTP)🔗

The connection entry for Lightrun’s remote MCP server has this form:

"Lightrun": {
  "url": "https://your-lightrun-server-domain/mcp"
}

Different parameter naming

If connection fails, some clients expect different parameter names: try serverUrl, httpUrl, or add "type": "http" to the same object.

Local MCP server (proxy)🔗

Some clients support only locally running MCP servers and cannot use a remote MCP URL. For these, use a local MCP proxy such as the mcp-remote Node package to connect to Lightrun’s remote MCP server.

The connection entry for Lightrun’s remote MCP server has this form:

"Lightrun": {
  "command": "npx",
  "args": ["-y", "mcp-remote", "https://your-lightrun-server-domain/mcp"]
}

Ensure Node.js and npx are available in the environment where the AI assistant runs.

Clients known to support local MCP only

  • Google: Gemini Code (VSCode and JetBrains plugins), Antigravity.
  • JetBrains: JetBrains AI assistant, Junie.

Step 3: Authenticate with OAuth🔗

Lightrun’s MCP is an authenticated MCP server. For personal assistants, after you add the MCP server you must sign in with your Lightrun user credentials when the client prompts you. Once authenticated, the MCP and its tools are available to the assistant.

Step 4: Start using Lightrun MCP🔗

Some AI assistants do not use MCP tools proactively until they have been prompted to do so. To get the assistant used to Lightrun MCP and to confirm that the tools are available, start a new chat and try one of the following:

  • List tools: Ask the assistant to list all available tools, or to list the Lightrun MCP tools. Confirm that the Lightrun runtime tools appear (for example, the tools described in Supported Lightrun MCP tools).
  • Describe capabilities: Ask what the assistant can do with Lightrun, or how it can inspect a live runtime.
  • Use a tool: Ask the assistant to use specific Lightrun capabilities.
    • Start with a simple one (for example, lightrun__get_runtime_sources to list available Lightrun sources)
    • Advance to other tools, such as getting a call stack or expression values.
    • Use explicit instructions, including choosing tool parameters, so the model learns how the tools are structured.
    • See Supported Lightrun MCP tools for the full list.
  • Add Lightrun to system prompt: Add an instruction to the system prompt to recommend using Lightrun tools whenever relevant.

Once the assistant has listed and used the Lightrun tools, it is more likely to suggest using them on its own when you discuss runtime issues or debugging.

Common AI assistants (Detailed instructions)🔗

The following sections provide step-by-step instructions for connecting specific AI assistants to Lightrun MCP.

Cursor🔗

  1. Go to Cursor > Settings > Cursor Settings.
  2. Click Tools & MCP.
  3. Click New MCP Server (or Add Custom MCP) to open the MCP configuration JSON.
  4. Add Lightrun’s MCP entry to the JSON. For example:

    "Lightrun": {
      "url": "https://your-lightrun-server-domain/mcp"
    }
    
  5. Save the file and return to Tools & MCP.

  6. In Installed MCP Servers, find Lightrun and click Connect.
  7. When prompted, open the authentication page and complete the sign-in and consent flow.
  8. When complete, the MCP shows as enabled and the number of tools loaded.
  9. The Lightrun MCP server is then available to Cursor.

Reconnect the MCP

With the toggle, turn the MCP server off and on again to reconnect and reload the tools.


GitHub Copilot (VS Code)🔗

Copilot is VS Code’s default AI assistant. Use the MCP extension to add Lightrun.

  1. Open the Command Palette and run >MCP: Add Server (the leading > is required).
  2. Choose HTTP (or HTTP or Server-Sent Events). The MCP server configuration JSON opens.
  3. Add the Lightrun MCP server to the JSON. Include "type": "http" if your configuration requires it:

    "Lightrun": {
      "type": "http",
      "url": "https://your-lightrun-server-domain/mcp"
    }
    
  4. Save the configuration file.

  5. Open the Command Palette and run >MCP: List Servers.
  6. Select the Lightrun MCP server, then choose Start Server.
  7. When prompted, open the authentication page and complete the sign-in and consent flow.
  8. The Lightrun MCP server is then available to Copilot.

GitHub Copilot (JetBrains)🔗

  1. Open Copilot’s MCP settings at SettingsToolsGitHub CopilotModel Context Protocol (MCP).
  2. Click Configure under Model Context Protocol (MCP) to open the MCP configuration file.
  3. Add the Lightrun MCP server to the JSON and save. For example:

    "Lightrun": {
      "url": "https://your-lightrun-server-domain/mcp"
    }
    
    4. In the configuration file, a clickable Start link appears above the Lightrun entry. Click Start. 5. When prompted, open the authentication page and complete the Lightrun sign-in flow. 6. When complete, the Start control above the Lightrun entry changes to running and the loaded tools are shown. 7. The Lightrun MCP server is then available to Copilot.


Claude Code🔗

  1. In a terminal, add the Lightrun MCP server with HTTP transport:

    claude mcp add --transport http Lightrun https://your-lightrun-server-domain/mcp
    
    • --transport http: use HTTP (Streamable HTTP) for the MCP connection.
    • Lightrun: name used for this MCP server in Claude Code.
    • https://your-lightrun-server-domain/mcp: your Lightrun MCP endpoint URL.

    Control MCP availability

    Use --scope local for the current project (default). Use --scope user for all projects.

  2. Start the MCP by running /mcp in Claude Code.

  3. When prompted, open the authentication page and complete the OAuth flow in the browser.
  4. The Lightrun MCP server is then available to Claude Code.

Amazon Q Developer (VS Code and JetBrains)🔗

  1. Open the Amazon Q / Q Developer panel.
  2. Open the Chat panel.
  3. Click the tools icon to open the MCP configuration UI.
  4. Click the plus (+) symbol to add a server.
  5. Select the scope for this MCP server (Global or Local).
  6. In the Name field, enter Lightrun.
  7. Select http as the transport protocol.
  8. In the URL field, enter your Lightrun MCP endpoint, for example:

    https://your-lightrun-server-domain/mcp
    
  9. Leave Headers empty unless your environment requires custom headers.

  10. Set Timeout to 0 (zero) to disable the timeout. The Lightrun MCP server enforces its own timeouts.
  11. Click Save.
  12. When prompted, complete the Lightrun sign-in flow.
  13. The Lightrun MCP server is then available to Amazon Q in the chat panel.

Connection issues

If a configuration alert appears at the top of the panel, choose Fix Configuration to edit the MCP server settings. Tools from that server may not work until you resolve the alert.


Kiro🔗

  1. Open the Kiro menu (ghost icon on the left side of the window).
  2. In the MCP Servers section, click the edit icon to open the configuration JSON.
  3. Add the Lightrun MCP server to the JSON and save. For example:

    "Lightrun": {
      "url": "https://your-lightrun-server-domain/mcp"
    }
    
  4. In MCP Servers, find Lightrun and click Authenticate to open the authentication page.

  5. When prompted, open the authentication page and complete the sign-in and consent flow.
  6. When complete, the Authenticate link is replaced by a checkmark and you can view the loaded tools.
  7. The Lightrun MCP server is then available to Kiro.

Gemini Code🔗

Gemini Code is available as a VS Code extension and a JetBrains plugin. Both versions support local MCP servers only and cannot connect directly to a remote MCP URL.
To use Lightrun’s MCP, configure the Local MCP server (proxy).


Google Antigravity🔗

Antigravity supports local MCP servers only and cannot connect directly to a remote MCP URL.
To use Lightrun’s MCP, configure the Local MCP server (proxy).


JetBrains AI Assistant🔗

JetBrains AI Assistant supports local MCP servers only and cannot connect directly to a remote MCP URL. MCP settings: Settings → Tools → AI Assistant → Model Context Protocol.
To use Lightrun’s MCP, configure the Local MCP server (proxy).


JetBrains Junie🔗

JetBrains Junie supports local MCP servers only and cannot connect directly to a remote MCP URL. MCP settings: Settings → Tools → Junie → MCP Settings.
To use Lightrun’s MCP, configure the Local MCP server (proxy).

AI agents and systems (API key)🔗

These steps apply only when the integration runs without an interactive browser login: automation platforms, agent frameworks, or custom services. You authenticate by sending your Lightrun MCP API key in the Authorization header as a Bearer token.

Protect your API key

Store the key in a secret manager or environment variable. Do not commit it to source control. Rotate the key if it is exposed.

Step 1: Prerequisites🔗

  • Obtain a Lightrun system API key with the Dev scope. See Generate an API Key.
    • Generating a system API key requires administrative permissions in the Lightrun Management Portal.
  • Confirm your deployment’s MCP endpoint URL (for example https://app.lightrun.com/mcp on SaaS).

Step 2: Connect Lightrun MCP to the AI agent or system🔗

Lightrun’s MCP follows the standard for remote MCP via Streamable HTTP (as defined by Anthropic).

AI agents and systems cover a wide mix of frameworks, products, and custom code, so how you attach an MCP server differs by product. The subsections below describe common patterns.

Detailed step-by-step instructions for representative clients (LangChain, LlamaIndex, n8n, and others) appear in Detailed instructions for common AI agents and systems below.

Code-based clients🔗

Names such as LangChain and LlamaIndex refer to software frameworks: libraries you use to build or embed agents in your application. Your code must create an MCP client (or equivalent) and pass the usual connection details.

Typical constructor or builder parameters include:

  • The MCP URL
  • Headers that include an Authorization value of the form Bearer <API_KEY> (your Lightrun MCP API key)
  • Sometimes an MCP server name. We recommend Lightrun

Clients that use mcp.json🔗

When your client reads MCP configuration from JSON, add a headers object with Authorization set to Bearer plus your API key (one space after Bearer):

"Lightrun": {
  "url": "https://your-lightrun-server-domain/mcp",
  "headers": {
    "Authorization": "Bearer YOUR_MCP_API_KEY"
  }
}

Clients that provide a dedicated UI🔗

Some products (for example n8n) offer a UI for adding MCP servers. Wording differs by vendor, but you typically set:

  • MCP endpoint or URL
  • Transport or server type (labels vary; look for options such as remote, HTTP, or streamable HTTP)
  • Authentication type (labels vary; look for Bearer token, Bearer auth, or API key)

Common AI agents and systems (Detailed instructions)🔗

The following sections provide step-by-step instructions for connecting specific AI agents and systems to Lightrun MCP.

LangChain (Java)🔗

Use Streamable HTTP transport and pass the Bearer token via custom headers. The following example reads the key from the environment variable LIGHTRUN_MCP_API_KEY:

// Create a streamable HTTP transport
McpTransport transport = StreamableHttpMcpTransport.builder()
  .url("https://your-lightrun-server-domain/mcp")
  .customHeaders(Map.of("Authorization", "Bearer " + System.getenv("LIGHTRUN_MCP_API_KEY")))
  .build();

// Create an MCP client using the transport
McpClient mcpClient = DefaultMcpClient.builder()
  .key("Lightrun")
  .transport(transport)
  .build();

Replace the URL with your Lightrun MCP endpoint when not on SaaS. Ensure LIGHTRUN_MCP_API_KEY is set in the process environment before creating the client.


LlamaIndex (Python)🔗

Use BasicMCPClient with a headers dictionary that carries the Bearer token, then expose MCP tools to your agent with McpToolSpec. The example reads the key from the environment variable LIGHTRUN_MCP_API_KEY:

MCP_URL = os.environ["LIGHTRUN_MCP_URL"]
MCP_HEADERS = {
    "Authorization": f"Bearer {os.environ['LIGHTRUN_MCP_API_KEY']}",
}

client = BasicMCPClient(MCP_URL, headers=MCP_HEADERS)
tools = await McpToolSpec(client=client).to_tool_list_async()

n8n🔗

n8n can call Lightrun MCP using the Lightrun MCP Client node as a tool wired into an AI Agent node. The agent decides when to invoke Lightrun tools (for example get_runtime_sources); the Lightrun node runs the tool call and returns structured JSON that the agent can pass back to the user.

  1. Add an AI Agent node to the flow.
  2. Connect a Chat Model node to the AI Agent node’s Chat Model input.
  3. Connect an MCP Client node to the AI Agent node’s Tool input.
  4. Open the MCP client node and set:
    • Endpoint — your Lightrun MCP URL.
    • Server TransportHTTP Streamable.
    • AuthenticationBearer Auth.
    • Credential for Bearer Auth — your Lightrun MCP API key.
    • Tools to IncludeAll (recommended), or only the tools you need.
  5. Save the workflow.

Troubleshooting🔗

My client doesn't support remote MCP

Solution

If your AI assistant does not support connecting to a remote MCP URL, it can only use MCP servers that run locally. In that case, use a local proxy to reach Lightrun’s remote MCP server. For setup, see Local MCP server (proxy) under Step 2: Connect Lightrun MCP to the AI assistant above.

Cursor shows the MCP as connected with all tools, but I can't use them in chat

Solution

Sometimes Cursor displays the Lightrun MCP as connected and shows the tool count, but the tools are not available in the current chat. To fix this:

  1. Open a new chat.
  2. In the new chat, ask the assistant to list all available tools (or similar).
  3. Confirm that the Lightrun runtime tools (for example, the tools described in Supported Lightrun MCP tools) appear in the list.
  4. Once the tools are listed, you can start using them in that chat.

If the tools still do not appear, use the Reconnect the MCP tip: in Tools & MCP, turn the Lightrun MCP server off and on again, then open a new chat and repeat the steps above.


Next steps🔗

You’re ready to debug live applications directly from your AI assistant or agent.

To continue, explore the Supported Lightrun Tools to see which runtime inspection and debugging capabilities are available through Lightrun MCP.


Last update: March 19, 2026