Observability MCP server
Overview
This topic describes the LaunchDarkly observability MCP server, which lets AI clients query observability data, build dashboards, and triage incidents using natural language.
The observability MCP server is one of three hosted MCP servers that LaunchDarkly offers. Each server is focused on a different product area, and the three servers are independent. You can enable any combination of them.
If you use LaunchDarkly’s observability product and want to query it from your AI client, enable the observability server.
Related to Vega
The observability MCP server and Vega for auto-remediation are complementary. Vega runs inside LaunchDarkly and investigates issues from observability views or alerts. The observability MCP server exposes the same underlying data to your AI client, so the agent can query it directly while it’s working on your code. To learn more about the relationship, read Observability MCP server and Vega.
Prerequisites
To use the Observability MCP server, you need:
- An AI client that supports MCP such as Cursor, Claude Code, VS Code with Copilot, or Windsurf.
- A LaunchDarkly account with the observability product enabled. To learn more, read Observability.
Enable the observability MCP server
The observability MCP server is a hosted server and connects your AI client to LaunchDarkly using OAuth. To enable it, follow the instructions in LaunchDarkly hosted MCP server and use the URL https://mcp.launchdarkly.com/mcp/observability.
For example, to manually configure the observability MCP server in Claude Code:
The first time you invoke a tool, the AI client prompts you to authorize access to your LaunchDarkly account. After that, tokens are stored and refreshed automatically.
Use the observability MCP server
After you enable the observability MCP server, you can prompt your agent to query your observability data. Typically you need to click Run tool or similar in your AI client to execute the result.
For example, you could try asking:
Show me error groups from the last 24 hours for the checkout service
or
Find the slowest traces in the last hour where service_name is “gonfalon-web”
or
Create a dashboard that shows error rate and p95 latency for the payments service
or
Which flag evaluations happened during session <session-id>?
Available tools
The observability MCP server exposes the tools described in the following sections.
Querying data
query-logs— Query project logs with a date range and optional query filter (for example,level=error AND service_name="api").query-traces— Query project traces with a date range and filter. Supports sorting by duration to find the slowest traces.query-error-groups— Query project error groups with filters likeerror_type,exception.message, orservice_name.query-sessions— Query project sessions, including filtering by identifier, errors, or session attributes.query-aggregations— Retrieve bucketed, aggregated counts over time for a product type (useful for charts and trends rather than individual records).query-flag-evaluations— Query flag evaluations for a specific session.query-timeline-events— Query timeline indicator events for a specific session.
Discovering the schema
get-keys— Discover available data keys and dimensions for a product type (logs, traces, sessions, errors, metrics). Use this before building dashboards or crafting complex filters.
Dashboards
list-dashboards— List existing dashboards for a project.get-dashboard— Get detailed information about a specific dashboard.create-dashboard— Create a new empty dashboard for organizing charts.create-graph— Add a chart or graph to an existing dashboard.
To learn about each tool’s inputs and outputs, review the tools list in your AI client after you enable the server.
Observability MCP server and Vega
The observability MCP server and Vega both help you understand observability data with AI assistance, but they run in different places and are used for different workflows:
You can use both the observability MCP server and Vega. For example, you might let Vega triage an alert inside LaunchDarkly and open a draft pull request, then switch to your AI client and use the observability MCP server to dig deeper into the affected traces while you review the fix.
Additional resources
- LaunchDarkly MCP server overview
- LaunchDarkly hosted MCP server
- Vega for auto-remediation
- Observability
- Search specification
For issues or feature requests, visit the LaunchDarkly MCP server GitHub repository.