Every project ships with an MCP server.
One URL, one token, twelve tools. Plug your API docs straight into Claude Desktop, Cursor, Cline, or any MCP-aware tool — models can list endpoints, generate typed clients, run real calls, and verify responses against the spec, all without you copy-pasting a single snippet.
What is MCP
An open standard for AI assistants to call your tools.
Where every AI tool used to need a custom integration, Model Context Protocol speaks one shared protocol — and every MCP-aware client (Claude, Cursor, Cline, Continue) can use it.
Outworx already has a parsed spec, generated TypeScript types, and a hosted Try-It engine. MCP is the natural fit: every capability becomes a tool the model can invoke.
The model wants to operate your API
A user asks Claude or Cursor to do something with your API. Without MCP, the model can only describe what to do; with MCP, it can actually do it.
Outworx exposes 12 tools
Discovery (list / search / get), code generation (types / SDK / examples), live ops (execute_endpoint, response_schema), Q&A (ask_docs), and audit (recent_calls, get_activity).
Calls go through your token
execute_endpoint runs with the user's own credentials — Outworx never sees them. Per-tool permission toggles let you keep destructive operations out of the model's reach.
Responses sample the schema
Every live call captures the response shape, infers a JSON schema, and merges it into the stored types. Generated TypeScript improves automatically.
Connect in 30 seconds
Pick your IDE. Paste the snippet.
The dashboard's MCP tab generates a one-click connection snippet for every supported client. Paste it into the config and your AI assistant has 12 new tools.
claude_desktop_config.json{
"mcpServers": {
"outworx-acme": {
"url": "https://docs.outworx.io/api/mcp/acme",
"headers": {
"Authorization": "Bearer <token>"
}
}
}
}.cursor/mcp.json{
"mcpServers": {
"outworx-acme": {
"url": "https://docs.outworx.io/api/mcp/acme",
"headers": {
"Authorization": "Bearer <token>"
}
}
}
}VS Code · settings.json"cline.mcpServers": {
"outworx-acme": {
"url": "https://docs.outworx.io/api/mcp/acme",
"headers": {
"Authorization": "Bearer <token>"
}
}
}.continue/config.json{
"mcpServers": [
{
"name": "outworx-acme",
"transport": "http",
"url": "https://docs.outworx.io/api/mcp/acme"
}
]
}The 12 tools
Discovery, codegen, live ops, Q&A, audit.
Discovery
list_endpointsBrowse the spec.
search_endpointsNatural-language search.
get_endpointSchema + parameters.
list_versionsMulti-version aware.
Code generation
generate_typesWhole spec or one endpoint.
generate_clientTS or Python SDK file.
generate_examplecURL, Python, Go, etc.
Live ops
execute_endpointReal call, user's creds.
get_response_schemaInferred from samples.
Q&A
ask_docsRAG-grounded chat as a tool.
Audit
list_recent_callsLast 50 invocations.
get_activityPer-tool / per-day stats.
Self-healing TypeScript
Types that improve as the API runs.
When the spec under-documents a response, MCP calls infer the shape from real traffic. Every sample merges into the stored schema, so generated TypeScript stays accurate as your API evolves — without you re-authoring the spec.
12 tools out of the box
Standard MCP, fully scoped to one project. Per-tool permission toggles for destructive ops.
Sparse-spec recovery
An OpenAPI file with missing response schemas slowly fills in. Every live call captures, samples, and merges.
Drift detection on every call
If the API returns a new field or changes a type, the next call surfaces it, widens the union, and invalidates the cached types.
Calls go through user's credentials
Outworx never sees the credential. Per-tool permission toggles + per-IP rate limits cap the blast radius.
{
"id": string,
"email": string
}{
"id": "u_8f3a",
"email": "ada@acme.io",
"created_at": "2026-05-07T10:32:00Z",
"preferences": { "theme": "dark" }
}{
"id": string,
"email": string,
/** sampled */
"created_at": string,
/** sampled */
"preferences": { "theme": string }
}generate_types emits the wider type. The cached version invalidates if the next sample widens the schema again.Pairs well with
AI Chat
Same retrieval engine, exposed as a public docs drawer. Visitors get answers; engineers get the MCP tools.
SDK Generator
MCP's generate_client tool emits the same drop-in TS / Python SDK you'd download from the dashboard.
Mock Server
When the model wants to iterate without touching prod, point execute_endpoint at the mock URL instead.
Make your API native to AI assistants.
Pro plan unlocks an MCP server per project. Connect it to Claude, Cursor, Cline, or Continue in 30 seconds.