If your AI client supports MCP, it works with UniversalBench. The protocol is open and the connection details are always the same: one URL.Documentation Index
Fetch the complete documentation index at: https://docs.universalbench.dev/llms.txt
Use this file to discover all available pages before exploring further.
Connection details
| Field | Value |
|---|---|
| Personal MCP URL | https://mcp.universalbench.dev/u/ub_live_xxxxxxxxxxxxxxxxxxxxx |
| Protocol | MCP over HTTP, SSE transport supported |
| Auth | Baked into the URL path. No headers required |
| Content type | application/json |
Power user alternative: If you are building a server that needs to multiplex requests across customers, you can use the generic endpoint
https://mcp.universalbench.dev plus an X-UB-Key header for each customer. Most users do not need this. Use the personal URL unless you are explicitly building a router.Confirmed working clients
Continue
Open source AI coding assistant. Add UB to
~/.continue/config.json under mcpServers with just a url field.Zed Editor
Set up via Zed’s MCP extension. Paste your URL. UB tools appear in the assistant panel.
Cline (VS Code)
Add UB under “MCP Servers” in Cline’s settings panel with the URL field only.
ChatGPT Custom GPTs
Use the OpenAPI bridge described below to expose UB as Custom GPT Actions.
Bolt, Lovable, v0
Any agentic builder that supports MCP. Paste your URL in their integrations panel.
Your own agent
Build with
@modelcontextprotocol/sdk (Node) or mcp (Python). See below.Build your own agent (Python)
Build your own agent (Node, TypeScript)
Direct HTTP, no MCP SDK
If you cannot or do not want to use the MCP protocol, hit the underlying workbench HTTP endpoint directly:ChatGPT bridge
ChatGPT does not speak MCP natively yet, but you can expose UniversalBench as a Custom GPT Action via OpenAPI. The OpenAPI spec is auto generated and lives at:What if my client is not listed?
If your AI client supports MCP, UniversalBench works. If you hit any compatibility issue, emailhi@universalbench.dev with the client name and we will publish a working config.