Skip to main content

Documentation Index

Fetch the complete documentation index at: https://docs.universalbench.dev/llms.txt

Use this file to discover all available pages before exploring further.

If your AI client supports MCP, it works with UniversalBench. The protocol is open and the connection details are always the same: one URL.

Connection details

FieldValue
Personal MCP URLhttps://mcp.universalbench.dev/u/ub_live_xxxxxxxxxxxxxxxxxxxxx
ProtocolMCP over HTTP, SSE transport supported
AuthBaked into the URL path. No headers required
Content typeapplication/json
Power user alternative: If you are building a server that needs to multiplex requests across customers, you can use the generic endpoint https://mcp.universalbench.dev plus an X-UB-Key header for each customer. Most users do not need this. Use the personal URL unless you are explicitly building a router.

Confirmed working clients

Continue

Open source AI coding assistant. Add UB to ~/.continue/config.json under mcpServers with just a url field.

Zed Editor

Set up via Zed’s MCP extension. Paste your URL. UB tools appear in the assistant panel.

Cline (VS Code)

Add UB under “MCP Servers” in Cline’s settings panel with the URL field only.

ChatGPT Custom GPTs

Use the OpenAPI bridge described below to expose UB as Custom GPT Actions.

Bolt, Lovable, v0

Any agentic builder that supports MCP. Paste your URL in their integrations panel.

Your own agent

Build with @modelcontextprotocol/sdk (Node) or mcp (Python). See below.

Build your own agent (Python)

from mcp import ClientSession
from mcp.client.sse import sse_client

async with sse_client(
    "https://mcp.universalbench.dev/u/ub_live_xxxxxxxxxxxxxxxxxxxxx"
) as (read, write):
    async with ClientSession(read, write) as session:
        await session.initialize()

        tools = await session.list_tools()

        result = await session.call_tool(
            "workbench_execute",
            {"code": "print(sum(range(1000)))"}
        )
        print(result.content)

Build your own agent (Node, TypeScript)

import { Client } from "@modelcontextprotocol/sdk/client/index.js";
import { SSEClientTransport } from "@modelcontextprotocol/sdk/client/sse.js";

const transport = new SSEClientTransport(
  new URL("https://mcp.universalbench.dev/u/ub_live_xxxxxxxxxxxxxxxxxxxxx")
);

const client = new Client(
  { name: "my-agent", version: "1.0.0" },
  { capabilities: {} }
);
await client.connect(transport);

const tools = await client.listTools();

const result = await client.callTool({
  name: "workbench_execute",
  arguments: { code: "print('hello from my agent')" }
});

Direct HTTP, no MCP SDK

If you cannot or do not want to use the MCP protocol, hit the underlying workbench HTTP endpoint directly:
curl -X POST https://mcp.universalbench.dev/u/ub_live_xxxxxxxxxxxxxxxxxxxxx/workbench/execute \
  -H "Content-Type: application/json" \
  -d '{
    "code": "import requests; print(requests.get(\"https://api.github.com\").status_code)"
  }'
This works from any language with an HTTP client. Use it for CI pipelines, webhook handlers, server side cron jobs, bash scripts, or anything that is not strictly “an AI agent”.

ChatGPT bridge

ChatGPT does not speak MCP natively yet, but you can expose UniversalBench as a Custom GPT Action via OpenAPI. The OpenAPI spec is auto generated and lives at:
https://penantia-mcp-production.up.railway.app/openapi.json
In your Custom GPT builder, paste that URL under “Actions, Import from URL”. ChatGPT reads the schema and your GPT can call every UB capability. For auth, set “API Key” to your UB token in the Authentication section.

What if my client is not listed?

If your AI client supports MCP, UniversalBench works. If you hit any compatibility issue, email hi@universalbench.dev with the client name and we will publish a working config.