Documentation Index
Fetch the complete documentation index at: https://docs.universalbench.dev/llms.txt
Use this file to discover all available pages before exploring further.
Step 1: Get your personal MCP URL
Sign up at universalbench.dev and your dashboard shows your personal MCP URL. It looks like this:Why one URL instead of an endpoint plus an API key?Most MCP services give you a shared endpoint and ask you to add an
X-API-Key header in your client config. That works, but every AI client handles custom headers differently. Some break silently when headers are present. Some require an extra plugin. Some force the header through OAuth flows.UniversalBench bakes your identity into the URL itself. One paste. Works in every MCP client. No header config, no env vars, no config bugs. Your URL is your auth, your scope, and your routing all in one.Step 2: Paste it into your AI
Add the URL to your AI client’s MCP config.- Claude Desktop
- Cursor, Claude Code, Windsurf
- Direct API
Edit Restart Claude Desktop. UniversalBench tools appear in the tool list within seconds.
~/Library/Application Support/Claude/claude_desktop_config.json on macOS or %APPDATA%\Claude\claude_desktop_config.json on Windows:Step 3: Run your first call
Ask your AI something that requires execution. For example:Use UniversalBench to find all prime numbers under 10000 and tell me the largest gap between consecutive primes.Your AI routes this to UniversalBench, which runs the Python and returns just the answer. Your AI then synthesizes a clean response. Token usage drops 60 to 95 percent compared to the AI doing this in its own context window.
Step 4: What to try next
Query a database
Use
db_select to read from Supabase with structured filters, ordering, and adaptive caching.Ship code safely
validate_and_push and safe_deploy catch syntax errors and auto rollback if smoke tests fail.Store secrets
secrets_vault keeps your API keys encrypted server side so your AI never sees the value.Run things in parallel
parallel_blocks runs up to eight code blocks concurrently for faster pipelines.