/
Platform
Alpic Tunnel: test your MCP app directly on ChatGPT & Claude before deploying
A stable dev URL, reachable from anywhere.
When building an MCP app, there's always a gap between local dev and testing on a real LLM client. Your server runs fine locally, but ChatGPT and Claude can't reach localhost, so at some point you have to expose it to the internet, and to the various desktop, mobile and web versions of your target platforms.
The workaround most developers reach for is ngrok, which has two concrete problems: the free tier injects an interstitial page that Claude refuses to bypass, and the URL changes on every restart, which means reinstalling your custom connector each time you run the command.
Plus, neither ngrok nor cloudflared tunnel support WebSocket traffic, so if you want HMR to work on a remote device, you're out of luck there too.
Luckily, alpic tunnel now gives you a stable, authenticated public URL on *.alpic.dev, tied to your Alpic account, that proxies traffic directly to your local server. Install it once in ChatGPT or Claude and leave it; the URL stays the same across sessions.
What you get
In practice, here’s what changes for your dev workflow:
A stable subdomain, generated from your Alpic account and persisted across sessions. No reinstalling your connector on every run.
Full HTTP and WSS support, so HMR works through the tunnel on any device, including mobile.
Access from the Alpic playground at /try: a real LLM chat interface served directly on your tunnel URL (e.g. https://superb-marmot-fondue-420.alpic.dev/try), so you can test your local MCP server or MCP app in context without installing it in ChatGPT or Claude first.
Auth built in, tied to your Alpic login with no extra setup required.
Request logging: timestamp, method, and path for the last few requests, visible directly from the CLI.
Get started
If you're starting from scratch, we highly recommend our open-source Typescript framework Skybridge as the fastest path to a working MCP app. One command scaffolds your project with the full dev environment already wired up: devtools, tunnel, and a deployment pipeline that goes straight to Alpic.
If you’re already using Skybridge, just add --tunnel to your dev command:
And if you're using a different framework entirely, Alpic tunnel works standalone. Simply install the Alpic CLI and point the tunnel at your local port:
Either way, your server will be reachable on Claude and ChatGPT before you finish your coffee.
The full documentation can be found at docs.alpic.ai/cli/tunnel.
Liked what you read here?
Get our newsletter!
