/

Platform

Launch week recap: Alpic releases and industry news

Friday, December 12, 2025

Friday, December 12, 2025

This launch week was about closing the gap between building MCP and ChatGPT apps, and getting them in front of real users. Over the past months, Alpic has become a reliable place to deploy, monitor, and distribute MCP servers. What we kept hearing, though, was consistent: we’re still struggling to build. 

That insight shaped our most important announcement of the week.

Skybridge: helping you build ChatGPT Apps

We formally introduced Skybridge, our open-source TypeScript framework designed to make it easier to build ChatGPT Apps. Unlike a typical launch-week “feature,” Skybridge is a deeper investment in developer experience.

ChatGPT Apps represent a shift in how AI-powered experiences are built. Through MCP, developers can now render interactive widgets directly inside conversations. Rich UI elements live alongside the model’s reasoning, turning ChatGPT into a real distribution surface rather than just an interface.

When we started building real apps on top of the OpenAI Apps SDK, it became clear that important pieces were missing. The SDK gives powerful primitives through the window.openai interface, but it leaves developers to solve most of the modern frontend and state-management problems on their own. More importantly, ChatGPT Apps introduce two parallel surfaces: what users see in the UI and what the model understands through context. Keeping those aligned is essential.

Skybridge exists to address exactly that. It provides a React runtime with hooks and components to render widgets inside ChatGPT’s iframe environment, a drop-in replacement for the official MCP SDK that adds widget registration and type inference, and a Vite-based dev server with proper hot reload.

One of the core ideas behind Skybridge is dual-surface alignment. With the data-llm attribute, any React node can declaratively expose context to the model. Only what is actually rendered, and therefore visible to the user, is shared with the LLM. This avoids brittle imperative approaches and ensures UI and model context stay synchronized by construction.

Skybridge also wraps low-level APIs like window.openai.callTool into higher-level React hooks such as useCallTool, removing a large amount of repetitive state management code. The goal is simple: let developers focus on product logic and user flows, not plumbing.

This framework sets the foundation. Distribution comes next.

Letting people actually try your app: the playground

The second feature of the week was the Alpic playground.

Building ChatGPT and MCP Apps is iterative, and collecting feedback early is critical. Most existing playgrounds target developers and assume familiarity with MCP tooling. They are useful for debugging, but not for sharing.

By appending /try to your server URL, the Alpic playground opens a minimal MCP client in the browser that anyone can use.

The playground runs on our own MCP client with OpenAI models, and includes token credits so it can be tested immediately. This makes it easy to collect feedback from teammates, reviewers, or early users without requiring client installation or connector setup. It’s designed to validate user journeys and your app’s design with real user input.

Publishing MCP servers to the official registry

Finally, we announced support for publishing directly to the official MCP Registry from Alpic.

While our focus has shifted toward helping teams during the build phase, pushing servers to real users remains just as important. The MCP Registry is quickly becoming the ecosystem’s primary source of truth for publicly available MCP servers. It replaces fragmented lists with a standardized, open, and maintained catalog.

From Alpic, publishing is intentionally simple. You create a new version from the Publishing tab, review the generated server.json, and publish. Versioning, naming conventions, and OpenAPI compliance are handled automatically.

Once published, your server becomes discoverable through a single authoritative entry point and automatically propagates to downstream catalogs such as GitHub MCP, VS Code, and Pulse MCP. From there, you can submit it to additional directories like Claude or Cursor, and prepare for upcoming distribution channels such as the ChatGPT App Store.

The broader context: MCP goes to the Linux Foundation

This launch week happened alongside important ecosystem news.

Anthropic announced that MCP has been donated to the Linux Foundation, under the newly formed Agentic AI Foundation. This move places MCP under neutral, open governance and signals long-term commitment from a broad set of industry players. 

At the same time (and most likely not by coincidence), Google announced native support for MCP across Google and Google Cloud services. Managed MCP servers now expose services like Maps, BigQuery, and Kubernetes directly to agents, without developers needing to deploy or maintain their own MCP infrastructure. 

Taken together, these announcements confirm what we’ve been seeing firsthand: MCP is solidifying as the standard interface between AI agents and services.

Our goal at Alpic is to make that transition easier. Skybridge helps you build. The registry and playground help you ship and share. And as new distribution channels and standards emerge, we’ll keep reducing the friction between an idea and a working, usable app.

Liked what you read here?

Receive our newsletter.

Resources
Legal
Resources
Legal
Resources
Legal