Sixteen months ago, connecting an AI agent to your company's tools meant writing bespoke integrations for every provider. OpenAI had function calling. Anthropic had tool use. Google had function declarations. Every framework demanded its own glue code, and switching providers meant rewriting your entire tool layer. That era is over — and it ended remarkably fast.

Anthropic's Model Context Protocol crossed 97 million monthly SDK downloads in March, the first MCP Dev Summit just wrapped in New York with 95+ sessions, and every major AI lab now ships MCP-compatible tooling. If you're building agents and haven't adopted this standard yet, you're writing code you'll throw away.

From 2 Million to 97 Million

The protocol launched in November 2024 as a quiet Anthropic side project. Two million downloads in month one. By March 2025, OpenAI adopted it across the Agents SDK and ChatGPT desktop. Google DeepMind followed a month later with Gemini support. In December 2025, the Linux Foundation created the Agentic AI Foundation to house MCP alongside Block's goose and OpenAI's AGENTS.md.

That trajectory makes Kubernetes look sluggish — and Kubernetes took nearly four years to reach comparable deployment density. As of this week, there are over 10,000 public MCP servers and the ecosystem includes native support from AWS, Microsoft, Cloudflare, and practically every AI startup with a product page.

Why This Standard Won

The answer is boring in the best way: it solved an actual problem that every agent builder hit daily.

Before the protocol existed, building a Slack integration for an AI agent meant writing one connector for Claude, a different one for GPT, and yet another for Gemini. Same underlying API, three separate wrappers. Multiply that across every tool in your stack — databases, CRMs, monitoring dashboards, code repositories — and you're drowning in integration code that has nothing to do with your actual product.

The protocol flipped the model. You build one server that exposes your tool's capabilities via a standard interface. Any compatible client can discover and call those tools. Capability negotiation, authentication, transport — all handled at the protocol layer. Build once, connect everywhere.

The "USB-C for AI" comparison gets thrown around constantly, and I usually hate hardware analogies for software protocols, but this one genuinely fits. Just like USB-C killed the era of carrying five different cables, this standard killed the era of maintaining parallel tool integrations per provider.

The competing approaches — OpenAI's function calling format, Google's function declarations — weren't bad technically. They just created vendor lock-in that nobody wanted to maintain at scale. When OpenAI itself adopted the Anthropic-originated protocol, the signal was unmistakable: even the company with the most market leverage didn't think proprietary tool interfaces were worth defending.

What the Dev Summit Actually Shipped

The first MCP Dev Summit ran April 2-3 in New York. Beyond the usual conference energy, there were concrete spec and SDK changes worth knowing about.

Python SDK v1.27.0 dropped on day one with RFC 8707 resource validation for OAuth clients, idle timeouts for Streamable HTTP sessions, and — finally — conformance test integration. The TypeScript SDK shipped a 2.0.0-alpha with Standard Schema support and a Fastify adapter that makes building remote servers dramatically simpler.

OAuth 2.1 is now first-class. This was the single biggest gap choking enterprise adoption. Running servers behind corporate auth used to mean a nightmare of custom middleware. The updated spec bakes in OAuth with proper token refresh and scope management. If your team has been stalling on adoption because of auth concerns, that excuse just evaporated.

Streamable HTTP transport is replacing stdio. The original transport model — spawning a local process and communicating over stdin/stdout — worked fine for development but collapsed in production. Remote HTTP means servers can run in the cloud, scale horizontally, and work across team boundaries without every developer running local processes. This is what turns the protocol from a developer convenience into actual infrastructure.

MCP Apps (SEP-1865) is the wildcard nobody expected. It's a new extension for interactive UI via a ui:// URI scheme — essentially, servers that render small interfaces inside AI clients. Too early to call whether this takes off, but if it does, it blurs the line between "AI tool" and "AI app" in a way that could reshape how we think about agent capabilities.

The Rough Edges

I don't want to oversell this. Real gaps remain.

Observability is thin. When an agent chains three servers together and something silently fails halfway through, debugging the interaction is brutal. The 2026 roadmap lists observability as a top priority, and Datadog's Cansu Berkem gave a pointed talk at the summit about the problem. But production-grade distributed tracing for tool calls isn't here yet.

Agent coordination is out of scope. The protocol handles connecting agents to tools — databases, APIs, services. It does not handle agents talking to each other. Google's A2A targets that layer, and the two protocols are complementary rather than competing. But the boundary confuses newcomers who expect one standard to cover everything.

Server quality is all over the map. Ten thousand servers sounds impressive until you start evaluating them. Some are battle-tested production code maintained by the tool vendor. Others are weekend projects that choke on edge cases. Anthropic's Paul Carleton presented a conformance testing framework at the summit ("One Spec, Ten SDKs, Zero Excuses"), but a real trust and discovery system — verified publishers, security audit trails, reputation scores — doesn't exist yet. When you install a community server, you're on your own for vetting it.

So What Do You Do Monday Morning?

If you're already building on the protocol, the Dev Summit outputs matter: upgrade your Python SDK to 1.27.0 for the OAuth improvements, start migrating any stdio-based servers to Streamable HTTP, and watch the SEP-1865 proposal if you're doing anything with agent UIs.

If you haven't started, the on-ramp is the official developer guide. The @modelcontextprotocol npm scope and the mcp PyPI package are your entry points. Realistically, you can have a working server exposing a custom tool in an afternoon. The spec is smaller than you'd expect.

The protocol war ended not with a dramatic showdown but with everyone quietly adopting the same thing because the alternative — maintaining parallel integrations forever — was just too expensive. Sixteen months from side project to Linux Foundation infrastructure. That's the kind of adoption curve that doesn't leave room for second-place finishers.