When Giants Join Forces: The History of Tech Standardization and What MCP Means for AI Agents
Anthropic just donated MCP to the Linux Foundation alongside OpenAI, Google, and Microsoft. I went down a rabbit hole of tech history — USB, Bluetooth, OAuth, Kubernetes — and found a pattern that predicts what happens next for AI agents.

December 2024
Today, Anthropic announced they're donating the Model Context Protocol (MCP) to the Linux Foundation, co-founding the Agentic AI Foundation alongside OpenAI, Block, Google, Microsoft, AWS, Cloudflare, and Bloomberg.
When I read this, something felt familiar. I went down a rabbit hole of tech history, and what I found was a pattern that repeats itself every decade or so — and it tells us a lot about what's coming next for AI agents.
There's a moment in every major technology shift when fierce competitors suddenly decide to sit at the same table. It seems counterintuitive — why would companies that are fighting for market dominance agree to share their core technology?
The answer is consistent across every case I studied: fragmentation hurts adoption more than proprietary control benefits any single player.
Let's look at the evidence.
In the early 1990s, connecting peripherals to PCs was a mess. Every device required a different connector — serial ports, parallel ports, PS/2 connectors, game ports, proprietary interfaces. If you wanted to connect a printer, a keyboard, a mouse, and a scanner, you needed four different types of cables and four different configuration processes.
Seven companies that were actively competing with each other — Intel, Microsoft, IBM, Compaq, Digital Equipment Corporation, NEC, and Nortel — came together in 1995 to form the USB Implementers Forum. The goal was simple: one universal connector for everything.
The USB-IF was initiated on December 5, 1995, and USB 1.0 launched in January 1996. But here's the interesting part — it wasn't immediately successful. The first version was barely adopted. It took until USB 1.1 in 1998, and Apple's decision to make the iMac G3 USB-only (dropping all legacy ports), for USB to reach critical mass.
The key insight: Intel could have tried to push a proprietary connector standard. So could Microsoft or IBM. But they recognized that the market itself would grow faster with an open standard than any single company's share of a fragmented market.
- USB Implementers Forum - Wikipedia
- USB History - Wikipedia
- 25 Years of Making Connections With USB - How-To Geek
Ericsson had been developing short-range wireless technology since 1994. By 1997, IBM approached Ericsson about integrating mobile phone connectivity into ThinkPad notebooks. But here's the crucial detail: neither company was a market leader — Ericsson wasn't dominating phones, and IBM wasn't dominating laptops.
Rather than try to create a proprietary advantage, Adalio Sanchez from IBM and Nils Rydbeck from Ericsson made a surprising decision: they would make the short-link technology an open industry standard "to permit each player maximum market access."
In May 1998, the Bluetooth Special Interest Group was launched with five founding members: Ericsson, IBM, Intel, Nokia, and Toshiba. By the end of that first year, there were already 400 member companies. Today, Bluetooth SIG has over 35,000 members, and billions of Bluetooth devices ship annually.
The key insight: Companies that aren't market leaders have even more incentive to push for open standards — it levels the playing field and lets them compete on implementation rather than proprietary lock-in.
- Bluetooth - Wikipedia
- Bluetooth Special Interest Group
- The Bluetooth Special Interest Group | InformIT
In November 2006, Blaine Cook was developing Twitter's OpenID implementation. Meanwhile, Ma.gnolia needed a way for users to authorize widgets to access their service. When Cook, Chris Messina, Larry Halff, and David Recordon met to discuss solutions, they realized something: there were no open standards for API access delegation.
Every major company — Google, Twitter, Yahoo, AOL, Flickr — had built their own proprietary authorization protocol. The result was that developers had to implement different authentication flows for every service.
The OAuth discussion group was created in April 2007. DeWitt Clinton from Google joined, bringing Google's support to the effort. By December 2007, OAuth Core 1.0 was released.
Here's the fascinating part: OAuth wasn't invented by a standards body. It emerged from developers at competing companies who were tired of solving the same problem differently. The OAuth 1.0 protocol was published as RFC 5849 in April 2010, and by August 2010, Twitter required all third-party applications to use OAuth.
The key insight: Standards often emerge bottom-up from practitioners who are frustrated with fragmentation, not top-down from corporate strategy meetings.
In June 2015, Docker did something that seemed almost irrational: they donated their container format and runtime (runC) to the newly formed Open Container Initiative under the Linux Foundation. This was the core technology that had made Docker the most exciting infrastructure company in years.
Docker's stated reason: "to avoid fragmentation and encourage innovation."
A month later, Google announced they were donating Kubernetes to the Cloud Native Computing Foundation. Kubernetes was based on Google's internal Borg system — technology that had given Google a massive competitive advantage in running distributed systems at scale.
The CNCF was founded with members including Google, CoreOS, Mesosphere, Red Hat, Twitter, Huawei, Intel, Cisco, IBM, Docker, and VMware. Today, CNCF has over 450 member organizations, and Kubernetes runs in more than 50% of Fortune 100 companies.
The key insight: When a technology becomes foundational infrastructure, trying to maintain proprietary control becomes a liability. Your customers want interoperability more than they want your specific implementation.
- Open Container Initiative
- Demystifying the Open Container Initiative - Docker Blog
- 10 Years of Kubernetes - Kubernetes Blog
- Cloud Native Computing Foundation - Wikipedia
To understand why companies choose standardization, it helps to look at what happens when they don't.
In the late 1990s, Microsoft and Netscape were locked in the "browser wars." Both companies kept adding proprietary features to their browsers, creating incompatible versions of JavaScript, different DOM implementations, and competing approaches to dynamic HTML.
The result? Web developers had to build two versions of every website, or choose to support only one browser. The W3C and the Web Standards Project spent years trying to convince both companies to support common standards.
As the Web Standards Project described it: "If Netscape and Microsoft persisted in building ever–more incompatible browsers, the cost of development would continue to skyrocket, tens of millions would find themselves locked out, and the Web would fragment into a tower of digital Babel."
The W3C eventually succeeded, and by the mid-2000s, browser standards compliance became a competitive advantage rather than a limitation. But it took nearly a decade of fragmentation before the industry learned its lesson.
Now let's look at where we are with MCP.
Since Anthropic introduced the Model Context Protocol one year ago, adoption has been remarkable:
- Over 10,000 active public MCP servers
- Adoption by ChatGPT, Cursor, Gemini, Microsoft Copilot, Visual Studio Code
- Enterprise deployment support from AWS, Cloudflare, Google Cloud, and Microsoft Azure
- 97 million+ monthly SDK downloads across Python and TypeScript
If Anthropic had kept MCP proprietary, the likely outcome would have been predictable: OpenAI would create a competing protocol, Google would create another, Microsoft would create yet another. We'd be heading toward "browser wars" for AI agents — developers forced to implement multiple protocols, tools that only work with specific AI providers, fragmented ecosystems.
Instead, Anthropic chose the USB/Bluetooth/OAuth/Kubernetes path: donate the protocol to a neutral foundation and invite competitors to co-govern it.
The Agentic AI Foundation includes:
- Co-founders: Anthropic, Block, OpenAI
- Supporting members: Google, Microsoft, AWS, Cloudflare, Bloomberg
This is essentially every major player in AI agreeing to a common standard for how AI agents connect to external systems.
Here's the part that most people miss: the standard itself is just the beginning.
The USB devices market is now worth over $40 billion. The cloud infrastructure market — built on standardized container orchestration — grew from $16 billion in 2016 to $160 billion today.
When there's a shared standard, companies can build confidently on top of it. Investment flows in. Ecosystems develop.
After Kubernetes became the standard for container orchestration, companies like Datadog, HashiCorp, and dozens of others emerged to solve the new problems that the standard created. They didn't compete with Kubernetes — they built on top of it.
As one industry analyst put it: "Providers like New Relic, Splunk, and Datadog grew to address new sprawling infrastructure observability demands. Docker helped to manage the container creation process. Kubernetes managed the orchestration process across containers, and HashiCorp abstracted away infrastructure in the form of code."
Here's the most interesting outcome: successful standards become so ubiquitous that people stop thinking about them.
As Betty Junod from Salesforce recently said about Kubernetes: "It is no longer the focal point of the conversation, because it is the default."
When was the last time you thought about the USB standard? You just expect things to connect. That's what successful standardization looks like.
If MCP follows the pattern of previous standardization moments, we should expect:
- Rapid proliferation of MCP servers for every major service and data source
- Competition shifts from "which protocol" to "which implementation"
- Early-stage companies building tooling on top of MCP
- MCP becomes the expected way for AI agents to connect to external systems
- New categories of companies emerge: agent observability, agent security, agent orchestration
- The protocol itself becomes largely invisible to end users
- Just as you don't think about TCP/IP when browsing the web, you won't think about MCP when using AI agents
- Competition happens entirely at higher layers of abstraction
- The real question becomes: what do we build once connectivity is solved?
The pattern is clear: standardization accelerates adoption, expands markets, and shifts competition to higher layers of abstraction.
So the interesting question isn't whether MCP will become the standard — the major players have already made that decision by joining the Agentic AI Foundation.
The interesting question is: what gets built on top of it?
Every previous standardization moment created opportunities for entirely new categories of companies:
- USB enabled the entire peripheral industry
- Bluetooth enabled wireless audio, fitness trackers, smart home devices
- OAuth enabled the entire "Login with..." ecosystem
- Kubernetes enabled the cloud-native monitoring, security, and DevOps tool industry
What will MCP enable? What problems will emerge when every AI agent can connect to every external system through a common protocol? What companies will be built to solve those problems?
History suggests that the biggest opportunities aren't in the standard itself — they're in everything that becomes possible once the standard exists.
- Anthropic: Donating the Model Context Protocol
- Model Context Protocol Documentation
- Agentic AI Foundation
- The Development of the USB Standard - Frontier Economics Case Study (PDF)
- W3C Official History
- 10 Years of Kubernetes - Official Retrospective
If you're navigating AI agent adoption in your organization and want to discuss what this means for your team, join us at Squid Club — a community of practitioners working through these challenges together.
About the Author
Sahar Carmel is a Director of AI enablement and AI-First Coding Consultant. After building 10+ production AI agents , he now helps engineering teams navigate the transition to AI-first development. He founded Squid Club, a community for practitioners navigating the AI coding revolution.