Thoughts on MCP & The Future of AI Agents
Before the App Store's launch in 2008, smartphones were powerful but constrained—limited to whatever apps the manufacturer provided. The App Store changed that overnight. By opening the doors to developers, it unleashed a wave of innovation, and users reaped the rewards. Suddenly, anyone could transform their phone into exactly what they needed—whether that was a fitness tracker, a remote control, or something entirely new. It redefined not just what our phones could do, but how we expected to interact with technology.
Since 2023, something similar has been happening in the world of AI assistants. Tools like ChatGPT, Gemini and Claude have evolved beyond conversation—connecting to external services, pulling in data, and performing real-world tasks through plugins. It feels like a new platform moment.
However, much like the early days of mobile, the first wave of LLM plugins has emerged inside closed ecosystems. OpenAI and others have introduced their own plugin formats and frameworks but a plugin built for one assistant won’t necessarily work with another. Under this model, developers must either duplicate effort or commit to a single platform, echoing the fragmentation we see with iOS and Android.
That’s where the Model Context Protocol (MCP) comes in. Open-sourced by Anthropic in late 2024, MCP is a universal standard for connecting language models to the systems where data and functionality live—content repositories, business tools, APIs, and developer environments. Instead of building custom integrations for each model, developers implement MCP once, and any assistant that supports it can connect.
MCP introduces a clean, consistent interface: a shared infrastructure layer replacing the patchwork of proprietary plugins and fragmented APIs. Developers expose capabilities through secure MCP servers. Assistants, acting as clients, interact with those services—regardless of vendor. The result? Interoperable AI systems that can move freely, speak a common language, and access real-world data and services at scale.
The analogy that’s quickly taken hold is that MCP is the USB-C port for AI. One plugin, written once, can now work across many assistants. Any assistant that supports MCP can connect to any compatible tool—calendar, thermostat, content repo—regardless of who built it.
This vendor-agnostic approach is central to MCP’s promise. It’s not controlled by Anthropic or any single company. Open-sourcing the protocol signalled a clear intent to build shared infrastructure the entire AI ecosystem can rely on—offering an escape from the walled gardens that have limited previous platform shifts.
Frameworks like LangChain and Semantic Kernel also help developers build AI-powered apps, focusing on agent behaviour and orchestration—managing prompts, tools, and memory. MCP complements these by standardising how agents connect to external systems. These frameworks could incorporate MCP to simplify tool interactions under the hood.
MCP supports two-way communication and context sharing—enabling assistants to carry relevant data across steps (e.g., checking a calendar, reviewing tasks, drafting an email). Previously, you had to stitch together APIs manually. MCP replaces that patchwork with a unified orchestration layer.
Imagine:
- Travel Planning: The AI calls MCP servers for flights, hotels, and your calendar—resolving conflicts and scheduling automatically.
- Smart Home: You say "I’m on my way with mates, can you tidy up?"—and your AI adjusts the thermostat, lights, and starts vacuuming.
- Business Workflows: “Email the sales report” could pull data, generate a summary, and send the message.
This is the shift from static assistants to agentic AI—systems that don’t just respond to prompts, but act across tools with autonomy. And because MCP is open, the ecosystem grows rapidly. Developers are already building connectors for Microsoft 365, GitHub, Jira, and more.
A thought that I've kept coming back to in the past few weeks is: What if Apple integrated MCP into Siri?
Siri, one of the earliest voice assistants, has lagged in the AI race. Apple recently said that major upgrades—including more autonomous features—are delayed until 2026. The proposed features aim to give Siri the ability to “take action for you within and across your apps,” such as pulling up a podcast your friend mentioned or fetching flight info from an email. It’s a vision that closely mirrors what MCP already enables.
If Apple embraced MCP, Siri could instantly become a gateway to the growing open ecosystem of AI tools. Rather than building every integration in-house, Apple could tap into community-built MCP connectors—just like it did with the App Store.
Want Siri to control a non-HomeKit device? If there’s an MCP connector, it can. Want it to manage your Google Calendar or enterprise tools? MCP bridges the gap.
Of course, Apple’s track record with open standards is mixed. But even a curated, secure set of MCP plugins would give Siri new reach—aligning it with the future of AI assistants, not sidelining it.
Picture a future where Siri, Google Assistant, ChatGPT, and other custom agents all share plugins. That’s MCP’s vision. As one tech writer put it, these AI assistants could “work together seamlessly rather than competing in isolated silos” if they plugged into MCP.
And the momentum is building. Sam Altman recently confirmed (sometime between me starting and finishing this blog!) that OpenAI's Agents Framework, ChatGPT desktop app, and Responses API will all support MCP. The vision is already becoming reality.
We’re entering a new era. AI is about to become the connective tissue of our digital lives. MCP could be the open protocol that makes that transformation seamless, scalable—and universal.
Sources
- The App Store turns 10 - Apple
- Introducing the Model Context Protocol - Anthropic
- Model Context Protocol (MCP) and OpenAI’s Stance - Frank Goortani
- The Future of AI: Why Agentic Systems and Open Integration Will Redefine the Race - Mark Jones
- The USB-C Moment For AI: Introducing The Model Context Protocol (MCP) - Spearhead
- Apple says some AI improvements to Siri delayed to 2026 - Reuters