As agentic AI explodes, Amazon doubles down on MCP

At the MCP Summit in New York City, Clare Liguori of Amazon Web Services discussed the rapid rise of the Model Context Protocol (MCP), now a leading way to connect AI agents with tools and data.

At the recent MCP Summit in New York City, The New Stack sat down with Clare Liguori, Senior Principal Software Engineer at AWS and core maintainer of the open-source Model Context Protocol project. We discussed the hyperscaler’s contributions to MCP, how the technology is being used today, and its future.

Since its unveiling in late 2024, Model Context Protocol (MCP) has become the de facto method for connecting AI agents to tools and data. Anthropic, MCP’s progenitor, gave control of MCP to the Linux Foundation in late 2025. As agentic AI expands its enterprise footprint, corporate interest in MCP has reached a fever pitch. After all, if your AI applications can’t reach for the fact or shovel that it needs, what good is it?

Amazon’s commitment to the MCP project

Liguori’s status as both an AWS denizen and MCP core maintainer means that she has a foot firmly planted in both the enterprise and open-source worlds. Per the developer, her role as an MCP maintainer includes helping decide what to add to the MCP spec (and what not to). In practical terms, she’s working on how best to bring web hooks, events, and notifications into MCP at present, she told TNS.

If that sounds a little bit OpenClaw-ish, you’re not off the mark. While we once were familiar with AI agents being tied to a very short leash, Liguori said, “we’re starting to see – especially with things like OpenClaw and some other agent run times that are coming about – agents that are always on, agents that are waiting for events to come in, and they will start acting on them.”

MCP is far from done, in other words. 

Given that AWS offers managed MCP servers, having one of its brightest lights aboard the Model Context Protocol is good synergy. In fact, AWS has made several contributions to MCP, including Tasks and Elicitations (longer request timelines and asking the human prompter for more context, respectively). 

We can expect more collaboration between AWS and MCP. Per Liguori, the cloud provider acts as an “experimental playground for some of these new and upcoming concepts in MCP that are still in the draft spec, that we’re still tuning and working on with feedback, but it’s great to have an official implementation of that somewhere that people can actually get their hands on and play with.” 

This is where corporate sponsorship and participation via proxy in OSS projects can bear fruit; not only do sponsor funds keep open-source foundations afloat, but their platforms can also help drive, or even shape, adoption.

What’s ahead

The role of MCP in the agnetic stack is clear. AI agents need its connective tissue to bring context into the work of AI agents and their underlying models. But what about companies that aren’t tech-forward, that perhaps don’t have an in-house development team? 

Liguori noted that Amazon recently made its Kiro AI development tool available to all roles in all job families, because the company had expected that only its engineers would want access to the service. That didn’t turn out to be the case. Cross rising interest in AI-powered development tools with tools like the beginner-friendly Amazon Quick, which use MCP, and you can just spy a time out in the future when AI breaks free from the technology world and brings real automation to even the smallest business. 

The post As agentic AI explodes, Amazon doubles down on MCP appeared first on The New Stack.