
Amazon Web Services on Wednesday introduced Kiro powers, a system that permits software developers to present their AI coding assistants fast, specialized expertise in specific tools and workflows — addressing what the corporate calls a fundamental bottleneck in how artificial intelligence agents operate today.
AWS made the announcement at its annual re:Invent conference in Las Vegas. The potential marks a departure from how most AI coding tools work today. Typically, these tools load every possible capability into memory upfront — a process that burns through computational resources and might overwhelm the AI with irrelevant information. Kiro powers takes the alternative approach, activating specialized knowledge only in the mean time a developer actually needs it.
"Our goal is to present the agent specialized context so it may possibly reach the suitable end result faster — and in a way that also reduces cost," said Deepak Singh, Vice President of Developer Agents and Experiences at Amazon, in an exclusive interview with VentureBeat.
The launch includes partnerships with nine technology firms: Datadog, Dynatrace, Figma, Neon, Netlify, Postman, Stripe, Supabase, and AWS's own services. Developers can even create and share their very own powers with the community.
Why AI coding assistants choke when developers connect too many tools
To know why Kiro powers matters, it helps to grasp a growing tension within the AI development tool market.
Modern AI coding assistants depend on something called the Model Context Protocol, or MCP, to attach with external tools and services. When a developer wants their AI assistant to work with Stripe for payments, Figma for design, and Supabase for databases, they connect MCP servers for every service.
The issue: each connection loads dozens of tool definitions into the AI's working memory before it writes a single line of code. In line with AWS documentation, connecting just five MCP servers can devour greater than 50,000 tokens — roughly 40 percent of an AI model's context window — before the developer even types their first request.
Developers have grown increasingly vocal about this issue. Many complain that they don't need to burn through their token allocations simply to have an AI agent determine which tools are relevant to a selected task. They need to get to their workflow immediately — not watch an overloaded agent struggle to sort through irrelevant context.
This phenomenon, which some within the industry call "context rot," results in slower responses, lower-quality outputs, and significantly higher costs — since AI services typically charge by the token.
Contained in the technology that loads AI expertise on demand
Kiro powers addresses this by packaging three components right into a single, dynamically-loaded bundle.
The primary component is a steering file called POWER.md, which functions as an onboarding manual for the AI agent. It tells the agent what tools can be found and, crucially, when to make use of them. The second component is the MCP server configuration itself — the actual connection to external services. The third includes optional hooks and automation that trigger specific actions.
When a developer mentions "payment" or "checkout" of their conversation with Kiro, the system robotically prompts the Stripe power, loading its tools and best practices into context. When the developer shifts to database work, Supabase prompts while Stripe deactivates. The baseline context usage when no powers are energetic approaches zero.
"You click a button and it robotically loads," Singh said. "Once an influence has been created, developers just select 'open in Kiro' and it launches the IDE with every little thing able to go."
How AWS is bringing elite developer techniques to the masses
Singh framed Kiro powers as a democratization of advanced development practices. Before this capability, only probably the most sophisticated developers knew tips on how to properly configure their AI agents with specialized context — writing custom steering files, crafting precise prompts, and manually managing which tools were energetic at any given time.
"We've found that our developers were adding in capabilities to make their agents more specialized," Singh said. "They wanted to present the agent some special powers to do a selected problem. For instance, they wanted their front end developer, and so they wanted the agent to turn into an authority at backend as a service."
This statement led to a key insight: if Supabase or Stripe could construct the optimal context configuration once, every developer using those services may gain advantage.
"Kiro powers formalizes that — things that individuals, only probably the most advanced people were doing — and allows anyone to get those sort of skills," Singh said.
Why dynamic loading beats fine-tuning for many AI coding use cases
The announcement also positions Kiro powers as a more economical alternative to fine-tuning, the technique of training an AI model on specialized data to enhance its performance in specific domains.
"It's less expensive," Singh said, when asked how powers compare to fine-tuning. "Nice-tuning may be very expensive, and you possibly can't fine-tune most frontier models."
This can be a significant point. Essentially the most capable AI models from Anthropic, OpenAI, and Google are typically "closed source," meaning developers cannot modify their underlying training. They will only influence the models' behavior through the prompts and context they supply.
"Most persons are already using powerful models like Sonnet 4.5 or Opus 4.5," Singh said. "What those models need is to be pointed in the suitable direction."
The dynamic loading mechanism also reduces ongoing costs. Because powers only activate when relevant, developers aren't paying for token usage on tools they're not currently using.
Where Kiro powers matches in Amazon's greater bet on autonomous AI agents
Kiro powers arrives as a part of a broader push by AWS into what the corporate calls "agentic AI" — artificial intelligence systems that may operate autonomously over prolonged periods.
Earlier at re:Invent, AWS announced three "frontier agents" designed to work for hours or days without human intervention: the Kiro autonomous agent for software development, the AWS security agent, and the AWS DevOps agent. These represent a distinct approach from Kiro powers — tackling large, ambiguous problems reasonably than providing specialized expertise for specific tasks.
The 2 approaches are complementary. Frontier agents handle complex, multi-day projects that require autonomous decision-making across multiple codebases. Kiro powers, in contrast, gives developers precise, efficient tools for on a regular basis development tasks where speed and token efficiency matter most.
The corporate is betting that developers need each ends of this spectrum to be productive.
What Kiro powers reveals in regards to the way forward for AI-assisted software development
The launch reflects a maturing marketplace for AI development tools. GitHub Copilot, which Microsoft launched in 2021, introduced tens of millions of developers to AI-assisted coding. Since then, a proliferation of tools — including Cursor, Cline, and Claude Code — have competed for developers' attention.
But as these tools have grown more capable, they've also grown more complex. The Model Context Protocol, which Anthropic open-sourced last yr, created a normal for connecting AI agents to external services. That solved one problem while creating one other: the context overload that Kiro powers now addresses.
AWS is positioning itself as the corporate that understands production software development at scale. Singh emphasized that Amazon's experience running AWS for 20 years, combined with its own massive internal software engineering organization, gives it unique insight into how developers actually work.
"It's not something you’ll use just on your prototype or your toy application," Singh said of AWS's AI development tools. "If you ought to construct production applications, there's a number of knowledge that we herald as AWS that applies here."
The road ahead for Kiro powers and cross-platform compatibility
AWS indicated that Kiro powers currently works only throughout the Kiro IDE, but the corporate is constructing toward cross-compatibility with other AI development tools, including command-line interfaces, Cursor, Cline, and Claude Code. The corporate's documentation describes a future where developers can "construct an influence once, use it anywhere" — though that vision stays aspirational for now.
For the technology partners launching powers today, the appeal is simple: reasonably than maintaining separate integration documentation for each AI tool in the marketplace, they will create a single power that works in all places Kiro does. As more AI coding assistants crowd into the market, that sort of efficiency becomes increasingly useful.
Kiro powers is available now to developers using Kiro IDE version 0.7 or later at no additional charge beyond the usual Kiro subscription.
The underlying bet is a well-recognized one within the history of computing: that the winners in AI-assisted development won't be the tools that attempt to do every little thing directly, however the ones smart enough to know what to forget.
