Microsoft’s Big Bet: AIs That Talk, Remember, and Unite

Key Takeaways

  • Microsoft is pushing for a future where AI agents from different companies can work together seamlessly.
  • The company is also focused on improving AI agents’ ability to remember past interactions and tasks.
  • Microsoft supports an open-source technology called Model Context Protocol (MCP) to help AI agents collaborate, envisioning an “agentic web.”
  • To tackle the high cost of better AI memory, Microsoft is developing a new approach called “structured retrieval augmentation.”

Microsoft’s Chief Technology Officer, Kevin Scott, shared a vision for a more collaborative and intelligent future for artificial intelligence. Speaking ahead of the company’s annual Build developer conference, he highlighted efforts to enable AI agents from various creators to interact and learn from those interactions more effectively, as reported by Investing.com.

The company is championing industry-wide standards that would allow these AI systems, which can independently perform tasks like fixing software bugs, to cooperate regardless of their origin. This is a key theme expected at the upcoming Build conference in Seattle, where new tools for AI developers will likely be unveiled.

A significant part of this plan involves backing a technology known as Model Context Protocol (MCP). This open-source protocol, initially introduced by Google-backed Anthropic, could pave the way for an “agentic web,” Scott explained. He drew a parallel to how hypertext protocols spurred the growth of the internet in the 1990s.

“It means that your imagination gets to drive what the agentic web becomes, not just a handful of companies that happen to see some of these problems first,” Scott remarked, emphasizing a more democratic development of AI capabilities.

Another major focus for Microsoft is enhancing the memory of AI agents. Scott pointed out that current AI interactions often feel very “transactional,” lacking long-term recall of user requests. However, improving AI memory is expensive due to the increased computing power required.

To address this, Microsoft is exploring an innovative method called “structured retrieval augmentation.” This technique involves the AI extracting key snippets from conversations, creating a kind of roadmap of what was discussed. This allows for more efficient recall without needing to process vast amounts of data each time.

Scott compared this to how biological brains work, stating, “you don’t brute force everything in your head every time you need to solve a particular problem.” This approach aims to make AI memory more efficient and less costly.

Independent, No Ads, Supported by Readers

Enjoying ad-free AI news, tools, and use cases?

Buy Me A Coffee

Support me with a coffee for just $5!

 

More like this

Latest News