Summary
Microsoft's Copilot Reorganisation
On 17 March 2026, Satya Nadella announced the most significant restructuring of Microsoft's AI organisation since the company hired Mustafa Suleyman two years earlier. Consumer and commercial Copilot were unified under one leadership team. Suleyman stepped back from day-to-day product leadership to focus on building in-house AI models, and a new generation of executives took the reins. Five days before that, Rajesh Jha, the Executive Vice President (EVP) who had led Microsoft's Experiences + Devices division for years, announced his retirement after 35 years at the company.

These are structural changes to how Microsoft builds and delivers AI. They arrived against a backdrop of mixed adoption data, a tech stock downturn, $37.5 billion in quarterly AI capital expenditure, and a partnership with OpenAI that has evolved from exclusivity towards mutual independence through four publicly renegotiated stages. Here is what happened, who the new leaders are, what their track records tell us about the direction ahead, and what it all means for the organisations that depend on Microsoft's products every day.
Two separate announcements, five days apart
On 12 March 2026, Microsoft announced that Rajesh Jha, Executive Vice President of Experiences + Devices, would retire after 35 years at the company. He will step down on 1 July 2026 and stay on in an advisory role. Jha had led the Experiences + Devices division responsible for Microsoft 365 Copilot, Windows, and Office. His responsibilities were divided among four EVPs reporting directly to Nadella: Perry Clarke, Charles Lamanna, Pavan Davuluri, and Ryan Roslansky. Jeff Teper was promoted to EVP, and Sumit Chauhan and Kirk Koenigsbauer were promoted to President. Jha's memo said the full cascade of organisational details would be finalised between March and June, ready for the start of FY27 (Microsoft's fiscal year 2027, beginning 1 July 2026).
On 17 March 2026, five days later, Nadella published a second announcement, accompanied by a separate memo from Suleyman. The core structural change was the unification of consumer and commercial Copilot into one organisation, spanning four connected pillars.
Copilot experience
Copilot platform
Microsoft 365 apps
AI models
A new five-person Copilot Leadership Team was formed:
Name | Role | Reports to |
|---|---|---|
Jacob Andreou | EVP, Copilot (experience across consumer and commercial) | Nadella (solid line), Suleyman (dotted line) |
Ryan Roslansky | M365 apps and Copilot platform (collectively) | Nadella |
Perry Clarke | M365 apps and Copilot platform (collectively) | Nadella |
Charles Lamanna | M365 apps and Copilot platform (collectively) | Nadella |
Mustafa Suleyman | AI models / Superintelligence | Nadella |
The March 2026 restructuring replaced Jha's single-leader model with five EVP-level direct reports to Nadella. This is the most significant change to Microsoft's AI organisation since Suleyman was hired in 2024.
Nadella's memo assigned Roslansky, Clarke, and Lamanna collectively to "M365 apps and the Copilot platform" without specifying individual pillar ownership. Jha's earlier memo noted the full cascade of details would be finalised between March and June, ready for the start of FY27. The transition is in progress.
In Nadella's memo, everything starts with models. "Progress at the AI model layer is more critical than ever to our success as a company over the next decade and is foundational to everything we build above it." The reorganisation brings commercial and consumer Copilot together, commits more resources to the superintelligence mission, and ties model development directly to product benchmarks, serving costs, and frontier research. On Andreou, Nadella wrote that he "has accelerated our user-focused AI-first product making and growth framework" at Microsoft AI. On the redesign itself, Nadella wrote that org boundaries should "simply reflect system architecture and product shape" so Microsoft can "deliver more coherent and competitive experiences."
Tom Warren of The Verge summarised the two announcements: "Microsoft has a new Copilot boss who is about to focus on unifying the consumer and commercial Copilots. The changes will see Microsoft AI CEO Mustafa Suleyman dedicated to creating Microsoft's own AI models instead." That two-part framing captures the March 2026 reorganisation accurately: a unified Copilot product experience under Andreou, and in-house model development under Suleyman.
Five restructurings in two and a half years
September 2023 | Microsoft announced "Microsoft Copilot" as the unified AI brand. Yusuf Mehdi led consumer Copilot marketing alongside Search, Ad & News. Commercial Copilot (M365 Copilot) fell under Jha's Experiences + Devices division. |
March 2024 | Microsoft hired Mustafa Suleyman and most of the Inflection AI team, creating a new organisation called Microsoft AI (MAI). Suleyman became EVP and CEO, reporting to Nadella. |
October 2024 | Jay Parikh, former Meta global head of engineering and former Lacework CEO, joined Microsoft's Senior Leadership Team. In January 2025, Nadella appointed him EVP of the new CoreAI division, covering Platform and Tools. |
June 2025 | LinkedIn CEO Ryan Roslansky's role was expanded to also lead Microsoft Office and the M365 Copilot app, initially reporting to Jha. |
March 2026 | The current restructuring. Jha announced his retirement, consumer and commercial Copilot were unified under Andreou, and Suleyman pivoted to focus on in-house model development and the superintelligence mission. |
The organisation behind the March 2026 restructuring looks very different from the one that launched M365 Copilot in late 2023.
The February 2026 security and quality changes
A month before Jha's retirement announcement, on 4 February 2026, Microsoft made separate leadership changes in security and engineering quality.
Hayete Gallot rejoined Microsoft as EVP of Security, reporting directly to Nadella. She had previously served as President, Customer Experience for Google Cloud. With this appointment, security has its own EVP-level leader on Nadella's Senior Leadership Team.
Charlie Bell moved from his role as an organisational leader to an individual contributor engineering role focused on the Quality Excellence Initiative (QEI), partnering with Scott Guthrie and Mala Anand on engineering quality across the company. Nadella's post said it was "given his desire to move from being an org leader to being an IC engineer" (IC meaning individual contributor, a hands-on technical role rather than a management one). Ales Holecek became Chief Architect for Security, reporting to Gallot.
Nadella noted "great momentum in security, including progress with Security Copilot agents, strong Purview adoption, and continued customer growth."
Jha's departure and the four-way succession
Jha's career at Microsoft spanned the company's transformation from on-premises software to cloud services. He joined Microsoft in 1990 after completing a Bachelor of Technology in Computer Science from the Indian Institute of Technology Madras (1988) and a Master of Science from UMass Amherst (1990). Over 35 years, his career path took him through Microsoft Works, the Consumer Division (multimedia), Director of Development for early cloud services, Corporate Vice President (CVP) of Exchange, SharePoint, Project, and Outlook, and CVP of Office 365 cross-platform.
He led the conversion of those flagship products from on-premises software to cloud services, oversaw Office 365's evolution into the Microsoft 365 subscription model that now serves hundreds of millions of users, and saw M365 Commercial cloud revenue grow 17% in the December 2025 quarter. Under his watch, M365 Copilot reached 15 million paid seats, and his division oversaw the incorporation of both OpenAI and Anthropic models into the M365 Copilot add-on.
At Build 2023, his division announced it "will adopt the same open plugin standard that OpenAI introduced for ChatGPT, enabling interoperability across ChatGPT and the breadth of Microsoft's copilot offerings." At Build 2024, his team announced Team Copilot, which "expands Copilot beyond a personal assistant to work on behalf of a team."
His retirement memo was methodical, as you would expect from someone who ran this division for years. "We're announcing these top-level changes today, and between now and June, my leadership team and I will work together to finalize the full cascade of details needed in this kind of transition. This includes aligning operating rhythms, decision ownership, and details on the future org structure, all so we'll be fully aligned and ready to run at the start of FY27. Our intent in taking this approach is to minimize changes and not lose the great momentum we have."
His leadership philosophy, as he described it on the WorkLab podcast: "It's very hard for teams to rally behind something when the leader themselves is half-hearted." And: "Whenever we are looking at a hard strategic call, we start by asking, how would the customer react to the decision that we are making?"
From Nadella's tribute: "Rajesh has been a constant throughout my entire life at Microsoft. From our earliest days working together, I have admired his unwavering commitment to his team, to our customers, to the products we build, and to the company." And: "I have always been struck by his operational rigor, his ability to make the hard strategic calls, lead through the grind, and emerge stronger on the other side. That, to me, is what true leadership looks like." And: "When I think about the pantheon of leaders who have truly shaped this company, Rajesh stands firmly among them. He embodies the commitment that helped build and transform Microsoft into the company it is today, and it is on the strength of that foundation that we will continue to move forward." Nadella closed: "And as we look to the future, the opportunity ahead is expansive. We have the depth of talent, the product ethos, and a clear sense of purpose as a company to ensure our technology advancements accrue to our mission of empowerment." And personally: "Rajesh β I am deeply grateful for all you have done for Microsoft and our customers and for all you have taught me personally. On behalf of all of us β Thank You."
Jha's own parting words to his team: "Our priorities around SFI, QEI, and Copilot remain unchanged β let's keep the intensity here." (SFI is the Secure Future Initiative, Microsoft's company-wide security programme; QEI is the Quality Excellence Initiative.)
Nobody replaced Jha. The cloud transformation chapter had one leader. The AI chapter has five: four EVP successors plus Andreou for unified Copilot.
His four successors all report directly to Nadella. Nobody replaced Jha. Instead, his responsibilities went to four people, each with a distinct specialisation: Clarke for M365 Core infrastructure, Lamanna for business applications and agents, Davuluri for Windows and Devices, Roslansky for productivity apps. Pavan Davuluri, the fourth EVP in the succession, leads Windows and Devices. The cloud transformation chapter had one leader. The AI chapter has four EVP successors, and five days later a fifth, Andreou, was added to lead the unified Copilot experience.
Beyond the four EVP successors, two additional promotions accompanied the announcement. Jeff Teper was elevated to EVP, and Sumit Chauhan and Kirk Koenigsbauer were both promoted to President.
Jha was not the only long-serving leader to step down in the period. Phil Spencer, head of Gaming, announced his retirement on 20 February 2026, with Asha Sharma replacing him. Between Spencer, Jha, and Bell's move to an IC role, three of Nadella's longest-serving senior leaders changed status in one quarter.
3.3% of the addressable market
Fifteen million seats in a market of 450 million
Microsoft reported 15 million paid M365 Copilot seats at the Q2 FY2026 earnings call in January 2026, with seat growth up more than 160 percent year-over-year. Microsoft has approximately 450 million commercial M365 users (cited by The Register, sourcing Directions on Microsoft analyst Mary Jo Foley). That puts paid Copilot seat penetration at 3.3% of the commercial base. Many of those 450 million now have access to Copilot Chat at no additional cost, but the paid add-on, the product that generates direct per-seat revenue, has reached a fraction of the addressable market.
On the consumer side, the picture is starker still. As of February 2026, the Microsoft Copilot app had 6 million daily active users (DAUs) (Sensor Tower data). For comparison, per the same Sensor Tower data reported by CNBC: OpenAI's ChatGPT had 440 million daily active users, Google Gemini had 82 million, and Anthropic's Claude had 9 million (March 2026).
In the paid AI chatbot subscriber market (January 2026), Copilot's share contracted from 18.8% in July 2025 to 11.5% (Recon Analytics data), a 39% contraction in just six months, while ChatGPT held 55.2% and Gemini 15.7%.
Bing, which underpins Copilot's search capabilities, holds approximately 5% of the search engine market as of February 2026, against Google's 90% (StatCounter data, reported by CNBC).
On the Windows side, Copilot integrations into Settings, File Explorer, and notifications are "no longer in active development", per Windows Central's Zac Bowden.
As of January 2026, paid M365 Copilot seats represent 3.3% of the commercial M365 base. Consumer Copilot has 6 million daily active users, compared to ChatGPT's 440 million. Copilot's share of the paid AI chatbot subscriber market contracted 39% in six months.
Windows Central also reported that "virtually no one is using Microsoft Copilot" internally at Microsoft, despite Nadella wanting executives to embrace AI. The internal non-usage report is a single-source claim, but it sits alongside the 3.3% seat penetration, the 6 million DAU figure, and the Forrester assessment below, all painting a picture of early-stage adoption.
Nadella's optimism
At the Q2 FY2026 earnings call, Nadella described Copilot as "becoming a true daily habit," stating daily active users were up 10x year-over-year and that average conversations per user had doubled. On measuring Copilot's business impact: "We don't want to maximize just one business of ours. You should think about M365 Copilot, GitHub Copilot, Dragon Copilot, Security Copilot β all of those have a GM [gross margin] profile and lifetime value."
A tenfold increase in daily active users sounds substantial until you note the base: 6 million daily active users against ChatGPT's 440 million, 3.3% paid seat penetration, and a 39% contraction in paid subscriber market share. Nadella is optimistic. The numbers are not.
Two views of the same data
Opinions on where Copilot adoption leads vary considerably among analysts and research firms.
Forrester's "The Copilot Reality Check" (Q1 2026) found enterprise adoption "measured β even cautious." Their assessment: "Most organizations are still piloting rather than scaling." And: "Billions are flowing into AI supply-side capacity while enterprise demand remains disciplined, governed, and conditional." CIOs were demanding "outcome-led use cases up front, not generic productivity claims."
Forrester sees cautious enterprise piloting. Goldman Sachs projects $35 EPS by FY2030. Both views can coexist if you believe AI adoption will eventually scale but has not done so yet.
On the other side, Wall Street remains bullish on Microsoft's long-term AI value. Goldman Sachs holds a buy rating with a $655 price target, arguing the market is "underappreciating the long-term value" of Microsoft's AI, with an upside scenario of more than $35 earnings per share by FY2030. Wedbush has a $625 target (outperform), projecting Copilot and Azure could add $25 billion in sales by FY2026. Morgan Stanley has a $650 target (overweight), and separately warned via Fortune that "a transformative leap in artificial intelligence is imminent, driven by an unprecedented accumulation of compute at America's top AI labs."
Forrester's cautious assessment and the bullish long-term outlook from the investment banks sit at very different ends of the range. They are not necessarily contradictory. Both can be true if you believe AI adoption will eventually scale but has not done so yet.
E7 at $99 and a July price rise
Microsoft 365 E7 (announced 9 March 2026), a new $99 per user per month bundle, launches in May 2026. It rolls the $30 M365 Copilot add-on, $12 Entra identity tools, and the new $15 Agent 365 into a single bundled offering at one per-user price.
π SAMexpert advises on M365 licensing, including E7 migration and Copilot bundling. We don't sell Microsoft licences β our advice is independent. Learn more: Microsoft 365 Planning and Optimisation.
Separately, Microsoft 365 price increases across the board take effect in July 2026.
The new Copilot leadership team
Jacob Andreou, EVP, Copilot
Andreou spent eight years at Snap Inc. (2015β2023) as Senior Vice President of Growth, helping scale the company from 80 million to over 360 million DAUs and $4.5 billion in revenue. At Snap, he created the growth team, established the "Core-Product-Value" growth framework, and built the international growth playbook. Per Greylock's announcement, he "built App Install, Web View, and Video View Ad Units and led design for ads and content." Under his product leadership, Snap launched Spotlight and "My AI, Snap's own AI chatbot powered by ChatGPT."
He left Snap in March 2023 to become General Partner at Greylock, a venture capital firm where he partnered with "early-stage founders who are building the next generation of consumer platforms and social experiences." He then joined Microsoft AI in 2025 as Corporate Vice President of Product and Growth, reporting to Suleyman. He came to Microsoft from Greylock, not directly from Snap.
Before the promotion, Andreou built several products at Microsoft AI:
Mico, described by AP as "a new artificial intelligence character" and "a floating cartoon face shaped like a blob or flame that will embody the software giant's Copilot virtual assistant." Introduced at the Copilot Sessions event on 22 October 2025 in Los Angeles, it "changes colors, spins around and wears glasses when in 'study' mode." Andreou said the character should be "genuinely useful" and not so validating that it would "tell us exactly what we want to hear, confirm biases we already have, or even suck you in from a time-spent perspective and just try to kind of monopolize and deepen the session and increase the time you're spending with these systems."
The Copilot Fall Release (23 October 2025), a consumer Copilot update with 12 major features: Mico avatar, Groups (up to 32 participants), Memory & Personalisation, Proactive Actions, Copilot for Health, Copilot Mode in Edge, Copilot on Windows, Learn Live, Connectors, Imagine, Real Talk, Copilot Pages and Search.
GroupMe received Chat Summaries powered by Copilot, voice messages, and shared event albums during 2025. Profile views hit 1.2 billion.
Copilot Tasks (26 February 2026). Per the announcement: "You describe what you need in natural language. Copilot plans and goes to work... Tasks works in the background, with its own computer and browser." Tasks can be "recurring, scheduled, or run once." The examples Microsoft gives are practical: surfacing urgent emails with draft replies, tracking apartment listings weekly, and Monday morning briefings. The system is "designed to ask for consent before taking meaningful actions like spending money or sending a message on your behalf." Limited research preview as of March 2026.
Andreou spent eight years scaling Snap from 80 million to 360 million DAU. His entire track record is consumer growth. Now he leads both consumer and commercial Copilot under one product vision.
Now, as EVP of Copilot, Andreou reports to Nadella with a dotted line to Suleyman. Dotted-line reporting is standard practice in large organisations and should not be read as anything more than an organisational coordination mechanism. His job is to unify the consumer and commercial Copilot experience under one product vision.
Mustafa Suleyman, EVP and CEO of Microsoft AI
Suleyman co-founded DeepMind in 2010, which Google acquired four years later, and subsequently co-founded Inflection AI. In March 2024, Microsoft hired Suleyman together with Inflection's Chief Scientist KarΓ©n Simonyan and several other Inflection team members. Together they formed Microsoft AI as a new organisation, with Suleyman as its EVP and CEO. Mikhail Parakhin's team (Copilot, Bing, Edge) and Misha Bilenko's Generative AI team moved to report to him.
Over the two years since joining Microsoft, Suleyman has been saying the same thing about building models in-house. He calls it "self-sufficiency," meaning the ability to build models without depending on any single external provider. That does not mean breaking with OpenAI. The partnership was reaffirmed in February 2026. But it does mean Microsoft wants to stand on its own feet when it comes to models.
His public statements trace the idea from aspiration to product. At Microsoft's 50th anniversary event in Redmond in April 2025, he said: "Look, it's absolutely mission-critical that long-term, we are able to do AI self-sufficiently at Microsoft." He also laid out the cost logic: "It's cheaper to give a specific answer once you've waited for the first three or six months for the frontier to go first." And: "That's actually our strategy, is to really play a very tight second, given the capital-intensiveness of these models." In plain terms, let the leaders spend the money pushing the frontier, then build something cheaper that does the same job well enough for enterprise customers.
Four months later, he had something to show for it. When MAI-1-preview launched in August 2025, he called it "our first foundation model trained end to end in house." And: "We have big ambitions for where we go next β model advancements, an exciting roadmap of compute, and the chance to reach billions of people through Microsoft's products."
At the Paley International Council Summit in October 2025, he gave a glimpse of where the products are heading. "Copilot on Windows can take control of your mouse and keyboard if you've granted permission⦠It will use your operating system for you." And: "In Edge, our browser, it can use the search engine. It can buy things and book things." And: "You know, it really is going to be working on your behalf with you as a kind of overseer."
Then in November, he made the ambition explicit. On 6 November 2025, Suleyman announced the MAI Superintelligence Team within Microsoft AI, joined by Microsoft AI Chief Scientist KarΓ©n Simonyan. He called the mission "Humanist Superintelligence" (HSI), which he defines as "incredibly advanced AI capabilities that always work for, in service of, people and humanity more generally." On the approach, per CNBC: "We are not building a superintelligence at any cost, with no limits." And: "We are doing this to solve real concrete problems and do it in such a way that it remains grounded and controllable." WinBuzzer described the effort as pursuing "true self-sufficiency."
On the day of the reorg (17 March 2026), he told CNBC: "I'm genuinely thrilled about this change precisely because most of the future value is going to accrue to the model layer, and my job is to create highly COGS-optimized, highly efficient enterprise specific model lineages for Microsoft over the next three to five years. That is singularly the objective, precisely because the model is the product, right? That is the future direction of all the IP." (COGS is cost of goods sold, in this context the compute cost to serve AI workloads; IP is intellectual property.)
His memo (edited slightly for external use per Microsoft) said the restructuring would let him "focus all my energy on our Superintelligence efforts and be able to deliver world class models for Microsoft over the next 5 years." Those models would do two things: improve Microsoft's own products through enterprise-tuned variants, and bring down the cost of serving AI at the scale Microsoft expects. He also committed to staying involved in day-to-day Microsoft AI operations, attending Meetups, MMMs, and Leadership Team meetings, and supporting Andreou on product strategy. Andreou retains a dotted line to him. On unifying Copilot, he wrote that he had been working with other leaders in the background for some time and that the consumer-commercial merger made sense.
Suleyman's cost logic: it's cheaper to wait three to six months for the frontier to move, then build something efficient for enterprise. That is the stated strategy behind Microsoft's in-house model portfolio.
Nadella's memo called Suleyman uniquely qualified to lead the model effort, citing his focus on advancing model science while keeping human control, agency, and economic opportunity at the centre.
Ryan Roslansky, EVP, M365 apps
Roslansky has been CEO of LinkedIn since June 2020. He has spent 16 years at the company in total. Under his leadership, LinkedIn grew to over 1 billion members and more than $17 billion in annual revenue.
On 4 June 2025, Nadella gave him an additional role: EVP of Office, responsible for Word, Excel, PowerPoint, Outlook, and the M365 Copilot app. He initially reported to Jha; after Jha's retirement announcement, he reports directly to Nadella.
When he accepted the dual role, he told The Register: "Office is one of the most iconic product suites in history. It has shaped how the world works, literally. The reach and impact of Office are unmatched." And: "Productivity, connection, and AI are converging at scale. Both Office and LinkedIn are used daily by professionals globally and I'm looking forward to redefining ourselves in this new world."
His view of what AI means for the workforce is worth hearing, because it shapes how he thinks about both LinkedIn and Office. At the Paley International Council Summit (22β24 October 2025, Menlo Park), he said: "Here's the reality: the skills that got you here won't get you there." LinkedIn data shows that ten years ago the average job changed roughly 25% in required skills; by 2030 that could reach 70%. The skills emerging fastest, in his view, are "AI literacy, adaptability and human-centered leadership." His advice to leaders: "Stop hiring for yesterday's job descriptions and start hiring for potential." And: "They need to create cultures of continuous learning: microlearning, real-world projects and personalized development."
Roslansky co-authored "Open to Work" with Aneesh Raman, LinkedIn's Chief Economic Opportunity Officer. The book is described as "LinkedIn's first book". Per the announcement: "The book explores how AI is reshaping work and what that shift means for the people navigating it every day." Backed by "insights from experts, LinkedIn's global network, Microsoft customers and the Work Trend Index." It publishes 31 March 2026.
Roslansky runs LinkedIn (1 billion members) and Office (hundreds of millions of users). The integration potential is obvious. As of March 2026, no concrete data integration has shipped.
Under his watch, the M365 E7 bundle was announced, Agent 365 and Copilot Cowork entered preview. But the big question hanging over the dual role is LinkedIn-to-Office integration, and so far no concrete data integration has shipped. GeekWire's coverage of the dual-role announcement (June 2025) discussed potential use cases but did not report any shipped features.
Perry Clarke, EVP, M365 Core
Perry Clarke holds a PhD in Physics from Queen's University in Belfast. He joined Microsoft around 1996 (inferred from "15 years at Microsoft" in a September 2011 blog post) and spent approximately 15 years in the Exchange Product Group, leading the Mailbox Server Engineering Team during Exchange 2007/2010 development. He was promoted to Distinguished Engineer (at the time, the blog post noted "there are less than 60 distinguished engineers at Microsoft"), then Corporate Vice President of Office 365 Online Services. His background is also documented by Queen's University.
His track record at Microsoft is in infrastructure engineering.
The Exchange storage redesign reduced disk input/output requirements by approximately 90% compared to Exchange 2003, enabling cheap commodity drives instead of expensive enterprise storage systems. This was an infrastructure revolution that made Exchange economically viable at cloud scale.
Clarke is an infrastructure engineer with a PhD in physics. His track record is Exchange storage and high-availability systems at cloud scale. He now runs M365 Core β the infrastructure layer of Microsoft 365.
Database Availability Groups (DAGs), introduced in Exchange 2010 under his team, provided automatic database-level failover in 30 seconds or less.
His "Geek Out with Perry" series covers high availability architecture, migrations versus in-place upgrades, archiving and storage economics, data protection, Exchange Online security, email data immutability, and disk I/O efficiency. As of March 2026, Clarke is EVP reporting to Nadella, leading M365 Core. He is an infrastructure engineer who thinks at cloud scale.
Charles Lamanna, President, Business Applications & Agents
Lamanna holds a BS in Computer Science from the University of Notre Dame. He first joined Microsoft in 2009. From 2012 to 2013, he founded MetricsHub, described as "one of the first offerings for public cloud cost management and service health monitoring". Microsoft acquired it in 2013. After the acquisition, he led engineering teams that created Azure Resource Manager, Azure Autoscale, Azure Logic Apps, and Azure Activity Logs. He rose to CVP leading Dynamics 365 and Power Platform, and from there to President. He sits on the board of Danaher Corporation (appointed February 2025). His current title is President, Business Applications & Agents. Per his Danaher bio: "He oversees the design, product development, and engineering of some of Microsoft's most transformative technologies, including Power Platform, Dynamics 365, and Copilot Studio." He was named as an "EVP direct report to Satya" in Jha's succession memo. His functional title is President.
He has been more specific about where Microsoft is heading with agents than Nadella's memo was.
Writing on the Microsoft Cloud Blog in April 2025, he offered an analogy: "If computers are the bicycle [for the mind], AI is the jetpack of the mind." He sees it unfolding in stages. "Initially, we're seeing AI augment people in an organization β everyone has a powerful AI assistant that deeply understands their specific work." Then: "AI agents will evolve to become key team members, capable of autonomously managing complex workflows and boosting efficiency considerably. People will set high-level strategies, provide direction and manage these agents."
By October 2025, at the Power Platform Community Conference 2025 (PPCC25) keynote, the message had sharpened. "Low-code as we know it is dead." The Power Platform has been "reborn in the era of AI." Per VladTalksTech's recap: "Makers now use AI-infused tools to build full-stack apps faster, smarter, and with less manual effort. You can still drag and drop, but now you can also tweak the React code behind the scenes if needed." Power Platform had reached 56 million monthly active users (up from 48 million in 2024 and 33 million in 2023, representing 70% growth over two years). Copilot Studio has reached 200,000 organisations creating agents, up from 50,000 in one year. Lamanna stated he would love to hit 500 million monthly active users (per VladTalksTech).
On the Madrona podcast in May 2025, he went further. "Business apps as we know it are indeed dead. It's going to be like mainframes." What replaces them, in his view, is a complete rethinking of how enterprise software works. "Business agents. You're going to have a generative UI, which AI dynamically authors and renders on the fly to exactly match what the person's trying to do. You're going to replace workflows with AI agents, which can take a goal and an outcome and find the best way to accomplish it, and you're going to move from static relational databases to things like vector databases, and search indexes, and relevant systems, which are a whole new class of technology." His timeline for the agent-first model becoming the default: "By 2030, this will be the prevalent pattern for business applications and business solutions."
The implications go beyond software architecture. If agents handle the workflows, people's roles change too. "You can be a generalist with a team of expert AI supporting you. What that translates to is probably de-specialization in the enterprise, de-specialization in companies where you have less distinct roles and disciplines, more generalists powered by AI." And eventually: "In the future, probably most knowledge work and most information work will be done by AI agents. And a knowledge worker's main responsibility will be the management and upkeep."
Power Platform: 56 million monthly active users, up from 33 million two years ago. Copilot Studio: 200,000 organisations building agents, up from 50,000 in one year. Lamanna's target: 500 million MAU.
He also pointed to the infrastructure enabling all this. The Model Context Protocol (MCP) and Agent-to-Agent protocol (A2A) represent "probably 30 years since we've had such an industry-wide convergence on an open standard." Copilot Studio already supports MCP servers. And inside Microsoft, promotion now depends on it: "This year, you won't be promoted unless you use AI tools if you're an engineer."
Lamanna's vision statements are backed by a product catalogue that has grown rapidly since late 2023.
Copilot Studio is where organisations build their own agents. It started as a replacement for Power Virtual Agents at Ignite November 2023, gained agent capabilities at Build 2024, and by March 2026 supports multiple AI models (GPT-5, Claude Opus 4.6, Claude Sonnet 4.5), Computer-Using Agents for desktop automation, multi-agent orchestration, and MCP server support. It has evolved from a chatbot builder into something considerably more ambitious.
Agent 365 sits above it as the management layer. Lamanna calls it "the control plane for AI agents" with "a focus on registration, access control, visualization, interoperability, and security." It works with agents built in Copilot Studio, Microsoft Foundry, open-source frameworks, and from partners including LexisNexis, ServiceNow, Adobe, Box, NVIDIA, ZenDesk, and Databricks. It goes generally available on 1 May 2026 at $15 per user per month, also included in qualifying M365 plans (Microsoft Agent 365 page).
And Copilot Cowork, announced 9 March 2026 alongside Agent 365, built in partnership with Anthropic. Cowork takes a task, turns it into a plan, and executes it in the background across M365 apps. In research preview as of March 2026.
Real customers are already using these tools. EstΓ©e Lauder built a ConsumerIQ agent in Copilot Studio. Dow deployed a freight invoice anomaly detection agent that caught a billing error, $30,000 charged against an expected $5,000. Microsoft's own Azure.com website uses an AI assistant that drove 70% more pages per session and a 21.5% increase in conversion rates. In a Fortune op-ed in December 2025, Lamanna wrote that "80% of leaders said their company plans to integrate agents into their AI strategy in the next 12 to 18 months, with more than one-third planning to make them central to major business processes."
Lamanna's current scope covers Copilot Studio, Power Platform, and Agent 365. Whether the market follows his stated direction at the pace he projects is a separate question, and one that the cautious enterprise adoption data from Forrester keeps open.
From chat to agents
Nadella and Suleyman's 17 March memos describe the same turn. In Nadella's words, "AI experiences rapidly evolve from answering questions and suggesting code, to executing multi-step tasks with clear user control points." He pointed to Copilot Tasks, Copilot Cowork, agentic capabilities in Office, and Agent 365 as evidence that the evolution from chat to autonomous execution is already happening, and said the opportunity for customers is more time on higher-value work, less manual coordination, and proper governance and security controls. Suleyman put it more bluntly: "We really do have an incredible opportunity to redefine Microsoft for this agentic revolution."
The products shipped between October 2025 and March 2026 support the claim. Copilot Tasks, Copilot Cowork, Agent 365, and Computer-Using Agents in Copilot Studio all work autonomously on the user's behalf, asking for consent before taking actions like spending money or sending messages.
The products shipped between late 2025 and March 2026 β Copilot Tasks, Copilot Cowork, Agent 365, Computer-Using Agents β all work autonomously on behalf of users, marking a clear shift from chat-based AI to agentic execution.
The March 2026 reorganisation positions Microsoft for exactly what Nadella's memo describes. The unified Copilot experience under Andreou, the agent infrastructure under Lamanna, and the model layer under Suleyman all point in the same direction β agents that act on behalf of users.
The evolution from chat to autonomous action is not limited to Microsoft. For a sense of where autonomous agents already are, consider developer AJ Stuyvenberg, who built an open-source personal AI agent called OpenClaw (originally Clawdbot, then Moltbot) and set it loose to buy him a car. While Stuyvenberg sat in a meeting, the agent emailed dealerships on his behalf, forwarded competing quotes between them to drive the price down, and negotiated a $4,200 discount on a Hyundai Palisade. The dealerships had no idea they were dealing with software.
Suleyman's five-year model mission
Suleyman wants to build in-house AI models. He has said so repeatedly, and the March 2026 restructuring gives him the room to do it. His memo describes the mission as creating superintelligence that delivers "a transformative, positive impact for millions of people," requiring frontier models built at scale. The models would do two things. First, provide enterprise-tuned lineages that improve products across the company. Second, deliver the cost efficiencies needed to serve AI workloads at the immense scale Microsoft anticipates.
What has shipped
As of March 2026, Microsoft has released a catalogue of in-house models, none of which are frontier-class large language models (LLMs) in the manner of GPT-5 or Claude Opus 4.6.
πΉ MAI-Voice-1, announced in August 2025, is a speech generation model that generates a full minute of audio in under a second on a single GPU, handling single and multi-speaker scenarios. It is available in Copilot Daily, Podcasts, and Copilot Labs. |
πΉ MAI-1-preview is a mixture-of-experts foundation model, pre-trained and post-trained on approximately 15,000 NVIDIA H100 GPUs. Suleyman described it as "our first foundation model trained end to end in house." Available on LMArena for public testing, rolling out "for certain text use cases within Copilot," with API access for trusted testers. Microsoft confirmed "a working cluster of Nvidia GB200 chips" alongside this announcement, also in August 2025. |
πΉ MAI-Image-1 is Microsoft's first in-house image generation model. It debuted in the top 10 on the LMArena Text-to-Image leaderboard and is available in Bing Image Creator and Copilot Labs. |
The Phi family of small language models (what Suleyman calls "off-frontier"; 3 to 15 billion parameters) is Microsoft's most prolific in-house line:
Phi-4, a 14-billion-parameter small language model specialising in complex reasoning and mathematics. Released January 2025 on Azure AI Foundry and Hugging Face.
Phi-4-mini and Phi-4-multimodal, smaller variants for on-device and multimodal scenarios.
Phi-4-reasoning, a 14-billion-parameter reasoning model trained via supervised fine-tuning on demonstrations from o3-mini. Outperforms OpenAI o1-mini and DeepSeek-R1-Distill-Llama-70B on most benchmarks.
Phi-4-reasoning-plus, a variant with outcome-based reinforcement learning for higher performance via longer reasoning traces.
Phi-4-mini-reasoning and Phi-4-mini-flash-reasoning, optimised for edge and device deployment (Copilot+ PCs). Available since May 2025.
Phi-4-reasoning-vision-15B, a 15-billion-parameter multimodal reasoning model, early 2026.
Microsoft has not publicly announced a frontier-class large language model. Bloomberg reported that Microsoft's in-house models produced test results "competitive with state-of-the-art rivals, including products from OpenAI and Anthropic," but the specific benchmarks and models were not named publicly. Models in active development cover source code generation, images, audio, and reasoning (per CNBC). In his memo, Suleyman said the goal is "enterprise tuned lineages that help improve all our products across the company" and "the COGS efficiencies necessary to be able to serve AI workloads at the immense scale required in the coming years."
As of March 2026, Microsoft has not shipped a frontier-class large language model. Its entire in-house portfolio β Phi, MAI-Voice, MAI-Image β consists of specialised and small models. The frontier LLM gap is where Suleyman's five-year mission sits.
Suleyman's memo commits to delivering world-class models for Microsoft over the next five years and describes the frontier-scale compute roadmap as locked and ready. What has shipped so far is a portfolio of specialised and small models. The Phi family excels in reasoning benchmarks at its parameter class. MAI-Voice-1 and MAI-Image-1 are competitive in their specific domains. But Microsoft has not yet shipped a large language model that competes directly with GPT-5, Claude Opus 4.6, or Gemini at the frontier. That gap is where the stated five-year mission sits.
Suleyman's off-frontier strategy explains the current portfolio. Build cost-efficient, specialised models now while the frontier compute roadmap matures.
Custom silicon, GB200 clusters, and Fairwater
The models need hardware, and Microsoft has been building it. Suleyman's memo says the "frontier scale compute roadmap" is "locked," meaning the infrastructure needed to train state-of-the-art models is in place or committed.
On the silicon side, the Maia 200 is Microsoft's own AI accelerator chip, described by WinBuzzer as a "second-generation AI-accelerator chip designed for cost-efficient inference." Microsoft is designing its own chips specifically for AI workloads.
On the GPU side, Microsoft confirmed "a working cluster of Nvidia GB200 chips" (Grace Blackwell architecture) as of the MAI-1-preview announcement in August 2025. The GB200 cluster sits alongside the approximately 15,000 NVIDIA H100 GPUs that were used to train MAI-1-preview.
And the Fairwater network of purpose-built AI data centres houses the lot. Taken together (custom silicon, state-of-the-art GPU clusters, and dedicated data centres), this is the compute foundation on which Suleyman's five-year model roadmap sits.
Copilot Cowork and the Anthropic connection
On 9 March 2026, eight days before the reorganisation announcement, Charles Lamanna announced Copilot Cowork. Lamanna did not mince words: "The era of Copilot execution is here." What matters most is what followed: "Working closely with Anthropic, we have integrated the technology behind Claude Cowork into Microsoft 365 Copilot."
In practice, "when you hand off a task to Cowork, it turns your request into a plan. The plan continues in the background, with clear checkpoints so you can confirm progress, make changes, or pause execution at any time." It draws on signals from Outlook, Teams, Excel, and the rest of Microsoft 365, "so it can act with the same understanding you bring to your job."
As of March 2026, Cowork is in Research Preview with a limited number of customers, with broader availability planned through the Frontier programme in late March 2026. In November 2025, Anthropic was valued at approximately $350 billion following an investment deal involving Microsoft and Nvidia.
Beyond Cowork, Anthropic's models are already available in Copilot Studio, where organisations can build their own agents using GPT-5, Claude Opus 4.6, and Claude Sonnet 4.5 alongside Microsoft's own models.
The Anthropic integration is one data point in a broader multi-model strategy. Microsoft now integrates OpenAI models, Anthropic models (Claude in M365 Copilot via Cowork, and multiple Claude variants in Copilot Studio), and its own in-house models throughout its product line. The in-house Phi models handle edge and on-device scenarios. The MAI models target specific domains including voice, image generation, and text. It is a portfolio approach to model supply, not a winner-take-all bet on any single provider.
The OpenAI partnership: evolution, not a breakup
The Microsoft-OpenAI partnership has been renegotiated four times in public since January 2025, and each round loosened the ties while deepening the money.
It started in January 2025, when Microsoft said: "Key elements of our partnership remain in place for the duration of our contract through 2030," but moved compute exclusivity to a right-of-first-refusal model and approved OpenAI to build its own capacity for research and training. A non-binding memorandum of understanding followed in September 2025.
The definitive agreement came in October 2025, and it rewrote the relationship substantially. Microsoft's equity stake dropped from 32.5% to approximately 27% (still roughly $135 billion) as OpenAI transitions to a public benefit corporation. IP rights were extended through 2032, now including models developed after the achievement of Artificial General Intelligence (AGI), though Microsoft keeps the IP needed to build and run models (architecture, weights, inference and finetuning code, data centre IP) while OpenAI's confidential research methods have a time-limited licence. The old arrangement where OpenAI could unilaterally declare AGI β triggering contractual changes β was replaced by independent expert verification. OpenAI's API products stay exclusive to Azure, but non-API products like ChatGPT can now run on any cloud provider. Microsoft lost its right of first refusal as compute provider. And both parties can now independently pursue AGI with other partners.
The financial commitment actually got bigger. OpenAI committed to purchase an additional $250 billion in Azure services. Revenue share continues until AGI is independently verified. OpenAI can release open-weight models and serve US government national security customers on any cloud.
In February 2026, the two companies issued a joint statement reaffirming that the October terms are unchanged, alongside OpenAI's $110 billion funding round (with Amazon backing). Azure remains the exclusive cloud provider for stateless OpenAI APIs.
π SAMexpert provides independent Microsoft licensing advice β no reselling, no vendor bias. Learn more: Microsoft Licensing Services for Enterprises.
The direction of travel is clear. Each renegotiation reduced how much either party was locked to the other, while the money flowing between them grew. Microsoft can now pursue AGI on its own. OpenAI can run its consumer products wherever it likes. Yet the financial ties remain deep: $250 billion in committed Azure purchases, a ~27% equity stake worth roughly $135 billion, and a joint reaffirmation issued as recently as February 2026.
Windows Central's original headline framed the reorg as Microsoft "building toward an OpenAI-free future" (the phrasing is preserved in the URL slug). The publication subsequently changed the page title to "Microsoft Copilot Changes: Moving Away From OpenAI?" Windows Central's coverage also noted Suleyman's new role "could also help Microsoft shift away from its reliance on OpenAI."
The fuller picture that emerged over the following days is less dramatic. Microsoft confirmed to Reuters: "We are integrating different models of OpenAI and Microsoft, depending on the product and experience we want to provide." Microsoft is also integrating its own in-house Phi models to reduce serving costs. The overall contractual evolution is consistent with diversification and optionality rather than decoupling.
As of March 2026, Microsoft has in-house model development capabilities, a $250 billion compute partnership with OpenAI, and Anthropic integration in its productivity suite.
$37.5 billion in a single quarter
The March 2026 Copilot reorganisation did not happen in a financial vacuum. The iShares Expanded Tech-Software Sector exchange-traded fund (IGV) was down approximately 19% year-to-date in March 2026. Microsoft's own stock was down 17% over the same period.
$37.5 billion in capital expenditure in a single quarter on AI infrastructure β while paid Copilot seats cover 3.3% of the 450-million commercial M365 base.
Microsoft spent $37.5 billion in capital expenditure in the single quarter ending December 2025, a figure The Register described as a "quarterly AI splurge." M365 Commercial cloud revenue increased 17% in the same quarter.
At the Q2 FY2026 earnings call (January 2026), CFO Amy Hood addressed the correlation investors were drawing between CapEx and Azure revenue: "I think many investors are doing a very direct correlation... between the capex spend and seeing an Azure revenue number." She said a large share of AI capacity is allocated to Microsoft's own products first (M365 Copilot, GitHub Copilot) before being made available to external Azure customers (The Register). The gap between capital expenditure and visible AI revenue is a legitimate investor concern. Nadella's answer was that the investment serves M365 Copilot, GitHub Copilot, Dragon Copilot, and Security Copilot, each with its own gross margin profile and lifetime value. The spending is not reducible to one Azure revenue number.
Voices from the field
The day after the reorg announcement, I sat down with Alexander Yashukov, SAMexpert's senior analyst, to work through what it all means. What follows is what came out of that conversation.
Self-sufficiency as a return to form
Alexander Yashukov observes that Microsoft was a serious force in machine learning research as far back as 2015β2016, competing with Google, alongside Facebook AI Research (FAIR), Baidu Research under Andrew Ng, and major academic labs. Microsoft Research was the first to claim human-level accuracy in speech recognition (October 2016, 5.8% word error rate on the Switchboard benchmark versus their measured human baseline of 5.9%, though IBM later disputed the baseline, arguing true human parity is 5.1% WER). They had the Cognitive Toolkit (CNTK), a C++ deep learning framework released January 2016 that was competitive with TensorFlow on benchmarks, particularly in multi-GPU scaling. Their ML R&D never stopped; what changed was appetite for public-facing AI products after the 2016 Tay chatbot disaster, where coordinated 4chan users exploited the bot's "repeat after me" feature and learning mechanism to produce offensive content. Microsoft did not ship another consumer AI chatbot for seven years. They retreated to letting third parties build the consumer-facing layer while continuing research internally. The self-sufficiency language in Suleyman's memo is a return to a position Microsoft held before OpenAI entered the picture.
Quality frustration as a driver
Microsoft got burned by Copilot's quality problems no less than its customers did: hallucinations, poor output, the lot. Nadella himself said in an internal email (December 2025, originally reported by The Information) that Copilot's Gmail and Outlook integrations "don't really work" and are "not smart." In my view, the subtext of the reorg is "thank you OpenAI, but we'll write our own models now." They write that their models are specialised for specific tasks, and that specialisation is exactly in line with what industry analysts have been predicting, a gradual move away from LLMs as a horizontal tool towards vertical solutions. Gartner predicts that by 2027, organisations will use small, task-specific AI models at least three times more than general-purpose LLMs. Sumit Agarwal, VP Analyst at Gartner: "The variety of tasks in business workflows and the need for greater accuracy are driving the shift towards specialised models fine-tuned on specific functions or domain data."
A horizontal model is a spork, a spoon and fork in one, adequate at neither. A vertical model is a specific fork for a specific dish. The Phi family, with its reasoning variants, multimodal variants, and edge-optimised variants, fits the vertical mould. So does the stated direction of "enterprise tuned lineages" that Suleyman describes in his memo.
Vertical specialisation and platform lock-in
In my view, horizontal models don't allow Microsoft to create real lock-in. When you use M365 Copilot, you can take all your prompts, your entire workflow, and migrate to Claude in a week. Got tired of Claude? Migrate to OpenAI in a week. There is no platform stickiness right now. Microsoft is coming out with something that will tie customers to Microsoft so tightly they won't be able to unstick themselves. Quality is part of it, but the deeper driver is a burning desire to lock the customer in.
Executing agents and security as the real motivation
Alexander Yashukov observes that the primary driver may be executing agents and security. Microsoft Research published UFO in February 2024, a UI-focused agent that controls Windows applications via keyboard and mouse, not a browser plugin. By October 2025, Suleyman was describing this publicly at the Paley Summit: "Copilot on Windows can take control of your mouse and keyboard if you've granted permission... It will use your operating system for you." By March 2026, Computer-Using Agents shipped in Copilot Studio. In Yashukov's view, the computer-using agent approach is fundamentally different from browser-based automation: it's a remote-desktop model where the agent is the operator.
What agents mean for the browser-based web
MCP is already everywhere. Services are writing their own MCP servers for bookings and transactions. Within a decade, much of the browser-based web built around human eyeballs and ads may change entirely.
The computer-using agent approach is an intermediate solution. MCP (Model Context Protocol) is already everywhere. Services are writing their own MCP servers for bookings, transactions, and other workflows. Within roughly a decade, much of what we currently call the World Wide Web will change entirely. The browser-based web, built around human eyeballs and ad-supported content, is a transitional form. If agents buy, book, and transact for users, users stop visiting websites.
Creator economics is done. Why would anyone visit a website, wade through hundreds of lines of advertising to find a five-line pancake recipe, when they can just ask ChatGPT? All that contextual advertising wrapped around a five-line recipe? You can say goodbye to it. And the problem isn't even that people aren't supporting creators. The problem is that the creators themselves made the experience intolerable. If it had been two or three banners and some in-text ads, nobody would have minded. But they pushed it so far that the reader's experience became unbearable.
As Alexander Yashukov adds: the creators themselves started actively using AI to generate content. Why would you pay for something they generated with AI when you can generate it yourself?
In 2026 people are still publishing articles that are just unedited ChatGPT drafts, no fact-checking, no style-checking. Meanwhile, a properly produced AI-assisted article takes just as long as a manual one. The difference is whether you treat AI as a production tool with a rigorous QA pipeline, or as a content printer.
The open question
In a closed-door interview on 24 March 2026, one week after the reorg announcement, I posed a question that sits behind all of the above. It is not about whether the reorganisation was the right move, or whether Suleyman's models will reach the frontier, or whether Copilot's adoption will scale. It is more fundamental than that.
Should we continue seeing AI as a separate vector in Microsoft's business portfolio, asking "how much is AI contributing to its growth?" Or is it time to switch the mindset to recognising that Microsoft is now systematically an AI company, where AI is an integral part of every significant offering?
M365 Copilot, GitHub Copilot, Security Copilot, Agent 365, Copilot Studio, Phi on device, Azure as backbone, E7 at $99 bundling it all. At what point does "AI strategy" stop being a separate question?
Consider the evidence. M365 Copilot is AI embedded in productivity. GitHub Copilot is AI embedded in development. Security Copilot is AI embedded in security operations. Agent 365 is AI embedded in business process automation. Copilot Studio is AI embedded in application development. The Phi models run on Copilot+ PCs, bringing AI to the device level. Azure is the compute backbone for all of it. The M365 E7 bundle at $99 per user per month packages Copilot, identity tools, and agent management into one subscription, making AI inseparable from the productivity suite itself.
At some point in the near future, asking "what is Microsoft's AI strategy?" may become the wrong question entirely. The question for readers is whether what we have covered in this article still looks like a company that is adding AI to its existing products, or whether it has become a company that is rebuilding its products around AI.
Microsoft often changes their licensing and pricing rules. If you need help, get in touch. We don't sell Microsoft licences or cloud services, so our advice is independent.