Why is Microsoft the best target for "bottom-fishing" in AI right now? Goldman Sachs: AI profit margins will replicate the expansion miracle of the cloud era

Wallstreetcn
2026.01.19 13:50
portai
I'm PortAI, I can summarize articles.

Goldman Sachs strongly recommends Microsoft after research, believing that its stock price correction is a good opportunity to position for the leader in the AI era. The report emphasizes that Microsoft is replicating the high-profit conversion path of the cloud era, leveraging its unique advantages from collaboration with OpenAI, strict infrastructure control, and the accelerated expansion of enterprise AI demand, and is expected to achieve long-term stable profit growth

Microsoft's stock price has fallen 16% from its peak in mid-December last year and still struggles to recover the key $500 level.

For traders who are on the sidelines, there is only one core question: does this pullback represent an excellent "buying opportunity"? Goldman Sachs analyst Gabriela Borges, after visiting Microsoft's headquarters in Redmond and engaging in in-depth discussions with executives, provided an extremely affirmative answer.

Goldman Sachs' conclusion is straightforward: Among all the tech giants covered, Microsoft is the "best target" to achieve compounded growth through the AI product cycle. This judgment is not only based on vague visions but is also supported by specific financial pathways—Goldman Sachs expects that by fiscal year 2030, Microsoft's earnings per share (EPS) will steadily approach $35, which implies a compound growth rate of over 20%. For investors, this means that the current panic may be the ticket to entry, as Microsoft is establishing its dominance in the AI era through the flexibility of its infrastructure and unique profit margin advantages.

Replaying the Miracle of the Cloud Era: The Inevitable Path from High Costs to High Profits

The core logic presented by Microsoft's management to Goldman Sachs is that the current AI cycle bears an astonishing similarity to the early cloud cycle. Investors need not panic about the currently high costs of AI investments. Looking back at the early stages of cloud transformation, it was similarly filled with high costs and weak unit economics, but as scale effects, utilization improvements, and engineering efficiencies were optimized, profit margins subsequently experienced significant expansion. During the cloud transformation, Microsoft set mid-term gross margin targets and kept actual performance within a deviation of 100 basis points from the targets each year; this precision in execution will replay in the AI era.

In fact, Microsoft believes its leadership position in the AI cycle is even better than during the cloud cycle. This confidence stems from an almost fanatical operational discipline and efficiency improvements. A typical example is that Microsoft identified inefficient elements in a model that led to unexpected computational consumption; in the cloud era, resolving such issues might have taken 2-3 months, whereas under the current urgency of AI operations, the team delivered an optimization plan in just one weekend. As scale expands, Microsoft is confident that the profit margins of its core cloud business have further expansion potential, and the profit margins of its AI business will also improve over time.

The Essence of the Moat: Gross Margin Advantages from OpenAI Collaboration and LLM Abstraction Layer

At the software level, Microsoft's competitive advantage is transforming into tangible financial barriers. Goldman Sachs specifically pointed out that Microsoft's partnership with OpenAI gives it a unique gross margin advantage. Because Microsoft holds the intellectual property (IP) rights to the OpenAI models, it does not need to pay additional API fees when utilizing these models, effectively eliminating a significant "gross margin tax," which constitutes a notable competitive advantage over other software providers In addition, Microsoft is redefining the role of large language models (LLMs). Just as virtual machines abstract hardware and containers abstract operating systems, Microsoft sees LLMs as the next generation of abstraction layers that will abstract the logic of applications themselves. Future applications will no longer rely on hard-coded rules but will shift towards intent-driven execution. Microsoft's Foundry platform has the opportunity to become the control hub of this layer, responsible for routing, governance, and cost optimization.

Although the market is currently overly focused on the absolute cost increase of new models (such as ChatGPT 5.2 vs. 5.1), Microsoft points out that next-generation models are becoming more efficient by design. In the future, as token costs decrease, value will increasingly be embedded in the platform layer, and the sales costs (COGS) related to LLMs will become negligible.

The "Universality" Strategy of Infrastructure: Rejecting Customer-Supplied Chips to Maintain Control

In infrastructure construction, Microsoft has demonstrated strong strategic resolve, rejecting seemingly tempting short-term compromises. Management has made it clear that the "Bring Your Own Chip" (BYOC) model holds neither economic appeal nor strategic advantage for Microsoft. BYOC would isolate the infrastructure stack and undermine the core drivers of cloud profitability—namely, scaled procurement, full-stack integration, and end-to-end optimization.

Microsoft's profit advantage stems from the overall optimization of data centers, power, cooling, networking, and silicon layers, rather than from individual components. Therefore, Microsoft has not reached a BYOC agreement with Anthropic but insists on leveraging its procurement and balance sheet advantages to provide customers with the necessary chip architecture, thereby maintaining overall system efficiency.

At the core of this strategy is "universality." Microsoft implements a "deferred binding" strategy in data center design and supply chains, delaying design and deployment decisions as much as possible to retain flexibility. For example, its new "Fairwater" design features a two-layer structure and 3D rack layout, shortening cable distances to enhance GPU performance. To achieve this capital agility that allows for flexible switching between different workloads and chips, Microsoft is even willing to sacrifice minor performance gains from customized cooling or chip designs. This strategy enables Microsoft to flexibly shift capacity from training tasks to inference tasks based on demand signals, thereby minimizing utilization risk.

Turning Point in Enterprise Adoption: Shifting from "Whether to Use" to "When to Scale"

On the demand side of the market, Goldman Sachs has observed a significant shift in sentiment. Compared to a year ago, conversations with enterprise customers about Copilot have shifted from discussing return on investment (ROI) and "whether" to adopt, to focusing on "when" and "to what extent" to adopt. Budget uncertainties are fading, and customers are no longer holding back budgets to defend against macro risks as they did in the fourth quarter of last year.

Microsoft points out that the adoption of enterprise AI is already widespread and is showing a trend of "taking root and then expanding." Customers typically start with pilot programs of a few hundred licenses and quickly expand to thousands as familiarity increases. In terms of pricing, Microsoft has adopted a value-based strategy, currently launching a low-priced commercial SKU at $21/user to broaden the adoption rate at the top of the funnel, but the long-term goal is to support pricing above $30/user through feature expansion Although many customers are currently trying to build AI agents internally (DIY), Microsoft believes that as the complexity of maintaining models, managing updates, and establishing reliable connectors compounds over time, customers will eventually return to Microsoft's platform solutions. The sales incentive mechanism has also been adjusted, shifting from an initial focus on pricing to accelerating customers' "time to realize value," indicating that Microsoft is transitioning from pure sales to deep ecosystem lock-in.