
Likes Received
Rate Of Return🚀$AMD(AMD.US) hits a new all-time high, but what's truly undervalued is "the reversal of CPU's role in AI."
$AMD(AMD.US)
$AMD(AMD.US)L
The signal from the market is already very direct.
Advanced Micro Devices rose nearly 7% in a single day, with its stock price reaching $275.91 and setting a new all-time high. But what I care more about isn't this gain, but the logic driving it—this isn't a wave of sentiment, but a change in the demand structure.
Over the past 6–9 months, AMD's data center business head Forrest Norrod described the situation as "unprecedented demand growth" with no signs of slowing down. This kind of statement isn't common in the semiconductor cycle.
To be more specific:
EPYC "Turin" server CPU capacity is in short supply
Annual capacity is almost sold out
Lead times for high-end models are extended to 8–10 weeks
This indicates one thing—
It's not orders testing the waters, but demand already pressing against capacity limits.
The real variable actually comes from the evolution of AI itself.
SemiAnalysis chief analyst Dylan Patel mentioned a key judgment: AI workloads are shifting from "text generation" to "agents + reinforcement learning."
This might seem like just a model change, but its impact on hardware is structural.
In the past, AI was more like a "GPU single-point explosion."
Now, it's increasingly resembling a complete system:
Task scheduling
Data preprocessing
Multi-model collaboration
Real-time decision-making
These tasks cannot be handled by GPUs; they must rely on CPUs.
This also explains another key data point.
TrendForce's estimates show:
Current AI data center CPU : GPU ratio ≈ 1:4 to 1:8
But in the era of agent AI, this ratio may converge to 1:1 to 1:2
This isn't a minor adjustment; it's a "flip" in the demand structure.
In other words:
In the past, the CPU was a supporting player; now it's starting to become a core resource again.
And AMD is positioned right at the center of this change.
Its chiplet architecture gives it advantages in scaling core counts, controlling costs, and increasing production capacity. This is also why the "Turin" generation of products is experiencing "short supply."
But more important is the timing.
The bottleneck for GPUs lies in advanced processes and packaging, while the bottleneck for CPUs is starting to become "overall system demand."
This means—
As AI moves from training to deployment, from models to applications, CPU demand will explode earlier than the market expects.
So if this rally is only understood as "AI driving semiconductors," it's actually an underestimation.
A closer understanding is:
AI is reallocating computing power weightings, and the CPU is a component being repriced.
Next, I will focus on two signals:
Whether EPYC lead times continue to extend
And whether cloud providers start increasing their CPU procurement share
If both happen simultaneously, it indicates this isn't a short-term shortage, but a structural change that has already taken hold.
Which judgment are you leaning towards?
Is this a short-cycle supply-demand mismatch, or the long-term return of the CPU in the AI era?
The copyright of this article belongs to the original author/organization.
The views expressed herein are solely those of the author and do not reflect the stance of the platform. The content is intended for investment reference purposes only and shall not be considered as investment advice. Please contact us if you have any questions or suggestions regarding the content services provided by the platform.

