Article by Sleepy.md
On April 29, 2026, Microsoft, Google, Meta, and Amazon all released their first-quarter earnings on the same day. If we look at the capital expenditure guidance provided by these four companies individually, the combined figure approaches $6.5 trillion. This scale is already equivalent to the annual GDP of Sweden.
In other words, the world's four wealthiest tech companies are preparing to spend an amount equivalent to a mid-sized developed country's annual economy to buy a ticket to the AGI era.
Now, everyone's eyes are fixed on that ticket to the AGI. At this moment, which has been jokingly referred to as the "Global AI Asset Showdown Night," if we shift our gaze slightly away from those grand narratives and look into the inconspicuous corners, you will find a covert war raging about physical constraints, capital anxiety, and industrial reorganization, which has reached a desperate point.
What truly controls market sentiment might not be the most profitable companies on paper but rather the enterprise that everyone sees as a "symbol of faith."
April 29 was supposed to be the most significant day of the U.S. stock market's earnings season. However, before public companies submitted their reports, the market experienced an unforeseen stampede. According to Goldman Sachs data, this was the second-worst trading day for AI assets since the beginning of this year.
The trigger was not a major financial underperformance by any listed company but an article in The Wall Street Journal the day before. The article reported that OpenAI had failed to meet its 2025 revenue target, and the goal of reaching over 1 billion weekly active users remained distant. What further unnerved the market was the mention that OpenAI's CFO, Sarah Friar, had internally warned that if revenue growth continued to fall short of expectations, the company might struggle to sustain its commitment to a $600 billion computing power purchase.
A company that is not publicly traded, does not need to release financial reports, merely based on a rumor, caused Oracle's stock to plummet by 4%, CoreWeave by 5.8%, and even SoftBank, located across the Pacific, to crash by 12% in over-the-counter trading markets.
When the $600 billion computing power commitment collided with sluggish revenue growth, the market suddenly realized that the most dangerous aspect of the AI narrative is not that no one believes in the future but that the future is too costly.

Over the past two years, OpenAI has been Silicon Valley's religion.
Graphics card purchases, data center construction, cloud provider expansion, startup company valuations—many seemingly unrelated decisions are all based on the same underlying assumption: that model capabilities will continue to leap forward, user bases will continue to expand, and AGI will eventually turn today's costly investments into tomorrow's tickets.

The strongest aspect of this logic is its ability to self-reinforce. The more people believe, the higher the valuation; the higher the valuation, the more people dare not disbelieve.
However, around April 29, the market for the first time seriously questioned the cash flow reality behind this belief system. Even OpenAI had to face customer acquisition costs, user retention, revenue growth, and computing power bills.
The most fascinating aspect of the internet age is that growth seems almost infinite.
Write a piece of code, give it to ten million users, and the marginal cost will be spread extremely thin. For the past two decades, Silicon Valley's audacity to use "burn money for growth" to disrupt traditional industries was based on this belief: as long as the network effect is strong enough, scale will swallow up costs.
But in the AI era, the digital world's money printer has been firmly choked by the physical world's cooling pipes.
During the April 29 earnings call, confronted with the astonishing 63% growth of the cloud business (quarterly revenue breaking $20 billion for the first time), Google CEO Sundar Pichai's tone was resigned: "If we could meet demand, cloud revenue could have been higher."

Behind this statement lies the most peculiar business dilemma of the AI era: demand far exceeds supply, but growth is mercilessly constrained by the physical world.
Google holds a massive backlog of $462 billion in cloud orders, almost doubling quarter-over-quarter. AI solution product growth is nearly 800% year-over-year, Gemini Enterprise paid users grew 40% quarter-over-quarter, and API token usage surged from 10 billion per minute to 16 billion.
These numbers, if attributed to any other internet company, would signify celebratory growth. However, within Pichai's words, we can hear a new type of dilemma emerging in the AI era: customers are already in line, money is on the way, but the servers haven't been set up, electricity hasn't been connected, and the cutting-edge chips haven't even been produced in the fab yet.
It's not a lack of demand, but too much demand, so much so that growth is being dragged back into the physical world.
Microsoft is facing a similar dilemma. Azure's growth rate has reached 40%, with AI's annualized revenue surpassing $37 billion. This number was only $13 billion in January 2025, nearly tripling in 15 months.
However, Microsoft's capital expenditure has decreased to $31.9 billion compared to the previous quarter's $37.5 billion, a reduction of nearly $6 billion. Microsoft explained in its financial report that this was due to the "timing of infrastructure development." The implicit meaning of this statement is that money can be approved today, but data centers will not sprout up tomorrow; GPUs can be ordered, but power, land, cooling systems, and construction timelines cannot be expedited by the capital markets.
Just when everyone thought we were sprinting towards the virtual world, what will ultimately determine success or failure is still the oldest heavy asset and the laws of physics.
Computing power is becoming a new form of "land resource," limited in the short term, slow to build, location-dependent, and subject to a first-come, first-served locking of supply. In this land rush, the reason the four tech giants dare to push capital expenditure to the $650 billion level is not because they have all calculated the returns, but because they are more afraid that if they do not hoard this "land" in their hands, they may not even be able to sit at the table tomorrow.
After-hours on April 29, Google's stock price rose by 7%, despite beating expectations and raising capital expenditure, while Meta plummeted by 7%.
Meta, to be fair, presented a rather impressive report card, with revenue of $563.1 billion, a 33% year-on-year growth, marking the fastest growth since 2021; EPS reached $10.44, far exceeding Wall Street's expectations.
However, Mark Zuckerberg made a taboo move by guiding Meta's 2026 capital expenditure to $125 billion to $145 billion. The better the performance, the more anxious the market becomes. Because what investors are truly worried about is not whether Meta is making money now, but that it is planning to use the cash earned from its advertising business today to support an AI gamble with an unclear path to monetization.
The market's punishment is ruthless, and the difference lies in the granularity of commercial monetization.
Google, Amazon, and Microsoft's AI expenditures can at least be included in a relatively clear ledger.
Google has a $462 billion cloud order backlog, Amazon has AWS's AI annualized revenue, and Microsoft has Copilot's paying users and a strong RPO. Every dollar they burn, while not necessarily immediately profitable, at least Wall Street knows roughly where this money will come back from: enterprise customers, cloud contracts, software subscriptions, and computing power leasing.
This is why the capital markets are willing to continue listening to their stories. The story can be compelling, but the path to monetization cannot be completely invisible.
The trouble with Meta is that it does not have a cloud business to sell to the outside world.
The tens of billions of dollars it has poured in will ultimately materialize through another even more convoluted path – the Meta AI assistant to increase user stickiness, the recommendation algorithm to boost ad conversion, AI-generated content to lengthen user engagement, smart glasses and future hardware to become new entry points.
This logic is not flawed, just that the chain is too long. Cloud providers burn money by slotting GPUs into an already signed order; Meta burns money by slotting GPUs into an advertising efficiency model that is not yet fully proven. The former can be discounted, while the latter can only be believed first. Although the logic holds, the monetization chain is too long, and Wall Street does not have enough patience.
Patience is a luxury in the capital markets. Especially when capital expenditure is pushed to the hundred-billion-dollar level, investors are willing to pay for the future but not indefinitely for the vague.
Even more worrying is the time lag.
Amazon CEO Andy Jassy admitted in a conference call that most of the money invested in 2026 will not yield returns until 2027 or even 2028.
This means that the giants are squeezing today's cash flow into capacity realization two years later. In between are data center construction, chip supply, power access, customer demand, and model iteration. Any deviation in any link will be repriced by the capital markets.
The most dangerous part of the AI arms race lies here: money is spent today, the story is told today, but the answer will not be revealed until two years later.
AI did not, as many expected two years ago, quickly push search off the table.
When ChatGPT first appeared, the market once believed that search ads would be directly cannibalized by direct answers, and companies like Perplexity were therefore highly anticipated. However, in the April 29th earnings report, Google's data showed a record high search query volume, with advertising revenue reaching $772.5 billion, a 15% year-on-year increase.
This is more like the "Jevons Paradox" of the AI era. In 1865, British economist William Stanley Jevons found that the improvement in steam engine efficiency did not reduce coal consumption but instead led to a significant increase in coal consumption because the efficiency improvement made more people afford steam engines, thus triggering an overall demand explosion. Similarly, AI has made search more complex and has also led users to ask more questions.
This is also where Google is more convincing to the market compared to Meta. It has both the cash flow from its old portal and the new ledger from its cloud business; it can make money from advertising and also from enterprise computing demand. AI has not dismantled its walls; at least for now, it has actually added another layer.
A similar boundary reshaping is also happening in the chip industry. On the same day, the king of mobile chips, Qualcomm, released a revenue report of $10.6 billion. During the earnings call, CEO Cristiano Amon made a major announcement: Qualcomm is officially entering the data center market with a custom chip developed in collaboration with a top hyperscale cloud provider, expected to begin shipping later this year.

Qualcomm's main battlefield has always been mobile devices. But as AI workloads begin to shift between the cloud and the edge, it also has to redefine its position.
If all future AI is monopolized by big cloud models, the value of mobile chips will be compressed; if edge AI becomes standard, Qualcomm must prove that it belongs not only in smartphones but also in inference, terminals, and low-power data centers.
Its entry into the data center market is more a defense than an offense.
As AI transitions from being a "luxury of the cloud" to a "standard at the edge," all industry boundaries are starting to blur. Mobile chip companies are trying to enter the data center space, cloud providers are developing their own chips, and chip companies are exploring models. Qualcomm's "defection" is just the tip of the iceberg in this major reshuffle.
In the same AI gold rush, the U.S. stock market has entered a rigorous "cash-in verification period." Even the leader in semiconductor process control and inspection equipment, once it shows even a hint of geopolitical and tariff risks, will be repriced by the market. After-hours on April 29, KLA Corporation presented an outstanding revenue of $3.415 billion, with Non-GAAP EPS reaching $9.40, surpassing the expected $9.16.
However, the stock price plummeted by 8% in after-hours trading.
The reason was not poor performance but the market's concerns about tariffs and exposure to China. KLA's client list includes many Chinese wafer fabs. Against the backdrop of U.S.-China tech decoupling, this "exposure to China" is like the sword of Damocles hanging over its head. Even with stellar performance, it cannot offset the market's instinctual fear of geopolitical risks.
Meanwhile, in the A-share market, a different language is being spoken.
Performance is important, but often, performance is just the fuel. What really ignites the fire is the narrative, whether you hold the ticket named "Self-Sufficiency Through Domestic Production."
On the evening of April 29, Cambricon delivered a remarkable first-quarter report: revenue of 2.885 billion yuan, a year-on-year increase of 159.56%, breaking the 2 billion mark for the first time in a single quarter in history; net profit of 1.013 billion yuan, a year-on-year growth of 185.04%. The next day, Cambricon's stock price soared, with a total market value exceeding 670 billion yuan, reaching a historic high, and the year-to-date increase has exceeded 62%.

On the same day, Mu Technologies, which released its financial report, achieved revenue of 562 million yuan, a 75% year-on-year growth, and reduced its loss from 233 million yuan in the same period last year to 98.84 million yuan. This was the first quarterly report from the GPU company, which only went public in December 2025.
Also positioned in the AI infrastructure chain, the US stock market and the A-share market had completely different pricing reactions.
KLA faces the complicated ledger of a globalized supply chain, where performance, orders, tariffs, China exposure, and export controls could all enter the valuation model.
Cambricon and Mu Technologies face a different narrative environment, where the stronger the external restrictions, the easier the strategic value of domestic computing power is amplified. The US market discounts risk, while the A-share market prices in scarcity.
However, just as the market cheered for Cambricon, one detail seemed somewhat glaring.
By the end of 2025, the super bull investor Zhang Jianping still held 6.8149 million shares of Cambricon, worth about 9.2 billion yuan, making him the second-largest individual shareholder of the company. In this quarter's report, he has quietly exited the list of the top ten shareholders.
If roughly estimated based on the first-quarter stock price range, the scale of the reduction in holdings corresponds to at least tens of billions of yuan. The specific price is unknown to the outside world, but what can be confirmed is that the earliest beneficiary of this narrative dividend, before the performance outbreak and the stock price hit a new high, chose to secure gains.
There are always two types of people in the market: those who pay for the narrative and those who price the narrative.
Zhang Jianping clearly belongs to the latter. He entered Cambricon before it became a consensus among the public and exited after it was written into the grand story of "Leading Domestic Computing Power."
On this $650 billion financial report night, Silicon Valley giants are anxious about the shortage of computing power, Wall Street analysts are agonizing over the timing of cashing out, while the A-share market is busy repricing domestic computing power.
In the same AI gold rush, each market is speaking its own language. The US stock market talks about ROI, the A-share market talks about domestic substitution; cloud companies discuss order backlogs, while Meta talks about ad efficiency; OpenAI has not released financial reports, but still tugs at the nerve of the entire computing power chain.
Everyone believes they have bought the ticket to the AGI era. But no one knows when this performance will end, and where the exit is. The ticket to the AI era is indeed expensive. But even more costly than the ticket is knowing when to exit.
Welcome to join the official BlockBeats community:
Telegram Subscription Group: https://t.me/theblockbeats
Telegram Discussion Group: https://t.me/BlockBeats_App
Official Twitter Account: https://twitter.com/BlockBeatsAsia