Original Title: If you don't understand AI by the end of this, the next decade will confuse you
Original Author: Anish Moonka
Translation: Peggy, BlockBeats
Editor's Note: When people talk about AI, attention often focuses on the most visible areas: chatbots, AI assistants, and various new applications. However, behind these products, a deeper industrial restructuring is taking place. From power, chips to data centers, and from models to applications, AI is actually a technology stack composed of multiple layers of infrastructure, and the flow of capital and profits is far more complex than meets the eye.
This article, from the perspective of the "AI Five-Layer Structure," systematically sorts out this value chain: why billions of dollars are flowing into energy, chips, and cloud infrastructure; why model companies are burning a lot of money while experiencing rapid growth; and in this technological revolution, where the real value may initially be concentrated.
By comparing AI to historical cycles such as the power revolution and the construction of internet infrastructure, the author attempts to answer a key question: in this technological wave that may reshape the global industrial structure, where is capital flowing, and how can ordinary people participate in this AI wealth opportunity.
The following is the original text:
Most people think AI is just a chatbot.
I can understand this idea. You open ChatGPT, ask it to help you revise an email, and it can do it instantly. It feels like magic. So you close the page, thinking you already understand what AI is all about. But that's like swiping a Visa credit card at a restaurant and then thinking you understand how Visa makes money. You just used the product but didn't see the system behind it.
For most of last year, I was trying to figure out where the real profits of AI were actually going. And a somewhat embarrassing fact is: it took me a long time to realize that I had been looking at the wrong level. I was fixated on ChatGPT, Claude, Gemini – the things you can directly interact with.
Meanwhile, $700 billion was quietly flowing into another set of infrastructure that I couldn't even name: chips I've never heard of, packaging tech acronyms that sound made up, cooling systems, power plants. In Texas, Iowa, and Hyderabad, massive amounts of concrete are being poured for data center construction.
A year ago, almost no one around me was talking about these things. And now, everyone has started talking.
This article will be quite long. If you don't have time to finish reading it now, you can bookmark it and read it later.
I want to take you through the complete AI value chain: from the electricity powering the data centers to the application on your phone.
And I will explain it in a way that even if you have never read a public company's annual report in your life, you can understand it. I will explain all the terms; I will provide real data for every judgment I make; for areas where I am still unsure, I will also be honest about it because there are indeed some.
So let's get started.
AI is infrastructure. Just like the internet, just like electricity, it needs factories. — Jensen Huang
Most people understand AI this way: a smart computer answering questions.
That's like saying the internet is "a place to watch videos." Technically correct but completely missing the point.
In January 2026, at the World Economic Forum, Jensen Huang described AI as a five-layer system:
· Energy
· Chips
· Cloud
· Models
· Applications
He referred to this entire system as "the largest infrastructure project in human history."
First, think about this term: Infrastructure.
Roads. Power grids. Water supply systems. These things keep modern civilization running, but people usually only notice them when they go wrong.
AI is becoming the same thing—invisible, indispensable, and extremely costly to build. I refer to this entire structure as the AI Stack. It consists of five layers, one on top of the other, with each layer supporting the one above it, and money flows bidirectionally between these layers.
The simplest version I can provide is this:
·Energy, you need electricity to power the computer, and a lot of it.
·Chips, you need processors specifically designed for computation. This is not the CPU in your laptop.
·Cloud, you need massive warehouse-scale data centers filled with these chips, interconnected by ultra-high-speed networking.
·Models, you need real AI software—a "smart brain" that learns patterns from data.
·Applications, you need products that people actually use, like ChatGPT, Google Search, or a bank's anti-fraud system.
Any AI discussion that only talks about the fifth layer (application layer) ignores a full 80% of reality. And if you're an investor, entrepreneur, or just someone trying to understand where the world is headed, the crucial point is that money does not flow evenly across these five layers. It concentrates, compounds, and flows to very few key nodes.
And today, that money is flowing into places most people aren't paying attention to at all.

People's attention is almost always on the application layer. ChatGPT, GitHub Copilot, Claude, Perplexity.
These are products you can directly use, so it's easy to think that's the AI story.
But most people miss one thing. By 2026, the world's top four cloud computing companies (Amazon, Microsoft, Google, Meta) are expected to spend between $650 billion and $700 billion in capital expenditures (CapEx) in a single year.
That's for one year for all four companies combined.
This number is roughly equivalent to Switzerland's GDP for a whole year. And around 75%, about $450 billion, will directly go into AI infrastructure.
Not chatbots, not applications. But the buildings, chips, fiber optics and networking, cooling systems—things that hardly anyone talks about at cocktail parties. This precisely shows where the money is.
Because think about it, before anyone can use ChatGPT, someone had to do one thing first: build a shopping mall-sized data center, then install tens of thousands of specialized processors in it, connect them with networking equipment worth more than most companies' market caps, and power the whole system with enough electricity to supply a small city. And this needs to run like that every single day.
This is the first to third layer: energy, chips, cloud infrastructure, all of which are invisible layers and where real deployment of massive capital takes place.
Someone might ask: "What about OpenAI? Haven't they already made tens of billions of dollars?"
Indeed.
By the end of 2025, OpenAI's Annual Recurring Revenue (ARR) had reached $20 billion. A year before, it was $6 billion, and just a year before that, it was only $2 billion.
A 10x growth in two years, very few companies in human commercial history have achieved such rapid revenue growth at this scale.
But the problem lies in the equally staggering costs.
· 2025: OpenAI is estimated to burn through $9 billion in cash
· 2026: Projected burn rate of $17 billion
Just the cost of inference, i.e., when you ask AI a question, the actual cost of running the model:
· 2025: $8.4 billion
· 2026 Projected: $14.1 billion
Based on current forecasts, OpenAI may not achieve positive cash flow until 2029 or 2030.
So the question is: Where is all this money going?
The answer is: flowing down the AI tech stack.
Flow:
· Microsoft Azure (OpenAI is required to pay Microsoft 20% of revenue by 2032 per the agreement)
· Nvidia's GPUs
· Engineering firms building data centers
· And energy companies providing power
If you stare at this system a bit longer, you'll notice an almost cyclical structure:
· Microsoft invests in OpenAI
· OpenAI uses that money to purchase Azure cloud services
· Azure uses revenue to buy Nvidia chips
· Nvidia reports record profits
Applause all around
Then, the funds continue to flow downward.
There is a crucial structural fact in the AI technology stack:
The vast majority of users are at the top layer (the application layer).
The vast majority of profits are at the bottom layer (the infrastructure layer).
This misalignment between user location and profit location is at the core of the entire AI investment logic.
This is the first law of the AI value chain: Revenue flows upward, capital settles downward.

All human problems are essentially engineering problems, and engineering problems can ultimately be solved. —Buckminster Fuller
If you want to truly understand what is happening with AI, you can look back at the history of the electricity revolution from 1880 to 1920.
In 1882, Thomas Edison built the first commercial power station on Pearl Street in Manhattan, New York. At that time, most people thought of electricity as just a novelty, a more "sophisticated" way of lighting. After all, gas lamps worked just fine. Who really needs this thing?
But in just 40 years, electricity completely reshaped nearly every industry: manufacturing, transportation, communication, healthcare, entertainment.
The true winners of this revolution were not the inventors of the light bulb, but those who built the infrastructure: General Electric, Westinghouse Electric, power companies, copper mining companies, engineering firms.
Today, AI is repeating the same pattern, only the pace has been compressed to a few years instead of decades.
Compare these two value chains:
· AI System: AI → Data Centers → Chips → Raw Materials → Energy
· Power System: Electricity → Factories → Machinery → Raw Materials → Coal/Hydro
The two paths are nearly identical. And once again, the primary winners are not primarily at the application layer but at the infrastructure layer.
I call this phenomenon Infrastructure Gravity. Whenever a new computing platform emerges, those who first create wealth are always the "ones selling shovels."
The application will eventually catch up, the application will receive all the media attention. But the infrastructure takes away most of the profit.
For example, in the 2026 fiscal year (ending January 2026), Nvidia's annual revenue was $215.9 billion, a 65% year-over-year growth. In that, the data center business alone generated $62.3 billion in revenue in the last quarter, a 75% year-over-year growth. This business now accounts for 91% of Nvidia's total revenue.
In other words, a company made $68 billion in revenue in a single quarter, with 90% coming from the same business line.
Now, looking at chip manufacturing. In 2025, TSMC held about 70% of the global semiconductor foundry market share, with sales of $122.5 billion. The second-place Samsung Electronics only had 7.2%. This level of monopoly makes even the days of Standard Oil seem less exaggerated.
Infrastructure always wins first. The real question is, how long will this window of opportunity last?
Ask anyone what the Internet revolution is, and they will say Google, Amazon, Facebook.
But if you ask where the earliest money was made, the answer is actually Cisco Systems, Corning, the companies that laid the fiber optic networks.
The same story, just in a different era.
The stock market is a machine to transfer money from the impatient to the patient. — Charlie Munger
I have to admit one thing. When I first started looking at AI from an investor's perspective, I also made the same mistake as most people; I was looking at the application layer. I saw the growth of ChatGPT, saw Anthropic raising billions of dollars. So I thought, AI companies will win, let's invest in AI companies.
Later on, three things changed my view, and they happened in sequence.
I found that almost all "AI companies" are burning cash like crazy. OpenAI, Anthropic, Mistral AI, xAI. All of them are spending money at a rate far higher than they are earning. The reason is not a bad business model, but rather, the cost of compute is structural.
Every time you ask AI a question, the system must perform real computation. Computation requires a GPU, and GPUs require electricity. The more powerful the model, the higher the computing power requirements, so the operating cost will only get higher and higher.
In other words: The so-called winners in AI, are actually the biggest spenders.
I've noticed that infrastructure companies are printing money. Nvidia has a gross margin close to 75%, TSMC is ramping up production while raising prices because demand far exceeds supply.
These companies do not have a "when to profit" problem. Their issue is that we can't even build fast enough. These are two completely different problems.
I realized that I've been thinking about AI like a consumer.
Consumers see the application. Engineers see the tech stack. Once you see the entire tech stack, you can no longer ignore it.
Every AI release will turn into a Capital Expenditure (CapEx) announcement. Every model upgrade will turn into new chip orders. Every new feature will turn into a new data center lease.
The entire industry is starting to look like concentric circles: the closer to the center, the more concentrated the profits.
Maybe you are: a software engineer focusing on AI models, a retail investor who bought Nvidia at $300, or someone in India observing this revolution from afar (perhaps you are all three at the same time—now, that's the most interesting position.)
Regardless of where you are positioned, the principle is the same. Consumers see the product, investors see the supply chain. And the best investors see what has already formed in the supply chain before the product is even released.
The article is already long, so I will pick up the pace.
Below is the structure of each layer of the AI Stack, the main participants, and potential opportunities.
AI data centers are extremely power-hungry. A single large model training run could consume a small town's one-year electricity usage. By 2026, global AI data centers are expected to consume around 90 terawatt-hours of electricity per year. This is approximately a tenfold increase from 2022.
This brings a very simple investment thesis: whoever can provide stable power to data centers will benefit. This includes nuclear power companies, natural gas companies, renewable energy companies, grid companies, especially energy companies near data center clusters.
Jensen Huang said in October 2025: The speed at which data centers self-generate power may be faster than getting power from the grid. In fact, many tech companies have already built power generation facilities directly next to data centers, bypassing the grid.
This point really shocked me. These tech companies are turning into their own power companies.
The beneficiaries include utility companies, independent power producers, power equipment manufacturers (transformers, switchgear, etc.). In Asia, for example, in India, as hyperscaler data centers expand, power equipment and transmission companies will also benefit.
This is the layer most familiar to the public because of Nvidia. But in reality, this layer is far more complex than just one company.
The chip layer can be further divided into several sublayers:
Design Companies
· Nvidia (GPU), AMD, Broadcom, Qualcomm
· And an increasing number of cloud giants' in-house chip designs: Google TPU, Amazon Trainium, Microsoft Maia
Manufacturing Companies
Almost monopolized by TSMC, with a market share of around 70%, followed by Samsung Electronics (7.2%). Intel is trying to rebuild its foundry business, but this will take several years.
Equipment Companies
The machines that manufacture chips come from ASML (the only company producing EUV lithography machines), as well as Applied Materials, Lam Research, Tokyo Electron
Memory Companies
AI models require a large amount of high-bandwidth memory (HBM). Key players: SK Hynix, Samsung, Micron Technology
System Integration Technology
Advanced system integration technologies (such as TSMC's CoWoS) have become a new bottleneck.
The most astonishing aspect of this layer is actually concentration:
· Nvidia: Around 92% AI GPU market share
· TSMC: Manufactures nearly all AI chips
· ASML: Sole EUV equipment supplier
One company designs. One company manufactures. One company produces manufacturing machines. This concentration is both an investment opportunity and a geopolitical risk.
This is where the chips truly come alive.
Huge warehouse-like facilities:
· Thousands of servers
· High-speed network connections
· Liquid cooling systems (have evolved from optional to standard)
Market dominated by three major cloud players:
· Amazon Web Services (31%)
· Microsoft Azure (24%)
· Google Cloud (11%)
Oracle is also rapidly expanding, with plans for $50 billion in capital expenditure by 2026. But this layer is far more than just hyperscalers. For example:
· Foxconn assembles 40% of AI servers
· Arista Networks provides network equipment
· Credo Technology (stock price up 117% by 2025)
· Vertiv provides liquid cooling
Data center real estate firms:
· Equinix
· Digital Realty
· Even concrete suppliers play a role, with a complete supply chain at every layer.
According to Bank of America's estimate, by 2026, hyperscalers will dedicate 90% of their operating cash flow to capital spending. In 2025, this ratio was 65%.
Morgan Stanley forecasts that these companies will issue over $400 billion in debt this year to build data centers. In 2025, this figure was $165 billion.
When I first read this number, I stopped in my tracks. A $4 trillion debt just to build more warehouses filled with computers.

This layer is the "brain layer," responsible for training and building actual AI models.
Main players include:
· OpenAI (GPT series, annual revenue of over $200 billion)
· Anthropic (Claude, reportedly with an early 2026 annual revenue of around $190 billion)
· Google DeepMind (Gemini)
· Meta AI (Llama, open-source model)
· Mistral AI
· xAI (developing Grok)
This layer fascinated me because it is both the most hyped and the least profitable.
For example:
· OpenAI has seen unprecedented revenue growth, but is still projected to burn $17 billion in cash in 2026.
· Anthropic has also experienced rapid growth but is highly reliant on financing—early 2026 saw a $5 billion funding round with a valuation of about $170 billion.
The issue lies in the structural contradiction of this layer's business model. As models become stronger, they require more computing power, and the cost of computing power often grows faster than revenue.
It's a bit like running a restaurant where each new dish requires more expensive ingredients, but customers expect prices to remain the same.
The result is that profit margins are constantly squeezed.
When will this change? I'm not sure, perhaps not in the near term.
For investors, this layer represents high risk, high reward. The challenge is that most companies are still private.
Therefore, exposure to the public markets mainly comes through two channels:
Cloud Computing Companies
For example, Microsoft holds a significant stake in OpenAI and provides computing power to it through Microsoft Azure.
Chip Company
Because they consume a lot of their hardware during the model training process.
This is the layer you see every day. For example, ChatGPT, Google Search powered by Gemini, Microsoft Copilot in Office, banks' AI anti-fraud systems, Netflix's recommendation algorithm, AI image enhancement in your phone
The application layer is the broadest and most crowded layer. Thousands of startups and large companies are competing here. In the long run, it could be the largest layer by market size. Some forecasts suggest that by the early 2030s, the market size of the application layer could surpass $2 trillion.
However, at this stage, this layer also has the thinnest margins and the most uncertain competition.
In this layer, true differentiation comes from data. Companies with unique, proprietary data will build lasting advantages.
For example:
- Salesforce — Enterprise CRM data
- Bloomberg — Financial market data
- Epic Systems — Medical record data
Companies that have this data moat can deeply fine-tune AI models, something that general chatbots cannot do.
For investors, the application layer may ultimately offer the most significant return potential but will also destroy the most capital.
Most AI startups will fail, with only a few survivors achieving exponential growth.
The most likely investment thesis for the next 3 to 5 years is to bet on infrastructure now and on applications later. The smartest funds have already positioned themselves this way.
Companies that will ultimately win at Layer 5 are likely those that have data no one else can access.
Interestingly, many of these companies do not even consider themselves AI companies.


The investor's chief problem—and even his worst enemy—is likely to be himself. —Benjamin Graham
Let's address head-on the most common question. "What about the internet bubble? Isn't this the same thing? Massive infrastructure investment, no profits, everyone caught up in the hype."
It's a good question and deserves a thoughtful answer.
The key difference is that during the internet bubble era, when companies were building infrastructure, the actual demand had not yet materialized. Back then, businesses were feverishly laying fiber optic networks and constructing server farms, but real internet users were still mostly on dial-up.
The result was that the infrastructure was built, but the demand didn't materialize until 5 to 7 years later. During that interim period, many companies went bankrupt.
By 2026, however, the demand for AI is already here. Nvidia's chips are in short supply, TSMC's advanced packaging capacity is fully booked, and cloud computing leasing prices are rising instead of falling. At the same time, OpenAI added 400 million weekly active users between March and October 2025. Models are being used.
Compute power is being consumed. Customers are paying. This does not mean there's no risk. In fact, the risk is significant, and I likely think about it more often than I'm willing to admit even to myself.
There are three particular points worth noting.
In 2026, tech companies will spend over $650 billion on data centers.
If the growth in AI service revenue is not sufficient to support these investments, many companies will face severe margin compression. Even Amazon could see negative free cash flow this year.
And this is Amazon, a company that virtually pioneered the cloud computing business model.
The AI supply chain is highly concentrated.
· TSMC produces around 70% of the world's chips
· ASML is the sole supplier of EUV lithography machines
· Nvidia designs 92% of AI data center GPUs
Any major disruption, whether geopolitical, natural disaster, or shifts in the competitive landscape, could impact the entire AI industry chain.
For example, a major earthquake in Hsinchu, Taiwan, could set back global AI development by years. This idea should be unsettling.
In January 2025, the Chinese AI lab DeepSeek released a model. Its performance rivaled state-of-the-art models, but the training cost was only a small fraction of before.
This challenged a core assumption that more computing power always leads to better AI.
If future open-source and high-efficiency models continue to narrow the gap, the logic of infrastructure investment will be weakened.
I don't think DeepSeek overturns the entire AI investment logic. But it does introduce a previously non-existent variable. And once this variable emerges, it will not disappear.
But I will always come back to a larger framework.
The long-term forecasts provided by consulting firms are as follows: McKinsey & Company estimates that global data center investment will reach $6.7 trillion cumulatively by 2030; PwC estimates that AI will contribute $15.7 trillion to the global GDP by 2030; the International Data Corporation (IDC) estimates that AI-related solutions will have a cumulative economic impact of $22.3 trillion.
Even if these numbers are overestimated by 50%, we are still facing the largest technology-driven economic transformation since the internet. The issue is not the direction, but the scale.
I often hear people say, "I'm skeptical about AI."
Of course, you can be.
You can doubt the model's capabilities, doubt the development timeline, but do not overlook the supply chain structure.
These are two completely different things. One is a healthy, rational skepticism, the other will make you miss opportunities.
In five years, the winners of this cycle will undoubtedly be very clear.
History has always been like this. And the key to this game now is: understand the structure before others see it clearly.
Imagine AI as a five-level video game. Each level is a different stage.
This is the Beginner's Guide level. Important, straightforward, and almost foolproof as long as normal operations are followed. Low risk, stable returns.
Similar to a game's quest NPC: won't die and keeps rewarding.
This is the Boss Battle. Most power, highest profits. But at the same time, highest technological risk and geopolitical risk.
Huge rewards, but in Hard mode.
This is the multiplayer server, where all players are active. The Hyperscalers are like server admins, taking a cut from all transactions.
This is the PVP arena. Competition is extremely fierce, and innovation happens rapidly.
Most players will be eliminated, only the best-equipped will survive.
This is an open-world map. Infinite possibilities, but no fixed rewards. You have to find missions yourself.
The true Meta Strategy is simple. You don't need to complete all levels.
Most people will go play Level 5 because it's the most prominent. But the smartest money is currently grinding levels 2 and 3 because, at this stage, that's where the highest returns are.
Where you are in the tech stack determines what you should focus on.
For Non-Techies
You don't need to understand how GPUs work. You just need to know that someone has to manufacture GPUs, someone has to build data centers for them, someone has to power them. And these companies are all publicly traded, so you can read their financial reports.
For Techies
You already know models are getting stronger. But you might underestimate one thing: the real bottleneck is becoming physical: power, cooling, chip packaging. The AI competition in the next decade might be more of an engineering problem than a model architecture problem from papers.
For Investors
The AI value chain is actually five different trades. Different risks, different time horizons, different winners. Treating AI as an industry is like treating "technology" as an industry in 1998. Huge internal differences.
This situation will not last forever. One day, infrastructure construction will mature, the application layer will integrate, and value will shift back up.
The Internet age was the same. In the end, the real moneymakers were Amazon, Google, Facebook, not just fiber optic companies and server manufacturers.
But AI has not reached that stage yet. It is still the infrastructure stage, the stage of selling shovels.
And right now, the shovel is making crazy money. Those who understand the full technology stack will see the signals before the turning point.
Others will be surprised time and time again, wondering where the money is flowing to.
Ten years from now, understanding the AI technology stack will be as fundamental as understanding a balance sheet.
Remember three things: Understand the technology stack. Draw out the hierarchy. Track the flow of capital.
That's the game.
Welcome to join the official BlockBeats community:
Telegram Subscription Group: https://t.me/theblockbeats
Telegram Discussion Group: https://t.me/BlockBeats_App
Official Twitter Account: https://twitter.com/BlockBeatsAsia