According to Omdia Beating Monitor, AI research firm Epoch AI has calculated the ownership cost of a typical 1GW capacity AI data center. Building such a data center requires an initial capital expenditure of $38 billion, plus an annual operational cost of $9 billion.
If the capital expenditure is spread evenly over the asset life, the data center's annualized total cost is $8.5 billion. The servers (assuming all are NVIDIA GB200 NVL72 systems) make up the lion's share, with an annual depreciation cost of up to $5 billion, accounting for 60% of the total expenditure.
In contrast, the daily operational costs are negligible on the books. Even the largest component, energy expenses, amount to only $0.6 billion per year.
This calculation is highly sensitive to hardware depreciation periods. Epoch currently assumes an IT equipment life of 5 years and a data center facility life of 14 years. If the IT equipment lifecycle is shortened to 3 years, the data center's annualized total cost will soar to $12 billion; if extended to 7 years, it can be reduced to $7 billion.
