header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

NVIDIA to Spend $26 Billion Over Five Years on Building Open-Source AI Models: Strongest Open-Source Models All Come From China, NVIDIA's Moat Would Be in Danger If They Run on Huawei Chips

According to 1M AI News monitoring, the core driver behind NVIDIA's in-house open-source model is an imminent threat: most of the world's top open-weight models are from China—DeepSeek, Ali Qwen, Moonshot AI, Z.ai, MiniMax—and numerous overseas startups and researchers have built applications on these Chinese models. More crucially, there are widespread industry rumors that DeepSeek is about to release a new model trained entirely on Huawei chips. If this rumor is true, it would prove that "top models can be trained without NVIDIA," potentially prompting more developers to try Huawei hardware, directly challenging NVIDIA's chip monopoly. NVIDIA's response strategy is to develop a set of in-house open-source models deeply optimized for their own hardware to keep developers within the NVIDIA ecosystem.

WIRED found in NVIDIA's 2025 SEC financial filings that the company plans to invest $26 billion over the next five years in building open-weight AI models, a move confirmed by executives and not previously disclosed. NVIDIA's VP of applied deep learning research, Bryan Catanzaro, stated, "Helping the ecosystem grow serves our interests. We are a U.S. company, but we collaborate with companies worldwide to ensure a diverse and robust ecosystem globally serves our interests."

On the same day, NVIDIA released its most powerful open-weight model to date, Nemotron 3 Super, with 1.28 trillion parameters, equivalent to the largest version of OpenAI's GPT-OSS. The company claims this model scored 37 on the AI Index (a comprehensive score across 10 benchmark tests), surpassing GPT-OSS's 33 but trailing behind several Chinese models. NVIDIA also stated that Nemotron 3 Super ranked first on PinchBench, a new benchmark evaluating models' control of OpenClaw capacity. Additionally, the company has completed pre-training for a 550 billion parameter model. Nathan Lambert, head of the Allen Institute for Artificial Intelligence's ATOM project, declared himself a "loyal fan of Nemotron" and called on the U.S. government to also fund open-source models.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish