header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Meituan Quietly Releases New Large-scale Model: No Announcement, No Open Source

According to Dongcha Beating monitoring, Meituan has launched a new model, LongCat-2.0-Preview, on the LongCat API platform. The update log is dated April 20, but Meituan has not yet released any official announcement or technical report. Previously, each model in the LongCat series (Flash-Chat, Flash-Thinking, Flash-Lite, Flash-Omni, Next) was accompanied by an official blog and technical report, and was simultaneously open-sourced on Hugging Face and GitHub. The update log for 2.0-Preview does not include any open-source links and only provides services through the API.

The update log lists three main capabilities: agent-focused development, native support for tool invocation, multi-step reasoning and long-context tasks; proficiency in code generation, workflow automation, and complex command execution; deep integration with Claude Code, OpenClaw, OpenCode, and Kilo Code.

On April 24, several media outlets, citing insiders, reported more details: the model's total parameters exceeded a trillion, using a MoE architecture, supporting a 1M context window, and the parameter volume is comparable to that of DeepSeek V4 released on the same day. Insiders said the LongCat-2.0-Preview training inference was entirely completed using domestic computing power, leveraging 50,000 to 60,000 domestic accelerator cards, marking the largest-scale training task completed by domestic computing power to date. During the testing phase, a daily quota of 10 million tokens was provided free of charge.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish