According to 1M AI News monitoring, MiniMax recently uploaded the weights of its 229 billion parameter model M2.7 to Hugging Face, officially claiming it to be "open source." However, the license states that all commercial use is prohibited without prior written authorization from MiniMax. The definition of "commercial use" is very broad, covering paid services, commercial APIs, and using fine-tuned versions for any commercial purposes. Third-party platforms like OpenRouter are unable to deploy M2.7 for inference services. The license is labeled as "Modified-MIT," but the actual MIT license has no usage restrictions. Previously, both M2 and M2.5 were under a standard MIT license, making M2.7 MiniMax's first tightening of terms.
The open-source community strongly opposes this. A popular post on the Hugging Face discussion forum describes M2.7 as "proprietary code with viewable weights, not open source," while a post on Reddit r/LocalLLaMA is titled "MiniMax M2.7 is NOT open source - DOA License." Developer JoeSmith245 compares the license terms to the widely criticized advertising clause in the BSD 3-clause license, pointing out its incompatibility with the GPL and referring to it as "a junk release," prompting a switch to Qwen 3.5. Hugging Face's commit history also indicates that a briefly existing Llama-like license permitting commercial use was present in the repository about 6 hours before the weights were uploaded, which was then replaced by the current non-commercial version.
The AI industry has long blurred the concepts of "public weights" and "open source," but there is a fundamental difference between the two: public weights only allow the viewing of model parameters, while open source implies free usage and commercialization. MiniMax went public on the Hong Kong Stock Exchange in January this year, with an IPO valuation of around $6.5 billion and a current market capitalization of approximately $38 billion. Its open-source strategies for M2 and M2.5 helped build its reputation in the developer community, but the stricter license for M2.7 is a common path for public companies to convert developer traffic into API revenue. Zhìpǔ AI's GLM-5 Turbo has also gone closed-source, signaling a collective shift of China's leading AI companies from openness to closedness. However, the repercussions of restrictive licenses are evident: Google's Gemma 3 has adoption rates far below its performance level, leading Gemma 4 to revert back to Apache.
