header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Google's TurboQuant Paper Rebutted Line-by-Line by Pre-Emptive Algorithm Author

According to 1M AI News monitoring, Dr. Gao Jianyang, a postdoctoral researcher at the Swiss Federal Institute of Technology in Zurich, published an open letter accusing Google's ICLR 2026 paper TurboQuant of three serious issues regarding its description of his prior work, RaBitQ. Gao Jianyang is the first author of RaBitQ, an algorithm published at the top-tier SIGMOD database conference in 2024. The core method involves applying random rotation (Johnson-Lindenstrauss transform) before quantization and has been rigorously proven to achieve an asymptotically optimal error bound. He was invited to present the work at a Workshop at the top theoretical computer science conference, FOCS.

The three accusations are as follows:

1. Methodological Similarity Evasion: TurboQuant's core method also utilizes random rotation, yet the paper categorizes RaBitQ as "grid-based PQ," systematically omitting the direct connection between the two methods. ICLR reviewers independently noted that both methods use random projection and requested further discussion. However, the TurboQuant team not only failed to provide additional information but instead moved the description of RaBitQ from the main text to the appendix.
2. Falsification of Theoretical Results: The paper, without any evidence, qualitatively characterized RaBitQ's theoretical guarantee as "suboptimal," attributing it to "loose analysis." An extended version of the RaBitQ paper has already proven that its error bound achieves the asymptotically optimal bound provided by Alon-Klartag (FOCS 2017).
3. Unfair Experimental Comparison: TurboQuant tested RaBitQ using self-translated Python code on a single-core CPU (disabling multi-threading) but evaluated its own algorithm using an NVIDIA A100 GPU. This led to RaBitQ being reported as significantly slower by several orders of magnitude, with this disparate setup not disclosed in the paper.

Gao Jianyang revealed that the second author of TurboQuant, Majid Daliri, proactively contacted the RaBitQ team in January 2025 to request assistance in debugging his Python translation of the RaBitQ C++ code. In a May 2025 email, he personally acknowledged the unfair experimental setup and stated that he had informed all co-authors of RaBitQ about the clarification of the theory from the RaBitQ team. However, throughout the entire submission, review, acceptance, and Google's official large-scale promotion process of the TurboQuant paper, these issues were never addressed.

The RaBitQ team has published a public comment on ICLR OpenReview and filed a formal complaint with the ICLR conference chair and ethics committee. TurboQuant's first author, Amir Zandieh, replied, expressing willingness to address the second and third issues but refusing to supplement the discussion on methodological similarity. He only agreed to make corrections after the conclusion of the ICLR 2026 conference. An independent researcher, Jonas Matthias Kübler, also pointed out in OpenReview that the paper's benchmarks were inconsistent with Google's blog in terms of speed benchmarks (PyTorch vs. JAX) and quantization baseline (FP32). Following Google's official large-scale promotion of TurboQuant, the stock prices of storage chip companies such as Micron and Western Digital collectively plummeted.

举报 Correction/Report
Correction/Report
Submit
Add Library
Visible to myself only
Public
Save
Choose Library
Add Library
Cancel
Finish