According to Omdia Research, Meta has announced an expanded collaboration with Broadcom to jointly develop the next-generation MTIA (Meta Training and Inference Accelerator) chip. MTIA is Meta's in-house AI accelerator designed specifically for inference and recommendation systems optimization. In March of this year, Meta announced its plan to develop and deploy the fourth generation of MTIA chips within two years, and this agreement with Broadcom will accelerate that process.
The collaboration is based on Broadcom's XPU platform (a technology suite for customized AI accelerators), covering chip design, advanced packaging, and network interconnect. Broadcom's Ethernet technology will be used for high-bandwidth communication between Meta's rapidly expanding AI compute clusters.
Zuckerberg stated that the initial deployment will exceed 1GW of in-house chip compute power, with future expansion to multiple GW levels. Broadcom CEO Hock Tan mentioned that this is just the "beginning of a sustained multi-generation roadmap."
Due to the increased scale of the collaboration, Hock Tan, who has served as a Meta board member for two years, will step down from the board to take on an advisory role, providing guidance on Meta's in-house chip roadmap and infrastructure investments.
