Catenaa, Wednesday, March 11, 2026- Meta has debuted four new AI chips as part of its Meta Training and Inference Accelerator family of in-house processors, providing competition to Nvidia and AMD.
The new chips are part of the social media giant’s efforts to take advantage of both commercial GPUs from Nvidia and AMD, as well as its own offerings, to not only meet artificial intelligence demands but also ensure it isn’t overly reliant on one vendor.
The chips, the MTIA 300, MTIA 400, MTIA 450, and MTIA 500, are designed to focus on different elements of Meta’s AI business, including its ranking and recommendations (R&R) models up to high-end inferencing.
The MTIA 400 is meant for generative AI as well as R&R processes and, according to Meta, can be strung together in a larger server rack with 72 chips. It’s a similar idea to Nvidia’s NVL72 or AMD’s Helios racks.
Meta claims that the MTIA 400 is its first chip that provides cost savings as well as “raw performance competitive with leading commercial products.” The company doesn’t specifically say what products those include, but the only major commercial products similar to the MTIA 400 would be from Nvidia and AMD.
Interestingly, Meta recently signed multiyear, multigenerational deals for chips from both companies.
The MTIA 450 processor takes things further than the MTIA 400, with faster high-bandwidth memory, while the MTIA 500 adds more memory with even faster speeds.
The company says it’s already begun using some chips and plans to deploy the others in 2026 or 2027. Importantly for the company, they all use the same basic infrastructure, so Meta can swap them out when they need to be upgraded.
