Meta Platforms Trials In-House Developed AI Training Chip

Wed 12th Mar, 2025

Meta Platforms is reportedly conducting tests on its own custom-built chip designed for artificial intelligence (AI) training, marking a significant step towards reducing reliance on external suppliers like Nvidia. This initiative aims to enhance the efficiency and cost-effectiveness of AI operations within the company.

Last year, Meta began to implement its homegrown AI processors primarily as inference accelerators. However, recent reports indicate that the social media giant is now exploring the use of its first self-developed AI chip for training AI models. If successful, this move could result in substantial energy savings and a decreased dependency on third-party hardware.

In April 2024, Meta unveiled the Meta Training and Inference Accelerator (MTIA), a chip designed for both training and utilizing AI models. Initially, the focus was on inference--where AI models make predictions based on pre-trained data. The latest developments suggest that training capabilities are now being integrated into the MTIA chip's functionality.

According to sources cited by Reuters, Meta has started testing a limited number of these AI accelerators for training purposes. While specific details about the chips remain undisclosed, they are said to be manufactured by Taiwan Semiconductor Manufacturing Company (TSMC). These chips are expected to be more energy-efficient than traditional graphics cards, which can also be employed for AI tasks.

Meta has claimed that the MTIA, produced using TSMC's 5-nanometer process, operates at a power consumption of 90 watts per chip. In contrast, leading AI accelerators from competitors such as Nvidia and AMD consume around 700 to 750 watts per unit. This significant difference in power usage highlights the potential advantages of Meta's custom chips.

The MTIA chips are intended to enhance the recommendation systems on platforms like Facebook and Instagram, utilizing already trained AI models to deliver personalized content to users. The effectiveness of these recommendations is crucial, as improved relevance can increase user engagement, thereby boosting advertising revenue.

Although neither Meta nor TSMC has commented on the recent reports, Meta previously indicated plans to deploy its custom chips for AI training by 2026. Initially, these chips will focus on enhancing recommendation systems, with future applications anticipated in generative AI, including Meta's own chatbot technology. Chris Cox, a senior product manager at Meta, recently confirmed that the company is actively working on training methodologies for these systems, as well as how to leverage them for generative AI.


More Quick Read Articles »