The world’s biggest social media company had begun a small deployment of the chip and planned to ramp up production for wide-scale use if the test went well, the sources said.
The push to develop in-house chips is part of a long-term plan at Meta to bring down its mammoth infrastructure costs as the company places expensive bets on AI tools to drive growth.
Meta, which also owns Instagram and WhatsApp, has forecast total 2025 expenses of US$114 billion to US$119 billion, including up to US$65 billion in capital expenditure largely driven by spending on AI infrastructure.
One of the sources said Meta’s new training chip was a dedicated accelerator, meaning it was designed to handle only AI-specific tasks. This can make it more power-efficient than the integrated graphics processing units (GPUs) generally used for AI workloads.