Meta starts testing its 1st in-house chip for AI training
What's the story
Meta, the owner of Facebook, Instagram, and WhatsApp, is testing its first in-house chip for training artificial intelligence (AI) systems.
The move is a key milestone in Meta's plan to design more custom silicon chips and lower reliance on external suppliers like NVIDIA.
The tech giant has begun a small deployment of the chip and wants to ramp up production for wide-scale use if the test goes well.
Cost reduction
Chip development to reduce infrastructure costs
The shift toward in-house processors is part of Meta's long-term plan to bring down its huge infrastructure costs.
This is especially pertinent as the firm places expensive bets on AI tools to drive growth.
For 2025, the company has forecast total expenses of $114 billion to $119 billion, including up to $65 billion in capital expenditure largely driven by spending on AI infrastructure.
Technical specifications
Processor designed for AI-specific tasks
Meta's latest training chip is a dedicated accelerator, implying it is designed to handle only AI-specific tasks.
This makes it more power-efficient than the integrated graphics processing units (GPUs) that are generally used for AI workloads.
Meta is working with Taiwan-based TSMC to produce the chip.
The test deployment began after Meta completed its first "tape-out" of the processor, a significant marker of success in silicon development work that involves sending an initial design through a chip factory.
Future plans
Chip to be used for recommendation systems
The new chip is part of Meta's MTIA series, which has had a wobbly start.
However, Meta last year started using an MTIA chip to conduct inference for the recommendation systems that determine which content shows up on Facebook and Instagram news feeds.
Meta executives have said they plan to start using their own chips by 2026 for training.
Initially, they'll be used for recommendation systems before being extended to generative AI products like Meta AI.