Amazon making its own AI chip to take on NVIDIA
Amazon is working on its next-generation artificial intelligence (AI) chip, the Trainium 2. The project is being spearheaded by Annapurna Labs, an Israeli chip start-up that Amazon purchased in 2015 for $350 million. The new chip falls in line with Amazon's plan to lessen its reliance on NVIDIA, and bring down operational costs for itself and its Amazon Web Services (AWS) customers.
Trainium 2: A step toward self-reliance and cost reduction
The development of the Trainium 2 marks a major step forward for Amazon's cloud computing division, AWS. The company is making a big investment in custom-designed processors to enhance the efficiency of its massive data center network. The new chip is tipped to be revealed next month as part of Amazon's lineup of AI chips for training large-scale AI models.
Trainium 2 under testing by leading companies
The Trainium 2 is already being tested by several leading companies, including Anthropic, an Amazon-backed AI start-up. This move marks a direct challenge to NVIDIA's dominance in the AI processor market. Dave Brown, Vice President of Compute and Networking Services at AWS, said that while they aim to be the best place to run NVIDIA chips, they also believe it's healthy to have an alternative.
Amazon's Inferentia AI chips offer cost-effective solutions
Amazon has claimed that its "Inferentia" AI chips, which are used to generate responses from AI models, are already 40% cheaper to run than similar solutions. The move highlights the company's focus on offering cost-effective alternatives in the AI processor space.
AWS to offer free computing power to researchers
In a bid to promote wider adoption of its AI processors among the research community, AWS has announced plans to provide free computing power to researchers using its custom-designed AI chips. The firm will provide researchers with credits worth an estimated $110 million for access to its cloud data centers and use of "Trainium," its specialized chip for developing AI models.