Start your day with intelligence. Get The OODA Daily Pulse.
Amazon is poised to roll out its newest artificial intelligence chips as the Big Tech group seeks returns on its multibillion-dollar semiconductor investments and reduce its reliance on market leader Nvidia. Executives at Amazon’s cloud computing division are spending big on custom chips in the hopes of boosting the efficiency inside its dozens of data centres, ultimately bringing down its own costs as well as those of Amazon Web Services’ customers. The effort is spearheaded by Annapurna Labs, an Austin-based chip start-up that Amazon acquired in early 2015 for $350mn. Annapurna’s latest work is expected to be showcased next month when Amazon announces widespread availability of ‘Trainium 2’, part of a line of AI chips aimed at training the largest models. Trainium 2 is already being tested by Anthropic — the OpenAI competitor that has secured $4bn in backing from Amazon — as well as Databricks, Deutsche Telekom, and Japan’s Ricoh and Stockmark. AWS and Annapurna’s target is to take on Nvidia, one of the world’s most valuable companies thanks to its dominance of the AI processor market. “We want to be absolutely the best place to run Nvidia,” said Dave Brown, vice-president of compute and networking services at AWS. “But at the same time we think it’s healthy to have an alternative.” Amazon said ‘Inferentia’, another of its lines of specialist AI chips, is already 40 per cent cheaper to run for generating responses from AI models.
Full report : Amazon’s chip R&D arm, Annapurna Labs to spend $100 million to build custom chips that lessen its reliance on NVIDIA chips.