Jeremy Laird
© NVIDIA
NVIDIA IS CURRENTLY SELLING AI chips at a rate of about $15 billion every three months. By 2027, analysts predict that this figure will have doubled, taking annual revenues to about $200 billion, again for those specific chips used in data centers and excluding other income like gaming GPUs.
But as clever as Nvidia’s CEO Jensen Huang is, it’s a bit of an accident. The AI chatbot revolution, the whole large language model thing, is thanks to PC gaming—at least it is as far as Nvidia is concerned.
Almost everything about Nvidia’s AI boom is kind of fascinating. It’s intriguing, for instance, to consider who Nvidia’s biggest customers are. You probably already knew that Microsoft was big on AI, so the fact that it’s thought to have bagged no fewer than 150,000 of Nvidia’s H100 AI-optimized GPUs in the most recent three-month period at a cost of around $5 billion isn’t a huge surprise.
But did you know that Facebook is reckoned to have bought just as many? It’s hard to imagine what they’re doing with them all when they can’t even get search routines in Facebook Marketplace to work properly. In both cases, it’s three times as many H100 GPUs as Google and Amazon, who are each estimated to have bought about 50,000 units.