LAB NOTES
ZAK STOREY, CONTRIBUTOR
Nvidia’s strangle-hold on AI
Taking advantage of UL Procyon’s AI Benchmark suite
If AI is the name of the game, it’s Nvidia.
ARTIFICIAL INTELLIGENCE has gone from buzzword to fascinating topic. Whether that’s in the form of data analysis, processes to make workflows more efficient, or debating with LLMs pretending to be Socrates, the ramifications are clear.
I’d heard how much of Nvidia’s hardware stock is being pushed into AI development. I needed to understand why Nvidia had an edge. Thanks to the team at UL Solutions, I got access to their AI Computer Vision and Image Generation benchmarks.
It is terrifying how much of an advantage Nvidia has. There’s a price difference of around 96 percent (if you pick up a 4080 Super, not our OC variant), and a performance delta of around 93 percent (with ray tracing) in stock testing. But in AI processing, it’s a whole other battle.
Take the Windows Machine Learning tests using the GPU. The 4080 OC scores 1,945 versus 1,084 on the 7800 XT—about 79 percent more. Enable those Tensor cores, though, utilizing Nvidia TensorRT, and that difference increases to 148 percent.