Intel CEO Pat Gelsinger has taken a shot at his main rival in high performance computing, dismissing Nvidia’s success in providing GPUs for AI modelling as “extraordinarily lucky.” Gelsinger also implied that it would have been Intel, not Nvidia, currently coining it in AI hardware had the company not killed one of his pet projects nearly 15 years ago.
During a broad ranging discussion at MIT, Gelsinger was asked what Intel is doing to drive the development of AI. His answer? Among other things, to point out how lucky he thinks Nvidia has been with its AI-accelerating GPUs.
Discussing the emergence of GPUs as the weapon of choice for the latest large AI models, Gelsinger explained how he thinks Nvidia and its CEO, Jensen Huang, just happened to be in the right place at the right time.
“Jensen worked super hard at owning throughput computing, primarily for graphics initially, and then got extraordinarily lucky,” Gelsinger said. Gelsinger also emphasised that AI wasn’t part of Nvidia’s original plan for GPGPU or general purpose computing on GPUs. “They didn’t even want to support their first AI project,” Gelsinger observed.
What’s more, Gelisnger also claimed things would have been very different had Intel not cancelled the Larrabee project shortly after he left for an 11-year stint outside of Intel before returning as CEO in February 2021.
“When I was pushed out of Intel 13 years ago, they killed the project that would have changed the shape of AI,” Gelsinger said of Larrabee.
Larrabee was an Intel GPU project long before its current Arc graphics cards that was intended to go head-on with Nvidia in the gaming and GPGPU markets courtesy of scores of tiny x86 CPU cores. The gaming graphics cards were cancelled in late 2009 and the rest of the Larrabee project withered thereafter.
Exactly when Nvidia really began bigging up its GPUs as tools for AI models is debatable. For sure, AI wasn’t in the mix when the company first began promoting GPGPU.
Here’s a passage from an early Nvidia document from 2007 describing the usage and benefits of its original CUDA 1.0 platform, the software that enables high performance computing on Nvidia GPUs, beyond mere graphics and image processing.
“Many algorithms outside the field of image rendering and processing are accelerated by data-parallel processing, from general signal processing or physics simulation to computational finance or computational biology.”
That was absolutely typical of Nvidia’s messaging at the time. In other words, no mention of AI as a key application for GPGPU. To an extent, then, Gelsinger has a point.
On the other hand, it was Nvidia that pushed GPGPU forward, that developed CUDA and Nvidia GPUs into the mighty force they are today. Nvidia might not have necessarily seen the AI revolution coming from the beginning. But the company did bet far bigger on GPUs than anybody else.
So was Nvidia lucky? In part, certainly. Nvidia and Huang did not see the AI revolution coming when they first embarked on the GPGPU project. Or if they did, they conspicuously failed to mention it. But there’s also a saying attributed to everyone from Founding Father Thomas Jefferson to pro golfer Gary Player that surely applies here and goes something like this: “I’m a great believer in luck, and I find the harder I work the more I have of it.”
It’ll be interesting to see how lucky Intel gets over the next few years of Gelsinger’s leadership.