Amazon’s Graviton server chips run high-performance Arm cores.
ARM VERSUS X86
THE END OF AN ERA
Is it a case of when—not if—Arm chips replace x86 CPUs in our PCs? Jeremy Laird investigates, and wonders whether the very question is outdated in an age of AI and heterogeneous and opensource computing
© AMAZON
When it comes to power efficiency, Arm-based CPUs have set the standard for decades. But for pure performance? Accept no substitute. You need a proper x86 chip—at least, that’s how the classic Arm versus x86 CPU contest used to stack up.
In the here and now? It’s not quite so simple. CPUs based in Intel’s x86 instruction set still top the performance charts, at least in the consumer space. But when it comes to the performance per clock cycle of a single CPU core, the very latest Arm chips arguably have the edge.
In fact, Apple now makes Arm CPU cores so powerful that its iPad Pro offers comparable single-threaded performance to a high-end desktop PC processor. Indeed, Apple’s latest Armbased CPUs are so good, they can beat native x86 processors when running x86 code. That’s ridiculous, and it begs some pretty pressing questions about the long-term viability of x86.
In short, is Arm now proving not only superior when it comes to power efficiency, but equal to or better than x86 as a platform for pure performance? If it is, should and indeed will Arm chips eventually replace x86 in PCs? And does Intel’s new IDM 2.0 strategy that prove even it knows the end is nigh for x86?
FAILURE OF ATOM
If Arm processors do eventually assimilate the PC, replacing Intel’s venerable x86 CPUs, hindsight will reveal that the seeds for x86’s ultimate demise—and possibly Intel’s, too—were sown by the failed Atom project. Atom was the low-power chip that was meant to get x86 into smartphones, and it was an abject failure.
Had Intel managed to turn x86 into a competitive ultra-mobile architecture for smartphones, not only would the company be dramatically richer and more successful, but x86 would also look much more viable for the future. So was the failure of Intel’s move into smartphones a function of x86’s fundamental unsuitability for the most low-power of general purpose applications? Or did Intel just mess things up?
In 2012, the first smartphones running an Intel x86 chip went on sale. Models like the Lava Xolo looked pretty plausible for the day. It had a high-resolution screen, 1GB of RAM, 16GB of storage, hardware video decode supporting 1080p video playback, multiple cameras—the works.
It also ran Android on its 1.6GHz Intel Atom Z2460 CPU with a decent SGX540 GPU, a chip otherwise known under the Penwell codename. To put this into context, in 2012, Apple was transitioning from the iPhone 4S to the iPhone 5. It just so happens that the Apple A6 chip in the iPhone 5 was the first to use cores designed in-house at Apple rather than off-the-shelf cores licensed from Arm, albeit the A6’s “Swift” cores were still very much based on the Arm instruction set.
Atom was single-core and HyperThreaded where the Arm competition in the Apple A6 and Qualcomm Snapdragon S4 were dual-core. But by 2015, Intel had a quad-core Atom chip running in handsets like the Asus Zenfone 2, complete with a 5.5inch 1080p screen and 4GB of RAM. Performance-wise versus competition like Apple’s iPhone 6 or the Google Nexus 9, it was a bit of a mixed bag. But the overpicture, including battery life, was far from a disaster. In other words, it’s hard to argue that the x86 instruction set and the perceived baggage it brings is what did for Intel in smartphones.
Yet by April 2016, Intel had officially cancelled Broxton, its then-upcoming quad-core SoC for smartphones, in effect putting an end to its efforts to get into smartphones. So what happened? Part of the problem was that Qualcomm, the biggest player in customer smartphone SoCs, was making it very hard through various licensing and purchasing terms for smartphone makers to experiment with Intel chips.
Given Intel’s record for sharp practices when taking on the likes of AMD in the old-school x86 market, you could argue that Intel got a taste of its own medicine. At the same time, the two biggest players in the smartphone market, Apple and Samsung were focussed on using their own SoC designs in smartphones, effectively ruling them out as customers for Intel SoCs. The combined consequence was that Intel’s potential customer base for smartphone chips was awfully narrow, at least without a costly and drawn-out battle with Qualcomm.