Apple has announced that they are making the transition to Apple Silicon within two years. The excitement around the performance of their chips is extremely high, and there is even talk about whether Wintel PCs will make a similar transition in the near future. Here, I would like to review what history tells us about transitions and evaluate whether this is indeed a likely scenario, or whether the Intel x86 architecture will prevail.
Chicken and egg
Before we start, I would like to emphasise that this is a typical chicken and egg problem. End-users will purchase the new devices if they run their favourite applications faster. However, application software developers will only invest resources to re-write their applications for the new CPU architecture, if they are confident that enough end-users will purchase the new devices.
The situation is the same in other levels of the value chain.
The transition from 68k to PowerPC Macs
A good summary of this transition can be found here. There are two factors that made this transition possible. First of all, the performance gap was already very wide, and with all the excitement around RISC, many people were convinced that it would get even wider in the future. Second, Apple had developed a very capable 68k to PowerPC emulator.In fact, the emulator was so good that a good portion of the operating system itself was not yet written in PowerPC code and had to be emulated. It was easy to mix 68k code with PowerPC code and so developers could gradually port their applications to PowerPC, focusing on the portions that affected performance the most.
Thus the chicken and egg problem was solved.
- End-users were confident that their apps would run on the new PowerPC Macs, and would also provide performance benefits.
- Application developers gained confidence that the PowerPC Mac users would be their main customers in the future, and they had a strong incentive to make their apps perform as best they could on the new architecture.
- Application developers did not have to spend huge resources upfront. They could gradually transition their software by focusing on the parts that mattered the most.
- Additionally for the CPU vendors, IBM and Motorola, they had their own agendas for investing in the PowerPC architecture, and to continue to provide high-performance designs.
The transition from PowerPC to Intel
Again, by the time Apple decided to make the switch, PowerPC lagged Intel in raw performance. What was more severe was the sheer inferiority in performance per watt, which was gaining more importance as computing was shifting from desktops to laptops. It was evident that the transition to Intel would provide significant performance benefits.
Apple also stressed that Intel-based Macs would be reliable since Apple had been running an Intel version of MacOS X for quite some time (“a secret double life”). Unlike the 68k to PowerPC transition, the underlying operating system was already written in Intel code.
Emulation software was also available (Rosetta) which allowed most applications written for PowerPC to run on Intel Macs without modification. For the most part, you would not know if an app was running native or under Rosetta unless you took a peek with the “Activity Monitor” utility. Emulation was that good.
Furthermore, the PowerPC to Intel transition had great developer tool support. Ever since the transition from the classic MacOS to MacOS X, the development tool of choice had been Apple’s Xcode suite. Apple provided the capability to easily recompile your existing Xcode project into Intel binary code, touting that all it took was to tick a checkbox, and even if it was an exaggeration, Intel optimised versions of pre-existing applications emerged within months.
Apple also announced that the full transition of their line-up would be completed within two years.
So here again, the chicken-and-egg problem was solved.
- End-users were confident that their favourite applications would run on the new Intel Macs smoothly, with performance gains coming in the future.
- Application developers knew that Apple would fully transition within two years and that if they did not do this quickly, they would be left out from the performance benefits.
- Xcode made it easy for application developers to transition to Intel code with minimal effort.
- Intel had a strong vested interest in improving CPU performance and had already had chips that performed far better than PowerPC.
The transition to Apple Silicon
The transition to Apple silicon builds upon Apple’s experience going through the 68k to PowerPC transition and the PowerPC to Intel transition. All the parts that made these two past events successful are fully ready. The Apple M1 touts vastly superior performance over Intel chips, Rosetta 2 is here, Apple has announced that the transition will be over in two years, and the M1 is based on the same technology as the iPhone SoCs so there are no doubts about Apple continuing to invest in them.
Nobody has any doubts about the success of this transition.
Will Windows transition to ARM?
Windows has show interest in moving Windows to the ARM architecture for almost a decade. However, to this day, this remains a very small niche project. Microsoft ported Windows 10 to the ARM platform in 2017, providing the capability to run regular Windows app on ARM and not just UWP apps. They do provide an emulator for running code written in Intel binaries — however, this is severely limited since it only supports 32-bit. Reportedly, 64-bit support will come at last, after almost three years on the market.
The silicon that currently powers ARM PCs are the SQ2 in the Surface Pro X, and the Qualcomm 8cx in the Samsung Book S and others. While they do provide benefits in battery life, in terms of raw performance, they are at the bottom of the herd, even when running native apps.
Therefore, it comes as no surprise that Microsoft nor any other PC vendor has not made any commitment on fully transitioning their PC line-up. ARM-based PCs remain just a small niche segment for customers who value mobility over anything else, and who are willing to sacrifice performance and compatibility. Nobody thinks that ARM-based PCs will make a significant portion of the total user base.
With the above, we can analyse the chicken-and-egg problem for Windows moving to ARM.
- End-users have no confidence today that their favourite apps will run on Windows for ARM. With the emulator supporting only 32-bit Intel code today, a large number of apps simply do not run. Even if Microsoft announces 64-bit Intel code support, it will take a while for consumers to believe that Microsoft is committed to good compatibility.
- End-users have no confidence that their apps will run with good performance. Even when the apps are rewritten in ARM code, given the current ARM processors available, they will only see performance similar to low-power consumption Intel laptops.
- Application developers have no confidence that a significant number of users will purchase ARM hardware. On the contrary, they know well that ARM will remain a small niche market. ARM users will not purchase many apps since the compromise that they made in the first place was that they would use their devices only for casual browsing and email. This makes this niche unattractive for developers.
- Even Microsoft itself does not seem to be convinced that recompiling for ARM is worthwhile. The glacial speed at which they are recompiling their own productivity apps proves this. MS-Office is still not available as ARM binaries, and MS-Teams has just come a few weeks ago.
- Qualcomm has been developing the SQ2 and the 8cx for PCs, but performance is in the low-power PC range. We are not seeing anything like the Apple M1, where it rivals and exceeds all Intel MacBooks in single-core performance. Successful transitions by Apple have always involved multifold jumps in raw CPU performance, and it is clear that the Qualcomm ARM chips are not delivering in this regard.
- There is the question of whether or not non-Apple CPU vendors will have the incentive to develop ARM chips as powerful as Apple Silicon, with the same power efficiencies. If there was an independent market for powerful non-Apple ARM chips, then the incentive might be there. As of today, I cannot identify such a market.
Given the above, it seems unlikely that the chicken-and-egg problem will be solved, and so I doubt that we will see a successful transition of Windows to ARM.
What need to happen for the Windows transition to happen?
Looking at the above analysis, my conclusion is that for Windows to transition to ARM, there needs to be a market independent of PCs where the demand for high-performance ARM chips is strong. This market has to be big enough to fund the development costs to bring features like powerful GPUs and Unified Memory to the SoC. One candidate is the datacenter. If this happens, and ARM delivers performance that is very significantly ahead of Intel, then the market might be convinced that ARM is the future. Both Microsoft and software developers will start to take ARM seriously, and as a result we will have better emulators and more native code. This in turn will convince consumers that it is safe to buy these new devices.
Intel will also have to stand still. Intel is no slouch in incorporating technologies onto its platform, and CISC is no longer a fundamental disadvantage since its current CPUs are RISC at the core. From a market demand perspective, you could argue that Intel has much better access today to the markets that would value the high-performance that the concepts in the Apple M1 would bring. For ARM to succeed, Intel will somehow have to forfeit this advantage.
Unfortunately, I cannot see either of these happening and my current conclusion is that it is highly unlikely that Windows will transition to ARM in a significant way, and that it is more likely that the x86 Intel architecture will improve dramatically by incorporating the concepts pioneered in the Apple M1, before non-Apple ARM can make major inroads into the market.