There are several people who seem to have had difficulty applying Clayton Christensen’s Disruption Theory to the tech world, and have proposed that this theory itself does not apply to either the tech market or consumer goods. However Horace Dediu, who is now working for the Clayton Christensen Institute specifically for the task of applying Disruption Theory-based analysis to the tech market, has argued in a podcast with Ben Bajarin, that there is no reason why it cannot be done. Horace argues that the challenge lies in defining what constitutes the “jobs-to-be-done”, and a failure to do this successfully is why Disruption Theory sometimes seems to fail.
This point is very important and is worth reiterating. The reason why Disruption Theory occasionally fails to explain a certain situation is not because the theory itself is limited in its scope; it is because identifying the jobs-to-be-done is extremely difficult. In fact in the typical example of the jobs-to-be-done, the milkshake example, the jobs-to-be-done is so unintuitive that its unlikely that even an industry expert would have accurately predicted it. It is no wonder that Clayton Christensen himself failed to predict how Disruption Theory would affect Apple.
More often then not, the people who attempt to expand, supplant or even discredit Disruption Theory have simply neglected to carefully analyze the jobs-to-be-done.
To further complicate things, if you look at the brief history of personal computing, which has only been with us for four decades at most, you can also observe that the jobs-to-be-done has shifted extremely rapidly. At most, a certain jobs-to-be-done will be mainstream for only five years.
For example, the PC started out in the Apple I era as a hardware hobbyist’s kit. With the Apple II, the PC was now a platform for a hobbyist software programmer. Then with the advent of packaged software like VisiCalc, the PC became a business tool for performing large numbers of calculations. With the arrival of the Macintosh and the Laser Writer, the PC now became a tool for creative professionals, and then with the Internet, it became a tool for communication and collaboration. After the year 2000, the PC became a tool to manage digital photos, music and video.
With each shift in the jobs-to-be-done, the required hardware specifications to sufficiently perform the task increased. At the same time, the customer base continuously expanded to less tech-savvy users which required the user interface to improve. This meant that the PC rarely reached the good-enough threshold because the bar was constantly being raised.
The exact same thing can be said for smartphones. The original iPhone started out as a phone, an iPod and an Internet communicator. In a short amount of time, it quickly became your main camera, your gaming console, your map, your photo album, your fitness tracker, your newsreader, the pacifier for your kids, your TV and so much more. And now with Touch ID and Apple Pay, Apple is making your iPhone your ID and credit card. From its initial humble jobs-to-be-done, the smartphone is now the center of a huge portion of your life. The jobs-to-be-done of smartphones has exploded.
And as with PCs, each shift in the jobs-to-be-done has required new and better hardware. Being your main camera has demanded better optical and image processing hardware and software. Being your gaming console has demanded better 3D graphics performance which technologies like Metal and better embedded GPUs can provide. Being you fitness tracker has resulted in technologies like the M7 motion co-processor which can constantly track your movements with minimal battery drain. And of course Touch ID and Apple Pay required new biometric hardware.
By now it should be plainly obvious why Apple has avoided being disrupted; Apple has consistently been at the forefront of the shifts in the jobs-to-be-done in personal computing (except for the years when Apple was run by John Sculley and R&D was run by Jean-Louis Gassée). That is why new Apple hardware has constantly been in high demand and can still command a high premium.
On the other hand, the reason why Samsung is being disrupted at the low-end is because Android is not expanding the frontiers of smartphone jobs-to-be-done. Other than UI tweaks that work equally well on less capable devices, Android has recently failed to introduce compelling features that require new or better hardware. In fact, this might have been intentional on Google’s part as an initiative to reduce fragmentation. As a result, Android phones have become as good-enough as they can be, even on the hardware that can be bought for $200-300. The Android OS is holding Samsung back.
This also means that Apple products will be disrupted if they start failing to create new jobs-to-be-done. It also means that the resurgence of the Mac could be attributed to a new jobs-to-be-done that the Mac can uniquely satisfy. Strong integration with iPhones could obviously be one of these jobs-to-be-done which is not available on Windows PCs.
Actually, if you look at computing from a jobs-to-be-done standpoint, the idea of a “new category” device starts to look rather ridiculous. It becomes clear that the emphasis should be on whether or not a new jobs-to-be-done has emerged. Sometimes this might require a new device, but oftentimes, it simply requires new hardware on top of a preexisting device. This in itself is sufficient to transform the previous hardware into a new category device. For example, Touch ID has transformed the iPhone into a digital ID and wallet with unprecedented security and convenience, something that was hitherto impossible with a smartphone. Focussing solely on new category devices is completely missing the point.
In fact you could even argue that every few years, Apple has introduced a new category product in the guise of the iPhone or a new iOS version; a product that enables new jobs-to-be-done to emerge.