Is your computer more stable today than it was in 1990?
As a piece of computing hardware, a 1990s desktop computer is essentially worthless today. By every metric anyone cares about, they are vastly exceeded by not just today’s desktop computers, but also by tablets and phones.
But unlike hardware, software has not
gotten universally better. While features and variety have improved, trust and reliability has, if anything, hit an all-time low. Even when software isn’t
malfunctioning due to proliferating bugs, it is forcing restarts, installing mandatory updates, becoming riddled with viruses and malware, covering every surface with ads, and routinely betraying the user’s confidentiality. Unlike that of yesteryear, software working in the morning today may well suddenly stop working in the afternoon, even if the user has done absolutely nothing in the interim.
The lack of platform competition is one important reason why computing has deteriorated.
For consumers, both the desktop and phone markets are now controlled by two operating systems each (Windows/Mac OS, and Android/iOS, respectively). In all cases, they are maintained by companies whose revenue does not primarily come from the sale of operating systems, and who have many incentives to pursue other goals that delivering the most stable, reliable, trustworthy experience. The operating system is never a product anymore — it is merely something users are forced to use based on the hardware they have chosen, and it is increasingly treated solely as a vehicle for pursuing the platform holders’ other business goals.
The unchallenged duopoly on both major computing form factors is likely due to the fact that interfacing with consumer hardware is now so difficult that it would be impossible for any new operating system to enter the market. This may also be a major reason Linux, while incredibly successful as a server operating system, has yet to make significant inroads as a desktop consumer OS.
But what if hardware weren’t
so hard to control?
In this lecture, I lay out the case for instruction set architectures (ISAs) as a possible first step out of the software quality quagmire. While shipping a stable ISA is more expensive that shipping continually updated buggy drivers, the rewards both to hardware companies and to users could be significant, and would allow those of us who care about reliability to start tackling what I have come to call “The Thirty Million Line problem”:
Eventually, I’d really like to see a return to sensible, simple interface standards for hardware, and it seems like pushing for consolidation to a few reasonable ISAs is the best way to start.