The shift from interesting geek toy to mainstream consumer product has also brought about a serious set of problems. Consumers are now bewildered by a dizzying choice of brands and hyped up claims across all price ranges. Online users are assaulted with an onslaught of spam email, which themselves generally proliferate due to security holes found in every major operating system and most productivity applications. Real innovations are less frequent, replaced by a steady onslaught of eye candy, useless claims, and greedy one-upmanship between competitors. Even the out-of-the-box experience with most store-bought computers is frustrating, with a deluge of pre-installed demo software, banner ads, and cryptic messages. Software developers themselves are equally frazzled, having to write code that supports numerous operating systems, various microprocessor architectures, and countless hardware configurations.
To various degrees, operating systems have similar dilemmas which causer longer and longer OS release periods and painful upgrades. Just look at the ever increasing gestation times of recent operation systems upgrades such as Mac OS X Leopard, Windows Vista, even the Linux kernel. I know first hand from spending many years as a developer at Microsoft how much time and resources are expended into just hunting for security holes and subtle code bugs. And before you think I'm on a Microsoft witch hunt, know that I am not. As I shall explain in detail over the next few weeks, this is an industry-wide problem that is rooted not just with Microsoft, but also with the fundamental design decisions being made in today's microprocessors that make code susceptible to these problem, to the design philosophies of today's programming languages, to the money-driven greed that causes products to be rushed to market, and to the lack of policing and accountability of the software development process.
I will elaborate in coming weeks, so allow me to give a high level overview of the issues before I delve into the technical details.
In my opinion, the personal computer industry is moving backwards in terms of what it set out to deliver in the 1980's. Take a look at where personal computers were 20 years ago and are today, and compare that to other technologies such as VCRs. 20 years ago, most personal computers, whether an Apple II or an Atari 400 or an IBM PC or an Apple Macintosh, booted up in under 5 seconds. My Atari 400 for example, press the power button and in about 2 seconds there is a nice blue screen with a READY prompt. Within seconds you can type in a simple computer program and run it, draw graphics, or be running a video game. I am still in awe of the engineers behind the technology in the Atari ST and Apple Macintosh of the mid-1980's. With just black and white displays and mere kilobytes of RAM, these machines were able to run graphics user interfaces not unlike what Windows or Mac OS offer today, were able to run word processing and paint programs not unlike what we use today, and even had entire integrated software development environments that edited, compiled, and debugged programs using less than 1 megabyte of memory. Less memory than a typical JPG photo uses today.
During that same time period, VHS and Betamax video cassette recorders became popular despite being difficult to use. How many people just left the clock blinking at "12:00" "12:00" "12:00" because it was hard to figure out how to set the clock? I remember I had an early VCR with two buttons to program anything. One button selected a digit of a function program (such as a digit of the date, time, or a channel number), and the other button set that digit. It was tedious to even program in one show to record. Since then VCRs have evolved to have remote controls with displays, allowing one to set up the programming all on the remote and then upload that in one shot to the VCR. Then came on-screen displays. Next, VCR Plus reduced the programming down to a few keystrokes. Today, we have hard-disk based DVR devices such as Tivo and set-top cable boxes that are trivially easy to program. I have no desire to ever go back to my old two-button VCR.
So why is it that 20 years later the typical personal computer now takes 30 to 60 seconds just to boot up, that I need a gigabyte of memory just to compile and debug a simple C program, and every day I am bombarded with security warnings telling me to download this patch and that patch or else risk having my computer hacked and my identity stolen? This isn't just a Windows issue as most people believe. My "Apple Software Update" goes off every few days with another patch for Mac OS X as well. About every two days, one of my Macs dies mysteriously with a "You must reboot" message, not unlike a Windows "blue screen of death". Special purpose computers such as wireless phones can be hacked. I'm just waiting for my DVR to get hacked and to lose my unwatched season of The Office.
Personal computer users have somehow managed to become either desensitized or oblivious to the fact that they're getting the shaft from the industry. This really puzzles me and has been an ongoing source of frustration, which I've occasionally vented about on my Darek's Secrets blog (http://www.emulators.com/secrets.htm). Why do paying consumers actually play in to the crisis?
For example, when I walk in to the local Best Buy or Fry's computer store, most of the software that is being sold is not productivity software but rather the paranoia-driven garbage of anti-spam, anti-virus, and disk cleaning security software. I hate this kind of software, I hate that people profit from it, and I have for years advocated against this kind of garbage. Yet the industry of paranoia thrives and the consumer complicity continues.
Hardware makers contribute to the paranoia, and their wallets, by encouraging the practice of upgrading computers every 2 or 3 years. I still have old 486 laptops and Gateway 2000 desktop computers from the early 1990's that still run just fine, even run Windows 98 or Windows XP. I usually read my email on a 500 MHz Pentium III machine that runs XP. I still to this day do not own even a 3.0 GHz computer (which hit the market for the Christmas 2002 shopping season) because I have no need for that clock speed, let alone 3.6 GHz or whatever artificial number will be pushed at us this Christmas season.
Add to the confusion the press, which frequently runs newspaper articles such as the Lee Gomes column which appears a few days ago in the August 29 2007 issue of the Wall Street Journal. Lee took it upon himself to open every piece of spam and surf every web site directed to in that spam in the hunt for "zombies". He was literally trying to infect his PC, which is a pretty dumb thing to do if you're not actually an engineer trying to analyze the zombie code. After a few days, his computer was still running, and therefore he declared that spam is "not so scary after all". This is similar to me saying that I drove my car today without a wearing a seatbelt and didn't die, so therefore not wearing a seatbelt is not so bad. It is this pointless and reckless journalism that simply adds to consumer confusion. What Mr. Gomes failed to point out is that clever malware can hide itself from virus detection software, which is primarily the reason I tell people not to waste their money on it. It was widely reported in 2001 for example (http://www.theregister.co.uk/2001/06/25/flaw_means_virus_could_disable/) how Norton Anti-Virus had a ridiculously stupid "off" switch which viruses could access. Or the more recent news that Sony Music was putting a dangerous "rootkit" on its music CDs (http://www.pcpitstop.com/spycheck/sonyxcp.asp). Rootkits are very hard to detect, because by design they aim to fool the operating system into not detecting them.
Journalists who run these naive experiments to play down the severity of these issues are doing consumers a huge disservice and are spreading misinformation. These security issues are very real and do manifest themselves into real world problems which affect global economies and real lives. It is not unlike other serious issues in other industries, such as exploding airplanes and collapsing bridges. When a bridge collapses, people die. Heads roll. Steps are taken to prevent further bridge collapses, such as the recent bridge inspection frenzy here in the U.S. When aircraft explodes, entirely fleets of planes are grounded until the root causes is discovered and fixed. This happened recently when a Japanese airliner caught fire, it happened when the Concorde blew up, it happened after each of the two space shuttle disasters. In each case, the root cause turned out to be some seemingly harmless item such as a small piece of metal or foam. In my experience, the vast majority of computer crashes and malware exploits are caused by very subtle technical errors deep down in the bowels of the computer code, which sadly most software developers don't know how to look for and testers don't know how to find. Just as you wouldn't think that a loose bolt could take down an airplane, an innocent misplaced semicolon buried deep in the source code of an operating system can take down millions of computers.
It Is Time For Consumers To Demand Higher Standards
The one thing that differentiates the computer industry from other industries is the lack of accountability. Even the lack of certification one needs to call oneself a software engineer. Any bozo can slap together 10 lines of JavaScript for a web site and call himself a developer and then get hired by Google. I've worked with such bozos at a number of companies. The tragedy is that millions of lines of buggy computer code out there are written by such bozos, and any one such line might be the next root cause of a Code Red or Blaster worm that takes down millions of machines. And then the world reboots, installs a patch, and keep running on its buggy software. This is just an unacceptable way to keep going into the future.
It seems to me the only time the industry really stopped and cared was during the Y2K scare, when it obvious that severe software failures could lead to loss of life. I wish the industry would be this vigilant all the time. Y2K was considered a dud because nothing happened. THAT'S THE POINT. When computers keep running instead of crashing, and nobody dies, and people aren't getting their files hacked and information stolen, that's good. It is boring, it doesn't make for great headlines in the newspapers, but that's what consumers should expect and demand.
This summer, while traveling through France on my way to see a Metallica concert, I had the fortune of driving across the Millau Viaduct, a truly amazing marvel of civil engineering. I know that the viaduct (I keep wanting to call it a bridge, it sure looks like bridge to me!) was built by competent engineers with years of experience. I know that probably every single nut and bolt was in place and secured, and if a few were missed, the bridge was over-engineered to handle that. And I know that it was built using tried and tested techniques used to build other bridges and viaducts. There are codes and standards that civil engineers must follow or risk losing their jobs and lawsuits. There is no room for hacking and last minute changes.
Yet that is precisely how computer code is written and hardware is designed. There is a lot of tweaking and experimentation and flying by the seat of the pants. In fairness, computer engineering and software engineering are very young fields. They are not thousands of years old as is say, civil engineering. I think of the computer industry as being at the same stage of development today as was say the automobile industry or the airplane industry early in the 20th century. People died at horrific rates as compared to today. There was no such thing as safety glass; you got your head sliced open instead. There were no seatbelts. Even running lights were optional at first. There weren't really even standards for roads, or traffic rules.
So in the year 2007 now, we have an industry that is widely prevalent yet ships fairly fragile and buggy product. Multiple microprocessor vendors push competing yet incompatible architectures at us. Multiple OS vendors push competing and mostly incompatible operating systems at us. Application vendors create dozens of incompatible file formats, some formats which even silently change every year. Going back 21 years to when I had the 4 incompatible computer systems which drove me into the field of emulation, it is my opinion that the industry needs to stop focusing in short term profits and just thinking about what they're shipping this coming Christmas. I've heard it called "falling prey to short term maximization", and this needs to stop. Just as other industries evolve to last for decades, the personal computer industry needs to lay down standards and a vision of the future for the next 50 years. To do anything less is immoral in my opinion.
The recent rush by AMD, IBM, Intel, and Sun to multi-core processors, 64-bit instruction set extensions, hardware virtualization, and other gimmicks is unfortunately not as well thought out as may seem. These technologies scare the hell out of me and you should be worried too. I've come to realize that some of these so called "innovations" are actually incompatible with each other and I wish the marketing bozos at these microprocessor companies would just shut up. We are going to have another Pentium 4 fiasco otherwise, and I'll explain why in two weeks, including why AMD and Intel need to call a truce for the good of the whole industry.
In coming weeks I'll also take the "hype" out of "hypervisors", explain why Microsoft should dump the entire Windows programming model, and pose the question "with Gateway gone, should Dell worry?".
I would love to hear about your own personal computer horror stories and your opinions on what you read in this blog. Email me directly at darek@emulators.com or simply click on one of these two links to give me quick feedback: