NO EXECUTE! A weekly look at personal computer technology issues.

(C) 2007 by Darek Mihocka, founder, Emulators.com.

September 03 2007


[Home]  [Next]

Introduction - From Mainframes to iPods in 30 years

 
In just 30 years, the personal computer has gone from being a geeky toy for hobbyists to an appliance as common as the television and telephone. As the PC has become ubiquitous, prices have dropped steadily while computing power increased exponentially year after year. The PC is the poster child of "bang for the buck", a marketing term I first remember from 1980's home computer ads when kilobytes and floppy disks ruled. As featured in a recent "60 Minutes" piece about the One Laptop Per Child Initiative (http://laptop.org/), today it is possible for school children to surf the Internet wirelessly from the middle of the jungle using a hand cranked $200 laptop. How many millions of dollars would that kind of communications capability have cost the military even 30 years ago? The size reductions, cost reductions, and corresponding increased computational capabilities are mind boggling.
 
Indeed, the personal computer era has led to the decline of mini computers, mainframes, and super computers. Gone are the room-sized central computers, replaced by millions (and now billions) of interconnected personal computers. Computation and data storage can be distributed across thousands of machines (such as the SETI (http://setiathome.ssl.berkeley.edu/) or PRIME95 (http://www.mersenne.org/) projects). Two hundred years ago it would have taken months to send a letter, but today one can instantly exchange photos and music around the world. Or pull down almost infinite on-demand content to a television or iPod. As is frequently pointed out, the mainframe computing power available for the 1969 moon landing was something along the lines of what one finds today in a calculator or digital watch.
 
The widespread use of computers has certainly changed the way people interact and do business, even when not directly using the computer. I remember 20 years ago as a college student looking for work, it was necessary to explain to interviewers what my 9-digit "CompuServe" account was or why I was even listing it on my resume. Today, I can't imagine someone not putting their Internet email address on a resume or business card. Or similarly 20 years ago, explaining to my friends and family why I would sit in front of an Atari 400 late into the night that made weird noises on the telephone line, just so I could "chat" with people on my custom written "bulletin board". This was all a weird and foreign concept to many people until AOL and MSN popularized the widespread use of email. Today, most people in the western world are in one way or another addicted to email and other communication technologies pretty much 24/7.
 
1986 - My Hobby Turns Into a Career
 
For me, personal computers have been an obsessive hobby since 1980. 1980? Yes, I am in fact a dinosaur, it being my 41st birthday today. Star Trek geeks can immediately do the math and realize that I was born just days before the original Star Trek series appeared on television, when much of today's technology - holographic data storage, wireless communications, flat screen displays, instant access to the world's information, and human space travel to name a few - was considered to be 23rd century science fiction. In 1980 I was a high school teenager when I first used the futuristic looking Commodore PET computer. Being a math and science geek, I was completely awed by programming in BASIC and wanted to understanding how the machine actually worked. The following year for my 15th birthday I had saved enough money to buy my own Atari 400 home computer, which included 48K of RAM memory and a cassette tape drive for storage. I had my first paycheck with the "Graphics Utility Package" published in the June 1985 issue of the Atari programming magazine ANTIC (http://www.atarimagazines.com/v4n2/GUP.html), launching what ultimately became a lifelong career.
 
My first steady paycheck started in April 1986, the week of the Chernobyl accident, when I was hired as a summer intern by the board of education in Toronto to write Apple II software for the coming school year. That same year, BYTE magazine featured the new Atari ST, a computer that I purchased immediately. I was soon a fluent programmer on four different computer platforms - the IBM AT at university running MS-DOS and Turbo Pascal, the Atari 400 which I had had for 5 years, the new Atari ST with its Mac-like graphics user interface, and the Apple II.
 
I had my life changing "aha!" light-bulb-over-the-head kind of moment some time in July 1986 where I realized that fundamentally all these four different computers worked the same way. On the hardware side, the microprocessors (the "CPU chips" that perform millions of calculations per second) differed mainly in the "machine language" they understood. Machine language is the code of ones and zeroes that makes up the "bits" and "bytes" of computer "code" that describe a computer program. It is slightly different for the Intel 8086 microprocessor used in an IBM PC as for the Motorola 68000 microprocessor used in the Atari ST. But they do essentially the same computations. Similarly, the high level programming languages such as Applesoft BASIC, Atari BASIC and Turbo Pascal differed mainly in how one needed to specify the grammar of the language and the use of punctuation, not unlike say, the difference between writing in Spanish and English. What I had naively stumbled into was the concept of translating computer code from one type of computer to run on a completely different computer. The concept is known as "emulation", although at the time I was so naive I didn't even know it had a name.
 
I also had no background about 1970's mainframe technology and the hardware methodologies invented years earlier to do a type of emulation in hardware known as "virtualization". This is a technique that in recent years has been rediscovered and popularized by companies such as VMware (http://www.vmware.com/) in its server virtualization products, and is today also used by similar products such as Virtual PC 2007 (http://www.microsoft.com/windows/products/winfamily/virtualpc/), Xen (http://www.xensource.com/), and Windows Server 2008. I didn't have a clue about any of this in 1986 and so went down my own path of solving the problem, leading up to many emulation products over the years and of course this web site (http://www.emulators.com/).
 
My first project was to try to emulate an Apple II computer on the Atari ST entirely in software. I completed this project in about two months and submitted the code to COMPUTE! magazine (http://www.atarimagazines.com/compute/) only to receive a rejection letter from its editor, stating something along the lines of their readers would have no interest in an emulator. Ha! How times would change.
 
2007 - The Personal Computer In Crisis
 
The shift from interesting geek toy to mainstream consumer product has also brought about a serious set of problems. Consumers are now bewildered by a dizzying choice of brands and hyped up claims across all price ranges. Online users are assaulted with an onslaught of spam email, which themselves generally proliferate due to security holes found in every major operating system and most productivity applications. Real innovations are less frequent, replaced by a steady onslaught of eye candy, useless claims, and greedy one-upmanship between competitors. Even the out-of-the-box experience with most store-bought computers is frustrating, with a deluge of pre-installed demo software, banner ads, and cryptic messages. Software developers themselves are equally frazzled, having to write code that supports numerous operating systems, various microprocessor architectures, and countless hardware configurations.
 
To various degrees, operating systems have similar dilemmas which causer longer and longer OS release periods and painful upgrades. Just look at the ever increasing gestation times of recent operation systems upgrades such as Mac OS X Leopard, Windows Vista, even the Linux kernel. I know first hand from spending many years as a developer at Microsoft how much time and resources are expended into just hunting for security holes and subtle code bugs. And before you think I'm on a Microsoft witch hunt, know that I am not. As I shall explain in detail over the next few weeks, this is an industry-wide problem that is rooted not just with Microsoft, but also with the fundamental design decisions being made in today's microprocessors that make code susceptible to these problem, to the design philosophies of today's programming languages, to the money-driven greed that causes products to be rushed to market, and to the lack of policing and accountability of the software development process.
 
I will elaborate in coming weeks, so allow me to give a high level overview of the issues before I delve into the technical details.
 
In my opinion, the personal computer industry is moving backwards in terms of what it set out to deliver in the 1980's. Take a look at where personal computers were 20 years ago and are today, and compare that to other technologies such as VCRs. 20 years ago, most personal computers, whether an Apple II or an Atari 400 or an IBM PC or an Apple Macintosh, booted up in under 5 seconds. My Atari 400 for example, press the power button and in about 2 seconds there is a nice blue screen with a READY prompt. Within seconds you can type in a simple computer program and run it, draw graphics, or be running a video game. I am still in awe of the engineers behind the technology in the Atari ST and Apple Macintosh of the mid-1980's. With just black and white displays and mere kilobytes of RAM, these machines were able to run graphics user interfaces not unlike what Windows or Mac OS offer today, were able to run word processing and paint programs not unlike what we use today, and even had entire integrated software development environments that edited, compiled, and debugged programs using less than 1 megabyte of memory. Less memory than a typical JPG photo uses today.
 
During that same time period, VHS and Betamax video cassette recorders became popular despite being difficult to use. How many people just left the clock blinking at "12:00" "12:00" "12:00" because it was hard to figure out how to set the clock? I remember I had an early VCR with two buttons to program anything. One button selected a digit of a function program (such as a digit of the date, time, or a channel number), and the other button set that digit. It was tedious to even program in one show to record. Since then VCRs have evolved to have remote controls with displays, allowing one to set up the programming all on the remote and then upload that in one shot to the VCR. Then came on-screen displays. Next, VCR Plus reduced the programming down to a few keystrokes. Today, we have hard-disk based DVR devices such as Tivo and set-top cable boxes that are trivially easy to program. I have no desire to ever go back to my old two-button VCR.
 
So why is it that 20 years later the typical personal computer now takes 30 to 60 seconds just to boot up, that I need a gigabyte of memory just to compile and debug a simple C program, and every day I am bombarded with security warnings telling me to download this patch and that patch or else risk having my computer hacked and my identity stolen? This isn't just a Windows issue as most people believe. My "Apple Software Update" goes off every few days with another patch for Mac OS X as well. About every two days, one of my Macs dies mysteriously with a "You must reboot" message, not unlike a Windows "blue screen of death". Special purpose computers such as wireless phones can be hacked. I'm just waiting for my DVR to get hacked and to lose my unwatched season of The Office.
 
Personal computer users have somehow managed to become either desensitized or oblivious to the fact that they're getting the shaft from the industry. This really puzzles me and has been an ongoing source of frustration, which I've occasionally vented about on my Darek's Secrets blog (http://www.emulators.com/secrets.htm). Why do paying consumers actually play in to the crisis?
 
For example, when I walk in to the local Best Buy or Fry's computer store, most of the software that is being sold is not productivity software but rather the paranoia-driven garbage of anti-spam, anti-virus, and disk cleaning security software. I hate this kind of software, I hate that people profit from it, and I have for years advocated against this kind of garbage. Yet the industry of paranoia thrives and the consumer complicity continues.
 
Hardware makers contribute to the paranoia, and their wallets, by encouraging the practice of upgrading computers every 2 or 3 years. I still have old 486 laptops and Gateway 2000 desktop computers from the early 1990's that still run just fine, even run Windows 98 or Windows XP. I usually read my email on a 500 MHz Pentium III machine that runs XP. I still to this day do not own even a 3.0 GHz computer (which hit the market for the Christmas 2002 shopping season) because I have no need for that clock speed, let alone 3.6 GHz or whatever artificial number will be pushed at us this Christmas season.
 
Add to the confusion the press, which frequently runs newspaper articles such as the Lee Gomes column which appears a few days ago in the August 29 2007 issue of the Wall Street Journal. Lee took it upon himself to open every piece of spam and surf every web site directed to in that spam in the hunt for "zombies". He was literally trying to infect his PC, which is a pretty dumb thing to do if you're not actually an engineer trying to analyze the zombie code. After a few days, his computer was still running, and therefore he declared that spam is "not so scary after all". This is similar to me saying that I drove my car today without a wearing a seatbelt and didn't die, so therefore not wearing a seatbelt is not so bad. It is this pointless and reckless journalism that simply adds to consumer confusion. What Mr. Gomes failed to point out is that clever malware can hide itself from virus detection software, which is primarily the reason I tell people not to waste their money on it. It was widely reported in 2001 for example (http://www.theregister.co.uk/2001/06/25/flaw_means_virus_could_disable/) how Norton Anti-Virus had a ridiculously stupid "off" switch which viruses could access. Or the more recent news that Sony Music was putting a dangerous "rootkit" on its music CDs (http://www.pcpitstop.com/spycheck/sonyxcp.asp). Rootkits are very hard to detect, because by design they aim to fool the operating system into not detecting them.
 
Journalists who run these naive experiments to play down the severity of these issues are doing consumers a huge disservice and are spreading misinformation. These security issues are very real and do manifest themselves into real world problems which affect global economies and real lives. It is not unlike other serious issues in other industries, such as exploding airplanes and collapsing bridges. When a bridge collapses, people die. Heads roll. Steps are taken to prevent further bridge collapses, such as the recent bridge inspection frenzy here in the U.S. When aircraft explodes, entirely fleets of planes are grounded until the root causes is discovered and fixed. This happened recently when a Japanese airliner caught fire, it happened when the Concorde blew up, it happened after each of the two space shuttle disasters. In each case, the root cause turned out to be some seemingly harmless item such as a small piece of metal or foam. In my experience, the vast majority of computer crashes and malware exploits are caused by very subtle technical errors deep down in the bowels of the computer code, which sadly most software developers don't know how to look for and testers don't know how to find. Just as you wouldn't think that a loose bolt could take down an airplane, an innocent misplaced semicolon buried deep in the source code of an operating system can take down millions of computers.
 
 
It Is Time For Consumers To Demand Higher Standards
 
The one thing that differentiates the computer industry from other industries is the lack of accountability. Even the lack of certification one needs to call oneself a software engineer. Any bozo can slap together 10 lines of JavaScript for a web site and call himself a developer and then get hired by Google. I've worked with such bozos at a number of companies. The tragedy is that millions of lines of buggy computer code out there are written by such bozos, and any one such line might be the next root cause of a Code Red or Blaster worm that takes down millions of machines. And then the world reboots, installs a patch, and keep running on its buggy software. This is just an unacceptable way to keep going into the future.
 
It seems to me the only time the industry really stopped and cared was during the Y2K scare, when it obvious that severe software failures could lead to loss of life. I wish the industry would be this vigilant all the time. Y2K was considered a dud because nothing happened. THAT'S THE POINT. When computers keep running instead of crashing, and nobody dies, and people aren't getting their files hacked and information stolen, that's good. It is boring, it doesn't make for great headlines in the newspapers, but that's what consumers should expect and demand.
 
This summer, while traveling through France on my way to see a Metallica concert, I had the fortune of driving across the Millau Viaduct, a truly amazing marvel of civil engineering. I know that the viaduct (I keep wanting to call it a bridge, it sure looks like bridge to me!) was built by competent engineers with years of experience. I know that probably every single nut and bolt was in place and secured, and if a few were missed, the bridge was over-engineered to handle that. And I know that it was built using tried and tested techniques used to build other bridges and viaducts. There are codes and standards that civil engineers must follow or risk losing their jobs and lawsuits. There is no room for hacking and last minute changes.
 
Yet that is precisely how computer code is written and hardware is designed. There is a lot of tweaking and experimentation and flying by the seat of the pants. In fairness, computer engineering and software engineering are very young fields. They are not thousands of years old as is say, civil engineering. I think of the computer industry as being at the same stage of development today as was say the automobile industry or the airplane industry early in the 20th century. People died at horrific rates as compared to today. There was no such thing as safety glass; you got your head sliced open instead. There were no seatbelts. Even running lights were optional at first. There weren't really even standards for roads, or traffic rules.
 
So in the year 2007 now, we have an industry that is widely prevalent yet ships fairly fragile and buggy product. Multiple microprocessor vendors push competing yet incompatible architectures at us. Multiple OS vendors push competing and mostly incompatible operating systems at us. Application vendors create dozens of incompatible file formats, some formats which even silently change every year. Going back 21 years to when I had the 4 incompatible computer systems which drove me into the field of emulation, it is my opinion that the industry needs to stop focusing in short term profits and just thinking about what they're shipping this coming Christmas. I've heard it called "falling prey to short term maximization", and this needs to stop. Just as other industries evolve to last for decades, the personal computer industry needs to lay down standards and a vision of the future for the next 50 years. To do anything less is immoral in my opinion.
 
The recent rush by AMD, IBM, Intel, and Sun to multi-core processors, 64-bit instruction set extensions, hardware virtualization, and other gimmicks is unfortunately not as well thought out as may seem. These technologies scare the hell out of me and you should be worried too. I've come to realize that some of these so called "innovations" are actually incompatible with each other and I wish the marketing bozos at these microprocessor companies would just shut up. We are going to have another Pentium 4 fiasco otherwise, and I'll explain why in two weeks, including why AMD and Intel need to call a truce for the good of the whole industry.
 
In coming weeks I'll also take the "hype" out of "hypervisors", explain why Microsoft should dump the entire Windows programming model, and pose the question "with Gateway gone, should Dell worry?".
 
I would love to hear about your own personal computer horror stories and your opinions on what you read in this blog. Email me directly at darek@emulators.com or simply click on one of these two links to give me quick feedback:
 
Darek, keep writing, this is gravy!
 
Darek, shut up and go back to your Atari!

[Home]  [Next]