NO EXECUTE! A weekly look at personal computer technology issues.

(C) 2007 by Darek Mihocka, founder, Emulators.com.

September 10 2007


[Home]  [Prev]  [X86 Doomed?]  [Next]
 
Consumers Are Losing
 
Imagine if George Lucas declared that old versions of Star Wars would no longer be distributed, television broadcasts and theatrical showings of the films would cease, and all existing Star Wars VHS tapes, laserdiscs, and DVDs would self-destruct 6 months from now. If you wished to keep enjoying Star Wars you would need to pay additional money for new digitally modified releases of Star Wars. You would also need to purchase a new video player due to the use of a new disc format. Oh, and likely in 10 years you would go through this whole mess again, paying new money to watch movies you've already paid for.
 
Seems absurd, right? This is a hypothetical money grab that I trust George Lucas will never stoop to in real life. Some aspects of this scenario are real however. It is common practise to release multiple versions of a DVD - such as a stripped down original, then perhaps a director's cut, then some limited edition box set. Each time, we are asked to pay full price for content that we already own. I've lost count of how many releases of Lord Of The Rings and Blade Runner there have been and will continue to come our way. Thankfully, because of open standards, I am free to watch the original Star Wars trilogy on VHS tape on my 20 year old VHS player on a 20 year old Sony Trinitron television.
 
It is dangerous when one single profit-driven corporation or entity controls content, controls a standard, or controls a product line. The consumer electronics industry has discovered the cash cow of subscriptions and now frequently exploits its ability to charge you a monthly subscription where in the past it would have simply sold you a product. And this is not just limited to software or content, but to actual hardware as well. Who hasn't been locked in to a 2 year contract when purchasing a cell phone, or cable TV service, or even Internet service? Even if you leave for an extended vacation and don't use the product for several months, you need to pay up or else risk hefty early-termination fees.
 
My personal worst "tech blackmail" story happened to me in the fall of 2005. For five years I was a rabid ReplayTV DVR customer and supporter. I preferred it over Tivo. After being spammed by ReplayTV to upgrade my previous 80-hour ReplayTV 4000 unit to their latest 300-hour model 5500, I dutifully did so. When it arrived I plugged it in, used it, and was happy for 10 days. The user interface did nag me to purchase either a monthly subscription or a lifetime subscription. I've already paid hundreds of dollars to ReplayTV for subscriptions and lifetime activations for my first 3 units. What I had intended to use this 5500 unit for was exactly what ReplayTV had been advertised as for 5 years - a tapeless digital VCR - to record daily showings of The Simpsons. Same channel, same time every day. I was fully prepared to program the unit manually as a digital VCR in order to fill up a hard disk full of every episode of The Simpsons. This is what every previous ReplayTV model I owned allowed me to do without an activation.
 
I was shocked to discover that after 10 days the ReplayTV 5500 simply stopped working; displaying only one screen of text telling me to call a number and activate my unit. I did call and found out that the folks at ReplayTV had pulled a bait-and-switch, no longer selling the unit as just a tapeless VCR. Instead, it was now mandatory to either subscribe to ReplayTV or pay the lifetime subscription. The unit was crippled, an upgrade that was less than what I had upgraded from! It was not possible to use it as a manual VCR or even to watch live TV, period! Literally, the $300 piece of hardware that I had just paid for 2 weeks earlier was already utterly useless, electronically locked out by a 10-day "time bomb". After 5 years of being a loyal customers and supporter, I sold off my ReplayTV units and gutted the 5500 for its 300 GB SATA hard disk, and switched to using Windows Media Center. ReplayTV conned me out of $300 (and no doubt other loyal customers), and I hope that ReplayTV goes bankrupt as soon as possible.
 
Outside of consumer electronics these tactics would be unacceptable in other industries. Imagine if gasoline was reformulated every year requiring all motorists to purchase a new car every 12 months. Or if you had to "subscribe" to specific brands of gasoline purchases whether you used your car or not. This seems absurd, so why do electronics and computer companies do exactly that?
 
Not to mention how insulting this is to the consumer, these type of actions merely perpetuate the west's image of being a disposable materialistic society. Always greedy for new things and dumping its technological garbage to other countries. I find it hypocritical that Americans are so up in arms over lead based paint in toys manufactured in China, when North America exports tons of it's polluted garbage to countries such as China. China is choking in its pollution and poverty, yet Americans demand clean toys and a place to dump their old computers and cell phones. From an ethical and moral point of view, disposing of technology every few years to replace it with something newer and more complex for the sake of technology's sake is not a sustainable road to the future.

Inventor Dean Kamen (http://www.brainyquote.com/quotes/authors/d/dean_kamen.html) said, "Everybody has to be able to participate in a future that they want to live for. That's what technology can do." Yet as the television show 60 Minutes reminded us just last night (http://www.cbsnews.com/stories/2007/01/26/60minutes/main2401726.shtml) when they covered the ever increasing frustration over the complexity of consumer electronics technology...

"The revolution is still a work in progress... Part of the problem, when it comes to computers at least, is that there are so many cooks for what you are using. Microsoft made the operating system, some company in Taiwan made the equipment, you're running software from a company in California, now you're installing the driver for a digital camera from a fourth company. What are the odds that all of these are going to work flawlessly together?... Zip... You get unhappy. You develop 'software rage'." - 60 Minutes, September 9 2007


 
Standard Need To Be Forward Looking
 
The mess that is today's personal computer industry was avoidable, just as the BluRay vs. HD-DVD mess was avoidable.

A long time ago in a place not too far away, governments and companies did co-operate to make sure their products were usable for the long haul. The interstate highway system was built across America fifty years ago, and the German Autobahn decades before that, and both are still widely used today. One can drive a 1957 Chevy as easily on the interstate today as a 2007 Toyota. Despite having numerous competing manufacturers, the automobile industry itself has set standards and specifications allowing them to use common gasoline, tires, batteries, and accessories.

 
Being a high school tinkerer, I started fixing televisions and stereo equipment when I was about 12 years old. Friends and neighbors would donate "dead" televisions and amplifiers. I would then go to the public library, find the specific schematics for the product I was repairing, go to Radio Shack and purchase the necessary replacement vacuum tubes or transistors, and with a high rate of success get the product repaired. I could do this because manufacturers used off-the-shelf parts, they documented their hardware in the form of parts lists and schematics, and I didn't need to worry about any crazy Digital Millennium Copyright Act that would land me in jail for daring to disassemble a television set.
 
Analog television is a great example of something that was designed to last. The NTSC video standard which is used to broadcast television in America was first developed in the 1950's. As is detailed in this wiki (http://en.wikipedia.org/wiki/NTSC), multiple standards were put forth, companies competed with each other to set one standard. Once that standard was defined they all used it, even it meant dropping their own proprietary format already in use. It was good for the consumer to do so.
 
The result of this 50 year long standard is 50 years of compatibility between different devices from different companies. It is this 50 years of compatibility which allows me to still connect my ancient Atari 800 computer to the exact same modern flat screen television that I connect my Playstation 3 to. I do not need to buy a new television each time I buy a new game console. I can similarly use the same television to watch Betamax tapes on my obsolete Betamax video player. Short of mechanical failure, I don't have to worry that I will wake up tomorrow and my collection of video games and videos will no longer be usable.
 
Part of the longevity of NTSC is that the original signal was designed to use 6 MHz of bandwidth, of which at the time only about 3 MHz of bandwidth was really needed to broadcast the blurry black and white signal on the television sets of the time. The standard had built in to it some headroom to make it somewhat "future proof". Numerous updates were made over the past 50 years - to add color to the video signal, to add stereo sound, to add subtitled text for the deaf, etc. - yet the standard has remained both backward compatible with older televisions and future-proof enough to allow new features to be added without making existing television sets unusable.
 
NTSC's legacy actually touches a lot more than just television. The master clock frequency that NTSC is based around is 3579454 Hz, or roughly 3.58 MHz. This clock frequency is almost a fundamental constant of physics, as it has over the past 5 decades worked itself into all sorts of other specifications. For example, the original IBM PC used the 3.58 MHz clock signal to drive both the high resolution timer (1.193 MHz, or 1/3 of 3.58 MHz) and the 8088 CPU (4.77 MHz is 4 times 1.193). The same 1.193 MHz frequency is further subdivided by a factor of 2^16 (65536) to give the 18.2 Hz frequency which was the BIOS timer frequency. Later that decade, the Atari 400/800 used the CPU clock speed of 1.79 MHz (half of 3.58 MHz). In the 1980's, the early Macintosh models were clocked at slightly over 7.1 MHz (which is twice 3.58 MHz) and not 8 MHz as commonly believed. Even today, many PCs have a high resolution timer running at 14.318 MHz  (which is 4 times 3.58 MHz). These clock frequencies have determined how fast various personal computers have run over the years, they've played a part in determining the graphics resolutions used by video cards, and they play a part in the very accurate timings needed by today's operating systems, from the length of a time slice to the synchronization of video and sound.
 
Even the relative dimensions of a television screen, a 4 by 3 ratio, is an arbitrary decision made by Thomas Edison (http://www.gizmohighway.com/hifi/wide_screen_tv.htm) which very much affects us today more than a hundred years later. It is why screen resolutions on computer monitors have always tended to be 4 by 3 (think 640x480, 1024x768, 1600x1200, etc). The moral here is that what may seem like a simple and arbitrary design decision can easily have repercussions and side effects 50 or even 100 years after the fact.
 

Corporations With Selfish Interests Must Not Control Standards

 
Apple, Microsoft, and Sony are three examples of overly ambitious companies that have their fingers in a lot of markets and try to gain vertical control of those markets at every level. Sony for example is both a music studio that produces albums as well as a manufacturer and distributor of music CDs. It is both a movie studio and a manufacturer and distributor of movies. Sony makes computers and audio equipment and Playstation game consoles. Recently it even got into the banking business by issuing its own credit card and lines of credit. Owning dozens of Sony products I am a card-carrying Sony customer, but feel somewhat raped by the $19 per $100 late fees and 26.99% interest rate.
 
Apple of course went from being a niche computer manufacturer to now being in bed with a movie studio, running an online music and video store, selling iPod players from plays that content on, and now getting closer and closer into bed with Google. Is an Apple - Google merger imminent? I hope not, as I have a horror story about Google which I'll discuss some other time.
 
My former employer Microsoft of course is similarly expanding from being just a software company into the businesses of video game consoles, developing video games, music and video distribution, competing with the iPod, and trying to gets its fingers into just about every standard there is from file formats to computer languages.
 
These are very worrying trends, as Sony, Apple, and Microsoft play a game of "winner take all". Some of us still remember the videotape wars between VHS and Betamax (http://en.wikipedia.org/wiki/Videotape_format_war). While most of the consumer electronics industry backed the VHS standard, Sony was pushing its own proprietary Betamax format. As it still does today in pushing its Sony MemoryStick, MinDisc, and other Sony-only formats. Sony ultimately lost that battle and itself switched to manufacturing VHS players.
 
In the 1980's there were similarly multiple design proposals for the laser audio disc which ultimately became the "Red Book" audio CD standard used for the past 25 years (http://en.wikipedia.org/wiki/Red_Book_%28audio_CD_standard%29). A format war was averted for music CDs as this time Sony went along with the industry.
 
10 years ago in 1997 another format war was brewing for the video equivalent of the CD, which we now all know and love as the DVD. I remember that months before the Christmas 1997 shopping season there were still disagreements over the DVD video format that threatened to delay or even derail the launch of DVD. While that crisis was averted, the DVD format did splinter for non-movie uses, which resulted in the various formats such as DVD-RAM (used by Apple in the PowerMac G4), DVD-R, DVD+R, and others. (http://www.videoguys.com/DVDformat.html). What consumer is not confused by the various DVD format choices?
 
Quite stupidly the industry has not learned from its past battles, and once again now in 2007 consumers are faced by the choice between HD-DVD and BluRay. On one side, you have Microsoft-backed HD-DVD which is supported by Microsoft's Xbox 360 and some manufacturers. On the other side, Sony's BluRay supported by the Sony Playstation 3 and remaining manufacturers. There aren't vast differences between the two formats, yet be it ego, or the need to control the universe, or what have you, the companies simply refused to get together and compromise on a single solution for consumers.
 
Recently, while shopping for the movie "300", my girlfriend and I were faced with the daunting decision over which of the three formats to purchase? We considered buying "300" in the "lowest common denominator" format - DVD - for $13. Or should we buy "300" on HD-DVD to have to watch on the Xbox 360 for $20? We finally went with the BluRay version since we use the Playstation 3 the most, unfortunately putting another $20 into Sony's pocket.
 
This kind of consumer confusion continues with other formats. Should music be stored in the widely adopted MP3 format, or Microsoft's proprietary WMA, or Apple's own funky format that seems to require a new iTunes upgrade every few months? Apple iTunes Store vs. Rhapsody? Should manuals be posted in PDF format or Microsoft Word format? In many of these format standards battles it always one of these three companies - Apple, Microsoft, or Sony - that is the stick in the mud.
 

Are We Designing for 5 Years or 500 Years?

 
The point I'm getting at here is to show how the storage and transmission of data, music, and video has been slowly hijacked from open formats such as, well, ink and paper, to proprietary formats which are dictated to us by corporations looking to grab as much money as possible from us. This trend should worry every citizen of the planet Earth. Imagine if ancient people, whether cavemen or Incas or Egyptians or Romans, did not write things down on stone or on paper. Imagine if all the centuries of literature and knowledge before us were lost. Or worse, owned by some corporation which kept the information locked up. Luckily we do not live in such a world... yet. Over the centuries man has written down his ideas on paper and in stone and this survives such that today in 2007 we can read the works of Shakespeare, we can learn about Julius Caesar, we can view cave paintings made thousands years ago, and in doing so piece together mankind's history and evolution.
 
I have friends who collect National Geographic with some issues dating back to the 19th century. I collect various computer, electronics, and science magazines and have since the 1970's. I have issues of Scientific American which discuss the wonders of computers that will one day play chess! Yes, play chess! I am free to read my collection of magazines at any time without cost. I do not have to pay anyone to read magazines that I already paid for more than 3 decades ago. I do not need a special device to read those magazines. I do not need to pay a monthly subscription fee for the right to retain those magazines. If all goes well perhaps 500 years from now if mankind survives, somebody will find my magazines and hopefully learn about life and technology of the late 20th century.
 
I can't say the same about digital files, whether music or video or word processor documents. Will I even have working hardware to watch the "300" BluRay in 10 years? How forward-looking are today's corporations?
 
Here in the United States, only after 50 years is the NTSC standard being replaced by a new standard for digital and high definition television. For almost a decade now the new standard has been phased in to new televisions and video hardware such that in February 2009, when analog broadcast of NTSC signals will cease, millions of people will continue to have television. It is because of the co-operation of competing manufacturers that this transition will be fairly painless. Only a small percentage of people, those mainly living in rural areas, will lose TV coverage and will need to upgrade their hardware. 300 million people will not all of a sudden lose television coverage and be forced to buy a new television immediately.
 
Why then do we allow the personal computer space to be dictated by arbitrary standards and forced upgrades?

The Lessons of Office 97

 
I worked at Microsoft at a time when the company had pretty much taken the market share away from Lotus 1-2-3, Word Perfect, and dBase. By the mid-1990's, Microsoft all but owned the word processing and spreadsheet spaces. And with that control it regularly modified the file formats for applications such as Word and Excel, and even Outlook and PowerPoint. The Office file formats change just about every 2 or 3 years and I have lost track of how many unique Office file formats there have been.
 
After owning over 90% of the word processor and spreadsheet market they apparently started to assume that all their customers would just blindly upgrade Microsoft Office every year. Case and point, the original release of Word 97 had the ability to import Word 95 documents but not to write them back out. Users had the option to save their documents in a new Word 97 format, or use the "Save As" option to save the document in a bloated and slow text format called "Rich Text Format" (RTF) which usually loses some formatting information in the conversion process. This decision to not include the ability to save in the previous file format saved some development and testing time I'm sure, but it ultimately backfired for Microsoft and for its customers.
 
As their corporate customers evaluated Office 97 they chose not to upgrade en masse. Because what usually happens in a large company is that perhaps 10% of its computers get upgraded to a new product and while the remaining 90% are upgraded gradually over a long time. This saves money and minimizes the risk that the business would be disrupted if it simply switched products overnight. As they did this gradual upgrade, companies soon discovered that employees using Word 97 were creating documents that could no longer be imported by employees running Word 95. The workaround of using RTF both bloated the file sizes and lost information, forcing people to constantly be reformatting the documents.

An important test of any cross-platform file format such as Word, Excel, Adobe Acrobat, Photoshop, JPEG, MP3, etc. is the ability to "round trip" a file or document between several different applications or different versions of the same application. For example, compose a document in Mac Word 5.1, load it into Word 6.0, save it in Word 6.0 format, then load that into Mac Office 98, save it out in Office 97 format, and so on. Does the document still load, display, and print exactly as it was originally composed? As almost any Microsoft Office user knows, and as I have run into myself as I write documents both on the Mac and on Windows, Office does a terrible job of round-tripping documents. Made worse when Microsoft specifically pulls out that support, as in Office 97!

 
After a wave of bad press and customer pressure, Microsoft issued the "Office 97 Service Release 1" patch, the primary "feature" of which was the ability to save in the previous file formats (http://office.microsoft.com/en-us/help/HA010449921033.aspx). The fact that the Office file formats have continued to change numerous times since 1997, it seems to me that Office management only thinks ahead 6 or 12 months, just enough to think about next year's profits and how it will sell yet another Office upgrade. I am very skeptical therefore of their intentions to now force their document formats as an international standard.
 
Everybody should be familiar by now of the fiasco between Microsoft and various governments over the issue of file formats (http://www.betanews.com/article/Massachusetts_Declares_Office_Open_XML_Suitable_Format/1183407051). This is not an issue to be taken lightly. How do we as a society plan to realize the vision of "The Paperless Office" if the electronic file formats we use change ever year and become unreadable after 10 or 20 years? Governments are right to be thinking about this issue for the long term, to be thinking 20, 50, 100 or more years ahead. Microsoft is currently losing its file format battle  (http://www.regdeveloper.co.uk/2007/09/05/microsoft_ooxml_defeat/) but who knows when lobbying efforts on its part will swing the pendulum the other way.

Microsoft itself flat out has a policy of only support any given product for about 5 years (http://www.microsoft.com/windows/lifecycle/default.mspx). Try to go to the Windows Update site using your Windows 98 or Windows 2000 and you are greeted by a message advising you to buy a new version of Windows. Windows XP is not immune. Hundreds of millions of Windows XP users are about to be dropped within a few months. Unfortunately what replaced Windows XP, Windows Vista, is rather expensive and people consider it to be garbage since it dumped many of the features which Microsoft had advertised for "Longhorn" during its 5 years development, such as WinFS. Many people have found that Windows Vista even has trouble copying large files across a home network and generally runs much slower than Windows XP. Not worth $400!

Those of you thinking that the answer is in Adobe's Acrobat PDF format, think again. My personal experience with Adobe Acrobat is that it too suffers from round-trip problems. I purchased Acrobat 4.0 a long time ago and used it to create various documents. I also have even older PDF files which I downloaded from various sites which contain various technical specifications which I use to design and test my emulators. I have found that as I upgrade to newer releases of Acrobat Reader (current at version 8 something) I get rendering errors displaying the old documents. Or worse, I've actually gotten error messages from Acrobat Reader telling me my PDF format is too old and that it is not even going to load the file. Thankfully, I can use some of the very emulators which I wrote to then run old versions of Adobe to read those documents. Is this what we expect governments and businesses to do?

Adobe showed their moral side when they threatened legal action against Microsoft (http://www.pcworld.com/article/id,125960-page,1/article.html). This just struck me as very odd and ironic that Adobe would attempt to squeeze money out of Microsoft for promoting it's supposedly free and open PDF standard.

Not just to pick on Microsoft and Adobe, Apple also has had a very long record of very short support cycles for its products. Mac OS releases get dumped about every 3 or 4 years. Mac OS 8 was launched to much fanfare in the fall of 1997, and was already dumped by 2001 when Mac OS X shipped. Apple did briefly support backward compatibility with Mac OS 9 but dropped that. Today, with Mac OS X "Leopard" 10.5 imminent, most of Apple's support is already limited to only Mac OS X 10.3.9 and later. It's pretty much an annual $99 force upgrade cycle for Mac OS, not unlike the forced upgrade cycle of Windows.

 
If we are to truly eliminate paper as the standard by which documents are archived, then we must not allow our future to be in the hands of a Bill Gates or a Steve Jobs or any other person or corporation. Do we have the NTSC-style competition between companies to develop standards that are for the mutual benefit of consumers? We now have companies such as Apple, Microsoft, and Sony who simply shove their particular format-of-the-day, their standard-of-the-day, down our throats. You want a certain movie in high definition? Then you may be forced to buy it only on BluRay. When Sun introduced the Java computer language in the 1990's, Microsoft decided it would develop an almost identical language called "C#" and to declare it a standard. Why not just support Java? These multi-billion dollar corporations are behaving like they are 2 year old children!
 
The future does not look bright for consumers and governments unless they demand real standards that will last for decades or centuries. Perhaps it may even be necessary for governments to step in and legislate these companies to stop behaving the way they do. Or perhaps, as I am coming to realize, the world truly should adopt open source software and the Linux operating system as the basis by which the economy continues to run on. I am seeing the moral benefits of open source, and I for one have committed to porting my emulation software from Microsoft-only platforms and over to Mac OS as well as to the open source Linux world.

That is all I am going to say for now about the ethics and policies of certain corporations. I'll return to open source and Linux in a few weeks. Now, let's get our hands dirty with technical issues that keep me awake at night. Starting first at the very bottom of the software stack - the x86 microprocessors which power the majority of the world's personal computers.


 
 
One matter that is very near and dear to my heart is to learn about microprocessors. Having taken Computer Engineering at university, I love to understand not only how microprocessors work, but also to understand the design motivations such that it can help me understand how to write better faster code. I've spent over 20 years now simulating various microprocessors in software in various emulators. And when I see things done poorly and done more for marketing reasons and profits than to advance the state-of-the-art, I raise a big stink about it (http://www.emulators.com/docs/pentium_1.htm).
 
After the launch of the ill-fated Pentium 4 in 2000, the personal computer industry fell into a rather boring pattern of one-upmanship, with AMD and Intel battling each other in a futile marketing game of ever increasing CPU clock speeds. Memory makers engaged in a similar battle of memory speeds and standards (SDRAM vs. RAMBUS vs. DDR). Video card manufacturers battle each other over frame rates and 3-D polygons until ultimately the field was reduced to just ATI and NVIDIA.
 
Looking back at what I consider the "Dark Ages" of the personal computer desktop, the years 2000 through 2005, one can see a lot of marketing driven product releases but very little innovation. I've argued for years, and still believe this today, that the average consumer does not need more than a Pentium III based system similar to what we all used back in 1999. 500 MHz systems with maybe a few hundred megabytes of RAM and 40 gigabyte hard drives capable of running email, web surfing, listening to music, etc. Of course the manufacturers have conspired against us such that it is not even possible to buy hardware with those specs, hard drives long since jumping in the hundreds of gigabytes range (even on iPods now), memory going into the gigabytes, and the slowest CPU speed even on battery power being about 1000 MHz.
 
Were we really better served in 2005 by laptops that ran so fast and so hot that they caught fire compared to laptops from 1999 that used less power and even ran longer on one battery charge?

It was refreshing then that in 2005, Intel waved the white flag (http://www.eweek.com/article2/0,1895,1848394,00.asp) and conceded defeat about the Pentium 4 failed "Netburst" micro-architecture and the whole stupidity over the clock speed race. The Dark Ages were over as Netburst went Netbust.

In March 2005, I attended the Intel Developer Forum in San Francisco and sat through several days of presentations about new technologies that Intel was planning to bring to market, including: 

Since then, WiMAX has become a reality with providers such as ClearWire (http://www.clearwire.com/) allowing people to dump their DSL and Comcast cable in favor of a less expensive WiMAX service that covers an entire city wirelessly.

64-bit multi-core processors became a reality, first from competitor AMD, and then appearing mainstream in February 2006 in Apple's new product line of Intel based computers and now pretty much being standard on all computers. 64-bit and multi-core are already supported by both Windows and Linux.

Vanderpool spurred a new age of competition in virtual machines, bringing about new and improved versions of VMware for the PC and the Mac, and competitors such as Xen.

Performance has never been faster, with today's Core 2 based computers delivering 2 to 4 times the speed of the Pentium 4 based computers on a per-core per-clock basis. With all that faster speed I can't complain about battery life, easily getting over 3 hours per charge out of my Apple Macbook even while running at full CPU speed, having wireless operating, and running Windows in a VMware virtual machine.

 
The future is bright... right? Well, unfortunately there are a lot of worrying storm clouds on the horizon that keep me awake at night.

For one thing, Intel has started to remove references to its "PC 2015" or "Platform 2015" vision that they announced in March 2005. Although they still have dead links to http://www.intel.com/technology/architecture/platform2015/ on the intel.com web site, why are they backtracking on their grand vision? Is it that after 2.5 years, 1/4 of the time to the vision of 2015 is already gone and we are still sitting at only 4 cores per CPU on the desktop and 2 cores in the laptop. Did Intel overshoot their vision?

Recently some serious bugs came to light concerning the Core 2, bugs which are causing alarming concern in the developer community (http://www.theregister.co.uk/2007/06/28/core_2_duo_errata/). Is Intel pushing forward too quickly? It was already announced not one but two upgrades to the SSE instruction set, aptly named SSE 4.1 and SSE 4.2, which strikes me very much of the near-sighted marketing-driven evolution of the Office file format. Where is the 50 year plan, or even 10 year plan, of the x86 instruction set when in just the past 10 years developers and consumers have had no less than 9 versions of MMX and SSE thrust their way?

At the other extreme, AMD now sits at 8 years and counting without a new micro-architecture to replace that of the 1999 AMD Athlon, it has begun to take the Intel approach over pre-announcing technologies long before they are available and in doing so shooting itself in the foot. Why does the world need SSE5 and who is AMD to even define SSE standards?

After a long hype, when exactly can we expect Barcelona (http://multicore.amd.com/us-en/AMD-Multi-Core/Products/Barcelona.aspx)? This morning's September 10 2007 issue of the Wall Street Journal runs a quarter-page piece on page B5 about the AMD Barcelona, yet tells us nothing new - giving no actual release date, no list price, no performance figures. Another pointless piece of shallow journalism by the WSJ that should really have been labeled as advertising.

Why is AMD so arrogant as to call one recently announced useless technology "a gift" (http://www.amd.com/us-en/Corporate/VirtualPressRoom/). Unless AMD is planning to send every developer in the world a free AMD processor, there is no "gift". And as I will explain next week, the technology they propose is completely unnecessary.

AMD has already mucked things up enough by introducing a number of technologies in the past few years which really served no benefit, including the pointless "NO EXECUTE" (NX) feature for which this blog is named, and arbitrary and inexplicable removal of critical instructions from the x86 instruction set. Is it any wonder that AMD executives seem to now be quitting like rats fleeing a sinking ship. I had one amusing incident this summer where at first AMD contacted me asking me to come work for them, and then just weeks later the very person who contacted me quit the company!

Don't even get me started (at least not until next Monday) about the whole "Vanderpool" thing and AMD's competing and incompatible "Pacifica" technology. Not only is it a mistake for companies like VMware and XenSource to bet the farm on this technology, but apparently I seem to be the only person who realizes that virtualization hardware is not even compatible with the whole "Platform 2015" vision in the first place!

There is a lot detail to cover for me to explain the various issue I just listed. I will spend all next week's blog focusing specifically on the technical issues afflicting this new battle brewing between AMD and Intel as well as why both companies need to stop behaving in the immature and counter-productive manner that Sony and Microsoft do. You the consumer end up suffering as a result of the AMD-Intel war, suffering in terms of crashes and blue screens and virus infections that you did not deserve. Despite all the hype, not much improves.

The question I pose to you to think about this week is: given that the personal computer is now an integral part of the lives of about 1 billion paying customers, should AMD or even Intel be in control of the "Intel x86" specifications? Or, should the responsibility and planning for the future be handed over to a neutral standards body?

As before, I would love to hear from you. Email me directly at darek@emulators.com or simply click on one of these two links to give me quick feedback:

 
Darek, keep writing, this is gravy!
 
Darek, shut up and go back to your Atari!

[Home]  [Prev] [Next]