NO EXECUTE! A weekly look at personal computer technology issues.
(C) 2007 by Darek Mihocka, founder, Emulators.com.
September 10 2007
Inventor Dean Kamen (http://www.brainyquote.com/quotes/authors/d/dean_kamen.html) said, "Everybody has to be able to participate in a future that they want to live for. That's what technology can do." Yet as the television show 60 Minutes reminded us just last night (http://www.cbsnews.com/stories/2007/01/26/60minutes/main2401726.shtml) when they covered the ever increasing frustration over the complexity of consumer electronics technology...
"The revolution is still a work in progress... Part of the problem, when it comes to computers at least, is that there are so many cooks for what you are using. Microsoft made the operating system, some company in Taiwan made the equipment, you're running software from a company in California, now you're installing the driver for a digital camera from a fourth company. What are the odds that all of these are going to work flawlessly together?... Zip... You get unhappy. You develop 'software rage'." - 60 Minutes, September 9 2007
A long time ago in a place not too far away, governments and companies did co-operate to make sure their products were usable for the long haul. The interstate highway system was built across America fifty years ago, and the German Autobahn decades before that, and both are still widely used today. One can drive a 1957 Chevy as easily on the interstate today as a 2007 Toyota. Despite having numerous competing manufacturers, the automobile industry itself has set standards and specifications allowing them to use common gasoline, tires, batteries, and accessories.
Corporations With Selfish Interests Must Not Control Standards
Are We Designing for 5 Years or 500 Years?
The Lessons of Office 97
An important test of any cross-platform file format such as Word, Excel, Adobe Acrobat, Photoshop, JPEG, MP3, etc. is the ability to "round trip" a file or document between several different applications or different versions of the same application. For example, compose a document in Mac Word 5.1, load it into Word 6.0, save it in Word 6.0 format, then load that into Mac Office 98, save it out in Office 97 format, and so on. Does the document still load, display, and print exactly as it was originally composed? As almost any Microsoft Office user knows, and as I have run into myself as I write documents both on the Mac and on Windows, Office does a terrible job of round-tripping documents. Made worse when Microsoft specifically pulls out that support, as in Office 97!
Microsoft itself flat out has a policy of only support any given product for about 5 years (http://www.microsoft.com/windows/lifecycle/default.mspx). Try to go to the Windows Update site using your Windows 98 or Windows 2000 and you are greeted by a message advising you to buy a new version of Windows. Windows XP is not immune. Hundreds of millions of Windows XP users are about to be dropped within a few months. Unfortunately what replaced Windows XP, Windows Vista, is rather expensive and people consider it to be garbage since it dumped many of the features which Microsoft had advertised for "Longhorn" during its 5 years development, such as WinFS. Many people have found that Windows Vista even has trouble copying large files across a home network and generally runs much slower than Windows XP. Not worth $400!
Those of you thinking that the answer is in Adobe's Acrobat PDF format, think again. My personal experience with Adobe Acrobat is that it too suffers from round-trip problems. I purchased Acrobat 4.0 a long time ago and used it to create various documents. I also have even older PDF files which I downloaded from various sites which contain various technical specifications which I use to design and test my emulators. I have found that as I upgrade to newer releases of Acrobat Reader (current at version 8 something) I get rendering errors displaying the old documents. Or worse, I've actually gotten error messages from Acrobat Reader telling me my PDF format is too old and that it is not even going to load the file. Thankfully, I can use some of the very emulators which I wrote to then run old versions of Adobe to read those documents. Is this what we expect governments and businesses to do?
Adobe showed their moral side when they threatened legal action against Microsoft (http://www.pcworld.com/article/id,125960-page,1/article.html). This just struck me as very odd and ironic that Adobe would attempt to squeeze money out of Microsoft for promoting it's supposedly free and open PDF standard.
Not just to pick on Microsoft and Adobe, Apple also has had a very long record of very short support cycles for its products. Mac OS releases get dumped about every 3 or 4 years. Mac OS 8 was launched to much fanfare in the fall of 1997, and was already dumped by 2001 when Mac OS X shipped. Apple did briefly support backward compatibility with Mac OS 9 but dropped that. Today, with Mac OS X "Leopard" 10.5 imminent, most of Apple's support is already limited to only Mac OS X 10.3.9 and later. It's pretty much an annual $99 force upgrade cycle for Mac OS, not unlike the forced upgrade cycle of Windows.
That is all I am going to say for now about the ethics and policies of certain corporations. I'll return to open source and Linux in a few weeks. Now, let's get our hands dirty with technical issues that keep me awake at night. Starting first at the very bottom of the software stack - the x86 microprocessors which power the majority of the world's personal computers.
It was refreshing then that in 2005, Intel waved the white flag (http://www.eweek.com/article2/0,1895,1848394,00.asp) and conceded defeat about the Pentium 4 failed "Netburst" micro-architecture and the whole stupidity over the clock speed race. The Dark Ages were over as Netburst went Netbust.
In March 2005, I attended the Intel Developer Forum in San Francisco and sat through several days of presentations about new technologies that Intel was planning to bring to market, including:
64-bit multi-core processors became a reality, first from competitor AMD, and then appearing mainstream in February 2006 in Apple's new product line of Intel based computers and now pretty much being standard on all computers. 64-bit and multi-core are already supported by both Windows and Linux.
Vanderpool spurred a new age of competition in virtual machines, bringing about new and improved versions of VMware for the PC and the Mac, and competitors such as Xen.
Performance has never been faster, with today's Core 2 based computers delivering 2 to 4 times the speed of the Pentium 4 based computers on a per-core per-clock basis. With all that faster speed I can't complain about battery life, easily getting over 3 hours per charge out of my Apple Macbook even while running at full CPU speed, having wireless operating, and running Windows in a VMware virtual machine.
For one thing, Intel has started to remove references to its "PC 2015" or "Platform 2015" vision that they announced in March 2005. Although they still have dead links to http://www.intel.com/technology/architecture/platform2015/ on the intel.com web site, why are they backtracking on their grand vision? Is it that after 2.5 years, 1/4 of the time to the vision of 2015 is already gone and we are still sitting at only 4 cores per CPU on the desktop and 2 cores in the laptop. Did Intel overshoot their vision?
Recently some serious bugs came to light concerning the Core 2, bugs which are causing alarming concern in the developer community (http://www.theregister.co.uk/2007/06/28/core_2_duo_errata/). Is Intel pushing forward too quickly? It was already announced not one but two upgrades to the SSE instruction set, aptly named SSE 4.1 and SSE 4.2, which strikes me very much of the near-sighted marketing-driven evolution of the Office file format. Where is the 50 year plan, or even 10 year plan, of the x86 instruction set when in just the past 10 years developers and consumers have had no less than 9 versions of MMX and SSE thrust their way?
At the other extreme, AMD now sits at 8 years and counting without a new micro-architecture to replace that of the 1999 AMD Athlon, it has begun to take the Intel approach over pre-announcing technologies long before they are available and in doing so shooting itself in the foot. Why does the world need SSE5 and who is AMD to even define SSE standards?
After a long hype, when exactly can we expect Barcelona (http://multicore.amd.com/us-en/AMD-Multi-Core/Products/Barcelona.aspx)? This morning's September 10 2007 issue of the Wall Street Journal runs a quarter-page piece on page B5 about the AMD Barcelona, yet tells us nothing new - giving no actual release date, no list price, no performance figures. Another pointless piece of shallow journalism by the WSJ that should really have been labeled as advertising.
Why is AMD so arrogant as to call one recently announced useless technology "a gift" (http://www.amd.com/us-en/Corporate/VirtualPressRoom/). Unless AMD is planning to send every developer in the world a free AMD processor, there is no "gift". And as I will explain next week, the technology they propose is completely unnecessary.
AMD has already mucked things up enough by introducing a number of technologies in the past few years which really served no benefit, including the pointless "NO EXECUTE" (NX) feature for which this blog is named, and arbitrary and inexplicable removal of critical instructions from the x86 instruction set. Is it any wonder that AMD executives seem to now be quitting like rats fleeing a sinking ship. I had one amusing incident this summer where at first AMD contacted me asking me to come work for them, and then just weeks later the very person who contacted me quit the company!
Don't even get me started (at least not until next Monday) about the whole "Vanderpool" thing and AMD's competing and incompatible "Pacifica" technology. Not only is it a mistake for companies like VMware and XenSource to bet the farm on this technology, but apparently I seem to be the only person who realizes that virtualization hardware is not even compatible with the whole "Platform 2015" vision in the first place!
There is a lot detail to cover for me to explain the various issue I just listed. I will spend all next week's blog focusing specifically on the technical issues afflicting this new battle brewing between AMD and Intel as well as why both companies need to stop behaving in the immature and counter-productive manner that Sony and Microsoft do. You the consumer end up suffering as a result of the AMD-Intel war, suffering in terms of crashes and blue screens and virus infections that you did not deserve. Despite all the hype, not much improves.
The question I pose to you to think about this week is: given that the personal computer is now an integral part of the lives of about 1 billion paying customers, should AMD or even Intel be in control of the "Intel x86" specifications? Or, should the responsibility and planning for the future be handed over to a neutral standards body?
As before, I would love to hear from you. Email me directly at darek@emulators.com or simply click on one of these two links to give me quick feedback: