During his keynote address at the Worldwide Developers Conference (WWDC) 2003 in San Francisco earlier this week, Apple CEO Steve Jobs told the excitable audience that the company's new PowerMac G5 systems--due in August--would not just match but surpass the performance of similarly equipped high-end Pentium 4 and XEON-based PCs from companies such as Dell. "We are delivering today the world's fastest personal computer," Jobs said of the G5 during the keynote, though the systems won't ship for two months. Apple has made similar performance claims over the years, but it has always relied on rather spurious evidence, such as hand-picked benchmarks that highlighted specific strengths of the PowerPC platform. However, this time, Jobs touted a number of so-called industry standard benchmarks from Veritest which reputedly backed up Apple's claims. Has the Mac really surpassed the PC, after years of lagging behind?
Sadly, Apple's claims are as questionable as ever, but what's astonishing is how quickly the truth has come out. Almost immediately after the keynote, while Mac fanatics worldwide continued chortling over their perceived victory, people around the Web began looking into the benchmarks Apple used to prove the G5's prowess. Predictably, things aren't as simple as Apple's followers would like to believe. More alarming, even dual processor G5 machines still don't match the processing power of a single processor Pentium 4 system, contrary to what Apple announced Monday.
Here's why: In a bit of classic benchmark trickery, Apple's systems were highly tuned in non-standard ways that would provide the best scores on specific benchmarks. Meanwhile, the PCs used to compete against the G5 were saddled with generic tools. Furthermore, advanced Pentium 4 features such as hyperthreading were actually turned off, artificially lowering that system's scores. What's most interesting is that Veritest actually has results for various Pentium 4 systems in which these features are enabled. Guess which system, Mac or PC, comes out ahead when these results are used.
"Apple's test results are invalidated by severely lopsided testing conditions," InfoWorld's Tom Yager writes in his Web log. "Among them, Apple used a prototype G5 running its special GNU compiler and an unreleased version of OS X. The Dells used shipping hardware, vanilla GNU compilers and Red Hat 9. None of this would be a problem if Apple and Veritest didn't claim the tests were objective. An apples-to-apples test, so to speak, would require that Dell, like Apple, be allowed to tune its systems and software for best-case performance. Dell's published results on the SPEC site--regarded as the definitive repository for SPEC results--are best-case. They're far better than the results cited by Veritest in the Apple report."
Sure enough, in each of the benchmarks in which Apple claims victory over the Pentium 4- or Xeon-based systems, various Pentium 4, Xeon, and even AMD Athlon XP systems actually beat the G5 routinely when the tested systems have been properly configured, and don't have features turned off.
What's most bizarre about all this, of course, is that Apple makes good products. Let's be clear on this point: Mac OS X is excellent, and the Panther release, while not overly exciting, looks solid. And the company's hardware is of tremendous quality (I own two Macs and an iPod), with the PowerMac G5 clearly continuing this trend. And there are still excellent reasons to pick a Mac over a PC in certain situations. But Apple has been stretching the bounds of credibility with its performance claims for years now, and this latest example is, by far, the most bold. This situation, ultimately, is an embarrassment for both Apple and its customers. Perhaps the company needs to think its claim that the PowerMac G5 is the "world's fastest computer." Quite clearly, that is not the case.