Skip navigation

Graphics Cards for Every Budget

Price, packaging, and performance considerations for your next upgrade

If you've been using your PC solely for the purpose of getting work done, you're missing out on a whole new world of technology. Let's be realistic—the gaming industry is driving many of today's computer upgrades and advancements. Do you really need a 1.67GHz CPU to run Microsoft Office? Probably not, but that 1.67GHz AMD Athlon, coupled with a few sticks of Double Data Rate (DDR) RAM, makes for some blisteringly fast Quake III gaming.

When it comes to hardware, the graphics market paves the way for the most exciting emerging technologies. Because most graphics developers work on a 6-month product cycle, they can introduce new features faster than even Intel or AMD can roll out new CPUs. In fact, today's graphics cards are classified as Graphics Processing Units (GPUs), based on the sheer size of the die and the chip capabilities. At the top of the graphics food chain, GPUs boast more transistors than high-end Intel CPUs do and are every bit as essential to a PC's gaming performance.

If you currently use a 3dfx Voodoo3-based or NVIDIA TNT2-based graphics card, you might want to consider an upgrade. These older cards are fine if you're playing games such as Diablo, Civilization III, and Age of Empires II, but you're probably stuck viewing low resolutions and 16-bit color depths. Moreover, when you start dealing with 3-D graphics, today's games demand more from your graphics card than older cards can deliver.

Don't be fooled into thinking that these cards are one-trick ponies just because they offer outstanding 3-D performance. As you'll see, some of these boards can handle DVD decoding and TV output (TV-out). And, for those of you living in High-Definition Television (HDTV)-capable areas who don't want to shell out the bucks for an HDTV decoder, some of these cards can decode and display HDTV signals in their native resolutions. Table 1 provides a comparison of the cards' features.

Whether you're looking for a card to take advantage of the latest and greatest graphics advances or considering a more budget-minded purchase, having an understanding of the technology behind these cards is helpful. Let's focus on the two primary considerations when you upgrade your graphics card: performance and features. With this knowledge in hand, we'll look at three high-end graphics cards that deliver 3-D graphics faster than any console available today and a couple of budget cards for those who don't want to break the bank to upgrade to the latest 3-D accelerators.

High-End Graphics


NVIDIA's GeForce4 GPU is the current reigning champ for powering high-end graphics cards. On a technical level, the GeForce4 is an evolutionary leap over the XGPU in Microsoft's Xbox console and represents the first genuine leap in 3-D technology since 3dfx first introduced the Voodoo chipset. And, as with every new GPU that NVIDIA introduces, the usual suspects (such as Leadtek Research, Creative Technology, and ASUSTek Computer) are back and slugging it out for a piece of your upgrade budget.

The GeForce4 GPU boasts 63 million transistors-20 percent more than Intel's Pentium 4 CPU. Thanks to a new, thinner fabrication process, the GeForce4 can hit higher clock speeds, consume less power, and generate less heat. The GeForce4 GPU also fully supports every DirectX 8.0 feature in hardware.

Like the GeForce, GeForce2, and GeForce3 before it, the GeForce4 features a full Transform and Lighting (T&L) engine. This feature lets games off-load geometric processing from the CPU to the GPU, leaving your Pentium 4 free to handle other gaming functions. The benefits of the GeForce4 GPU are evident in its three primary features: Lightspeed Memory Architecture, High Resolution Anti-Aliasing (HRAA), and DirectX 8.0 support. For a closer look at these features, see the sidebar "Inside the GeForce4."

Although the GeForce4 is the very definition of high-end 3-D technology, NVIDIA offers the core GPU at various price points that target a wide market. At the highest end of the spectrum, the GeForce4 Ti 4600 targets hard-core gamers who crave uncompromising performance above all else. For those who want a balance between price and performance, the GeForce4 Ti 4400 sells for roughly $100 less than the GeForce4 Ti 4600, but with a loss of about 100MHz of memory clock speed.

Still, both cards are pricey. You get a lot-but depending on your needs, these features might not be worth the amount of money you'll have to pay. If you can get past the sticker shock, a GeForce4-equipped card gives you a lot of bang for the buck and will make your buying decision future-proof.

VisionTek Xtasy GeForce4 Ti 4600


As the primary manufacturer of NVIDIA's reference boards, VisionTek's Xtasy GeForce4 Ti 4600 follows NVIDIA's reference design to the letter. With the exception of an S-video TV-out port and a Digital Visual Interface I (DVI-I) connector for flat-panel displays, you won't find any amenities such as video capturing or even a software bundle. Then again, if you're willing to shell out $399 for a 3-D card, you probably already have any game that VisionTek might have considered including with this card.

The Xtasy GeForce4 Ti 4600 features a 300MHz core paired with a staggering 128MB of 325MHz DDR RAM (which effectively yields 650MHz). With a massive 10.4GBps of memory bandwidth to boot, the Xtasy GeForce4 Ti 4600 provides a ridiculous amount of polygon pushing power.

For testing, I matched up the Xtasy GeForce4 Ti 4600 with NVIDIA's 23.48 drivers for Windows XP. I installed the card in a 1GHz Pentium III system in less than 5 minutes (including the time it took to remove my PC's cover). I removed the drivers for my old graphics card and plugged in the new card; Windows automatically detected the card and installed NVIDIA's drivers. The drivers provide a multitabbed control panel to configure the card, including radio buttons to enable and disable features such as Vsync and HRAA. Users who want to really tweak the card (e.g., calibrate the color, set display-related hotkeys, configure refresh rates) will want to use a third-party program, but the drivers' control panel lets you handle the most common functions.

In every game I tested ranging from Quake III to Max Payne to Serious Sam to Return to Castle Wolfenstein to 3DMark 2001, the Xtasy GeForce4 Ti 4600 returned a better frame rate than any card I've ever tested. If raw power is your thing, you must run out and get this card-even before you finish reading this article. The GeForce4 Ti 4600 is the fastest consumer-level GPU available on the PC platform today.

What's especially impressive about the GeForce4 Ti 4600 is that we finally have a GPU that can run games at ultra-high resolutions without compromising performance. Running Quake III at a screen resolution of 1600 x 1200 x 32, the Xtasy GeForce4 Ti 4600 makes every other graphics architecture look silly by sustaining an average of 115 frames per second (fps), which is simply unprecedented.

The GeForce4 Ti 4600's superlative performance is well known, thanks to its new memory architecture. The real test of the GPU's mettle comes when you test HRAA performance on legacy games. So what can you expect from the GeForce4 Ti 4600's HRAA? Excellent performance, even at high resolutions. In every game I tested, the GeForce4 Ti 4600's Quincunx anti-aliasing mode doesn't cause even one hiccup. Even at a reasonably high resolution (e.g., 1280 x 1024 x 32), Quake III played at a more-than-adequate 90fps. Although the visual quality of Quincunx doesn't match the Voodoo5 running at 4X Full Scene Anti-Aliasing (FSAA), you won't suffer the performance hit associated with that level of anti-aliasing.

With DirectX 8.0-enhanced games, the Xtasy GeForce4 Ti 4600 pulls far ahead of the competition. Using AquaMark as a benchmark, the card blew away a still-respectable GeForce2 Ultra while leaving the former king—the GeForce3 Ti 500—in the dust. This type of performance is important because DirectX 8.0 games are already hitting the shelves—and nothing can run them as well as a GeForce4-based card.

Finally, I used an S-video cable to connect the card to a standard TV so that I could test the card's TV output. With a hard limit of 800 x 600 resolution, you probably won't want to use the Xtasy GeForce4 Ti 4600's TV-out to perform work-related tasks, but games are extremely playable and DVD playback—especially when you connect to a display that supports a progressive scan signal—is excellent. The visual quality of the Xtasy GeForce4 Ti 4600 doesn't quite match the quality of ATI Technologies' RADEON 8500, but most users won't have much to complain about unless they're serious about their home theater.

Because the GeForce4 architecture doesn't include a hardware DVD decoder, the Xtasy GeForce4 Ti 4600 supports motion compensation to alleviate the load on the CPU when the CPU is decoding Moving Pictures Experts Group (MPEG)-2 video. Motion compensation uses predictive coding to let the chip analyze the contents of a scene to determine the number of moving objects. The chip uses this data to measure the motion within the scene and predict the content of upcoming frames. The result is a smoother picture with less video artifacting and dropped frames.

The Xtasy GeForce4 Ti 4600 is as future-proof as today's graphics cards come. Sure, you can buy other cards that offer excellent performance at a fraction of this card's costs, but by doing so you limit your options. Simply put, every 3-D game currently in development for the PC platform leverages the power and features of DirectX 8.0, and the GeForce4 Ti 4600 supports every one of those features in hardware. If you have the money, the Xtasy GeForce4 Ti 4600 lets you play today and tomorrow's games fast enough to satisfy your adrenaline rushes while bombarding you with the most visceral eye candy around. What more can you ask for?

Xtasy GeForce4 Ti 4600
Contact: VisionTek
Web: http://www.visiontek.com
Price: $399
Decision Summary
Pros:A solid card with the fastest Graphics Processing Unit available, the GeForce 4
Cons: Very expensive and lacks any semblance of a software bundle


VisionTek Xtasy 6564


OK, so maybe you don't have $399 to drop on what might be the ultimate in luxury items. Fortunately, NVIDIA offers another GPU, the GeForce3 Ti 200, that provides performance for the rest of us.

Technically, the GeForce3 Ti 200 is a mid-range product, but it might offer the best dollar-for-dollar performance of any card available. By using the GeForce3 core clocked at 175MHz and pairing it with 64MB of 200MHz DDR RAM (effectively a 400MHz Single Data Rate-SDR-equivalent), the VisionTek Xtasy 6564 gives us all the benefits of DirectX 8.0, loads of RAM to store textures, performance that would have been considered top-of-the-line a year ago, and a $200 price point.

In testing, the Xtasy 6564 comes in right behind the GeForce4 Ti 4600 and RADEON 8500, with an 8 to 10 percent difference in frame rate between the Xtasy 6564 and RADEON 8500. Although it's a huge jump between the Xtasy 6564 and the Xtasy GeForce4 Ti 4600, I was surprised to see the GeForce3 Ti 200-equipped product faring so well because of the price difference. Expect to see close to 100fps when playing Quake III at a 1600 x 1200 x 32 screen resolution.

To keep the price as low as possible, VisionTek skipped any value-added extras when manufacturing the Xtasy 6564. The card has an S-video output for TV-out, but that's it—no flat-panel support and no video capturing. Because the Xtasy 6564 shares the same core as the Xtasy GeForce4 Ti 4600, you can expect the same level of support for DVDs-very decent quality, but playback that's not as crisp as what the RADEON 8500 provides.

In the end, the Xtasy 6564 is a genuine GeForce3 product that's priced to sell. If you don't suffer from PC envy and find yourself overclocking every part of your system to eke out those few extra frames per second, the Xtasy 6564 offers the perfect balance between features and price. Sure, compared with GeForce4 Ti 4600 cards, you're giving up a few megahertz here and there, a more efficient memory architecture, and an extra vertex shader, but the GPU (the real meat of the product) is intact so that you can play games the way they're meant to be played. And in the end, that's what makes this product superior to the GeForce4 MX line of cards. As you'll see, the "new" GeForce4 MX 440 boards can't compare.

Xtasy 6564
Contact: VisionTek
Web: http://www.visiontek.com
Price: $199
Decision Summary
Pros:A genuine GeForce3 solution that offers the best dollar-for-dollar performance of any graphics card
Cons: Not the fastest kid on the block and lacks any value-added extras


ATI Technologies RADEON 8500


As ATI's second shot in the enormously competitive graphics card market, the RADEON 8500 is a serious contender marred by the same problems ATI faces every time it launches a new product. In essence, the RADEON 8500 presents a curious blend of powerful silicon coupled with dodgy drivers. The result is a good graphics card that hasn't quite reached its potential.

The RADEON 8500 is an AGP 4X part that includes VGA, DVI-I, and S-video outputs. You can use the included DVI-I to VGA adapter to take advantage of ATI's HydraVision technology for setting up dual displays. Unlike the older RADEON 64MB ViVo card, the RADEON 8500 doesn't support video capturing—a feature that's best left to a dedicated solution. The software bundle includes a demo of Half-Life and full versions of Team Fortress and Counter-Strike.

Going on specifications alone, the RADEON 8500 is very impressive. With the core and memory both clocked at 275MHz (effectively 550MHz with the use of DDR RAM), the RADEON 8500's four rendering pipelines and 8.8GBps memory bandwidth give ATI the muscle to win the raw fill-rate battle. As NVIDIA's only competition, ATI shows that its product engineering is more than capable of going 12 rounds.

Like the original RADEON architecture, the core of the RADEON 8500's GPU consists of multiple components. The CHARISMA ENGINE II serves as the card's T&L engine. Closely coupled with the CHARISMA ENGINE II is the SMARTSHADER engine-ATI's programmable vertex shader. If you're familiar with the GeForce4's vertex shader, you won't find anything new here. SMARTSHADER handles the geometry and math required to render a 3-D scene. SMARTSHADER works like a traditional co-processor. By off-loading this process to a dedicated engine, developers can take advantage of features such as key frame animation and vertex lighting, resulting in more realistic character models.

The RADEON 8500's PIXEL TAPESTRY II architecture lets the card make an evolutionary leap beyond its predecessor. As a DirectX 8.1-compliant programmable pixel shader, PIXEL TAPESTRY II's primary job is to apply textures. The catch is that the pixel shader lets designers apply far more advanced effects (such as Shadow Mapping to create more realistic shadow effects) to the textures within a 3-D scene. PIXEL TAPESTRY II can also perform reflective bump mapping, a technique that overlays a realistic bumpy surface over a polygon that also reflects light. Going a step beyond the GeForce4, PIXEL TAPESTRY II lets the RADEON apply six textures in one pass. By comparison, the GeForce4 can apply only four textures in one pass. This feature will be important when games such as the texture- and lighting-heavy DOOM III hit the market. The benefit? Pure eye candy.

Where the RADEON 8500 offers the most potential over the GeForce4 is in ATI's TRUFORM technology—a technique that promises to drastically improve the visual quality of new and existing games. Before I explain how TRUFORM accomplishes that, let's use computer-generated imagery (CGI)-rendered films as a point of reference.

The reason that films such as Final Fantasy: The Spirits Within; Monsters, Inc.; Toy Story; and Toy Story 2 look worlds better than what PC graphics offer is that they're pre-rendered. By rendering every scene ahead of time, animators can use curved surfaces to make 3-D objects look more realistic. For example, look at your hand. Now, fire up any 3-D game and look at the hand of any character model. Notice how the character model's hand looks blocky and suffers from strange joint anomalies? That's because rendering curved surfaces to properly represent any type of real-world object has to be supported in hardware. Today's 3-D graphics technology considers curved surfaces and polygons to be mutually exclusive, so a typical consumer graphics card can't do both.

ATI's approach is an elegant solution that could give us the best of both worlds. TRUFORM works by taking the triangle information that tells the GPU how to render each polygon and internally converts that information into curved surfaces. TRUFORM then takes the converted information and uses it to draw the polygons on screen.

The details behind this technology are complex, but in a nutshell, to produce curved surfaces, TRUFORM takes each triangle's vertex information and uses N-Patches to form a curved surface mesh on each triangle. It then places additional control points on a separate plane located above or below the triangle. Using these control points, the GPU draws additional triangles until the original triangle is actually a curved surface. Because the GPU's T&L engine computes these calculations, the enhanced visual quality doesn't come at the cost of performance.

So how well does TRUFORM perform? Recent games like Return to Castle Wolfenstein and Serious Sam: The Second Encounter use TRUFORM to "flesh out" character models. By adding polygons to round off joints, RADEON 8500-rendered characters appear to be less blocky than GeForce4-rendered characters-and with no performance hit. Counter-Strike junkies will be glad to know that Valve Software's most recent game patch includes TRUFORM support.

To improve on the original RADEON's performance, the RADEON 8500 uses the new HyperZ II architecture, which is a form of tile-based rendering. Rather than leveraging a true tile-based rendering architecture, ATI takes a three-pronged approach to reduce wasted memory bandwidth. First, the RADEON 8500 uses a technique called Hierarchical Z to determine which pixels are visible and which are obscured when the graphics card renders a scene. The RADEON 8500 culls obscured pixels before they enter the rendering pipeline. Second, the card uses Z Compression to fit more data into the Z-buffer. Third, Fast Z Clear clears the Z-buffer to begin rendering the next frame. Traditionally, graphics cards clear the Z-buffer by overwriting the existing data with zeros. Fast Z Clear just marks the data in the Z-buffer as clear in one pass, which on paper is 64 times as fast as the conventional approach. The benefit? Blazingly fast performance.

Finally, the RADEON 8500 includes an FSAA component dubbed SMOOTHVISION. Rather than taking a multisampling approach to FSAA as the GeForce4 does (see the sidebar, "Inside The GeForce4"), SMOOTHVISION uses the old supersampling method to reduce the amount of jagged edges in a scene. The benefit to using supersampling is that the displayed textures are generally less blurry than those anti-aliased with a multisampling technique. The drawback is the performance hit that comes with internally rendering a scene at a higher resolution than is displayed, which is one reason supersampling was phased out. Luckily for ATI, the RADEON 8500 has enough raw power to perform 2X FSAA using SMOOTHVISION without impacting performance to the point where games are unplayable. However, 4X FSAA results in abysmal performance. As SMOOTHVISION shows, supersampling was put out to pasture for a good reason.

Beyond gaming, the RADEON 8500 also provides several additional functions. The RADEON 8500 features superb TV-out quality, a staple of ATI technology. Even at a screen resolution of 800 x 600, using an S-video connection makes text as readable as it gets on TV screens. For those of you who already have HDTV, the RADEON 8500 features impressive de-interlacing technology that lets it output 480, 720, and 1080 progressive scan (along with 1080 interlace) signals. Just plug a DVI-I to Component adapter between the card and your HDTV's component inputs, and you're set.

Home theater enthusiasts will find a lot to like about the RADEON 8500. By including a hardware DVD decoder (which supports all the expected features, such as motion compensation and Inverse Discrete Cosine Transform-IDCT-to take the load off the CPU), the RADEON 8500 offers a picture comparable to most mid-range DVD players. Because the RADEON 8500 can output a progressive scan signal, the card fares well even against high-end DVD players. Simply put, if you're looking for a video-decoding solution, the RADEON 8500 is an obvious choice.

So what's the problem with the RADEON 8500? In a word: drivers. As anyone who has owned a recent ATI video card can attest, ATI has always been light years behind NVIDIA when it comes to driver development. The original drivers that shipped with the RADEON 8500 were horribly broken to the point that features either didn't work or were intentionally disabled simply because ATI's software engineers hadn't implemented them. Performance was also an issue-with its specifications, the RADEON 8500 should have outperformed the GeForce4 Ti 4600, but it didn't and it doesn't even now. To ATI's credit, later driver releases fixed some issues but performance still lags behind the GeForce4 Ti 4600 in most games.

The RADEON 8500 offers good performance, excellent 3-D technology, and advanced video playback features. Aside from the driver issue, the RADEON 8500 is superior to the GeForce4 Ti 4600 in almost every way. ATI plans to aggressively release beta drivers on a monthly basis, so the product can only get better from here. As it stands, the RADEON 8500 is an excellent product if you're willing to deal with growing pains.

RADEON 8500
Contact: ATI Technologies
Web: http://www.ati.com
Price: $299
Decision Summary
Pros: Very fast; best visual quality available; supports DirectX 8.0; includes a true Moving Pictures Experts Group-2 decoder
Cons: Questionable driver quality


Graphics on a Budget


If the VisionTek Xtasy GeForce4 Ti 4600, VisionTek Xtasy 6564, and ATI RADEON 8500 cards are considered high-end cards, the VisionTek Xtasy GeForce4 MX 440 and Guillemot Hercules 3D Prophet 4500 would be low-end technology (i.e., technology for those on a budget). Keep in mind that by buying a budget graphics card you're sacrificing many of the features found in the high-end boards. To keep costs down, card manufacturers cut corners and remove many features (e.g., TV-out, DirectX 8.0 support) that are commonplace on more expensive cards. Despite these cost-saving measures, a budget graphics card can offer decent performance at the lowest possible prices.

VisionTek Xtasy GeForce4 MX 440


For all the unbridled genius behind the engineers at NVIDIA, sometimes the company's marketing strategy makes me wonder. Whereas NVIDIA offers the GeForce4 Ti 4600, easily the world's most powerful GPU, the company slaps us in the face with a sister GPU that NVIDIA calls the GeForce4 MX 440.

Forget the GeForce4 prefix; the GeForce4 MX 440 is actually a GeForce2 MX GPU-the same GeForce2 MX that we've relegated to backup boxes that occasionally run something moldy such as Quake II. The GPU is the same GeForce2 MX that you just shouldn't be buying, no matter how tempting the "GeForce4" name might be.

Sure, the Xtasy GeForce4 MX 440 has advantages over a vanilla GeForce2 MX card. NVIDIA has squeezed out some performance from the MX 440 GPU by bumping up the core clock speed to 270MHz and using a thinner 0.15-micron fabrication to keep the chip running cooler. VisionTek was kind enough to supply the card with 64MB of DDR RAM with an effective memory clock speed of 400MHz.

So what's the problem? The Xtasy GeForce4 MX 440 is yesterday's technology wrapped in a shiny package hoping to lure in suckers who think they're getting cutting-edge gear at bargain prices. Don't be misled-the entire GeForce4 MX line lacks the vertex and pixel shaders found in the real GeForce4. In other words, the GeForce4 MX 440 is missing all the cool stuff that games are finally starting to support.

Yes, the $149 price tag might attract impulse buyers. Yes, you'll be able to run your existing library of games at high resolutions and see decent performance. And yes, it has the DVD features you need to turn your PC into a home theater centerpiece. However, you shouldn't buy this card if you want your hardware to last you more than a year before you're forced to upgrade because the industry has passed you by.

Xtasy GeForce4 MX 440
Contact: VisionTek
Web: http://www.visiontek.com
Price: $149
Decision Summary
Pros: The GeForce4 logo looks nice ...
Cons: ... but it's not a GeForce4; it's not even a GeForce3


Guillemot Hercules 3D Prophet 4500


Built around STMicroelectronics' KYRO II chip, Guillemot's Hercules 3D Prophet 4500 delivers a full 64MB of RAM and a unique architecture that strikes a strong, but ultimately crippling, balance between price and performance. The KYRO II chip in the Hercules 3D Prophet 4500 is an AGP 2X part (as opposed to the current, faster AGP 4X standard) that uses SDR RAM clocked at 175MHz with a matching core clock speed. The chip also lacks a T&L engine. As a final indictment, the KYRO II features a slow 270MHz RAM Digital-to-Analog Converter (RAMDAC). Going on specifications alone, the Hercules 3D Prophet 4500 looks more like yesterday's throwaway than a current contender. However, just this once, don't go on the specifications alone because the Hercules 3D Prophet 4500 performs at least as well as high-quality graphics cards at a fraction of the price.

The KYRO II performs so well thanks to its tile-based rendering architecture. Rather than filling in every polygon on a given scene, the KYRO II draws only visible objects-usually the polygons that display in the foreground. By breaking up a scene into multiple blocks (all rendered independently), the KYRO II can perform all Z-calculations (checking the forward-or Z-location of a given polygon relative to the other onscreen polygons) internally. As a result, the KYRO II doesn't waste valuable time traveling across the memory bus to access an external Z-buffer. This feature frees up processing power because the chip doesn't have to waste cycles drawing objects that end up being obscured anyway.

To test just how this unique architecture compensates for the low specifications, I installed the card on a mid-range 600MHz Pentium III Windows-based system using the included 7.89 reference drivers. Rather than taking the NVIDIA approach and spreading configuration options across multiple tabs, the KYRO II control panel lets you configure the card's settings (e.g., disabling VSync, enabling FSAA) from just two tabs (one for Direct3D and the other for OpenGL).

Once installed, I ran the usual benchmark gauntlet. With the color depth set to 16-bit, the Hercules 3D Prophet 4500 didn't come close to matching the pure speed of a GeForce2 GTS board. Interestingly, things changed as soon as I kicked the display into 32-bit color. At every tested screen resolution (from 800 x 600 up to 1600 x 1200), the Hercules 3D Prophet 4500 checked in at just a few frames per second short of the GeForce2 GTS. And when I say a few frames per second short, I'm talking about no more than a 5fps gap between the two boards.

To clean up the jagged edges on screen, the Hercules 3D Prophet 4500 supports 2X and 4X FSAA. Because the KYRO II uses supersampling to smooth lines, the card takes a large performance hit with FSAA enabled. To keep frame rates from dropping too much, Guillemot imposes a 1280 x 1024 limit on the 2X mode and 1024 x 768 on the 4X mode. Impressively, the Hercules 3D Prophet 4500 performs almost as well as the GeForce2 GTS with both 2X and 4X FSAA enabled. In fact, the KYRO II-based card even pulled ahead of the GeForce2 GTS board at lower resolutions with a 32-bit color depth.

But as far as 3-D splendor goes, that's about it. Because the Hercules 3D Prophet 4500 doesn't support DirectX 8.0 features, the card can't handle the advanced effects found with other GPUs. And that's a problem because the future of 3-D gaming is in DirectX 8.0. The Hercules 3D Prophet 4500 might be able to push polygons faster than any other budget card, but that really won't mean much as games start to require features that the KYRO II GPU doesn't have.

For the film geeks, the Hercules 3D Prophet 4500 supports motion compensation to assist with DVD playback. Unfortunately, it's not a full hardware MPEG-2 decoding implementation, and the lack of TV-out means you'll have to watch movies on your monitor. If you're dead-set on outputting your movies to a TV, Guillemot ships a version of the Hercules 3D Prophet 4500 with TV-out for $169.

If speedy frame rates in your current games is the only thing you're concerned about, the Hercules 3D Prophet 4500 is a decent solution-but only if you want to spend the bare minimum to get the 3-D experience. By embracing the KYRO II GPU, Guillemot effectively brings last year's high-end gaming performance to the mainstream. The only significant knock against the card lies in its older architecture: The card can't support a T&L engine and DirectX 8.0. By buying into technology that can't handle complex T&L routines, your shiny new graphics card might have a shorter life span than you thought. Ultimately, budget graphics cards are built around compromises, but recommending the Hercules 3D Prophet 4500 is hard (even with its exceptional performance) when you can pick up a GeForce3 Ti 200-based card for only a few dollars more.

Hercules 3D Prophet 4500 64MB
Contact: Guillemot
Web: http://www.guillemot.com
Price: $100
Decision Summary
Pros: Unique tile-based rendering architecture makes this one the fastest budget card available; delivers a full 64MB solution; inexpensive
Cons:Lacks a Transform and Lighting engine; no support for DirectX 8.0 features; antiquated specifications; uses Single Data Rate RAM


The Moral: Features Before Performance


So what have we learned? For one thing, you get what you pay for. Also, specifications are occasionally irrelevant to the true performance of a graphics card. In the end, you should base your buying decision on the features you want, rather than going on performance alone. If you prefer real-time strategy games, Diablo clones, or the current flavors-of-the-month such as Counter-Strike and you don't anticipate playing 3-D games that take advantage of DirectX 8.0, you're better off buying a more inexpensive card—such as the Xtasy GeForce4 MX 440 or Hercules 3D Prophet 4500—that has value-added features, rather than purchasing a true GeForce4-based card. If you're a performance junkie who salivates at the prospect of games leveraging programmable shaders to provide realistic effects such as water and leaves, you'll want the best GPU technology available—and that's the GeForce4. If you're a home theater maven who wants the best in DVD output and a powerful DirectX 8.0-compliant GPU, the RADEON 8500 is the card for you.

Hide comments

Comments

  • Allowed HTML tags: <em> <strong> <blockquote> <br> <p>

Plain text

  • No HTML tags allowed.
  • Web page addresses and e-mail addresses turn into links automatically.
  • Lines and paragraphs break automatically.
Publish