Authour: Mark "Ratchet" Thorne
Date: November 27th, 2006
You’ve seen the enormity of the smack-down laid upon the previous generations of graphics hardware in both performance and image quality, and by now some of you lucky folks have even got one or, if you are a particularly devout gamer, two of them sitting in your system right now. Those who do know firsthand the qualities of this chip, those who don’t wish they were those who do.
You know all about its Unified Shader Architecture and its DX10 capability. You know it has some pretty cool features, some very much needed improvements to the GeForce image quality reputation and, most obvious, you also know that it’s fast. Fucking fast.
Yes, I swore. I feel bad. However, to properly project the idea of how fast the G80 is I need to color things a bit. Grab your attention. Take you by the shoulders, as it were, and shake you till you get the enormity of what I’m saying here. Others would tell you that it’s “Blazing Fast” or “Fast as Lighting” or, maybe, in attempt to go colorful themselves, would say something naughty like “It’s Fast as Hell!”. I feel those types of literary turns of wit do not convey the idea properly, however, so bear witness to my words, fine reader, and take them to heart: it is fucking fast. You’ll see plenty of evidence of exactly how fast the thing in throughout this review, I do think.
Don’t tell my Mom I cussed though.
Regardless of your foreknowledge of all things G80 I will restate this simple fact of recent history here: on November the 8th NVIDIA released the G80 GPU, what is arguably the biggest advancement in graphics technology since Ernie Coombs invented the pixel in his tickle trunk during his spare time. It does away with so many of the standard conventions of GPU design that the G80 can almost be considered a whole new approach to graphics hardware. It’s akin to the technological leap from piston powered engines to jet engines. Faster, more efficient, and immediately obsoletes everything before it.
Of course, you don’t “release” a GPU and expect people to buy it (though I’m sure some would!). NVIDIA knows this, which is why they’ve cleverly put the G80 on a PCB and made two new SKU’s that they’ve ingeniously called the GeForce 8800 GTX and GeForce 8800 GTS. Good stuff, Mr. Dressup would approve.
Here’s what NVIDIA’s lineup looks like today:
As you can see the 8800 GTX and the 8800 GTS replace the 7950 GX2 and 7900 GTX, respectively, in NVIDIA’s enthusiast class segment. The 7-series cards from the 7950 GT down remain the same for now, but there are rumors that NVIDIA is already hard at work creating GPUs and cards for a complete top-to-bottom 8-series lineup.
The GeForce 8800 GTS, being the more affordable of the two, has less overall performance than the GTX but is no less impressive in the grand scheme of things. It has its G80 GPU clocked at 500MHz, 20 ROPs, and uses an equally upsetting 320bit wide memory bus connected to 640MB of GDDR3 running at 800MHz (1.6GHz).
At $449 the GTS is priced right in line with the other enthusiast class cards these days, but at $599 the 8800 GTX will set you back quite a bit more. Is the extra power worth the extra money? You’ll have to decide that one for yourself (we’ll help by providing you with some nice benchmarks to mull over).With no response in sight, AMD’s graphics division (which we’ll call “ATI” in this article, for sake of convenience) finds themselves in the disappointingly familiar position of having to play catch-up once again. Something they should be quite accustomed to by now, unfortunately, having being given so much practice in that position these past couple years.
content not found