Authour: Mark "Ratchet" Thorne
Date: June 29th, 2006
Prey Demo Performance

Released just a couple weeks ago, a lot of hype has already built up around the Prey demo. It's easy to see why as the graphics, atmosphere, and especially the gameplay experience are all pretty spectacular. The game has already gone gold and should be on shelves as soon as July 17th, so start saving your pennies for what will surely be one of the big hits of 2006.

After playing through the demo I decided to check out whether it could be benchmarked (benchmarking is what I spend the majority of my time doing, it seems, so it's always on my mind). After a few failed attempts and with some tips from some friends I managed to get it working and before long had a script spitting out tons of results. From there it didn't take long to build up into a full blown benchmarking session.

I ended up testing 18 cards from both ATI and NVIDIA (Crossfire/SLI included) over three timedemos, five resolutions, and four seperate settings. All told, more than 1,000 benchmark passes went into the charts below.

Test Procedure

It seems as though Human Head/3D Realms or whoever was in charge of the demo production didn't really want people benchmarking it so they attempted to cripple the timedemo functionality. Thankfully, getting it working is pretty easy. All you have to do is create an autoexec.cfg file with seta fs_restrict "0" in it and drop it in your Prey Demo/base/ directory. Once it's in there you can timedemo using the same methods you use for other Doom 3 engine games. For my purposes, I modified my Quake 4 benchmark script, which worked fine.

I recored three seperate timedemos for benchmarking purposes, and then averaged the results of those three timedemos rto get a final score for each resolution and setting. Adminittidly there isn't a whole lot of variance in the timedemos, but considering it's only a game demo and not the full blown thing you can't really ask for much in that regard. If you wawnt to record your own timedemos in Prey, make sure you don't shoot or attack any of the enemies. If you do, the timedemo playback will abort at that point in your recording (it will still give results, however). That's either a bug or another measure to discourage benchmarking the demo. Either way I hope this is fixed for the full version of the game.

To keep things from getting too overwhelming, I set Prey to the highest possible graphics setting (basically everything turned on) and went from there. As a result, the scores for the slower cards are unrealistically low.

Here's a rundown of the test setup. Be sure to check out the Notes listed below as well.


  • 1024x768
  • 1280x1024
  • 1600x1200
  • 1920x1200
  • 2048x1536

Image Quality Settings

  • No AA & No AF
  • No AA & 16x AF
  • 4x AA & No AF
  • 4x AA & 16x AF

Test System Specs

  • CPU: AMD Athlon 64 FX-60 @ 2.6GHz
  • Motherboard:
    • ASUS A8R32-MVP, Radeon Xpress 3200, BIOS 0404
    • ASUS A8N-SLI, nForce4 SLI, BIOS 1306 (beta)
  • Videocards:
    • ATI GPU based
      • ATI Radeon X1900 XTX
      • ATI Radeon X1900 XT Crossfire Master
      • ATI Radeon X1800 XT (512MB )
      • ATI Radeon X1600 XT
      • HIS Radeon X1600 XT
      • Sapphire Radeon X1600 Pro x2
      • ATI Radeon X1300 Pro
    • NVIDIA GPU based
      • NVIDIA GeForce 7950 GX2 (reference)
      • BFG GeForce 7900 GTX OC x2
      • NVIDIA GeForce 7800 GTX x2 (reference)
      • NVIDIA GeForce 7800 GTX 256MB (reference)
      • NVIDIA GeForce 7800 GT (reference)
      • BFG GeForce 7600 GS OC x2
  • Video Drivers:
    • ATI Catalyst 6.6 WHQL
    • NVIDIA ForceWare 91.31 WHQL
  • Memory: 2GB (2x1024MB) Corsair TWINX2048 PC3500LL Pro @ 400MHz 2-3-2-6 1T
  • HDD: Western Digital Caviar SE16 250GB SATA2 7200RPM 16MB
  • Sound: Onboard
  • PSU: Silverstone Strider ST56F 560w
  • OS: Microsoft Windows XP Pro SP2


  • You will have noticed that the motherboard I used for the ATI based cards and the one I used for the NVIDIA based cards are not really of the same ilk: the A8R32-MVP is a dual PEG x16 board while the A8N-SLI is not. I had originally intended to use ASUS' dual-x16 A8N32-SLI, which would have made much more sense, but the one I bought arrived DOA (or damn well close to it) so, in the interest of not wanting to delay things for an extra week or more while my RMA was processed, I decided to go ahead and use my older A8N-SLI instead.
  • Which leads me to the next note: try as I might I couldn't get the 7950 GX2 to run in dual-GPU mode on the A8N-SLI (even with the beta BIOS(s) that supposed added support for the card). I ended up benchmarking the GX2 on the A8R32-MVP instead, which worked fine.
  • To get SLI to work with Prey I made a custom SLI profile for the demo based on the Quake 4 profile. As is typical after you've just finished a days worth of benchmarking, new beta drivers were released that add a Prey SLI profile. NVIDIA tells me however that using a custom profile based on Quake 4 as I did is just as good so I didn't bother retesting.
  • ATI had added a Crossfire profile to the official Catalyst 6.6 driver set that was released a few days ago and which I used for all the ATI testing. I didn't do any prey.exe renaming to get Crossfire working like what was required with the previous Catalyst 6.5s.


Here are the charts. Click the text links above each chart to show the Anti-Aliasing and Anisotropic results. I'll refrain from commenting and let you make up your own mind on what is presented here:


[ No AA / No AF ] [ No AA / 16x AF ] [ 4x AA / No AF ] [ 4x AA / 16x AF ]

[ No AA / No AF ] [ No AA / 16x AF ] [ 4x AA / No AF ] [ 4x AA / 16x AF ]

[ No AA / No AF ] [ No AA / 16x AF ] [ 4x AA / No AF ] [ 4x AA / 16x AF ]

[ No AA / No AF ] [ No AA / 16x AF ] [ 4x AA / No AF ] [ 4x AA / 16x AF ]

content not found

Copyright 2022 ©

You may not use content, graphics, or code elements from this page without express written consent from

All logos are trademarks of their original owners. Used with permission.