Rage3D Discussion Area

Rage3D Discussion Area (http://www.rage3d.com/board/index.php)
-   AMD Radeon Discussion and Support (http://www.rage3d.com/board/forumdisplay.php?f=45)
-   -   Official Navi2x/6x00 series thread (http://www.rage3d.com/board/showthread.php?t=34051860)

demo Nov 20, 2020 12:52 AM

Quote:

Originally Posted by Nunz (Post 1338245994)
Tessellation disabled.. and I'd chalk a good portion of that score to the LN2 5950X. You'd have to look at graphics scores to compare the GPUs.

AMD has historically been superior in Firestrike, for whatever reason. They are slower in Timespy though.. no clue why.

Yeh exactly, Tess off and who knows what else, in 1080p FS with 5950x @ 5.4. As you know, AMD does unnaturally well in FS for whatever reason and scores don't translate well to other benchmarks or games.

In his comments he states GPU score with Tess on is 'close to 50k' (for reference I just scored 51k), and has this benchmark run hidden - presumably because it doesn't look so hot. Then goes on to say raster gaming is around 3080 level and DXR gaming around 2080TI level. No surprises there. He also predicts single digit gains with 6900XT.

pax Nov 20, 2020 01:11 AM



@18.02

They are doing something interesting with Infinity cache that they havent announced yet. But some games have had insanely high low 1% scores. IC has some real potential.

Other time stamps:


Quote:

@7:50 Cudo's to AMD for launching competitive Card. With a trifecta kind of staggered launch: AMD is launching CPU's/GPUs/Consoles.

@9:35 both CPU and GPU collaborated together to bring the RX 6000 series. Better power to performance matrix.

@14:25 AMD will have laptop variants but wouldn't comment about it nor elaborate on it/when/how/etc

@18:02 Reason for using Infinity Cache. In particular they have another announcement they are making about Infinity Cache. He didn't commit to a time frame. But said it will pretty soon (tm).

@19:49 Vram Discussed: Using 16GB Vram and infinity cache and the 6000 series still uses less power then Ampere

@22:00: AMD will ensure that developers will take advantage of AMD Ecosystem between consoles/PC. As developers don't want to worry about a myrid of pc configurations. They prefer a closed form factor. Because AMD is Uarch used in console games developers will use more vram, RT (based on how AMD wants it), and all the other things AMD wants in games.

@26:40 RT/DLSS Discussed. He stated as new games are launched they will improve RT performance. But he emphasized new titles. He also emphasized that when you are coding for the console you are coding for AMD.

@28:30 DLSS Discussed. Originally was going to develop their own API. Developers begged AMD not to create another API (would only work with Radeon). So they are going for an open solution. Developers do not like having to fetch AMD/Nvidia reps to come on site to help code for their games (that's some juicy gossip right there, I didn't know that). They are still working on this with developers which is why it's not ready yet. (Or else they would have launched this API he speaks of.)

@32:25 (several minutes) Smart Access Memory Discussed. AMD never said that SAM wouldn't work with other hardware. AMD simply focused their efforts on this generation of hardware. Validation work, communication protocol work, tweaking, etc. He said they are still undecided on older hardware but they love backward compatibility... They are still evaluating it. MORE PERFORMANCE IS COMING AS IT MATURES. (Is that tied into infinity cached he mentioned earlier??? Hmmm...) But it is clear that Intel has to be involved with their bios, etc to get it to work for Nvidia. It's not just a nvidia thing. It's more then just a driver update. Nvidia told PCWorld they felt that AMD would hard block them. Scott said they wouldn't hard block them. (perhaps soft block them, lol /s)

@37:45 Question: Why are you doing that (allowing Nvidia in for SAM on AMD) when nvidia tried to flip freesync to g-sync as they out marketed AMD in branding. Do you run the risk of Nvidia re-market/rebranding SAM to something Nvida will claim as their own? Answer: To be determined...
(I'm flabbergasted. Perhaps he can't say in front of the camera but I hope AMD isn't dumb enough to let nvidia in w/o paying monthly royalty fees/SAM for rent.)

@38:40 Smart Shift Discussion

@41:00 Infinity Cache just L3 cache? Mainly L3 Cache with special Sauce. He won't reveal exactly what it is yet.

@45:00 Sapphire does not help design reference pcb

@51:45 Availability discussed. They are shipping cards everyday. They like the EVGA queue system. But wouldn't elaborate. AMD website was able to stop scalpers. Other partners were able to stop some of them.

@55:50 RT will be rolled out there the entire RX6000 series stack.

@56:30 Varible Rate Shading Discussed (a few minutes) with Radeon boost with Ray Tracing to improve performance

@57:10 Direct Storage coming...but not elaborated on

demo Nov 20, 2020 01:26 AM

This part was interesting:

Quote:

DLSS Discussed. Originally was going to develop their own API. Developers begged AMD not to create another API (would only work with Radeon). So they are going for an open solution.
DLSS may end up dead with devs opting for an open alternative used on consoles. Nvidias only hope would be to market DLSS as having superior IQ.

NWR_Midnight Nov 20, 2020 02:00 AM

Quote:

Originally Posted by Nagorak (Post 1338246106)
So, after watching some more reviews:

It looks to me like for 4K you want RTX 3000 series. For 1440P and below the 6800 series looks good. At least if we're talking about rasterization. I have to wonder whether it's the limited memory bandwidth that's holding the cards back in 4K?

Ray tracing performance is passable, but a little disappointing. The lack of a DLSS equivalent really hurts, since ray tracing is basically unplayable in some games without it. I don't fully cut AMD slack on this based on it being their "first gen of ray tracing", because Nvidia didn't really see much improvement in ray tracing with Ampere, outside it simply scaling with the cards being faster. I think it's still basically 1st gen vs 1st gen and AMD's implementation is not as good. It could be newer drivers and tweaks by game developers can improve it, but the jury is still out on that one.

The efficiency is nice compared to Ampere, however not all places saw the large difference that TechPowerUp did. I guess it depends on the game? Regardless, greater efficiency is still a plus.

Overall I think the cards are priced correctly, and are a decent product, not an amazing product. (However by that same measure, Ampere is also only a decent product.)

It's still impressive that AMD is back in the game and actually is more efficient for a change. This is their most competitive card released in a long time, I'd say going back all the way to 290X released 7 years ago. Hopefully this is the start of consistent improvement on the part of AMD, which means competition will heat up again in the GPU market.

In the end how well the cards do will come down to availability. If the AIBs show up in force next week then AMD will get a lot of sales. If the GPU shortage continues then the real determination will come down to how many cards AMD and Nvidia can deliver, with both selling everything they make.

I don't think you are necessarily being fair in your view on Ray tracing. Nvidia has been working on their generation 1 for 2 years, and when their ray tracing was first introduced 2 years ago, it's performance was much, much worse than it is today, as well of also running into bugs. So I don't think you can call it first gen against first gen unless you go back and compare day 1 AMD to day 1 Nvidia. Nvidia required multiple patches in BF5 to deal with some of the issues, along with performance enhancements optimizations, which wasn't full illumination. They have also had 2 years to optimize it via drivers. The fact that the Ampere isn't really an improvement is really a huge black mark on Nvidia. Out of the gate, AMD is expected to do full illumination from the get go and shine? Yet, full illumination didn't' even happen for Nvidia until Metro Exodus. Not to mention that Nvidia had their hand in the cookie jar the whole time having all these games optimized for them. Sot heir should be a little more slack given to AMD than what it appears you are giving. Just my opinion.

bill dennison Nov 20, 2020 02:07 AM

Quote:

Originally Posted by demo (Post 1338246119)
This part was interesting:



DLSS may end up dead with devs opting for an open alternative used on consoles. Nvidias only hope would be to market DLSS as having superior IQ.

they won't but the smart thing for NV would do a preemptive strike and open DLSS to all

but AMD will do it open with Sony and M$ consoles and game devs it will take over just like freesync did

they could scream superior IQ all day but if people can't really see it it won't matter
and if you really want superior IQ you want DLSS off .

bill dennison Nov 20, 2020 02:25 AM

Quote:

Originally Posted by NWR_Midnight (Post 1338246122)
I don't think you are necessarily being fair in your view on Ray tracing. Nvidia has been working on their generation 1 for 2 years, and when their ray tracing was first introduced 2 years ago, it's performance was much, much worse than it is today, as well of also running into bugs. So I don't think you can call it first gen against first gen unless you go back and compare day 1 AMD to day 1 Nvidia. Nvidia required multiple patches in BF5 to deal with some of the issues, along with performance enhancements optimizations, which wasn't full illumination. They have also had 2 years to optimize it via drivers. The fact that the Ampere isn't really an improvement is really a huge black mark on Nvidia. Out of the gate, AMD is expected to do full illumination from the get go and shine? Yet, full illumination didn't' even happen for Nvidia until Metro Exodus. Not to mention that Nvidia had their hand in the cookie jar the whole time having all these games optimized for them. Sot heir should be a little more slack given to AMD than what it appears you are giving. Just my opinion.

:lol:

you can't it was close to two months after the 2080 ti came out before the first ray tracing game Battlefield V came out and you could do more than a short demo with a 1200 buck RT card

and then it was months between new RT games for NV to tweak them
then with some games RT and or DLSS was added 6 months after the game came out

and lets get real

Quote:

Ray tracing games you can play right now:


Amid Evil
Battlefield V
Bright Memory
Call of Duty: Modern Warfare (2019)
Control
Crysis Remastered
Deliver Us The Moon
Fortnite
Ghostrunner
Justice
Mechwarrior V: Mercenaries
Metro Exodus
Minecraft
Moonlight Blade
Pumpkin Jack
Quake II RTX
Shadow of the Tomb Raider
Stay in the Light
Watch Dogs Legion
Wolfenstein: Youngblood
https://www.rockpapershotgun.com/202...ss-games-2020/

and some of them just plain suck .

badsykes Nov 20, 2020 02:43 AM

Quote:

Originally Posted by bill dennison (Post 1338246126)
:lol:

you can't it was close to two months after the 2080 ti came out before the first ray tracing game Battlefield V came out and you could do more than a short demo with a 1200 buck RT card

and then it was months between new RT games for NV to tweak them
then with some games RT and or DLSS was added 6 months after the game came out

and lets get real



https://www.rockpapershotgun.com/202...ss-games-2020/

and some of them just plain suck .


And another matter is ....
When only a 700-800 $ GPU + fancy CPU can play RT at more than 60 fps at 1080p or 1440p and usually this guys have 4k stuff... What do you do with native resolution if one wants to play RT smooth ? Change the monitor ? Downscaling from native resolution is looking bad from what i know ?

Megaman Nov 20, 2020 02:55 AM

Quote:

Originally Posted by badsykes (Post 1338246128)
And another matter is ....
When only a 700-800 $ GPU + fancy CPU can play RT at more than 60 fps at 1080p or 1440p and usually this guys have 4k stuff... What do you do with native resolution if one wants to play RT smooth ? Change the monitor ? Downscaling from native resolution is looking bad from what i know ?

You wait for the 3085. Then you wait for the 4080, then the 4085 and so on.

LordHawkwind Nov 20, 2020 06:22 AM

Quote:

Originally Posted by acroig (Post 1338246066)
Anyone see Jay roast the 6800 launch? Ouch...

Ouch indeed :lol: I particulaly like the graph re vapour ware :lol: :lol:

acroig Nov 20, 2020 06:27 AM

Quote:

Originally Posted by LordHawkwind (Post 1338246139)
Ouch indeed :lol: I particulaly like the graph re vapour ware :lol: :lol:

Yeah, lol. I'm hoping for lots more inventory on the 25th.

KAC Nov 20, 2020 06:43 AM

Any gay flashed a 6800 to 6800 XT?

LordHawkwind Nov 20, 2020 08:03 AM

Quote:

Originally Posted by acroig (Post 1338246140)
Yeah, lol. I'm hoping for lots more inventory on the 25th.

This article makes for interesting reading. We know who SOC's are in the new consoles don't we?

https://investorsfreshnews.com/plays...ds-into-chaos/

This is also an interesting read from Gibbo at Overclockers UK;

"The CPU's are simple, we are buying most in small batches from grey resources because official stock is slim to zero and all that grey stock actually cost us above what you the customer is paying, the current AMD 5000 CPU's we are shipping out we are losing money on, but we buy these more expensive chips to ship some orders to do the best we can for our customers. We only turned one batch of 5800X away because the supplier wanted 100 more per CPU than we had sold at, we can afford to take a 10 or even 20 hit per CPU, but 100 hit per CPU we cannot afford that. That is why the CPU's were slightly above MSRP which makes us around 10% margin on official stock but at the same time means we don't loose too much when we buy none official stock. I've being doing this for twenty years and I know that at every CPU launch there will not be enough official product, I will have to buy grey product which loses us money but it is our priority to ship customers orders at the prices they paid, we always honour the price even if we lose money and we simply knew we would lose money on a large percentage of the pre-orders due to buying grey. So in short we charge a little extra, around 5% to give us a buffer to allow us to buy more expensive stock, because our main priority is shipping our customers orders, not how much money we make on those orders but at the same time we don't want to be shipping all the orders at losses so that buffer means we can take more expensive stock and still break even. By us taking official and grey stock means in the end we ship more orders meaning we please more customers who were happy to pay that small premium.

The customer has the choice, to shop around, the MSRP is also a suggestion not a given and after being recently burnt selling at MSRP's we are now far more cautious.

The GPU MSRP's are near impossible to hit for board partners and for ourselves, after operational cost we'd be lucky to break even and considering the huge amount of workload on all departments GPU's are creating selling product at a loss on several hundred or a thousand plus cards on a brand new in mega short supply product is simply not possible for ourselves. When the custom cards are released people will see that launch MSRP's are very distant compared to actual selling prices, that is not resellers gouging that is just board partners selling product at fair margin so they can survive.

We are always fair in what we do and if we can hit an MSRP we will do so, but to be quite frank after the 3080 MSRP that has lost us potentially thousands we are now a lot more cautious. As 24hr after launch of 3080, in some cases, minutes/hours no reseller anywhere was selling a board partner card at 649, ask yourself why and it was not because of gouging."

He's also commented that their will be slim to no cards from 6800 AIB's next Wednesday. I'd only expect cards from Sapphire, Powercolor, XFX and whoever else is an AMD only partner. Forget Asus, MSI & Gigabyte. They've got too many pre-ordered 3080's to deliver.

Riptide Nov 20, 2020 08:09 AM

Was the whole "way more aib inventory on 25th" thing emanating from some guy on hardocp or is there an actual statement from a company to back this?

Because at this point, I'm skeptical. Sounds to me like we'll have another 60-120 second rat race to check out with a whole bunch of sad faces afterwards.

CyanBlues Nov 20, 2020 08:23 AM

sounds like most of the AMD only aib will get most of them vs the non amd vendors like msi/gigabyte/asus etc...

LordHawkwind Nov 20, 2020 08:36 AM

Quote:

Originally Posted by Riptide (Post 1338246153)
Was the whole "way more aib inventory on 25th" thing emanating from some guy on hardocp or is there an actual statement from a company to back this?

Because at this point, I'm skeptical. Sounds to me like we'll have another 60-120 second rat race to check out with a whole bunch of sad faces afterwards.

Regarding availability it's simple maths & economics really.

Apparantely MS has sold 1.5m Xboxes since launch. Sony being the bigger brand will have sold at least the same if not more. So that's 3m consoles.

In the UK AMD have shipped about 200 Zen 3's mostly the 5600. Again in UK AMD have shipped at best 200 6800's with only about 20% of them being XT's.

It's really not rocket science to work out where all the 7nm wafers are going is it. You can't blame AMD, consoles are money in the bank it's just the way they've tried to cover up the supply issues. If you're honest and upfront people can respect that. Instead what they're doing is like chinese water torture. Drip, drip drip and as you said this just leaves them with frustrated consumers. Obviously the same goes for Nvidia but as this is an AMD thread I'm not going to comment on that.

LordHawkwind Nov 20, 2020 08:47 AM

Here's another interesting article on 6800 availabilty outside of the UK.

https://www.guru3d.com/news-story/dr...80-launch.html

Still think there'll be stock of the 6900XT Billy?

Mangler Nov 20, 2020 09:01 AM

What's the status on the $10 bet?

pax Nov 20, 2020 09:51 AM

Quote:

Originally Posted by Riptide (Post 1338246153)
Was the whole "way more aib inventory on 25th" thing emanating from some guy on hardocp or is there an actual statement from a company to back this?

Because at this point, I'm skeptical. Sounds to me like we'll have another 60-120 second rat race to check out with a whole bunch of sad faces afterwards.

The only thing he said was there would be 7 times more than reference but how many reference were there? And how much vs demand?

Its just that if you preferered reference with amd you would have to be lucky to get one.

LordHawkwind Nov 20, 2020 10:11 AM

Quote:

Originally Posted by Mangler (Post 1338246166)
What's the status on the $10 bet?

He's trying to weedle out of the bet :lol:

https://www.world-today-news.com/the...-be-paperless/

This is also very funny.

https://www.reddit.com/r/Amd/comment...ief_architect/ Some of the comments are great.

Sack Frank Azore is trending heavily especially for that 2nd tweet. Brain dead comes to mind when this guy is mocking potential AMD consumers who couldn't buy the card. Was he really that desperate to try and win a $10 bet? :lol:

Shapeshifter Nov 20, 2020 10:35 AM

https://videocardz.com/newz/powercol...d-devil-tested

Quote:

manual overclocking ExtremeIT was able to achieve 2750 MHz Game and 2800 MHz boost clock, according to the GPU-Z monitoring software. By default, however, the graphics card has 2090/2340 MHz frequencies respectively. The manual overclocking failed and it did not appear to be very stable (multiple display signal losses), but eventually he modified the overclocking to 2600/2650 MHz.
Quote:

This allowed the RX 6800 XT Red Devil to score 56,756 points in Fire Strike (1080p preset, Graphics score).

NWR_Midnight Nov 20, 2020 10:52 AM

Quote:

Originally Posted by bill dennison (Post 1338246126)
:lol:

you can't it was close to two months after the 2080 ti came out before the first ray tracing game Battlefield V came out and you could do more than a short demo with a 1200 buck RT card

and then it was months between new RT games for NV to tweak them
then with some games RT and or DLSS was added 6 months after the game came out

and lets get real



https://www.rockpapershotgun.com/202...ss-games-2020/

and some of them just plain suck .

Exactly. Someone gets it!
Quote:

Originally Posted by Shapeshifter (Post 1338246211)

Thank you! Had to scroll thru 2/3 of a page of posts about availability talk that belongs in the designated topic for it, to get to something that has to do with performance and the actual card itself.

pax Nov 20, 2020 10:52 AM

Well I was worried that having the same memory arch from 6800 to 6900 was a problem but this test says it isnt:

https://twitter.com/CapFrameX/status...23250088603657


Quote:

RAM frequency scaling test from
@ComputerBase
.

Rightwards arrow17 games at 4k
Rightwards arrowstandard vs. 8600Mhz
Rightwards arrow1% difference

Big Navi is not bandwidth limited thanks to 128MB Infinity Cache.
12:25 PM Nov 20, 2020Twitter Web App
17
Likes

NWR_Midnight Nov 20, 2020 10:58 AM

Quote:

Originally Posted by pax (Post 1338246219)
Well I was worried that having the same memory arch from 6800 to 6900 was a problem but this test says it isnt:

https://twitter.com/CapFrameX/status...23250088603657

So, why the drop off at 4k if it's not memory bandwidth limited? 1080p/1440p the cards are beasts.. 4K it falls flat in perspective to the 1080/1440 results.

LordHawkwind Nov 20, 2020 11:06 AM

To try and be a bit more fair I think it's clear that in AMD's early planning they knew they had 0% of the enthusiast GPU market and they knew that Nvidia with 100% of the segment were launching the 30 series in the same quarter. They will have then set themselves a target of how many 6800 sales they could realistically make coming from a non existant base. They would have realistically set a target of 2 to 3% over say the first 6 months and an optomistic market share of 10% over the life of the card (obviously these are just made up numbers). That would then have fed into their production schedule and this would have been set months in advance of the launch.

As they are also producing the Xbox, PS5 and Zen 3 it would probably have left them very little wriggle room in re crunching the numbers. Then they saw what happened with the 3080/90 and two weeks later the 3070 launch. I bet they thought oh sh*t. Now instead of maybe 2% of the market being a target there was probably 10x that number who couldn't get a 30 series card and would now consider a 6800. Unfortunately they would have then known there was no way they could meet this huge demand as the production numbers would have been written in stone and unable to be changed. So this was a perfect sh*t storm that unfortunately they would have had no way of predicting during their production scheduling.

At this point they could have come out and told the truth about the situation asked for people's patience whilst they tried to address the issue. Like they mention in the article Pax posted they could have come up with an EVGA type system to ensure the right people got the cards they did have without all the frustrations.

The rest as they say is history as we all know what they chose to do. Add in the stupid bet from Frank Azor and then his moronic tweet about buying a 6800 that the whole pack of cards collapsed. Don't lie to your consumers as that never ends well as Apple has found out with the Jobs arial issues, bendgate and ofc Nvidia's gimped 970 memory to name but a few.

So where does that leave us with 6800 availabilty? In the short term little to non existant and in the medium term a little better but the chances of any of us geting one in 2020 is looking remote unless we get very very lucky.

Obviously, all opinions are my own. Just my 2c's.

pax Nov 20, 2020 11:28 AM

Quote:

Originally Posted by NWR_Midnight (Post 1338246221)
So, why the drop off at 4k if it's not memory bandwidth limited? 1080p/1440p the cards are beasts.. 4K it falls flat in perspective to the 1080/1440 results.

Thats the 64 000$ question... maybe drivers but maybe some other bottleneck that we dont know of yet. Its not rops thats for sure. 128 is overkill. We see the 6800 outperf vs its 17% lower # of cus vs the 6800xt and that wiht the 6800xt running higher clocks, so whatever it is, adding cus isnt giving a 100% scale increase in perf. So not rops or cu's...

TU's maybe? But Im thinking more and more its drivers.

EYE4LYFE Nov 20, 2020 11:34 AM

ASUS Radeon RX 6900 ROG STRIX series spotted at EEC

There is a small confusion around the new submission though. For reasons unknown, ASUS used Radeon RX 6900 non-XT codename instead of the RX 6900 XT (the only SKU AMD officially introduced). AMD did not confirm, however, that the 6900 series would be available under custom series, but this is a second AIB known to be preparing such graphics cards, so it is safe to assume that this is indeed happening. Whether ASUS used the wrong code, or there is indeed the RX 6900 non-X SKU planned, we dont know yet.

According to the list, ASUS would launch the ROG STRIX LC series of the RX 6900 (XT) SKU. The regular ROG STRIX is also planned accompanied by the mid-range TUF Gaming series. Meanwhile, the DUAL and DUAL EVO series might be exclusive to the slowest Navi 21 SKU, the RX 6800 non-XT.



videocardz

bill dennison Nov 20, 2020 11:35 AM

Quote:

Originally Posted by LordHawkwind (Post 1338246161)
Here's another interesting article on 6800 availabilty outside of the UK.

https://www.guru3d.com/news-story/dr...80-launch.html

Still think there'll be stock of the 6900XT Billy?

in the US yes by x-mass

no one care about Europeans

bill dennison Nov 20, 2020 11:47 AM

Quote:

Originally Posted by Shapeshifter (Post 1338246211)

damn will AMD break 3000MHz :lol:

Higgy10 Nov 20, 2020 12:37 PM

guys i created a thread to discuss availability but it seems we all have literacy issues, im just gonna start deleting posts from here on in now or if it amuses it me, ill edit the post to say things such as im illiterate, or i didnt graduate from primary school etc

Nagorak Nov 20, 2020 07:39 PM

Quote:

Originally Posted by NWR_Midnight (Post 1338246122)
I don't think you are necessarily being fair in your view on Ray tracing. Nvidia has been working on their generation 1 for 2 years, and when their ray tracing was first introduced 2 years ago, it's performance was much, much worse than it is today, as well of also running into bugs. So I don't think you can call it first gen against first gen unless you go back and compare day 1 AMD to day 1 Nvidia. Nvidia required multiple patches in BF5 to deal with some of the issues, along with performance enhancements optimizations, which wasn't full illumination. They have also had 2 years to optimize it via drivers. The fact that the Ampere isn't really an improvement is really a huge black mark on Nvidia. Out of the gate, AMD is expected to do full illumination from the get go and shine? Yet, full illumination didn't' even happen for Nvidia until Metro Exodus. Not to mention that Nvidia had their hand in the cookie jar the whole time having all these games optimized for them. Sot heir should be a little more slack given to AMD than what it appears you are giving. Just my opinion.

No, this is a fair counterpoint. I was thinking mostly in hardware terms, not software optimization, which Nvidia has had a lot more time to do (both on the driver and developer level).

I still have mixed feelings about the importance of ray tracing this gen, because the performance hit is usually quite high for any substantial effects, and even now not that many games support it.


All times are GMT -5. The time now is 04:29 PM.

Powered by vBulletin® Version 3.6.5
Copyright ©2000 - 2021, Jelsoft Enterprises Ltd.
All trademarks used are properties of their respective owners. Copyright 1998-2011 Rage3D.com