RTX 3090 Reviews

Here is a reliable review of the 3090 STRIX from ASUS. Looks like the 3090 really can achieve 20% faster performance than a 3080 which equates to about 20FPS or more performance at 4K resolution. It's beating the 2080TI by 56% which is solid. Very nice card but unless you use the quiet bios it's pretty loud and that price is pretty bad. It's pretty impressive that they can keep this 390 watt card at 68C and when overclocked it pulls 480watts and still stays at 75C or lower. Very nice cooling potential.

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/
 
Last edited:
Here is a reliable review of the 3090 STRIX from ASUS. Looks like the 3090 really can achieve 20% faster performance than a 3080 which equates to about 20FPS or more performance at 4K resolution. It's beating the 2080TI by 56% which is solid. Very nice card but unless you use the quiet bios it's pretty loud and that price is pretty bad. It's pretty impressive that they can keep this 350 watt card at 68C and when overclocked it pulls 480watts and still stays at 75C or lower. Very nice cooling potential.

https://www.techpowerup.com/review/asus-geforce-rtx-3090-strix-oc/

Wait, so an oc 3090 at 480w can beat a base clock 3080 by 20% = 3090 is 20% faster?

Sorry, if the average OC 3090 isn't 20% faster than the average OC 3080, its not 20% faster. If you ask me, OC will more likely pull them even closer real world than base clock/boost.

3090 Strix owner "haha my $1800 3090 is 20% faster than your $700 3080.\

Base 3080 owner "just a sec, let me load my Afterburner profile........... Hmm, looks like your $1800 3090 is only 7% faster than my $700 3080.
 
Well, it's 18% faster at it's bass OC and then they can strangle another 3% out of it. Really, I don't think the Overclocking numbers are worth the extra power draw even if the temps are under control. Still, I'm happy to see that the 3090 can go 20% faster than a 3080 as long as the board partner is man enough to allow 390watt power delivery. 390watt power delivery with 20% performance over a 3080 and more than double the VRAM @ 68C is pretty sweet. The 3080 is locked at 320 watts and that is what separates the two more than anything else. It's 50% or more above the 2080TI with the VRAM that I wanted. I'm just not liking that price. Of course we won't be able to buy one anyway so...Not going to lie, if I could buy one right now I probably would.
 
Last edited:
I know this seems like crazy talk, but I'm starting to lean away from all the reviewers ragging on these things and NVidia for not being able to actually game at 8k. I'm sitting here watching 4k/8k gaming results and people are playing a bunch of games at 8k just fine.

I realize it's not the popular opinion, but I mean...they ARE playing these things. :bleh: A surprising amount of them are 45-60fps. AC Odyssey was 45fps in their whole demo. I played through all of AC Origins at 4k on my 1080Ti with the framecap slider locked at 45fps. What's the difference? :lol: Tons of console games ran at 30fps...

[yt]jyZbArP-Y2w[/yt]
 
I think it's because reviewers are looking at Nvidia for trying to sell Titan cards as gaming cards. In my opinion, Nvidia is pushing the 8K gaming scenario because consoles are pushing the 4K gaming scenario and Nvidia is trying to separate itself from console gaming. Most reviewers are overlooking that. 8K is something I would do with DSR on old games and yes it's possible in a handful of games with DLSS but Nvidia's 8K gaming claims are similar to consoles 4K gaming claims. It can be done at playable levels with resolution hacks but most of us would probably rather have better frame rates at 4k. It's the 3090's VRAM that separates it from the 3080 at 8K which is why they chose to market it that way. Nvidia is just trying to fight console marketing with those 8K claims.
 
Last edited:
I agree that I'd certainly want better framerates at 4k. The whole point is moot because you still need to get an 8k display. My only point is, if someone really wanted to play at 8k and had the means, a bunch of these games DO seem actually playable. Without DLSS and no need of resolution hacks.

Again, the key word here is playable. Obviously none of us would call it a great example of a PC gaming experience being 30-45fps. :p
 
I've been checking out 5k and 8k and pleasantly surprised, thought it would be much worse. By seeing this, I've changed my thinking for waiting for the 20 gig sku, so I can have the choice to use 5k and 8k with Dsr without memory limitations in some titles.
 
I've been checking out 5k and 8k and pleasantly surprised, thought it would be much worse. By seeing this, I've changed my thinking for waiting for the 20 gig sku, so I can have the choice to use 5k and 8k with Dsr without memory limitations in some titles.

That's a smart move in my opinion. The extra VRAM could come in handy with future titles 2 years from now but we don't know. If you are the type of gamer that wants to use high resolution texture packs from modders or you are interested in using DSR at resolutions above 4k then more VRAM is a must.
 
I agree that I'd certainly want better framerates at 4k. The whole point is moot because you still need to get an 8k display. My only point is, if someone really wanted to play at 8k and had the means, a bunch of these games DO seem actually playable. Without DLSS and no need of resolution hacks.

Again, the key word here is playable. Obviously none of us would call it a great example of a PC gaming experience being 30-45fps. :p

Yeah I agree. If 30FPS is considered playable then it can be done without DLSS. Even with DLSS most titles will struggle for 8K60 which is why I'm not really worried about 8K. If I have the power to achieve 60+ FPS with 8K DSR on my 4K panel, then I will use it. That is why I'm happy about 8K DLSS. I still have plenty of old games like stalker and GTA that I wouldn't mind trying to force 8K DSR/DLSS. Will anyone be able to to add DLSS support for older titles via modding though? Doubtful. I'm still pumped about playing command and conquer remaster at 8K DSR 120FPS though.:D

I wouldn't be surprised to see a 3090 powering older games like STALKER at 8K DSR without the need for DLSS and that would be very cool for me. I always love pushing every ounce of quality out of my old favorite games. I play alot of older titles still.
 
Last edited:
Well, it's 18% faster at it's bass OC and then they can strangle another 3% out of it. Really, I don't think the Overclocking numbers are worth the extra power draw even if the temps are under control. Still, I'm happy to see that the 3090 can go 20% faster than a 3080 as long as the board partner is man enough to allow 390watt power delivery. 390watt power delivery with 20% performance over a 3080 and more than double the VRAM @ 68C is pretty sweet. The 3080 is locked at 320 watts and that is what separates the two more than anything else. It's 50% or more above the 2080TI with the VRAM that I wanted. I'm just not liking that price. Of course we won't be able to buy one anyway so...Not going to lie, if I could buy one right now I probably would.

What I was getting at, is how does it compare to the base OC 3080 Strix.

It’s the same logic but reverse of the people that claim card A isn’t better than card B because you can OC card B and get the same performance as card A completely ignoring the fact that you can OC card A too.
 
Yeah I agree. If 30FPS is considered playable then it can be done without DLSS. Even with DLSS most titles will struggle for 8K60 which is why I'm not really worried about 8K. If I have the power to achieve 60+ FPS with 8K DSR on my 4K panel, then I will use it. That is why I'm happy about 8K DLSS. I still have plenty of old games like stalker and GTA that I wouldn't mind trying to force 8K DSR/DLSS. Will anyone be able to to add DLSS support for older titles via modding though? Doubtful. I'm still pumped about playing command and conquer remaster at 8K DSR 120FPS though.:D

I wouldn't be surprised to see a 3090 powering older games like STALKER at 8K DSR without the need for DLSS and that would be very cool for me. I always love pushing every ounce of quality out of my old favorite games. I play alot of older titles still.

The Rtx 3080 really handles 4k to me and why I was curious about 5k and 8k gaming as a viable Dsr choice for some gaming titles. Obviously, 8k wasn't going to deliver ideal gaming for all titles, and feel Nvidia was too aggressive with the 8k claim using the best case examples with the Rtx 3090. I can see playing titles with 5k and some at 8k Dsr with a Rtx 3080 20 gig sku.
 
What I was getting at, is how does it compare to the base OC 3080 Strix.

It’s the same logic but reverse of the people that claim card A isn’t better than card B because you can OC card B and get the same performance as card A completely ignoring the fact that you can OC card A too.

I don't know, I haven't seen a review for the 3080 STRIX but I do know that gamers nexus got 16% OC with liquid Nitrogen on the ASUS Tuff. It really comes down to how much power the board has in terms of OC but the OC isn't doing much for the results. You get better performance on a stock 3090 than you do with the LN2 cooled ASUS 3080 card from what I have seen. Either way, the 3090 isn't worth double the price for 20% more performance anyway. If you can't use the extra VRAM then it's pointless because that is what you are paying for.

My goal has been to get more than 50% over the 2080TI with double the VRAM for roughly $1400 to $1500. The 3090 gets close and that's why I'm not upset with it. I need the VRAM for my art projects and if I can get an extra 20FPS at 4k when I game then that will help add value to the card.
 
Last edited:
I think 8K is just a pipe dream at this point. It's not clear that public demand is high even for 4K, let alone 8K. We're probably at least a decade off from that being mainstream, if it ever is. At some point the difference gets small enough that the average consumer just isn't going to care, so it's possible 8K never catches on.

Even as a super tiny niche, anyone trying to run 8K right now is basically looking for an excuse to make themselves miserable, because the performance just isn't there.
 
I think 8K is just a pipe dream at this point. It's not clear that public demand is high even for 4K, let alone 8K. We're probably at least a decade off from that being mainstream, if it ever is. At some point the difference gets small enough that the average consumer just isn't going to care, so it's possible 8K never catches on.

Even as a super tiny niche, anyone trying to run 8K right now is basically looking for an excuse to make themselves miserable, because the performance just isn't there.

till we have lots of 8k tv content on like Disk and DirecTV

it won't be mainstream
 
I think 8K is just a pipe dream at this point. It's not clear that public demand is high even for 4K, let alone 8K. We're probably at least a decade off from that being mainstream, if it ever is. At some point the difference gets small enough that the average consumer just isn't going to care, so it's possible 8K never catches on.

Even as a super tiny niche, anyone trying to run 8K right now is basically looking for an excuse to make themselves miserable, because the performance just isn't there.

I agree that 8k dispays are niche but Amd, Nvidia, Microsoft, Sony and many developers are targeting 4k gaming, coupled with much more affordable 4k displays. Kind of a clue on trending demand for 4k gaming, one may imagine.
 
What I was getting at, is how does it compare to the base OC 3080 Strix.

It’s the same logic but reverse of the people that claim card A isn’t better than card B because you can OC card B and get the same performance as card A completely ignoring the fact that you can OC card A too.

Overlocking doesn't really net that much performance for me.

10600 points vs 10900 points on firestrike ultra.
 
I think 8K is just a pipe dream at this point. It's not clear that public demand is high even for 4K, let alone 8K. We're probably at least a decade off from that being mainstream, if it ever is. At some point the difference gets small enough that the average consumer just isn't going to care, so it's possible 8K never catches on.

Even as a super tiny niche, anyone trying to run 8K right now is basically looking for an excuse to make themselves miserable, because the performance just isn't there.

I agree, I've seen an 8k TV in action watching video that was encoded at 8K and I couldn't tell the difference between that and the 4K TV standing next to it until I got within 5 feet of the display and these were 85 inch TV's I was looking at.

In my room, I'm sitting about 7 feet away from a 77 inch display and I simply can't go any bigger without moving my home theater into the basement. I can't see myself buying an 85 inch screen or larger so I don't see any real reason to use 8K. The jump from 4K was noticeable with screen sizes larger than 40 inches or on monitors that you were sitting right up against. Even then, HDR added more to movies and games than resolution did.

I've been fighting for 4k 60FPS since I owned 980TI SLI and the 3000 series is the first one that can actually do it without SLI. I'm done moving up in resolution. Higher resolution will be useful for VR but that's about it. Movies are being shot at 2K and 4K. Hollywood doesn't want to move up to 8K because it costs too much and has little to no benefit. The only reason I would want 8K was for less aliasing but you can get that with DSR in the rare title that will actually run at 8K 60FPS. The 3000 series is already struggling for 4K 60FPS in games like control even with DLSS enabled. There is no way I'm moving up to 8K until my TV dies and 8K is simply the new standard that you are forced to buy if you want the best looking screens. Until then I don't really even care about it.
 
Back
Top