Official nVidia 3080/3090 review thread

I can't recall Nvidia ever releasing GPU sales numbers? How many 2080ti's did they actually sell? Probably a lot less than 1080ti's and maybe that's why they've gone back to this pricing. How many people really need a 3090 at $1,400 other than they can afford it? Not many.

If you desire to have some level of insight from a big survey; the steam survey offers some insight.


https://store.steampowered.com/hwsurvey/videocard/

Based on this survey; the Rtx 2080 Ti did surprisingly well, imho.
 
What do you guys think about Hardware Unboxed's finding regarding VRAM in Doom Eternal? He says that the VRAM is limiting it but he also says that the game was showing 9GB of VRAM usage and the 3080 has 10GB so that's kind of odd.

I have no doubts that games will be more CPU and VRAM intensive in the future but will it really matter if you are planning on using a 3080 for a couple of years and then rebuilding your system? I think for me it will because of I use some "not so professional" mods with monstrous texture packs as it is but I'm thinking it probably won't matter to your average user.

Another thought I just had was that Red Dead Redemption 2 could still use a little boost to it's minimums.:)
 
Fair enough :) but I should point out that each game has CPU bottlenecks in different areas. I've experienced them with the I9 9900k in Metro Exodus and Hitman 2. It comes down to where you benchmark the game. Jayztwocents hasn't posted his follow up to the benchmarks he ran on the quoted tests so we don't know on that specific test group just yet.

Also, I was right about the I9 9900K having a 20% lead at 1080P and that lead will continue when we start to see those same bottlenecks at higher resolutions. From the tests I have seen the 3080 didn't reach them but the 3090 has yet to be tested and 1440P is now showing a solid 10% advantage to the I9 9900K over the 3900X across the board.

I was correct about the real world clock advantages of the I9 9900K, I was correct about the lack of advantages from the extra cores on the 3900X and I was correct about PCIE 4 not having any advantages. I was also correct about CPU bottlenecks starting to appear more often at higher resolutions like 1440P and 4K. It's not my logic you have been arguing with, it's undisputed facts. The bottlenecks just haven't hit a critical enough point at 4k for it to matter in the areas tested so far with the 3080.

I will say that I will admit I was wrong about a slaughter, even if it did turn out to be one that comment was childish which happens sometimes when I'm being trolled.

In the end the 8 core I9 9900k 14nm that came out in 2018 is still faster at 1080P and 1440P when compared to the 12 core 7nm 3900X from 2019 and there will be areas in games where a bottleneck will occur here and there at 4K where it will gain an advantage as well. I know this because I have experienced them personally.

Here is a pretty good review that shows the I9 9900K and 3900X being tested with the 3080 if you are interested.
https://www.techpowerup.com/review/nvidia-geforce-rtx-3080-amd-3900-xt-vs-intel-10900k/

According to your techpowerup article the 9900k was over clocked, so I wouldn't say you where right per those benchmarks, as it wasn't running at stock speeds. Overclocked is a different story, but not everyone chooses to overclock.

I also don't think Intel being 20% faster in SOME of the benchmarks is all tied to the CPU, but rather the games code. There are quite a few games that are between 2% and 5% difference at 1080P.
 
According to your techpowerup article the 9900k was over clocked, so I wouldn't say you where right per those benchmarks, as it wasn't running at stock speeds. Overclocked is a different story, but not everyone chooses to overclock.

I also don't think Intel being 20% faster in SOME of the benchmarks is all tied to the CPU, but rather the games code. There are quite a few games that are between 2% and 5% difference at 1080P.

My original posts back when the 3900X came out was that the I9 9900K was 32% faster overall based on Gamers Nexus review clearly stating it. Then when the 10900K Review came out, the I9 9900K was 20% faster at gamers nexus due to changes in their tests and game lineup. I usually state gamers nexus review info because I trust them to really CPU stress when they do CPU stress tests.

The tech power up article shows a 10% advantage to the I9 9900K at 1080P and you can see those same tests now show a 7% advantage at 1440P so the bottleneck is now starting to rise up to 1440P. How much ahead it is comes down to the games tested and where the testing was done in the game. That is why it's still possible for Jayztwocents's follow up video to still show a substantial gain across all resolutions, it just depends how CPU limited his tests are.

EDIT:
Overclocking isn't for everyone but if you buy an AIO then all it takes is a minute to give an Intel chip a significant boost. That isn't the case with AMD but as you can see, it's all relative by how CPU bound the tests actually are, if they are not very CPU bound then the OC won't show much advantage and neither will faster CPUs in general.
 
Last edited:
My original posts back when the 3900X came out was that the I9 9900K was 32% faster overall based on Gamers Nexus review clearly stating it. Then when the 10900K Review came out the I9 9900K was 20% faster at gamers nexus. I usually state gamers nexus review info because I trust them to really CPU stress when they do CPU stress tests.

The tech power up article shows a 10% advantage to the I9 9900K at 1080P and you can see those same tests now show a 7% advantage at 1440P so the bottleneck is now starting to rise up to 1440P.

You do realize that Techpowerup is basically gamer's nexus in print format right? So the 3900x went from 32% faster to 10% faster and you believe this all has to do with the CPU and can't possibly be due to game code and such affecting it? Specially when there are many games that are only 2% to 5% difference. I will give you that some of it is due to bios updates and micro code changes, but software code (games and windows) has more to do with it than the CPU.

But that is enough on that subject... You guys enjoy your 3080's.
 
I think it has more to do with the games tested. Gamers Nexus was finding CPU limitations at extremely high frame rates at 1080P but how long before 1440P would actually reach those extremely high FPS? That is why they changed their game lineup to include more demanding titles but ones that don't show such an exaggerated FPS outcome.

Also, I agree, enjoy the 3080's based on the benchmarks I have seen you can't go wrong with either CPU at 4K with a 3080.:)
 
Everything is exactly like it was stated before the reviews. Nothing underwhelming about it. Looks like a nice upgrade for us 1080 Ti owners.

I think the 1440P results are fairly disappointing. It's only offering around a 20% improvement on average over the 2080 Ti. I'd have liked to see at least a 30% improvement. Frankly if I still had my 2080 Ti I'd probably just push the power slider up to max which would put me at only 10% slower than the 3080 at the same power draw, and then just skip this gen.

The problem also is that some of the stuff Nvidia was putting out was suggesting up to 40% faster than the 2080 Ti in cherry picked scenarios. So, that doesn't help things.

For people running 4K it looks better, but not everyone is running 4K.
 
I think the 1440P results are fairly disappointing. It's only offering around a 20% improvement on average over the 2080 Ti. I'd have liked to see at least a 30% improvement. Frankly if I still had my 2080 Ti I'd probably just push the power slider up to max which would put me at only 10% slower than the 3080 at the same power draw, and then just skip this gen.

The problem also is that some of the stuff Nvidia was putting out was suggesting up to 40% faster than the 2080 Ti in cherry picked scenarios. So, that doesn't help things.

For people running 4K it looks better, but not everyone is running 4K.

My 2080TI @ 2150/8000 on an AIO is not within 10% of the 3080 numbers, so pushing your power limit up would not have got you that close. Gamers Nexus numbers were using an overclocked 2080TI Strix and it was still getting slapped around like a child.

I think people expecting huge gains at resolutions below 4K are disappointed, but in reality 1440P really isn't that hard of a resolution to push anymore. The 2080TI has handled it extremely well. There's only one game I've seen it struggle with, and that was Kingdom Come: Deliverance.
 
I think the 1440P results are fairly disappointing. It's only offering around a 20% improvement on average over the 2080 Ti. I'd have liked to see at least a 30% improvement. Frankly if I still had my 2080 Ti I'd probably just push the power slider up to max which would put me at only 10% slower than the 3080 at the same power draw, and then just skip this gen.

The problem also is that some of the stuff Nvidia was putting out was suggesting up to 40% faster than the 2080 Ti in cherry picked scenarios. So, that doesn't help things.

For people running 4K it looks better, but not everyone is running 4K.

Look at my benchies, my power slider is up there too and the 3080 is still 20 to 30 percent faster
 
I think the 3080 is delivering your standard 20% to 30% for $699. The FE has a great cooler from the looks of it too. The only thing disappointing to me is that it took them 2 years to deliver it and more VRAM would have been nice. The GDDR6X probably cost a lot which might be why they cut it down from the 1080TI and 2080TI. Dedicated RTX and DLSS 2.0 aren't bad features to have this time around either, lets hope game developers support them more now that the tech has matured a bit.
 
so your going to play vegas roulette rtx 3080 :p

good luck :heart:


Thank you sir! :D :heart:


What do you guys think about Hardware Unboxed's finding regarding VRAM in Doom Eternal? He says that the VRAM is limiting it but he also says that the game was showing 9GB of VRAM usage and the 3080 has 10GB so that's kind of odd.

I have no doubts that games will be more CPU and VRAM intensive in the future but will it really matter if you are planning on using a 3080 for a couple of years and then rebuilding your system? I think for me it will because of I use some "not so professional" mods with monstrous texture packs as it is but I'm thinking it probably won't matter to your average user.

Another thought I just had was that Red Dead Redemption 2 could still use a little boost to it's minimums.:)

Honestly I run High settings a lot of the time because ultra settings are so little of an improvement and such a big loss in performance it is not justified. Hence where Nvidia got the "2x" performance in doom, it was a Memory bottleneck. :rolleyes: :lol:

I know my 1700 is going to be a problem now even at 1440p, but I had money set aside for a 3090. Now i can upgrade to a R7 4700X/Mobo/mem and be less than that 3090. I really just want this card for cyberpunk 2077, i want the RTX reflections and shadows etc.

AMD Navi 21 might be close to match the 3080, but it doesn't have the RTX enhancements in games that the Nvidia cards do. :)


I think the 3080 is delivering your standard 20% to 30% for $699. The FE has a great cooler from the looks of it too. The only thing disappointing to me is that it took them 2 years to deliver it and more VRAM would have been nice. The GDDR6X probably cost a lot which might be why they cut it down from the 1080TI and 2080TI. Dedicated RTX and DLSS 2.0 aren't bad features to have this time around either, lets hope game developers support them more now that the tech has matured a bit.

Samsung is the issue that they don't have faster cards and better yields. But they honestly made due with what they had and produced an enticing product.

DLSS 2.0 will be in cyberpunk and hopefully will be adopted more in games, as after running control with it, looks pretty damn good!
 
Thank you sir! :D :heart:




Honestly I run High settings a lot of the time because ultra settings are so little of an improvement and such a big loss in performance it is not justified. Hence where Nvidia got the "2x" performance in doom, it was a Memory bottleneck. :rolleyes: :lol:

I know my 1700 is going to be a problem now even at 1440p, but I had money set aside for a 3090. Now i can upgrade to a R7 4700X/Mobo/mem and be less than that 3090. I really just want this card for cyberpunk 2077, i want the RTX reflections and shadows etc.

AMD Navi 21 might be close to match the 3080, but it doesn't have the RTX enhancements in games that the Nvidia cards do. :)






Samsung is the issue that they don't have faster cards and better yields. But they honestly made due with what they had and produced an enticing product.
DLSS 2.0 will be in cyberpunk and hopefully will be adopted more in games, as after running control with it, looks pretty damn good!

I'm interested in Cyberpunk as well. I'm curious how DLSS 2.0 will actually look and perform in Cyberpunk. I'm a "native is best" type of dude but I just upgraded to a 4k 120hz panel and I'm thinking I might try DLSS 2.0 in an attempt to get higher FPS. It's hard to tell how CPU demanding it will be but an upgrade to the 4000 series will elevate any problems if I had to guess.

I'm going after a 3090 as well. Probably won't be lucky enough to get one though.:lol:
 
My 2080TI @ 2150/8000 on an AIO is not within 10% of the 3080 numbers, so pushing your power limit up would not have got you that close. Gamers Nexus numbers were using an overclocked 2080TI Strix and it was still getting slapped around like a child.

The GN reviews showed that the 3080 was only about 10-15% faster than the OC'd Strix, which I would consider pretty standard for an OC'd 2080 Ti.

I think with most AIB 2080 Ti's with increased power limit you can get about 10% more performance out of them compared to an FE (not like I've tested out hundreds of cards, but it seemed true for mine). That's not even a great overclock honestly, but it's enough to make the difference smaller.

Hardware Unboxed/Techspot only shows a 21% increase on average at 1440P, and they're using an 2080 Ti FE as far as I can tell. So that's how I get only around 10% faster once accounting for overclock, which oddly in this case only puts the power draw to the same amount.

So, anyway, to me it looks like this generational improvement is at the low end of average. If you guys are super impressed by it then more power to you. I'm not that impressed but I'll be getting one anyway. ;)

Look at my benchies, my power slider is up there too and the 3080 is still 20 to 30 percent faster

You haven't actually compared a 3080 in your own system though, right? Anyway, it is what it is, I think the 3080 is a decent card, I'm just not blown away by it.

To be fair my performance at 1440P was already fine with a 2080 Ti, so I'm not exactly desperate for huge improvements, unlike those running at 4K!
 
Last edited:
Hey...for me, an upgrade from GTX 1080 will be massive. I just can't wait to start RDR 2...:D


:lol: I went from a 1080 to 2080S and had a pretty big performance uplift. Just wait till I buy my 3080 first! :lol:


I'm interested in Cyberpunk as well. I'm curious how DLSS 2.0 will actually look and perform in Cyberpunk. I'm a "native is best" type of dude but I just upgraded to a 4k 120hz panel and I'm thinking I might try DLSS 2.0 in an attempt to get higher FPS. It's hard to tell how CPU demanding it will be but an upgrade to the 4000 series will elevate any problems if I had to guess.

I'm going after a 3090 as well. Probably won't be lucky enough to get one though.:lol:


wow nice monitor! What monitor did you get? I got my 1440p 144hz a few years ago, and despite it being a TN panel, I love it!

I am actually going after the 3080, not 3090. I want to do a cpu/mobo/mem upgrade so it will cost me less upgrading my whole system then getting a 3090. :) I would love to have Hap money and spend money on a 3090, but really 20% increase for $800 more doesn't sit well with me.
That is more for people who want extreme speed at any cost, or people who render or do workstation apps that utilize 24GB of memory.
 
You haven't actually compared a 3080 in your own system though, right? Anyway, it is what it is, I think the 3080 is a decent card, I'm just not blown away by it.

To be fair my performance at 1440P was already fine with a 2080 Ti, so I'm not exactly desperate for huge improvements, unlike those running at 4K!

Nope, just the 2080Ti.
 
Back
Top