RTX 30X0 power issue thread.

From this side, it seems like you've done as much assumption/speculating as I have, lol.

Why don't you go take a walk to the UPS driver or drive to him? I used to do that all the time when I lived in the city. Throw the guy a $10 and grab your package. Saves him time later
 
From this side, it seems like you've done as much assumption/speculating as I have, lol.

Why don't you go take a walk to the UPS driver or drive to him? I used to do that all the time when I lived in the city. Throw the guy a $10 and grab your package. Saves him time later

I don't see where the well known phenomena of content creation and synthetic benchmarks ALL agreeing to the performance, gaming not doing so, and it EVER having ANY historic reference to anything OTHER than drivers need optimizing.

Literally, you have no historic ground to stand on, this is not assumption. This is fact....

And no need to chase UPS man! He is HERE!!!!! :D I will abstain from this debate for now, I think you and I see eye to eye on THIS point :p
 
I don't see where the well known phenomena of content creation and synthetic benchmarks ALL agreeing to the performance, gaming not doing so, and it EVER having ANY historic reference to anything OTHER than drivers need optimizing.

Literally, you have no historic ground to stand on, this is not assumption. This is fact....

And no need to chase UPS man! He is HERE!!!!! :D I will abstain from this debate for now, I think you and I see eye to eye on THIS point :p

Congrats! Pics of the card, benches, gaming .. you know the drill. Enjoy :)
 
The problem is that the 3090 shouldn't have been called the 3090. It should have been called some kind of Titan variant, as it's not optimized for games.

We can see this happened before with the Titan RTX. More core count than the 2080Ti, yet not much faster in games. In content creation apps however, much more significant.

So unless Nvidia starts giving dual driver options, one optimized for gaming and one for compute/content creation, I don't think we'll see much more improvement with the 3090.

The 3090 interestingly, should have been priced MORE (in line with prior Titan cards). It looks as though Nvidia saw an opportunity to sell more cards (as limited as it is) when giving it a 3090 tag and dropping the price to midway between a gaming and prosumer segment. To put in perspective, 20 x $1499 is more profitable than 10 x $2500. If you're a prosumer, 3090 is probably a steal. If you're just a gamer, stay away from this.

Of course I could completely KAC on this and games do end up being optimized better in a single driver package.
 
Oh newegg. TUF shows the add to cart button. Me! YES I want that one!!! Clicks add to cart. Clicks checkout. Sorry just messing with you man. It's out of stock now. :cry: :nag:
 
Oh newegg. TUF shows the add to cart button. Me! YES I want that one!!! Clicks add to cart. Clicks checkout. Sorry just messing with you man. It's out of stock now. :cry: :nag:

Newegg is doing a ton of site shuffling tonight and it's actually causing some bots to stop working. Alot of speculation on a big drop tonight. But we'll see.
 
20% shader count difference results in 20% average OR MORE in many content creation tests. That is clear scaling working as intended, and EVERYBODY AND THEIR DOG knows that gaming optimization in drivers is the FIRST FACTOR AFTER TFLOPS to consider when finding bottlenecks.

In this case, the bottleneck CLEARLY points to drivers for 3090 not having as many game optimizations as 3080 so far, and a week of launch difference time MIGHT play into that difference, at least THIS early in both products life.

So I stand firmly on 3090 having driver performance lifts a plenty coming, and that your stance of Geforce historic LACK of fine wine improvements is false in itself. Having experienced those improvements over the life of 1080 Ti, I'm shocked you can even type points to the contrary with a straight face.

Edit: Am I wrong in assuming you have had a 1080 Ti as well? I assumed you have one (Or 2080 Ti?) to be making broad statements about lifetime performance of such products. But the direction of your gist here makes me wonder, as you cannot have owned a high end GPU and not experienced such issues before and recognized the clear signs. Lack of driver optimization for gaming ALWAYS shows up in content creation apps performing as expected, games performing like crap. This is known behavior from well before this generation.

I don't think your driver optimization argument makes any sense at all. They are both the same chip just one is cut down more than the other. Why would drivers favor the 3080 over the 3090?

I highly doubt you're going to see any changes in relative gaming performance between the 3080 and 3090 with drivers. They are too similar. What benefits one will benefit the other equally.

There is some other bottleneck that is clearly holding the 3090 back, and the most likely explanation is the power limit is holding it to lower clocks. Or it could scale to lower clocks in general due to having more CUs and being held back by the worst ones, which on the 3080 are disabled.

Content creation doesn't utilize the GPU in the same way and that can explain how it scales better on the 3090. It could be the larger memory bandwidth is more important in content creation, or simply the larger memory capacity.
 
I don't think your driver optimization argument makes any sense at all. They are both the same chip just one is cut down more than the other. Why would drivers favor the 3080 over the 3090?

I highly doubt you're going to see any changes in relative performance between the 3080 and 3090 with drivers. They are too similar.

There is some other bottleneck that is clearly holding the 3090 back, and the most likely explanation is the power limit is holding it to lower clocks. Or it could scale to lower clocks in general due to having more CUs and being held back by the worst ones, which on the 3080 are disabled.

Content creation doesn't utilize the GPU in the same way and that can explain how it scales better on the 3090. It could be the larger memory bandwidth is more important in content creation, or simply the larger memory capacity.

Holding one in my hand and 1950's mhz for 3090 is the same average for 3080, it is not clock speed related. And the ONLY part you got right is content creation apps working different... and everything else wrong. Yanno how content creation apps work differently than games? Driver optimizations. Content creation apps generation don't need much driver optimization and grab hardware resources on their own fine. Unlike games.

Your argument is invalid all around.
 
Yeah, ok, not going to waste time arguing with you. I highly doubt you're going to see the gains you expect in games relative to the 3080, but feel free to believe what you want.
 
Holding one in my hand and 1950's mhz for 3090 is the same average for 3080, it is not clock speed related. And the ONLY part you got right is content creation apps working different... and everything else wrong. Yanno how content creation apps work differently than games? Driver optimizations. Content creation apps generation don't need much driver optimization and grab hardware resources on their own fine. Unlike games.

Your argument is invalid all around.


Vega had the same problem it did great on creation apps and crap on games
and AMD that is know for "fine wine" and getting better with drivers months latter never got it fixed

:hmm: it was samsung also


and like vega 3090 is a good miner maybe gaming was not it's focus

but keep dreaming of the magic driver all the vega 64 users that still have them still are
 
The problem is that the 3090 shouldn't have been called the 3090. It should have been called some kind of Titan variant, as it's not optimized for games.

We can see this happened before with the Titan RTX. More core count than the 2080Ti, yet not much faster in games. In content creation apps however, much more significant.

So unless Nvidia starts giving dual driver options, one optimized for gaming and one for compute/content creation, I don't think we'll see much more improvement with the 3090.

The 3090 interestingly, should have been priced MORE (in line with prior Titan cards). It looks as though Nvidia saw an opportunity to sell more cards (as limited as it is) when giving it a 3090 tag and dropping the price to midway between a gaming and prosumer segment. To put in perspective, 20 x $1499 is more profitable than 10 x $2500. If you're a prosumer, 3090 is probably a steal. If you're just a gamer, stay away from this.

Of course I could completely KAC on this and games do end up being optimized better in a single driver package.

Nvidia did go out of their way to promote 8k gaming and even offered new Dlss flexible settings for 8k gaming, so there was some gaming focus and resources spent. One may argue on the relevance of 8k though.
 
Vega had the same problem it did great on creation apps and crap on games
and AMD that is know for "fine wine" and getting better with drivers months latter never got it fixed

:hmm: it was samsung also


and like vega 3090 is a good miner maybe gaming was not it's focus

but keep dreaming of the magic driver all the vega 64 users that still have them still are
Wun dae !___!


wlZWPxX.gif

 
The problem is that the 3090 shouldn't have been called the 3090. It should have been called some kind of Titan variant, as it's not optimized for games.

I think it's actually a good strategy by Nv, because most people automatically ignore Titans.. but 3090? That's just a next tier gaming GPU over 3080, right? :bleh:
 
I think it's actually a good strategy by Nv, because most people automatically ignore Titans.. but 3090? That's just a next tier gaming GPU over 3080, right? :bleh:

I said it before on this forum about the $1200 price of the 2080ti, If it had been called a Titan with the same performance, I would have bought a 2080 but being called the 2080ti, changed my perception.
 
I said it before on this forum about the $1200 price of the 2080ti, If it had been called a Titan with the same performance, I would have bought a 2080 but being called the 2080ti, changed my perception.
You are too gullible. :bleh:
 
Looks like the Nvidia driver fix took no performance away from the cards, so this was a whole lot about nothing.
 
Back
Top