Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

Yea... now if the used GPU market wasn't in complete chaos, I would get something to keep my little Haswell system going :(
 
unknown.png

Yes, because when I spend $700 to $800 on a 3080 I plan on playing at 1080p medium details. :lol:
 

Maybe I live in a parallel universe but I thought everyone understood that Nvidia's high end/enthusiast cards are not designed for 1080p gaming. They're squarely aimed at 1440p/UWQHD/4K and that's where they shine. AMD cards have always been better at the lower resolutions like 1080p but fall behind at 1440p/UWQHD and die a dreadful death at 4K. Plus can anyone tell the difference between 161 and 131 fps if there was no frame counter? Really?
 
The bigger deal, is that a 3080 is no faster than a 3060Ti and that a RX 580 is faster than both, by allot.

So in my scenario I was considering a 3060Ti. But based on that above chart, and the rest, it really might not have been worth it.
 
At 1080p medium settings, though. Do you play at 1080p medium? If not, then I wouldn't worry about it much. The overhead issue is directly related to CPU limited scenarios, and with a 3060TI you'd more than likely be GPU limited at all times regardless of your CPU.
 
At 1080p medium settings, though. Do you play at 1080p medium? If not, then I wouldn't worry about it much. The overhead issue is directly related to CPU limited scenarios, and with a 3060TI you'd more than likely be GPU limited at all times regardless of your CPU.

It would depend. But the story doesn't change much at 1440p, which is a little closer in difficultly to my 3840x1080 situation. If I'm trying to keep that FPS as close to 144 as I can then I would probably drop to medium settings to pull it off.

Of course what this really means is I need to upgrade the core of my system. But I've known that for awhile, just not a great time to do it.
 
For an enthusiast class gamer this may be a non issue but if you're an entry level or even midrange gamer it may be an issue and would like to see this improve.
 
It would depend. But the story doesn't change much at 1440p, which is a little closer in difficultly to my 3840x1080 situation. If I'm trying to keep that FPS as close to 144 as I can then I would probably drop to medium settings to pull it off.

Of course what this really means is I need to upgrade the core of my system. But I've known that for awhile, just not a great time to do it.


The story does change at 1440p and it changes significantly when the bottleneck shifts from the CPU to the GPU. If someone wants to pair a $800 3080 with a 10 year old CPU or a $100 new bottom of the barrel CPU then that person should stop focusing about being CPU limited with medium details and instead shift the bottleneck from the CPU to the GPU.

That means cranking up the details, applying supersampling AA, DSR, etc.. whatever instead of focusing on medium. Being CPU limited at 130 fps on a 3080 means you can hold that same 130 fps while cranking up the settings until the GPU then becomes the bottleneck.

This is really a mountain of a molehill. Yes you can under certain conditions make a 580 faster than a 3080, but under realistic conditions you can make an 3080 absolutely stomp a 580 into the dirt where it belongs on the exact same machine/display setup.
 
For an enthusiast class gamer this may be a non issue but if you're an entry level or even midrange gamer it may be an issue and would like to see this improve.

I doubt this is even a problem for midrange gamers. You need to have basically 100% CPU utilization across all cores for this issue to even pop up.
 
The story does change at 1440p and it changes significantly when the bottleneck shifts from the CPU to the GPU. If someone wants to pair a $800 3080 with a 10 year old CPU or a $100 new bottom of the barrel CPU then that person should stop focusing about being CPU limited with medium details and instead shift the bottleneck from the CPU to the GPU.

That means cranking up the details, applying supersampling AA, DSR, etc.. whatever instead of focusing on medium. Being CPU limited at 130 fps on a 3080 means you can hold that same 130 fps while cranking up the settings until the GPU then becomes the bottleneck.

This is really a mountain of a molehill. Yes you can under certain conditions make a 580 faster than a 3080, but under realistic conditions you can make an 3080 absolutely stomp a 580 into the dirt where it belongs on the exact same machine/display setup.

Erhm I don't care about the 3080. I cared about the 3060Ti, but you still see it from the bottom up.

Skip to 4min and you can see the same story play out at 1440p in WatchDogs.
[yt]JLEIJhunaW8[/yt]

It's not a huge deal. But it's a funny oversight and one that matters for low to mid tier gamer's like myself.
 
IMO, some here are minimizing the problem. For many of us, whole system upgrades are not possible, so people put their money where they think the current bottleneck might be, which is often the GPU.

It is not remotely inconceivable that people with an older quad-core CPU will buy a high end nVidia 3000 series card to replace their aging Radeon, only to find that their FPS went down in the process, not realizing that the increased driver overhead has tanked the potential advantage. This phenomenon was unknown to me until recently, and I suspect that the majority of people assume that adding a high end GPU will automatically improve framerates, yet the evidence shows the opposite in cpu-limited scenarios, which really are not that uncommon for the majority of people who can't afford to constantly upgrade to the latest/greatest CPU/MB/RAM/Monitor.

IMO, HWUB is providing important information for the community here and this information should be shared widely so that others can make informed system upgrade decisions.
 
Its not just the 3080. They simply use it as a shock value on the charts. Its the entire Nvidia lineup thats affected on older or many mid/lower range cpus. Point is if you have 4 core or mid or low range 6 core you can get hit with this.
 
3060Ti or 3090, it doesn't matter. If you're CPU limited with Nvidia with an old CPU then you shift the bottleneck from the CPU to the GPU. It's funny to see the arguments cling to medium / low settings when cranking up to higher settings makes this "issue" basically disappear.

If cost was really a concern you'd be much better off just getting a console, and you wouldn't be limited to 1080p medium. You'd be running the same game like Watch Dogs Legion at (faux) 4k with much higher fidelity.

Or just quit being stingy and upgrade your CPU. It cost me $500 to upgrade my office 4770k system to an i7 10700F. That's CPU, motherboard, 16GB memory and a $40 cooler, paired with a 2080ti before and after the upgrade. Everything else including power supply and drives the same. Now I can play Watch Dogs 2 and Legion at something higher than 1080p/1440 medium, a much better prospect than switching out the 2080ti for a RX 580 and being stuck with those low details. People for some reason forget that for a gaming system, the CPU is just as important as the GPU.
 
IMO, some here are minimizing the problem. For many of us, whole system upgrades are not possible, so people put their money where they think the current bottleneck might be, which is often the GPU.

It is not remotely inconceivable that people with an older quad-core CPU will buy a high end nVidia 3000 series card to replace their aging Radeon, only to find that their FPS went down in the process, not realizing that the increased driver overhead has tanked the potential advantage. This phenomenon was unknown to me until recently, and I suspect that the majority of people assume that adding a high end GPU will automatically improve framerates, yet the evidence shows the opposite in cpu-limited scenarios, which really are not that uncommon for the majority of people who can't afford to constantly upgrade to the latest/greatest CPU/MB/RAM/Monitor.

IMO, HWUB is providing important information for the community here and this information should be shared widely so that others can make informed system upgrade decisions.

Its not just the 3080. They simply use it as a shock value on the charts. Its the entire Nvidia lineup thats affected on older or many mid/lower range cpus. Point is if you have 4 core or mid or low range 6 core you can get hit with this.

Bingo

3060Ti or 3090, it doesn't matter. If you're CPU limited with Nvidia with an old CPU then you shift the bottleneck from the CPU to the GPU. It's funny to see the arguments cling to medium / low settings when cranking up to higher settings makes this "issue" basically disappear.

If cost was really a concern you'd be much better off just getting a console, and you wouldn't be limited to 1080p medium. You'd be running the same game like Watch Dogs Legion at (faux) 4k with much higher fidelity.

Or just quit being stingy and upgrade your CPU. It cost me $500 to upgrade my office 4770k system to an i7 10700F. That's CPU, motherboard, 16GB memory and a $40 cooler, paired with a 2080ti before and after the upgrade. Everything else including power supply and drives the same. Now I can play Watch Dogs 2 and Legion at something higher than 1080p/1440 medium, a much better prospect than switching out the 2080ti for a RX 580 and being stuck with those low details. People for some reason forget that for a gaming system, the CPU is just as important as the GPU.

Wow? I just wanted to get a little more outta my Haswell rig while I could and there was a point where a 3060Ti to pair with it was where I really wanted to go. Seem'ed like a Perfectly reasonable combo with room for later. But as HWUB has made clear... I would have partially screwed myself for a bit.

Again this isn't a huge deal. But your overly minimizing the issue Exposed.
 
IMO, some here are minimizing the problem. For many of us, whole system upgrades are not possible, so people put their money where they think the current bottleneck might be, which is often the GPU.

It is not remotely inconceivable that people with an older quad-core CPU will buy a high end nVidia 3000 series card to replace their aging Radeon, only to find that their FPS went down in the process, not realizing that the increased driver overhead has tanked the potential advantage. This phenomenon was unknown to me until recently, and I suspect that the majority of people assume that adding a high end GPU will automatically improve framerates, yet the evidence shows the opposite in cpu-limited scenarios, which really are not that uncommon for the majority of people who can't afford to constantly upgrade to the latest/greatest CPU/MB/RAM/Monitor.

IMO, HWUB is providing important information for the community here and this information should be shared widely so that others can make informed system upgrade decisions.


It actually crossed my mind to pair a 3080 with my ryzen 7 1700.I wanted to play RTX at 1080p full implementation.
 
It actually crossed my mind to pair a 3080 with my ryzen 7 1700.I wanted to play RTX at 1080p full implementation.

Yeah... not everyone looking at a 3060 Ti - 3090 is necessarily in the same situation. A significant percentage will be people with older CPUs thinking that a new GPU will remove a bottleneck rather than add one. This is important info that consumers should know about.

Note, I'm not against nVidia... just have an issue with those who seek to minimize this issue, perhaps because they wrongly assume that others have the means (or desire) to constantly upgrade system components like they do.
 

3600x comes up close to a lot of intel 6 cores but high end 6 cores intel might be able to pull off some upgrades. But I wouldnt be sure on longevity.

Seeing this at what point does the changing and growing driver/games keep on taxing cpus more and more.

Even as a casual gamer myself I wouldnt touch a 6 core if I was to consider nvidia. And I would if they can ever come to a point where they arent such a high price queen in normal times much less these days.
 
Back
Top