Nvidia Has a Driver Overhead Problem, GeForce vs Radeon on Low-End CPUs

I wanted to move away from 16.7 8bit panel that are being phased out ... Plain and simple.Also 1.07b is not worthless.It helps with gradients and fine stuff.Regarding monitors gaming community was so brainwashed to accept any missery and all sorts of compromises. (I am pointing to Lurk situation)

https://www.rtings.com/tv/tests/picture-quality/gradient --- For me personally is noticeble.


A 3090 with 62 fps mins will last much fewer years at 1440p Ultra.Not even 4k.This is a 2000$ card!!! It will be quickly overwhelmed.I would not give the 3090 more than 2 years until it can't sustain minimum of 60fps in fully RTX games.Also the deeply need of DLSS smells very bad.I have a feeling that deactivating DLSS on 1440p ultra you may not like it.If you have deep pockets to pay 2000$ every 2 years to keep 60fps minum in RTX games at 1440p be my guest.

Mangler has a valid perspective but I still don't think fully that we will be "stuck" the next years.We will see.

badsykes you're taking my comments out of context. When I mentioned 62fps at 1440p I wasn't talking about the 3090. It was a graphic card agnostic statement relating purely to the frames per second and the monitor's native resolution. In BF5 with everything on Ultra settings and Ultra RT at 1440p,no DLSS, I was getting >100fps. Now I've got another 1.2m pixels to push it's probably about 90.

The few other games I have that have RT are more than playable at acceptable frame rates at my native resolution of 3440x1440 without DLSS so I'm not sure what you're talking about. The lowest I've seen is in SOTTR where I averaged 75fps with Ultra settings and native Ultra RT. And that's a poorly optimised RT game. I think you're reading too many click bait articles and youtube videos where they're talking out of their ars**.

Ray tracing is no where near ready for the mainstream yet, why do you think Nvidia has DLSS and AMD are looking to launch their own version? IMHO you're looking at maybe 4 to 5 years until you get 4K >100fps with no DLSS. That's probably a conservative estimate.

To quote Cyberpunk as an example of RT is a bit of a joke really. They released a half baked game that was so buggy it was embarrassing. Even the people who worked on it are trying to distance themselves from the mess. It's a bit like quoting the Titanic as a good example for a cruise ship :lol:

There are a lot of people like me who aren't really interested in RT in the games we play and I, for one, didn't buy the 3090 because of its RT performance. I bought it because I think I can play games on it for at least 3 to 4 years at Ultra settings (no RT) no problem. Well the first person shooters and sports sims I play anyway just totally disinterested in role playing crap, no offence.
 
badsykes you're taking my comments out of context. When I mentioned 62fps at 1440p I wasn't talking about the 3090. It was a graphic card agnostic statement relating purely to the frames per second and the monitor's native resolution. In BF5 with everything on Ultra settings and Ultra RT at 1440p,no DLSS, I was getting >100fps. Now I've got another 1.2m pixels to push it's probably about 90.

The few other games I have that have RT are more than playable at acceptable frame rates at my native resolution of 3440x1440 without DLSS so I'm not sure what you're talking about. The lowest I've seen is in SOTTR where I averaged 75fps with Ultra settings and native Ultra RT. And that's a poorly optimised RT game. I think you're reading too many click bait articles and youtube videos where they're talking out of their ars**.

Ray tracing is no where near ready for the mainstream yet, why do you think Nvidia has DLSS and AMD are looking to launch their own version? IMHO you're looking at maybe 4 to 5 years until you get 4K >100fps with no DLSS. That's probably a conservative estimate.

To quote Cyberpunk as an example of RT is a bit of a joke really. They released a half baked game that was so buggy it was embarrassing. Even the people who worked on it are trying to distance themselves from the mess. It's a bit like quoting the Titanic as a good example for a cruise ship :lol:

There are a lot of people like me who aren't really interested in RT in the games we play and I, for one, didn't buy the 3090 because of its RT performance. I bought it because I think I can play games on it for at least 3 to 4 years at Ultra settings (no RT) no problem. Well the first person shooters and sports sims I play anyway just totally disinterested in role playing crap, no offence.

I do plan to play RTX full implementation and have the experience so this is why i kinda take Cyberpunk as a baseline for 1440p ultra in consideration.In 1080p ultra i can escape with a 2080ti/3070 but in 1440p ultra a 3090 is needed.
From your history of buying amd in the last years you jumped from vega64 to 5700xt to 3090. Your cards didn't last too much.You bought in 400-500$ segment and you jumped to 2000$ segment..It doesn't feel you have been very pleased with the performance otherwise you would have stayed with that semgnet.I don't think all the youtube reviewers big or small lie.They wouldn't last if they lied.
 
Last edited:
Well ... at least i know that going to 1280x720 my vega56 have better performance than RTX3090..This is where the CRT would be an awesome investment.Going down in resolution to 720p leads to better performance more than rtx3090.I don't need to buy a new gpu actually...I have a gpu that is better than the most expensive gpu :)
 
Last edited:
Enjoy low resolution I guess? That's kind of an absurd statement.

We are leaving in "absurd" times.
I buy graphic monitors for playing doom eternal in black and white...
KAC bought AMD ...
GPU's have "absurd" prices.I can sell my vega56 for the same price i bought it.


Should i add more "absurd" stuff ? :D


Later Edit: I moved the 6700xt vs 5700xt in the 6700xt thread.
 
Last edited:
Well ... at least i know that going to 1280x720 my vega56 have better performance than RTX3090..This is where the CRT would be an awesome investment.Going down in resolution to 720p leads to better performance more than rtx3090.I don't need to buy a new gpu actually...I have a gpu that is better than the most expensive gpu :)

Yes.... ok.... if you can play at 720 then. I'm sure you can find a great CRT that you can play at that res in B&W.
 
Last edited:
Enjoy low resolution I guess? That's kind of an absurd statement.

Silly question, because I'm not one of the knowlegeable guys here...but at 720p wouldn't the CPU, not the GPU be doing most of the heavy lifting?

:bleh:
 
Silly question, because I'm not one of the knowlegeable guys here...but at 720p wouldn't the CPU, not the GPU be doing most of the heavy lifting?

:bleh:

Yeah at that res the bottleneck goes to the CPU as GPU is waiting for triangles to be received from CPU.

The thing is, one can always go down in resolution but the IQ would certainly suffer. Aliasing and texture filtering would become huge issues for me. Not to mention that most CRT's are locked at 60Hz (not all).

To each his own.
 
Yeah at that res the bottleneck goes to the CPU as GPU is waiting for triangles to be received from CPU.

The thing is, one can always go down in resolution but the IQ would certainly suffer. Aliasing and texture filtering would become huge issues for me. Not to mention that most CRT's are locked at 60Hz (not all).

To each his own.

:up: Thanks man, maybe badsykes should just upgrade his CPU :D
 
I do plan to play RTX full implementation and have the experience so this is why i kinda take Cyberpunk as a baseline for 1440p ultra in consideration.In 1080p ultra i can escape with a 2080ti/3070 but in 1440p ultra a 3090 is needed.

From your history of buying amd in the last years you jumped from vega64 to 5700xt to 3090. Your cards didn't last too much.You bought in 400-500$ segment and you jumped to 2000$ segment..It doesn't feel you have been very pleased with the performance otherwise you would have stayed with that segment. I don't think all the youtube reviewers big or small lie.They wouldn't last if they lied.

When I owned my AMD Fury X I bought my first freesync monitor which only AMD supported at the time so I became "locked" in the AMD Eco-system. So that's why I bought Vega 64 and then 5700XT as I liked using freesync for no tearing or stuttering. I even briefly bought the Radeon VII for £700 but cancelled my order when I read the reviews because of the noisy fans. For 1440p 144hz, Fury, Vega64 and 5700XT were very good and I had no issues with any of them in playing games.

For this generation my initial thoughts were to buy a 6800XT or a 3080 whichever I was able to get. Obviously that didn't work out well so I broadened my horizons and budget and moved on to a 6900XT or 3090. The 6900XT was basically vapour ware but I was able to purchase a 3090 so I did. Quite simple really. I didn't need to upgrade my 5700XT but I WANTED to which is totally different.

Personally I wouldn't play Cyberpunk if it was free I just don't like that genre of games. I've just purchased Metro Exodus for £9.00 because they're bringing out a whole new PC enhanced ray tracing version (which is free if you own the game) so I might be able to see what all the fuss is about with that game.
 
Well ... at least i know that going to 1280x720 my vega56 have better performance than RTX3090..This is where the CRT would be an awesome investment.Going down in resolution to 720p leads to better performance more than rtx3090.I don't need to buy a new gpu actually...I have a gpu that is better than the most expensive gpu :)

Just to give you an idea of what a 3090 can really do I fired up Forza 4 Horizon demo and cranked everything up to Ultra (including AF) and 4x MSAA. I unlocked the frame rates and in the demo I averaged 157fps at 3440x1440. I then did some racing and fps never dropped below 145 and in some places were 211 so I probably averaged >175fps.

TBH the graphics in the game look very good and at those frame rates it's buttery smooth. Basically If I limit the fps to 144 to match my monitor the frame counter won't move. All that and the GPU utilisation was only in the high 80's so the card still has more to give. This is with everything in my system at stock. As I've said in other posts gaming is fun again.
 
Last edited by a moderator:
LordHawkwind, I have edited your post to remove your unnecessary commentary. I know badsykes plays at a much lower resolution due to his hardware limitations and there's no need to ridicule him for it.

Let's keep it less antagonistic from here on please.
 
LordHawkwind: ..The RTX full implementation was the most exciting thing to me and worth spending 600$.I waited for RTX from GTX480 era.Cyberpunk just made this true.This is why i am baselining on it .The industry is not there yet.
I can consider Cyberpunk a technological demo. ;)
 
Last edited:
unknown.png
 
Back
Top