GeForce Ti Showdown: GTX 980 Ti vs GTX 1080 Ti vs RTX 2080 Ti

acroig

Just another Troll
https://www.tweaktown.com/articles/9420/geforce-ti-showdown-gtx-980-vs-1080-rtx-2080/index.html

The road of Titanium has been long, and they represent some of the best of the best in GeForce graphcis cards.

But how does the older GeForce GTX 980 Ti perform in 2020? Surprisingly well. The same goes for the GeForce GTX 1080 Ti but we already knew that... and the GeForce RTX 2080 Ti? Well, we know that thing is a beast in its own right.

However, I've thrown in a bunch of other graphics cards into the benchmark charts to see how the Ti series graphics cards match up today in 2020 against the likes of AMD's new Navi-based Radeon RX 5000 series, as well as the older HBM2-powered Vega-based Radeon RX Vega graphics cards.
 
I'm still using a 980 Ti with 1440p 60Hz monitor and its been fine for most games at max settings. Of course newer games with demanding graphics I will need to turn a few settings down to maintain 60 fps, but that's not every game. Nor does it bother me. I feel bad for people who get bothered by not being able to max out every game.

Going on 5 years now since I bought it so I'd say its probably the longest lasting video card I've owned. I am likely due for a whole new PC once Cyberpunk gets released though. Still running Sandy Bridge i7 CPU!
 
Radeon 7 is now beating the 1080 ti in all 4k benches :hmm:

980 ti is down with the RX 5500 XT a 150 buck card
 
I picked up a used 2080ti from the local Facebook Marketplace for $800 IRSbux (tax return dollas). Apparently the guy bought it at the local BestBuy for $1250 and a few weeks later his wife said "I'm pregnant". So apparently the 2080ti causes babies as well as buyer's remorse, because now he has to buy a crib & stuff. Win-win for me, because I'm pretty sure I can't get pregnant. :bleh: No issues with the card. Time to get a 4K monitor to put to good use.
 
I picked up a used 2080ti from the local Facebook Marketplace for $800 IRSbux (tax return dollas). Apparently the guy bought it at the local BestBuy for $1250 and a few weeks later his wife said "I'm pregnant". So apparently the 2080ti causes babies as well as buyer's remorse, because now he has to buy a crib & stuff. Win-win for me, because I'm pretty sure I can't get pregnant. :bleh: No issues with the card. Time to get a 4K monitor to put to good use.

Killer deal!!
 
Radeon 7 was always faster than a 1080Ti even at launch.

Yet, who did you give your monies to? :lol:

Asus as always back to my pair of 7970's with both AMD and NV

and since it came out 5 month after i got my Asus 2080 ti strix and cfx is as dead as sli there was no point buying one or two


to your point if navi 2x is faster than my 2080 ti by 15% or more i will go back to AMD only this time move the 2080 ti to my secondary system and drop NV like a bad case of the clap
i don't have to have the fastest card i never bought a titan and 15% to 25% more will be fine for 4k for the next year or two

and i miss a real post y2k driver UI
 
Last edited:
Radeon 7 is now beating the 1080 ti in all 4k benches :hmm:

980 ti is down with the RX 5500 XT a 150 buck card

Well I wouldn't consider either "playable" at 4k since they can't sustain 60fps but the Radeon VII does a bit better thanks to double the memory bandwidth of a 1080Ti.
 
Some may define 'playable' differently; as I define 60 sustained for lows more-so 'ideal' frame-rate than 'playable' and anything over 60 more-so 'diminishing returns' than to achieve 'must have' or needed to enjoy the game. Heck, 30 fps is even playable to some and I would settle for some sustained low dips below 60 if the experience was increased dramatically.

I understand the need or advantages of higher frame-rate because the responsiveness, smoothness and over-all improvements with latency are desired by some and shouldn't be ignored and a market out there.
 
I'm able to tolerate 45fps if the game engine allows for locking it at that framerate. I played all of AC Origins that way, and it was perfectly fine. If I lowered the resolution and swapped to 60fps, I could immediately notice how much smoother it was. However, if I stuck with 45fps for a while it became normal and seemed great.

However...I could NOT get use to 30fps no matter how much I tried. It was absolutely terrible. I realize some console games run at that and it seems OK, but on my PC...I could just not get use to it. It was awful.
 
It all depends on the level of fidelity I can achieve and how much this fidelity can improve the gaming experience. Pc gaming is so subjective and each gamer has their own balancing act or threshold between fidelity and performance and why I've been so vocal on features and tools to improve flexibility over the years.
 
I'm able to tolerate 45fps if the game engine allows for locking it at that framerate. I played all of AC Origins that way, and it was perfectly fine. If I lowered the resolution and swapped to 60fps, I could immediately notice how much smoother it was. However, if I stuck with 45fps for a while it became normal and seemed great.

However...I could NOT get use to 30fps no matter how much I tried. It was absolutely terrible. I realize some console games run at that and it seems OK, but on my PC...I could just not get use to it. It was awful.

Throw G-Sync or FreeSync into that mix and sub 60fps dips into the 40's feel fine in allot of games. Especially if it's not competitive FPS stuff.
 
Throw G-Sync or FreeSync into that mix and sub 60fps dips into the 40's feel fine in allot of games. Especially if it's not competitive FPS stuff.

Yep, absolutely. Or so I've read...I've never had the pleasure of experiencing it yet. :cry:
 
Back
Top