Navi Reviews

I haven't looked into it but does that involve bios flash?

Misread that they just used the power play tables...


https://forums.overclockers.co.uk/threads/5700xt-hitting-2-3ghz-tomshw-de.18859391/page-6

For those that are asking.

Yes I used the powerplay tables to bypass the 5700 lock.

Yes am using the stock 5700 cooler with max temps around 70c around 3500-4000rpm

There are many powerplay tables but I suggest the 50% power target one and not the 90% as you'll end up nuking the GPU.

As for gains, I saw around a 20fps improvement in some games not all of course. But I feel am around 2070 overclocked/2070s levels with a 5700.
 
Last edited:
https://www.anandtech.com/show/14618/the-amd-radeon-rx-5700-xt-rx-5700-review/15

1.2v is standard for the 5700xt...


get


get


At 50% + PL it barely consumes more power... ~20w more ... nipping at the heels of the 2080... not sure id leave it at +95% PL but 50% sure.
 
Def but the fact they can take the lousiest cooler and get this kind of an oc tells us how good the aibs versions will be.
 
Will likely be a little better, but at some point you hit the voltage/clock limit regardless of temps. It will be good for a more consistent boost clock and likely a lot quieter.
 
It's a sharpening filter, same thing you can sharpen your monitor with. Or Reshade. Or Sweetfx. DLSS is something different.

It's also like saying sharpening is better than DSR. You're just sharpening the image. DSR works with an internal higher resolution.

Metro Exodus 4k DLSS is the best example of DLSS so far and produces a much better image than RIS. Because DLSS is used properly in that game. Hardware Unboxed even admitted to this.

DLSS was a classic example of over(software)engineering, IMO. It was an interesting idea that might have turned out to have merit. But instead it turned out to not doing anything more than a standard dumb filter could do, all the while being 1000 times more complicated. It's possible to come up with very intricate ways of solving simple problems, and that's what Nvidia managed to do with DLSS. The result simply isn't that great, but meanwhile it's very impractical to implement since it requires machine learning.

I also suspect it will quietly be dropped in the next year or so, and we just won't hear anything more about it. So, I certainly wouldn't buy a 2000 series card based on DLSS.
 
DLSS was a classic example of over(software)engineering, IMO. It was an interesting idea that might have turned out to have merit. But instead it turned out to not doing anything more than a standard dumb filter could do, all the while being 1000 times more complicated. It's possible to come up with very intricate ways of solving simple problems, and that's what Nvidia managed to do with DLSS. The result simply isn't that great, but meanwhile it's very impractical to implement since it requires machine learning.

I also suspect it will quietly be dropped in the next year or so, and we just won't hear anything more about it. So, I certainly wouldn't buy a 2000 series card based on DLSS.

Nobody should buy a 2000 series card based on DLSS. But when it's implemented properly (Metro Exodus), its superior to any sharpening filter. Because it isn't a sharpening filter. A sharpening filter won't magically add in texture detail because all it's doing is sharpening. The idea behind super resolution image processing is to take a lower resolution image and produce a near identical higher resolution image. This isn't new, this kind of image processing existed for years (medical, satellite, photography), requiring machine learning as well. Nvidia calls their version DLSS, they're the first to bring it to gaming. Metro Exodus does this well and even Hardware Unboxed, who love to crap on Nvidia, had to admit the DLSS image looked better vs RIS but then went on to complain about DLSS performance (which doesn't matter with a 2080Ti).

Lastly, If you really want to discuss this further there's another thread for that in the other graphics card section.
 
After spending the weekend getting the house ready for the In-Laws visit, I finally got to do some gaming last night with the RX5700XT.

Initial impressions (going from GeForce 970)

  • Skipping two generations is one helluva upgrade.
  • The card is bloody quick.
  • The Catalyst software suite is polished and fantastic.
  • I like the look of the blower card, dent and all. Reminds me of a Formula One car shrink-wrapped body panels.
  • Case temps improved with the replacement of the 970.
  • Temperatures for the card and case are perfectly within scope.
  • I can play all my titles in Ultra for 1080p.
  • So far, I have been able to play Cars in VR at ultra settings and its smooth.
  • DCS World, ramped up the graphics all the way up in VR, the exception of some setting (shadows flat, etc.)


I'm putting this last entry separated from the list above because I want it to stand out. I have spent the early part of my PC Gaming life enjoying ATi video cards and before that, 3DFx. After 2004, I have moved exclusively to Nvidia.

This is my first ATi (AMD) card in 14 years and I can conclusively say, without a shadow of a doubt, that ATi cards have the best image quality. Especially in terms of color saturation and black levels. For 14 years Nvidia has delivered washed out garbage, and I've just gotten used to it. Like a battered wife syndrome. Even after setting the correct HDMI color profile, etc.. The visuals I witnessed last night across numerous titles, felt bold, colorful and contrast-ey heaven.

I know there will be arguments on both sides, and it is a subjective thing, but to my darling eyes, ATi just plain has better visuals. It's downright gorgeous and I could not be happier with my purchase of the RX5700XT.
 
AMD has been known to have higher color saturation compared to NV. NV has more of a "natural" appearance while AMD has more "pop." If you like the vibrant look, you could have just raised the "Digital Vibrance" setting that's in the NVCP and achieved the exact same thing ..
 
Digital Vibrance can get things close, but it's not the same thing. This has been well known for over a decade.
 
No ****, I said in my post that it’s been a known thing. This has been debunked for years; Nvidia has a lower dynamic contrast by default, but that can be changed to match AMD’s appearance. There’s nothing going on that can’t be edited to match, it’s just different default color settings.

The biggest issue used to be the NVCP defaulting to Limited RGB range for some reason but I haven’t seen that happen in years. It doesn’t help that a lot of the videos that claim major IQ differences are from people that couldn’t be bothered to buy a capture card..
 
Color differences these days will be due to the color settings and monitor. All color information is sent digitally over HDMI/DVI Link/Displayport. AMD did have better color back in the analog VGA days because they had a better 2D processor, but any perceived color differences now will be due to the initial color setting or monitor used. Or the age old placebo affect.

First thing I do when I get a new card or monitor is calibrate the card first to make sure its sending out a base signal, calibrate the monitor accordingly, then recalibrate the card again to my tastes (which is generally a neutral output (not warm or cool) with a slightly higher contrast/color vibrance from base (really takes advantage of LG's superior color display over PC IPS). Then enjoy that setup for years to come without touching it again.
 
I was going to say I must be getting old(eyes going bad) because I can't tell the difference between nVidia and AMD based on 2D color output. I have both in a single system and can switch between the two without rebooting. I can't tell the difference in 2D.
 
Sorry that reads off when I read it now. I was just trying to say that setting alone has never been enough on its own.
 
Nobody should buy a 2000 series card based on DLSS. But when it's implemented properly (Metro Exodus), its superior to any sharpening filter. Because it isn't a sharpening filter. A sharpening filter won't magically add in texture detail because all it's doing is sharpening. The idea behind super resolution image processing is to take a lower resolution image and produce a near identical higher resolution image. This isn't new, this kind of image processing existed for years (medical, satellite, photography), requiring machine learning as well. Nvidia calls their version DLSS, they're the first to bring it to gaming. Metro Exodus does this well and even Hardware Unboxed, who love to crap on Nvidia, had to admit the DLSS image looked better vs RIS but then went on to complain about DLSS performance (which doesn't matter with a 2080Ti).

Lastly, If you really want to discuss this further there's another thread for that in the other graphics card section.

Eh.. don't really want to discuss it further since my opinion is DLSS is largely worthless (too much effort, for too little pay off), and I don't even care enough to argue about it. We're obviously not going to agree so what's the point of even saying anything more about it?

The one thing I'll say about RIS, is that at least it's a straightforward implementation that should pretty much work with every game out of the box. I doubt I'd ever use it either, myself. (For that matter, since I now have a G-sync monitor I'm "locked in" to Nvidia so it's kind of a moot point anyway. :lol: )
 
Back
Top