Nvidia bringing Gysnc to Freesync monitors

Eisberg

New member
https://www.theverge.com/circuitbreaker/2019/1/7/18171631/nvidia-g-sync-support-freesync-monitors

Nvidia has announced at CES 2019 that it will bring G-Sync support to several FreeSync monitors, removing the need to buy a monitor specifically certified and branded by the leading GPU maker. 12 FreeSync displays have been confirmed to get G-Sync compatibility through a driver that will be released on January 15th.

Tested and approved monitors so far:

Acer XFA240
Acer XG270HU
Acer XV273K
Acer XZ321Q
Agon AG241QG4
AOC G2590FX
Asus MG278Q
Asus VG258Q
Asus VG278Q
Asus XG248
Asus XG258
BenQ XL2740
 
their certification is meaningless as long as they let you manually enable VRR

We’ll continue to test monitors and update our support list. For gamers who have monitors that we have not yet tested, or that have failed validation, we’ll give you an option to manually enable VRR, too.

https://blogs.nvidia.com/blog/2019/01/06/g-sync-displays-ces/


and it better work as well as it does with a AMD card or they are sandbagging it
 
I'm curious, does amd test free sync monitors to make sure the gamer receives a good experience?
Afaik it´s basically Adaptive Sync as part of the DisplayPort standard, so its either compatible or it isnt.
It´s not really their job to validate if screen makers follow the VESA standards correctly I think.

They have software features outside of that, but in the end it´s pretty much just the GPU telling the screen "here´s a new frame go ahead and do a refresh", as per the VESA standard.

I suspect that what Nvidia is doing with their G Sync Compatible badge is mostly marketing and possibly asking for a "administrative fee" or somesuch.
As mentioned if your monitor isnt certified but supports Adaptive Refresh you can enable it manually.
 
Afaik it´s basically Adaptive Sync as part of the DisplayPort standard, so its either compatible or it isnt.
It´s not really their job to validate if screen makers follow the VESA standards correctly I think.

They have software features outside of that, but in the end it´s pretty much just the GPU telling the screen "here´s a new frame go ahead and do a refresh", as per the VESA standard.

I suspect that what Nvidia is doing with their G Sync Compatible badge is mostly marketing and possibly asking for a "administrative fee" or somesuch.
As mentioned if your monitor isnt certified but supports Adaptive Refresh you can enable it manually.

This only applies to HDMI 2.1, not the countless other freesync monitors of varying quality outside of this that existed for years before.

Gsync was around longer than freesync and offered a better experience throughout the years because it was a closed standard.

VRR/HDMI 2.1 isnt a gaurantee against piss poor cheap freesync monitors, just like the HDR standard didnt prevent crappy cheap HDR panels either.
 

It's funny how when it came to the actual motion quality, it was pretty unanimous Gsync was better.

However, it was also blatantly apparent they had system 1 running at a higher detail setting because most kept saying how system 1 had better detailed textures/lighting, etc.. Things that have nothing to do with refresh rate but everything to do with graphics/game settings.

So the comparison was loaded in the first place. And then the loaded question later if the $200 difference was worth it, to take away focus from the qualitative difference in the first place. :lol:

Also these are good monitors they tested. Maybe they should have tested on one of those cheaper free sync monitors and then ask if the price difference was worth it. Do they still make 60hz/90hz free sync monitors these days? :lol:
 
I would not be surprised if most freesync 2 displays will be fine.

Didn't amd themselves put stricter requirements on the freesync 2 compatible monitors compared to freesync 1 compatible displays?
 
It's funny how when it came to the actual motion quality, it was pretty unanimous Gsync was better

I think people tend to forget that the main benefit of G-Sync over Freesync is the ability to maintain the smoothness at lower frame rates.
For me, the ability to have a better experience even with fps in the 30s than vsync 60+ fps is by far the best thing about it.

However, it was also blatantly apparent they had system 1 running at a higher detail setting because most kept saying how system 1 had better detailed textures/lighting, etc.. Things that have nothing to do with refresh rate but everything to do with graphics/game settings.

Pretty sure that had everything to do with the freesync monitor they used supporting HDR.
 
Last edited:
I would not be surprised if most freesync 2 displays will be fine.

Didn't amd themselves put stricter requirements on the freesync 2 compatible monitors compared to freesync 1 compatible displays?

yes
and I think screen makers like LG and Samsung will have something to say about this if they meet the vrr standard and it works fine on AMD and not NV


my next screen will be freesync 2/VRR and I will try to pick one that nv likes but gsync is dead .
 
It's funny how when it came to the actual motion quality, it was pretty unanimous Gsync was better.

However, it was also blatantly apparent they had system 1 running at a higher detail setting because most kept saying how system 1 had better detailed textures/lighting, etc.. Things that have nothing to do with refresh rate but everything to do with graphics/game settings.

So the comparison was loaded in the first place. And then the loaded question later if the $200 difference was worth it, to take away focus from the qualitative difference in the first place. :lol:

Also these are good monitors they tested. Maybe they should have tested on one of those cheaper free sync monitors and then ask if the price difference was worth it. Do they still make 60hz/90hz free sync monitors these days? :lol:

I think you are confusing better graphics with smoothness (this is where freesync/gsync matters) The better graphics most likely had nothing to do with G-Sync, but could very well be due to the fact that they where running a water cooled, over clocked 1080ti, where the Vega was air cooled and at stock speeds. Did you remember to take that into account?

Next, you are accusing them of using different settings because you some how believe it is impossible for AMD to look better than Nvidia.. That is bias BULLSHIT! You also obviously missed the one guy that said that the AMD system was grainer, of course it was only 1 person out of all of them that made that comparison, but that contradicts your different settings theory. Also, the comments about "leafs floating" etc. on the Nvidia card, could have something to do with PhysX, because that has always been an effect of PhysX (added debree, paper flying by, etc)

Why would it be fair to use a Cheaper freesync monitor? They tested using 2 comparable monitors. Your suggestion of using a lower quality/cheaper monitor would defiantly stack the deck in Nvidia's favor, which the test had one huge factor that already stacked the deck for Nvidia. (mentioned next)

If it was a loaded test, as you say, why didn't they use a 1080 stock card against the Vega 64 air cooled? If anything, the very fact that they used a overlocked, watercooled 1080ti against a air cooled stock Vega 64, gave Nvidia/Gsync a large advantage from the start. So your "the comparison was loaded in the first place" bias view is wrong. The deck was already stacked in Nvidia's favor just with this fact alone.

When it comes to graphic quality, both vendors are about equal over all. What that means, is there are some games/scenes etc that AMD displays better, and the same goes for Nvidia. Now, when it comes to Gsync vs Freesync2, from the comments made, they are about equal, and the bottom line is that Gsync is not worth the extra $200.
 
Last edited:
I think people tend to forget that the main benefit of G-Sync over Freesync is the ability to maintain the smoothness at lower frame rates.
For me, the ability to have a better experience even with fps in the 30s than vsync 60+ fps is by far the best thing about it.



Pretty sure that had everything to do with the freesync monitor they used supporting HDR.

Actually, On HardOCP, they have instructions where you can edit some settings and change the Freesync frame rate window. I am not sure how well it works, but some have mentioned getting it to work as low as 22 fps.
 
Actually, On HardOCP, they have instructions where you can edit some settings and change the Freesync frame rate window. I am not sure how well it works, but some have mentioned getting it to work as low as 22 fps.

with HDMI 2.1 we want that window more to be 50 to 90 FPS at 4k even if it needs CFX
 
?? I don't understand your thought process on this. You wouldn't want adaptive sync to work below 50 fps?

yes but with a 2080 ti or navi cfx it is less important as it should not get that low

great if it does go that low to 22 fps but I want a higher top end more now 90 or even 120 at 4k

or Radeon 7 cfx
 
I think you are confusing better graphics with smoothness (this is where freesync/gsync matters) The better graphics most likely had nothing to do with G-Sync, but could very well be due to the fact that they where running a water cooled, over clocked 1080ti, where the Vega was air cooled and at stock speeds. Did you remember to take that into account?

Next, you are accusing them of using different settings because you some how believe it is impossible for AMD to look better than Nvidia.. That is bias BULLSHIT! You also obviously missed the one guy that said that the AMD system was grainer, of course it was only 1 person out of all of them that made that comparison, but that contradicts your different settings theory. Also, the comments about "leafs floating" etc. on the Nvidia card, could have something to do with PhysX, because that has always been an effect of PhysX (added debree, paper flying by, etc)

3 - 4 clearly said the detail and lighting looked better. If you think that has something to do with AMD just simply looking better vs Nvidia rather than graphic/game settings then you are the one with "bias BULLSHIT! "

I would expect even you would call that into question, especially since something as substantial such as textures and lighting would be noticed in every game and video card review prior. But it's better to go against the grain and declare AMD just looks better, based on this single questional comparison, right? :rolleyes:
 
Pretty sure that had everything to do with the freesync monitor they used supporting HDR.

I suspected that as well. Some of these gamers made their decision based on that image quality difference, which only skews the tally on that alone. So in essence it was not really a Freesync 2 vs Gsync comparison, but Samsung Vs Asus monitor.
 
Back
Top