Nvidia bringing Gysnc to Freesync monitors

Yes. Freesync 1 is all over the place in terms of variable refresh range, low framerate compensation, etc. So you have a lot of crap 60Hz Freesync panels where adaptive sync only works between 48 and 60 FPS. G-Sync requires LFC and adaptive sync from 30 to the monitor's maximum refresh rate.

Your'e correct on Freesync1 on monitors and there are a lot of crap monitors out there. TBH any VRR monitor that only supports 60hz is a complete joke. When originally reviewed my monitor supported 48-144hz but when they released they included a firmware update which meant they went to 30-144hz. It also has LFC so be honest I've had no problems whatsoever. The reviews on my monitor said it was one of the best freesync monitors available so I'm pretty sure it will handle an Nvidia card no problem, I don't need NV's certification.

This now means I'm no longer tied into the AMD ecosystem so opens up more possibilities going forward with regards to graphic card purchases. Pity about the high cost of NV cards :lol:
 
I’d hardly call this supporting an open standard. The supported list is a joke.

All one has to do is enable the feature in the control panel for the non certified monitors. I could see your point if non certified monitors were locked out totally.
 
I don't really give Nvidia props for finally enabling adaptive sync because they should have done it years ago. Glad they are finally supporting it, but there's no excuse for it taking this long. I hope the support actually works reasonably well and isn't unnecessarily gimped.

Also, I don't need another company to "approve" of my monitor choice. I can do my own research about my monitor, as I do with any other component, and make sure I buy a good one. Most of the guys mentioning "cheap Freesync monitors" are those buying 2080 Tis who would never have bought a cheap monitor to begin with, so it really has no impact on them. If someone else buys a cheap monitor and has a bad experience then that's their problem.
 
Proprietary makes sense when there are no standards or standards are not as mature or robust, imho. That is why amd spent resources on mantle.
 
I never understood your venom for proprietary over-all because at times it creates awareness and innovation moving the industry forward. Gsync has helped move the industry forward.
 
I never understood your venom for proprietary over-all because at times it creates awareness and innovation moving the industry forward. Gsync has helped move the industry forward.

Respect the standards.Otherwise there is always anarchy and chaos.
 
I want to know when Nvidia plans on supporting Freesync on Gsync monitors. So you know, If AMD does release a powerful card I can use it on my Gsync display and make use of Freesync.
^This when?

Come on novideo whip out your dick and make it happen.
 

Is AMD going to dedicate entire server farms to make use of this? Direct ML is machine learning and any machine learned upscaling like DLSS is going to require dedicated server farms to run the AI learning network on, open or not.


Also the article you linked shows just how little expertise AMD has in this area, because Microsoft used Nvidia hardware to demonstrate DirectML. Which means Nvidia is ready out the gate DirectML ready while AMD is just "compliant".



BTW what does this have to do with Freesync/Gsync?
 
Last edited:
I don't know, the article said developing hardware for inference hardware, Nvidia spent resources here with tensor cores, don't think amd has similar hardware.
 
Back
Top