I recently reinstalled windows on my pc, and looked at pcmr on reddit to find a post of someone complaining about gsync, on nvidia systems, not beeing enabled by default everywhere.
That reminded me the pain it is to help someone enable it, with an ugly and hard to understand (for noobs) nvidia tutorial, and even worse with a freesync display.
On my system, with an amd card, and a freesync premium display, once the drivers were installed, freesync was enabled and no issues, nothing do fiddle with, it was just enabled automatically for all the system and windows to use.
Wonder why nvidia can’t do that.
It even set automatically my display to 165hz (tho maybe that could have been because it already was at 165 before the reinstall?).
There is still the trick to lower the max fps 3/4fps lower than the max hz of the display to teach, for better smoothness. But that is just an easy to do trick.
I just checked, it’s 5 (or 6 if it’s not your main display) clicks to enable gsync, nothing hard.
For tech illiterate people, the difficulty of a task is not measured by the number of clicks it takes. Literally the first step you listed is enough to lose most people on their first PC.
Tech illiterate people are screwed either way because windows will leave their high refresh rate screens at 60Hz (unless something has changed).
But why not just enable it by default? The PC knows it’s a GSync Panel. It would be so easy. And kinda ironic as Nvidia was specifically advertising with the deep panel-Pc integration of their modules.
I have no clue. If it were up to me I’d enable it by default or have some dialog popup after you first connect a new gsync capable screen where you could set it up.
I didn’t have any trouble with it, either. It might have also been set up by default, but that was long ago, so can’t recall clearly.