G-SYNC 101: G-SYNC vs. V-SYNC OFF w/FPS Limit


At the Mercy of the Scanout

Now that the FPS limit required for G-SYNC to avoid V-SYNC-level input lag has been established, how does G-SYNC + V-SYNC and G-SYNC + V-SYNC “Off” compare to V-SYNC OFF at the same framerate?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

The results show a consistent difference between the three methods across most refresh rates (240Hz is nearly equalized in any scenario), with V-SYNC OFF (G-SYNC + V-SYNC “Off,” to a lesser degree) appearing to have a slight edge over G-SYNC + V-SYNC. Why? The answer is tearing…

With any vertical synchronization method, the delivery speed of a single, tear-free frame (barring unrelated frame delay caused by many other factors) is ultimately limited by the scanout. As mentioned in G-SYNC 101: Range, The “scanout” is the total time it takes a single frame to be physically drawn, pixel by pixel, left to right, top to bottom on-screen.

With a fixed refresh rate display, both the refresh rate and scanout remain fixed at their maximum, regardless of framerate. With G-SYNC, the refresh rate is matched to the framerate, and while the scanout speed remains fixed, the refresh rate controls how many times the scanout is repeated per second (60 times at 60 FPS/60Hz, 45 times at 45 fps/45Hz, etc), along with the duration of the vertical blanking interval (the span between the previous and next frame scan), where G-SYNC calculates and performs all overdrive and synchronization adjustments from frame to frame.

The scanout speed itself, both on a fixed refresh rate and variable refresh rate display, is dictated by the current maximum refresh rate of the display:

Blur Buster's G-SYNC 101: Scanout Speed DiagramAs the diagram shows, the higher the refresh rate of the display, the faster the scanout speed becomes. This also explains why V-SYNC OFF’s input lag advantage, especially at the same framerate as G-SYNC, is reduced as the refresh rate increases; single frame delivery becomes faster, and V-SYNC OFF has less of an opportunity to defeat the scanout.

V-SYNC OFF can defeat the scanout by starting the scan of the next frame(s) within the previous frame’s scanout anywhere on screen, and at any given time:

Blur Buster's G-SYNC 101: Input Lag & Optimal Settings

This results in simultaneous delivery of more than one frame scan in a single scanout (tearing), but also a reduction in input lag; the amount of which is dictated by the positioning and number of tearline(s), which is further dictated by the refresh rate/sustained framerate ratio (more on this later).

As noted in G-SYNC 101: Range, G-SYNC + VSYNC “Off” (a.k.a. Adaptive G-SYNC) can have a slight input lag reduction over G-SYNC + V-SYNC as well, since it will opt for tearing instead of aligning the next frame scan to the next scanout when sudden frametime variances occur.

To eliminate tearing, G-SYNC + VSYNC is limited to completing a single frame scan per scanout, and it must follow the scanout from top to bottom, without exception. On paper, this can give the impression that G-SYNC + V-SYNC has an increase in latency over the other two methods. However, the delivery of a single, complete frame with G-SYNC + V-SYNC is actually the lowest possible, or neutral speed, and the advantage seen with V-SYNC OFF is the negative reduction in delivery speed, due to its ability to defeat the scanout.

Bottom-line, within its range, G-SYNC + V-SYNC delivers single, tear-free frames to the display the fastest the scanout allows; any faster, and tearing would be introduced.



3105 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
BlurryAlienGoZ00mXD
Member
BlurryAlienGoZ00mXD

If I change Vertical sync to “On” in “Manage 3D settings”, it forces “Adjust image settings with preview” from “Let the 3D application decide” to “Use the advanced 3D image settings“.

Ideally I don’t want this as I don’t want any artificial enhancements on my game, but it seems impossible to have both at the same time.

It’s not mentioned in the guide that it does this. Any advice / insight for this? Best to set as much of the “enhancing” “Manage 3D settings” (for example: “Antialias – Gamma correction”) to “Off” to make it is as close to “Let the 3D application decide” as possible?

Let me know if I’m misunderstanding. Thanks.

dandyjr
Member
dandyjr

Hello there! I recently bought a 500Hz G-sync monitor and it’s the first time I’ve owned a true G-sync monitor with a module. The exact model is the AW2524H. One of the first things I noticed with this monitor is that the frames never hit the max range of the monitor as if the module has it’s own way to prevent leaving the G-Sync range. The monitor is technically a 480Hz panel but it has a factory overclock you can set to 500Hz. In stock form (with all of the proper G-sync settings in the control panel) the frames automatically are capped at 477 and never will hit 480. Overclocked to 500Hz, The frames will never go above 496. I tested multiple games just to make sure and they all produced the same results. What I’m wondering is if this is an effect of this specific model or if this is true for all G-Sync native monitors. Does this mean that I never have to cap my fps because the monitor won’t allow the frames to hit the ceiling anyway? I noticed that if I enable reflex in supported games, the frames will be capped at 438 instead of 496. My guess is that Nvidia set such a low limit to be safe for G-Sync compatible monitors that happen to be 500Hz (since I’ve heard they are less accurate and will leave range more often). What are your thoughts on this? I’d be stoked to hear that I don’t have to cap my fps anymore but it seems too good to be true haha!

IggyRex
Member
IggyRex

Hello, you probably get asked this a lot, but I game on a lg c1 at 120hz, and I was wondering what would be best for my setup. I am currently using Vsync – on in the control panel, but I am unsure if I should set low latency to ultra or if setting to On would be the better option for me. Having it set to ON would require me to set an fps limit on a game basis correct? So something like elden ring would work better being capped to 59 or 57fps as opposed to it set to ultra which would do it for me. What are the benefits of either option? Sorry, and please let me know. I just want to get the most out of my hardware and out of gsync.

Kaffik
Member
Kaffik

Hi!
I have 165 Hz G-Sync Compatible (ls27ag500nuxen) monitor.
I used all your tips for this type of display but something irritates me a lot.
60 fps on my screen looks like 30-35 at best, I need to have like 95-100 fps to make it look like smooth 60.
For example my boyfriend has 144 Hz VRR TV and 60 fps looks like 60 fps.
I have DP 1.4 and VRR on.
Do I really need a G-Sync module to make it better?

Indignified
Member
Indignified

Hello, is there any point of having gsync or reflex on if fps is uncapped?

wpDiscuz