G-SYNC 101: Range


Blur Buster's G-SYNC 101: Range Chart

Exceeds G-SYNC Range

G-SYNC + V-SYNC “Off”:
G-SYNC disengages, tearing begins display wide, no frame delay is added.

G-SYNC + V-SYNC “On”:
G-SYNC reverts to V-SYNC behavior when it can no longer adjust the refresh rate to the framerate, 2-6 frames (typically 2 frames; approximately an additional 33.2ms @60 Hz, 20ms @100 Hz, 13.8ms @144 Hz, etc) of delay is added as rendered frames begin to over-queue in both buffers, ultimately delaying their appearance on-screen.

G-SYNC + Fast Sync*:
G-SYNC disengages, Fast Sync engages, 0-1 frame of delay is added**.
*Fast Sync is best used with framerates in excess of 2x to 3x that of the display’s maximum refresh rate, as its third buffer selects from the “best” frame to display as the final render; the higher the sample rate, the better it functions. Do note, even at its most optimal, Fast Sync introduces uneven frame pacing, which can manifest as recurring microstutter.
**Refresh rate/framerate ratio dependent (see G-SYNC 101: G-SYNC vs. Fast Sync).

Within G-SYNC Range

Refer to “Upper & Lower Frametime Variances” section below…

Upper & Lower Frametime Variances

G-SYNC + V-SYNC “Off”:
The tearing inside the G-SYNC range with V-SYNC “Off” is caused by sudden frametime variances output by the system, which will vary in severity and frequency depending on both the efficiency of the given game engine, and the system’s ability (or inability) to deliver consistent frametimes.

G-SYNC + V-SYNC “Off” disables the G-SYNC module’s ability to compensate for sudden frametime variances, meaning, instead of aligning the next frame scan to the next scanout (the process that physically draws each frame, pixel by pixel, left to right, top to bottom on-screen), G-SYNC + V-SYNC “Off” will opt to start the next frame scan in the current scanout instead. This results in simultaneous delivery of more than one frame in a single scanout (tearing).

In the Upper FPS range, tearing will be limited to the bottom of the display. In the Lower FPS range (<36) where frametime spikes can occur (see What are Frametime Spikes?), full tearing will begin.

Without frametime compensation, G-SYNC functionality with V-SYNC “Off” is effectively “Adaptive G-SYNC,” and should be avoided for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + V-SYNC “On”:
This is how G-SYNC was originally intended to function. Unlike G-SYNC + V-SYNC “Off,” G-SYNC + V-SYNC “On” allows the G-SYNC module to compensate for sudden frametime variances by adhering to the scanout, which ensures the affected frame scan will complete in the current scanout before the next frame scan and scanout begin. This eliminates tearing within the G-SYNC range, in spite of the frametime variances encountered.

Frametime compensation with V-SYNC “On” is performed during the vertical blanking interval (the span between the previous and next frame scan), and, as such, does not delay single frame delivery within the G-SYNC range and is recommended for a tear-free experience (see G-SYNC 101: Optimal Settings & Conclusion).

G-SYNC + Fast Sync:
Upper FPS range: Fast Sync may engage, 1/2 to 1 frame of delay is added.
Lower FPS range: see “V-SYNC ‘On'” above.

What are Frametime Spikes?

Frametime spikes are an abrupt interruption of frames output by the system, and on a capable setup running an efficient game engine, typically occur due to loading screens, shader compilation, background asset streaming, auto saves, network activity, and/or the triggering of a script or physics system, but can also be exacerbated by an incapable setup, inefficient game engine, poor netcode, low RAM/VRAM and page file over usage, misconfigured (or limited game support for) SLI setups, faulty drivers, specific or excess background processes, in-game overlay or input device conflicts, or a combination of them all.

Not to be confused with other performance issues, like framerate slowdown or V-SYNC-induced stutter, frametime spikes manifest as the occasional hitch or pause, and usually last for mere micro to milliseconds at a time (seconds, in the worst of cases), plummeting the framerate to as low as the single digits, and concurrently raising the frametime to upwards of 1000ms before re-normalizing.

G-SYNC eliminates traditional V-SYNC stutter caused below the maximum refresh rate by repeated frames from delayed frame delivery, but frametime spikes still affect G-SYNC, since it can only mirror what the system is outputting. As such, when G-SYNC has nothing new to sync to for a frame or frames at a time, it must repeat the previous frame(s) until the system resumes new frame(s) output, which results in the visible interruption observed as stutter.

The more efficient the game engine, and the more capable the system running it, the less frametime spikes there are (and the shorter they last), but no setup can fully avoid their occurrence.

Minimum Refresh Range

Once the framerate reaches the approximate 36 and below mark, the G-SYNC module begins inserting duplicate refreshes per frame to maintain the panel’s minimum physical refresh rate, keep the display active, and smooth motion perception. If the framerate is at 36, the refresh rate will double to 72 Hz, at 18 frames, it will triple to 54 Hz, and so on. This behavior will continue down to 1 frame per second.

Regardless of the reported framerate and variable refresh rate of the display, the scanout speed will always be a match to the display’s current maximum refresh rate; 16.6ms @60Hz, 10ms @100 Hz, 6.9ms @144 Hz, and so on. G-SYNC’s ability to detach framerate and refresh rate from the scanout speed can have benefits such as faster frame delivery and reduced input lag on high refresh rate displays at lower fixed framerates (see G-SYNC 101: Hidden Benefits of High Refresh Rate G-SYNC).



3745 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
higorhigorhigor
Member
higorhigorhigor

In my tests with the AMD RX 6750 XT, an LG 180hz IPS monitor, on both Linux and Windows 11, I noticed that when I cap the FPS at 60, for example, in scenarios that could deliver 160 uncapped, what happens is that my GPU significantly reduces its clock speeds, and this creates instability in frame production, causing the refresh rate to fluctuate widely.

On Linux, in LACT (a program for managing GPUs on Linux), I created a profile for the specific game and activated a performance level option that keeps the clock speeds higher. This completely solved the problem that occurs when I limit the FPS far below what my GPU can produce.

On Windows, I haven’t found a similar option to this, but I also haven’t looked much since I’m not using Windows a lot. I came here to comment, so that in case you weren’t aware of this, it might help other users who feel they have this VRR disengagement issue even when the FPS seems stable in RTSS.

COLEDED
Member
COLEDED

Thanks for the detailed guide. Sorry if this is a mostly unrelated question, I ask because the power plan is mentioned in the conclusion section.

My default Window’s “High performance” plan puts my minimum processor state at 0% for some reason.

Is it a good idea to just use Bitsum’s highest performance plan from park control, which sets the Minimum processor state at 100%, all of the time? I haven’t seen an increase in idle CPU power consumption or utilization after changing to this profile

Does this setting actually changes anything?

PODDAH
Member
PODDAH

Before you read this, I’m sorry for wasting your time if this question has already been answered in the article or in the comments. I tried to read everything to the best of my ability and still am a bit confused because my English is not the best.

Hey, I’m just writing this to make sure that I’m using the best setting. I have adaptive sync on in my monitor’s settings, which enables me to use G-Sync, and then I have G-Sync compatible enabled and V-Sync enabled too on in NCP, and the preferred refresh rate is at application controlled. I tried checking the delay and everything in Fortnite, because it has a setting which lets me do that. This gives me the least amount of delay, and even if I change my preferred refresh rate to the highest available, it still pretty much gives the same delay. I also have my FPS cap in Fortnite set to 144 just in case. I tried other things, and either they give me screen tearing or more delay. I only have one question: is this good enough to get the least amount of delay without getting any screen tearing?

CyclesOfJosh
Member
CyclesOfJosh

Hi there! I just stumbled upon this through a YouTube comment section, thank you so much for your hard work!

I was about to test the optimal settings in Overwatch when I noticed that in the Nvidia settings, there is a second option for V-Sync called “Fast”.
Is there any information on how that interacts with G-Sync and if it will have the same effect as it does with regular V-Sync? Would love to see if there’s more information on this!

kdog1998
Member
kdog1998

I have a question about my monitors VRR and if you know if there’s a fix for this or if I possibly have a bad monitor?

I have on G sync as recommended by you, using a Riva Tuner fps cap of 60 for final fantasy xvi. I have a perfectly flat frame time graph per riva tuner, but my game feels extremely jittery when I move around, especially when moving the camera. I figured out that my monitors on screen display that shows what it’s refreshing at is constantly bouncing around when using g sync and adaptive sync on. It will bounce from 60 to 52 to 67 to 48 to 180 (which is my max refresh) back to 60 to 77 and etc.. so despite my game holding a locked in 60 fps with a flat frame time graph, my monitor doesn’t seem to be refreshing at 60 and it seems to be bouncing around.

Is this normal? Or did I just happen to get a bad monitor or graphics card? My monitor is the ASUS VG27AQ3A, I have had it for about 6 months and have been thinking from the beginning something may be off with it. Any help would be great!

wpDiscuz