G-SYNC 101: In-game vs. External FPS Limiters


Closer to the Source*

*As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVCP; labeled “Max Frame Rate,” it is a CPU-level FPS limiter, and as such, is comparable to the RTSS framerate limiter in both frametime performance and added delay. The Nvidia framerate limiting solutions tested below are legacy, and their results do not apply to the “Max Frame Rate” limiter.

Up until this point, an in-game framerate limiter has been used exclusively to test FPS-limited scenarios. However, in-game framerate limiters aren’t available in every game, and while they aren’t required for games where the framerate can’t meet or exceed the maximum refresh rate, if the system can sustain the framerate above the refresh rate, and a said option isn’t present, an external framerate limiter must be used to prevent V-SYNC-level input lag instead.

In-game framerate limiters, being at the game’s engine-level, are almost always free of additional latency, as they can regulate frames at the source. External framerate limiters, on the other hand, must intercept frames further down the rendering chain, which can result in delayed frame delivery and additional input latency; how much depends on the limiter and its implementation.

RTSS is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter. In my initial input lag tests on my original thread, RTSS appeared to introduce no additional delay when used with G-SYNC. However, it was later discovered disabling CS:GO’s “Multicore Rendering” setting, which runs the game on a single CPU-core, caused the discrepancy, and once enabled, RTSS introduced the expected 1 frame of delay.

Seeing as the CS:GO still uses DX9, and is a native single-core performer, I opted to test the more modern “Overwatch” this time around, which uses DX11, and features native multi-threaded/multi-core support. Will RTSS behave the same way in a native multi-core game?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yes, RTSS still introduces up to 1 frame of delay, regardless of the syncing method, or lack thereof, used. To prove that a -2 FPS limit was enough to avoid the G-SYNC ceiling, a -10 FPS limit was tested with no improvement. The V-SYNC scenario also shows RTSS delay stacks with other types of delay, retaining the FPS-limited V-SYNC’s 1/2 to 1 frame of accumulative delay.

Next up is Nvidia’s FPS limiter, which can be accessed via the third-party “Nvidia Inspector.” Unlike RTSS, it is a driver-level limiter, one further step removed from engine-level. My original tests showed the Nvidia limiter introduced 2 frames of delay across V-SYNC OFF, V-SYNC, and G-SYNC scenarios.

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yet again, the results for V-SYNC and V-SYNC OFF (“Use the 3D application setting” + in-game V-SYNC disabled) show standard, out-of-the-box usage of both Nvidia’s v1 and v2 FPS limiter introduce the expected 2 frames of delay. The limiter’s impact on G-SYNC appears to be particularly unforgiving, with a 2 to 3 1/2 frame delay due to an increase in maximums at -2 FPS compared to -10 FPS, meaning -2 FPS with this limiter may not be enough to keep it below the G-SYNC ceiling at all times, and it might be worsened by the Nvidia limiter’s own frame pacing behavior’s effect on G-SYNC functionality.

Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC.



3745 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
higorhigorhigor
Member
higorhigorhigor

In my tests with the AMD RX 6750 XT, an LG 180hz IPS monitor, on both Linux and Windows 11, I noticed that when I cap the FPS at 60, for example, in scenarios that could deliver 160 uncapped, what happens is that my GPU significantly reduces its clock speeds, and this creates instability in frame production, causing the refresh rate to fluctuate widely.

On Linux, in LACT (a program for managing GPUs on Linux), I created a profile for the specific game and activated a performance level option that keeps the clock speeds higher. This completely solved the problem that occurs when I limit the FPS far below what my GPU can produce.

On Windows, I haven’t found a similar option to this, but I also haven’t looked much since I’m not using Windows a lot. I came here to comment, so that in case you weren’t aware of this, it might help other users who feel they have this VRR disengagement issue even when the FPS seems stable in RTSS.

COLEDED
Member
COLEDED

Thanks for the detailed guide. Sorry if this is a mostly unrelated question, I ask because the power plan is mentioned in the conclusion section.

My default Window’s “High performance” plan puts my minimum processor state at 0% for some reason.

Is it a good idea to just use Bitsum’s highest performance plan from park control, which sets the Minimum processor state at 100%, all of the time? I haven’t seen an increase in idle CPU power consumption or utilization after changing to this profile

Does this setting actually changes anything?

PODDAH
Member
PODDAH

Before you read this, I’m sorry for wasting your time if this question has already been answered in the article or in the comments. I tried to read everything to the best of my ability and still am a bit confused because my English is not the best.

Hey, I’m just writing this to make sure that I’m using the best setting. I have adaptive sync on in my monitor’s settings, which enables me to use G-Sync, and then I have G-Sync compatible enabled and V-Sync enabled too on in NCP, and the preferred refresh rate is at application controlled. I tried checking the delay and everything in Fortnite, because it has a setting which lets me do that. This gives me the least amount of delay, and even if I change my preferred refresh rate to the highest available, it still pretty much gives the same delay. I also have my FPS cap in Fortnite set to 144 just in case. I tried other things, and either they give me screen tearing or more delay. I only have one question: is this good enough to get the least amount of delay without getting any screen tearing?

CyclesOfJosh
Member
CyclesOfJosh

Hi there! I just stumbled upon this through a YouTube comment section, thank you so much for your hard work!

I was about to test the optimal settings in Overwatch when I noticed that in the Nvidia settings, there is a second option for V-Sync called “Fast”.
Is there any information on how that interacts with G-Sync and if it will have the same effect as it does with regular V-Sync? Would love to see if there’s more information on this!

kdog1998
Member
kdog1998

I have a question about my monitors VRR and if you know if there’s a fix for this or if I possibly have a bad monitor?

I have on G sync as recommended by you, using a Riva Tuner fps cap of 60 for final fantasy xvi. I have a perfectly flat frame time graph per riva tuner, but my game feels extremely jittery when I move around, especially when moving the camera. I figured out that my monitors on screen display that shows what it’s refreshing at is constantly bouncing around when using g sync and adaptive sync on. It will bounce from 60 to 52 to 67 to 48 to 180 (which is my max refresh) back to 60 to 77 and etc.. so despite my game holding a locked in 60 fps with a flat frame time graph, my monitor doesn’t seem to be refreshing at 60 and it seems to be bouncing around.

Is this normal? Or did I just happen to get a bad monitor or graphics card? My monitor is the ASUS VG27AQ3A, I have had it for about 6 months and have been thinking from the beginning something may be off with it. Any help would be great!

wpDiscuz