G-SYNC 101: In-game vs. External FPS Limiters


Closer to the Source*

*As of Nvidia driver version 441.87, Nvidia has made an official framerate limiting method available in the NVCP; labeled “Max Frame Rate,” it is a CPU-level FPS limiter, and as such, is comparable to the RTSS framerate limiter in both frametime performance and added delay. The Nvidia framerate limiting solutions tested below are legacy, and their results do not apply to the “Max Frame Rate” limiter.

Up until this point, an in-game framerate limiter has been used exclusively to test FPS-limited scenarios. However, in-game framerate limiters aren’t available in every game, and while they aren’t required for games where the framerate can’t meet or exceed the maximum refresh rate, if the system can sustain the framerate above the refresh rate, and a said option isn’t present, an external framerate limiter must be used to prevent V-SYNC-level input lag instead.

In-game framerate limiters, being at the game’s engine-level, are almost always free of additional latency, as they can regulate frames at the source. External framerate limiters, on the other hand, must intercept frames further down the rendering chain, which can result in delayed frame delivery and additional input latency; how much depends on the limiter and its implementation.

RTSS is a CPU-level FPS limiter, which is the closest an external method can get to the engine-level of an in-game limiter. In my initial input lag tests on my original thread, RTSS appeared to introduce no additional delay when used with G-SYNC. However, it was later discovered disabling CS:GO’s “Multicore Rendering” setting, which runs the game on a single CPU-core, caused the discrepancy, and once enabled, RTSS introduced the expected 1 frame of delay.

Seeing as the CS:GO still uses DX9, and is a native single-core performer, I opted to test the more modern “Overwatch” this time around, which uses DX11, and features native multi-threaded/multi-core support. Will RTSS behave the same way in a native multi-core game?

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yes, RTSS still introduces up to 1 frame of delay, regardless of the syncing method, or lack thereof, used. To prove that a -2 FPS limit was enough to avoid the G-SYNC ceiling, a -10 FPS limit was tested with no improvement. The V-SYNC scenario also shows RTSS delay stacks with other types of delay, retaining the FPS-limited V-SYNC’s 1/2 to 1 frame of accumulative delay.

Next up is Nvidia’s FPS limiter, which can be accessed via the third-party “Nvidia Inspector.” Unlike RTSS, it is a driver-level limiter, one further step removed from engine-level. My original tests showed the Nvidia limiter introduced 2 frames of delay across V-SYNC OFF, V-SYNC, and G-SYNC scenarios.

Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings
Blur Buster's G-SYNC 101: Input Latency & Optimal Settings

Yet again, the results for V-SYNC and V-SYNC OFF (“Use the 3D application setting” + in-game V-SYNC disabled) show standard, out-of-the-box usage of both Nvidia’s v1 and v2 FPS limiter introduce the expected 2 frames of delay. The limiter’s impact on G-SYNC appears to be particularly unforgiving, with a 2 to 3 1/2 frame delay due to an increase in maximums at -2 FPS compared to -10 FPS, meaning -2 FPS with this limiter may not be enough to keep it below the G-SYNC ceiling at all times, and it might be worsened by the Nvidia limiter’s own frame pacing behavior’s effect on G-SYNC functionality.

Needless to say, even if an in-game framerate limiter isn’t available, RTSS only introduces up to 1 frame of delay, which is still preferable to the 2+ frame delay added by Nvidia’s limiter with G-SYNC enabled, and a far superior alternative to the 2-6 frame delay added by uncapped G-SYNC.



3105 Comments For “G-SYNC 101”

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Sort by:   newest | oldest | most liked
BlurryAlienGoZ00mXD
Member
BlurryAlienGoZ00mXD

If I change Vertical sync to “On” in “Manage 3D settings”, it forces “Adjust image settings with preview” from “Let the 3D application decide” to “Use the advanced 3D image settings“.

Ideally I don’t want this as I don’t want any artificial enhancements on my game, but it seems impossible to have both at the same time.

It’s not mentioned in the guide that it does this. Any advice / insight for this? Best to set as much of the “enhancing” “Manage 3D settings” (for example: “Antialias – Gamma correction”) to “Off” to make it is as close to “Let the 3D application decide” as possible?

Let me know if I’m misunderstanding. Thanks.

dandyjr
Member
dandyjr

Hello there! I recently bought a 500Hz G-sync monitor and it’s the first time I’ve owned a true G-sync monitor with a module. The exact model is the AW2524H. One of the first things I noticed with this monitor is that the frames never hit the max range of the monitor as if the module has it’s own way to prevent leaving the G-Sync range. The monitor is technically a 480Hz panel but it has a factory overclock you can set to 500Hz. In stock form (with all of the proper G-sync settings in the control panel) the frames automatically are capped at 477 and never will hit 480. Overclocked to 500Hz, The frames will never go above 496. I tested multiple games just to make sure and they all produced the same results. What I’m wondering is if this is an effect of this specific model or if this is true for all G-Sync native monitors. Does this mean that I never have to cap my fps because the monitor won’t allow the frames to hit the ceiling anyway? I noticed that if I enable reflex in supported games, the frames will be capped at 438 instead of 496. My guess is that Nvidia set such a low limit to be safe for G-Sync compatible monitors that happen to be 500Hz (since I’ve heard they are less accurate and will leave range more often). What are your thoughts on this? I’d be stoked to hear that I don’t have to cap my fps anymore but it seems too good to be true haha!

IggyRex
Member
IggyRex

Hello, you probably get asked this a lot, but I game on a lg c1 at 120hz, and I was wondering what would be best for my setup. I am currently using Vsync – on in the control panel, but I am unsure if I should set low latency to ultra or if setting to On would be the better option for me. Having it set to ON would require me to set an fps limit on a game basis correct? So something like elden ring would work better being capped to 59 or 57fps as opposed to it set to ultra which would do it for me. What are the benefits of either option? Sorry, and please let me know. I just want to get the most out of my hardware and out of gsync.

Kaffik
Member
Kaffik

Hi!
I have 165 Hz G-Sync Compatible (ls27ag500nuxen) monitor.
I used all your tips for this type of display but something irritates me a lot.
60 fps on my screen looks like 30-35 at best, I need to have like 95-100 fps to make it look like smooth 60.
For example my boyfriend has 144 Hz VRR TV and 60 fps looks like 60 fps.
I have DP 1.4 and VRR on.
Do I really need a G-Sync module to make it better?

Indignified
Member
Indignified

Hello, is there any point of having gsync or reflex on if fps is uncapped?

wpDiscuz