Nvidia – higher framerates means a better Kill Death ratio

Kill-Death ratio

What is a Kill Death ratio I hear non-gamers asking? Kill Death ratio defines how many kills a player gets before they die.

Gadgeteer gamer Arun Lal writes: A Kill Death ratio is Kills divided by Deaths. For example, if a player gets ten kills and five deaths in a game, they have a Kill Death ratio of two. The higher the better.

The casual gamer might not understand all this, so NVIDIA released a video showing how 60, 144, and 240 FPS/Hz affect gameplay in fast-paced shooters like Counter-Strike: Global Offensive (CS: GO). It also has a study on the relationship between framerate and Kill-Death ratio.

The truth is 60FPS (frame per second) is ideal for shooting games. But some games in the eSports scene run at insanely high rates – 200+ FPS range.

What’s the difference between framerate (FPS) and monitor frequency (Hz)?

FPS is the average number of frames rendered per second by your GPU. Note that while a human eye can theoretically detect up to 1000 FPS – anything over 150 FPS is a ‘guess’ known as ‘critical flicker fusion’.

The FPS can go as high as your GPU (and CPU) allow. The FPS in a game rarely stays constant and varies from scene to scene.

Hz refers to the refresh rate your monitor can achieve without screen tearing (straight lines being jagged). The human eye cannot see things above 60Hz.

Kill-death ratio tear

What’s the relationship between the two?

It’s easy to think of it like this: The GPU renders a bunch of frames per second and passes them on to the display.

The display picks these frames up depending on the monitor refresh rate. So, if it’s a 60 Hz monitor, it’ll pick up 60 frames per second and output at 60Hz.

If the GPU produces fewer frames than the refresh rate, it leads to screen tearing and choppy gameplay. If it provides more frames than the display refresh rate, you still get screen tearing because the screen cannot process al the extra information.

For the best performance, the frame rate and refresh rate should be as high as possible and matched to similar speeds.

There are eSports-oriented monitors out there that go all the way up to 240Hz.

But is it worth getting one of these for typical gaming? 

According to the Nvidia study, higher frame rates and refresh rates make game animations smoother, allows better tracking of opponents, reducing choppiness and latency.

Kill-death ratio latency

Nvidia uses CS: GO, a popular competitive shooter, as its example. At 180 FPS, the average Kill Death ratio is 90% higher than at 60FPS.

Insert K/D graph here

There could be other factors in play here affecting kill death ratio

For starters, the market segment that buys (well, can afford) high refresh monitors has a disproportionately high number of hardcore professional eSports players. Their kill death ratios are high anyway. We can’t be sure if correlation implies causation here. Still, noticeably better latency and smoothness are likely to translate into more competitive play.

GadgetGuy’s take – is 240Hz just like the 5G hype?

You bet. Telcos continuously tell us that 5G is the panacea to all our mobile ills. By the time you see any respectable 5G download speeds, you will be a lot older but no wiser for buying the Telco Kool-Aid that you need it.

We’ll level with you here. High framerate monitors do not matter for the typical GadgetGuy reader. I’m an avid gamer. I have a professional setup and game every day. But I don’t game at higher than 60Hz.

Why is that? 60Hz is already very smooth. And, unless you are a competitive multiplayer, there is no real advantage to higher refresh rates.

For casual gamers, image quality trumps higher framerates. The choice is between running at high framerates or running at a higher resolution. There are no graphics cards in existence that can do, for example, 4K/144 FPS.  I’d pick 60 FPS any day over high framerates if it means getting a stunning 4K image.