Passive Income. ㅤㅤㅤㅤ
That is surely untrue, I can see a massive difference between 30fps and 60fps.
Why did not you mention that with 300 fps you’ve a lot of less input lag, despite having a 60hz monitor?
Why are we arguing over how many frames the human eye can see, it doesn’t even work in frames per second and technically everything you see is upside down and our brains flip it around but regardless, there is no generally accepted “FPS” humans can see, Ive heard that pilots have been tested to pick up a single frame at 300 FPS but I wouldn’t be able to provide a source if someone asked, while other sources suggest what Wyoming stated
But anyway, there’s virtually little reason to go any higher than 144hz, I mean think about it. At 30 FPS that’s one frame every 33.37 ms, people have latency faster than that, at 60 FPS its a frame every 16.67 ms, at 120 frames its a frame every 8.34 ms, 240 frames and you got one frame every 4.16 ms
Now think about it, do you notice any difference playing on 5 ping, verse 10 ping? No? Because that’s the difference between 120 FPS and 240 FPS
Yeah, human eyes do not see in frames, they see in constant waves of light.
A frame is a still image - 30 frames is 30 images updated per second. 30 frames, or 60, isn’t enough information to distort the image in your eyes, for something like motion blur. If human eyes could only see 30 frames, then that would be all you would need for motion blur as a natural phenomenon, and thus it wouldn’t be something you can toggle on and off in games… To say we can only see in 30 fps, 60, or whatever number you want to throw out, is factually incorrect.
You can wave your hand in front of your face and get motion blur. It wont be blurry if you did the same on camera because 60 frames that the camera is recording in isn’t enough to fool our eyes. We don’t see in frames.
Here is the frame time difference for all the values:
30 fps → 33.33 ms
60 fps → 16.67 ms
75 fps → 13.33 ms
120 fps → 8.33 ms
144 fps → 6.94 ms
165 fps → 6.06 ms
240 fps → 4.16 ms
300 fps → 3.33 ms
360 fps → 2.77 ms
390 fps → 2.56 ms
So the jump from 120 fps to 300 fps makes a 5 ms difference.
The difference is probably extremely hard to notice, but you don’t necessarily need to notice it to profit from it. Your brain will still receive a new image 5 ms faster, even if you can’t tell.
If it is worth it is a question that only you can answer for yourself though.
Note that you monitor has to actually support the higher refreshrate.
If it doesn’t, then all what exceeding the max supported frequency of your monitor/TV does is, that on average the image that you get on the screen is a bit more “up to date”, while it doesn’t improve the frame time itself.
I personally would rather get a 240 fps monitor with backlight strobing technology (like the BenQ ZOWIE XL2546K) instead of a 300+ fps one for the same price, cause this actually makes a visible difference.
Not really
However, 60 Hz vs 144hz is like 30 fps vs 60 fps
You ain’t want to go back.
I’m pretty sure the human eye can’t see above 75
I have a 144hz monitor and I cap at 72 FPS and I can’t tell the difference between that and 144 as far as smoothness. Input lag may possibly be reduced however
This is… Not correct… At all…
framerates are strange. since i was a console guy so many years ago i never even really cared about framerates. then i got a pc and still didnt have a clue what they were, till later when graphic cards got more advanced i learned about 60 fps. thats when i started noticing framerates. always tried to achieve that 60 fps rate. then one day i got a 144 hz monitor and i could afford the top of the line grapic cards. now im spoiled, i need to be near 144 fps. i really notice it if i get any lower. forget about playing something under 60 fps thats just painful
For some context, I ran an experiment and found that the increase for my accuracy drastically increased until about 90-100FPS. After that, it started to plateau and didn’t fully flatten until around 200FPS.
If you want to maximise your gameplay, 120/144Hz is a minimum, 60Hz is way too low. 240Hz is a luxury and not really needed, but still helps a little.
Interesting point; old chonky CRT monitors ran at around 75hz, sometimes up to 110hz. It wasn’t, really, until LCD and LED screens came about that it was limited to 60Hz.
I can say, from experience, that the differences between 30/60/144/240 are distinct, even in blind-tests.
Ping-compensation exists. Input-lag compensation doesn’t.
If you are a mutant like Dafran or some other pro player, yes. For your average 130-160 ms reaction time player, no. What is more important that frame rate for most people in actual play is frame stability. a 300 fps that drops up and down to 240 or lower then up is for many actually worse than a steady say 150 fps.
i’ve heard that 30 to 60 is very noticable as is 60 to 144hz. anything above 144 it starts to becoome redundant or you cant see the difference. dont have a clue tho since i never had a monitor above 144 hz. anyone who does… do you notice the difference?
The difference between 144 and 240 is noticeable, but it’s not really a bother. I’m fine playing with 144, but I picked-up a 240 just so my monitor wasn’t the bottleneck.
I’d say 90 is the minimum, anything above 240 is just ‘don’t bother unless you’re in Esports’.
I can tell a slight difference from 120 to 144, but its slight. I get used to it very fast. I think the faster you get the less difference most people can perceive.
The difference between 120-144 hz vs 300hz is almost non existant when it comes to giving an edge, but it will looks more fluid.
If you did the transition from 60hz to 144hz, you can kind of imagine how it could be “more” fluid if you really pay attention, but it’s much much more minimal.
You get very quickly dimishing returns over 120/144 but there is still a difference. It’s not an easily noticeable one if you’re looking for it, though. It’s more one of those things you just get used to, and then if you ever dip things feel a bit off.
To take full advantage of it you need a monitor that can actually display at that refresh rate, but even on limited displays like a 60hz monitor, higher framerates still reduces input lag. So running at 300fps on a 60hz display does still have benefit. It’s just not even close to as good as if you had a monitor that was >=300hz.
Pretty sure you are either blind or you didnt actually change your monitor’s refresh rate, because the difference between 72hz and 144hz is like literally night and day and I think even the most untrained noob would instantly tell the difference.

Pretty sure you are either blind
Please refrain from personal attacks.

72hz and 144hz is like literally night and day and I think even the most untrained noob would instantly tell the difference
It’s not at all. This is known as the placebo effect. Coupled with post-purchase rationalization.
It looks very very slightly smoother but hardly.
I have a 144hz screen and it’s set to 144hz.
Humans can detect change up to 600 fps, but obviously there are huge diminishing returns.