Mua ha ha ha ā¦
Yes, it is precisely for that reason that people are quite capable with unaided eyes (without help from high-speed photography) to distinguish all frames in a battle between a Cobra and a Mangoose or the beating of the wings of a Colibri.
Such nonsense ā(Human eyes can see separate frames well past 100 fps)ā is used from half-educated shysters to sell to the sheeple overpriced gear.
Internet use without minimal formal intellectual preparation and critical thinking is a dangerous place.
Higher frame rate also reduces input latency. Even if your eyes cannot see the difference between 60 and 120fps, there is still less latency from the time you push the button until the action executes. On my display, gaming at 60hz, the input latency is 10ms. If I game at 120hz, itās 5.3ms.
Large rooms.
This however, is correct. Very small screens with very high resolutions have huge pixel density. This is the only spec here that you gain no benefit from⦠itās just marketing.
Maybe but your brains can only process 60 images per second or are you trying to claim that your brains can process images faster than peregrine falcon?
To you anything below 60 hz flickers.
To peregrine falcon anything below 100 hz flickers.
I donāt think thatās what hes saying. I can play at 60hz and 120hz and tell very easily which is which. My wife says she cannot tell the difference. Itās the same concept as audiophiles who have trained their ears to pick up on better quality audio. Some people donāt have the āknow-howā to distinguish higher quality. Most people who are like that have settled for low quality for so long that their body sets the bar there.
Yet I canāt see difference between 60 fps and 240 fps on Youtube, not even when in slow motion. Itās all 60fps.
I canāt even see difference between 60hz and 120hz on my phone
Youtube does not support video higher than 60hz. There are people who record things at slower speeds to demonstrate the difference. To see the difference, one would have to actually sit down and use it. On my Series X, when I switch between 60hz and 120hz on Doom, I can instantly tell the difference. Itās buttery smooth and more responsive.
Now⦠I said earlier that the input latency went down from 10 to 5.3ms. Going to 240hz, it would drop to probably little under 3ms. At a certain point, you wonāt be able to discern the responsiveness but going from 10 to 5, you can tell.
Maybe a human brain can process up to 30 frames of full images worth of visual information but that doesnāt mean people canāt see a difference between 30 vs 60 vs 90 vs 120 etc. framerates. Brains process different amounts of visual information at different rates. Small changes can be perceived at much higher rate. Because of that higher framerates provide much smoother visualisation of motion for example.
Obviously every person is an individual and how well people see the differences can vary greatly. But saying that 30 frames is the actual limit is really completely false. There is no solid limit what one perceive, in reality itās a very dynamic function.
As Iāve said, even if your eyes cannot see the difference, the responsiveness of the lower input latency IS there. There are varying sources of input latency and keeping everything as low as possible is good because it all adds up. Using a wireless controller, lower frame rate, budget model TV, etc
And I say that to those people who believe that there is no reason to go above 60hz. They donāt understand that there is more to it.
Can you explain why I canāt see difference between 60hz and 120hz on one of the best displays (S22 Ultra)?
Try running something at 60Hz vs 120Hz. Reminder: Youtube videos run at 60Hz max so 60Hz material on 120Hz really wonāt show the difference.
When someone says they cannot see the difference between 60 and 120Hz, itās always 1 of 2 reasonsā¦
- Theyāre not watching ānativeā 120Hz content. This goes along with those TVs marketed as āEffective 120Hzā which is a fake 120Hz.
- Their eyes actually cannot tell the difference. Not everyoneās can. Some peopleās eyes are not trained to know how to tell. My dad cannot tell the difference between 30 and 60fps which is crazy to me because if I switch from 60 to 30, it makes my eyes water.
There are devices for mass-market and specialized devices for laboratories or scientific research. Simply because someone (0.01% of customers) can āperceiveā something in the boundaries of human physiological limits (30-60HZ) or 16-18000HZ in case of hearing does not mean that mass-market should be saturated with devices that overkill. Why they try to do that? Simple: High-end devices are much more expensive. Just making sheeple to pay for something double.
Have you noticed how the corporations try to sell reading devices that canā¦stay underwater for prolonged times up to 10 or 100m?
If someone needs personal protection he gets a handgun, not a heavy-flamethrower or 250mm howitzer.
Iām not arguing, Iām trying to understand your perspective. Who are the .01%? Is that the % of customers that can tell the difference between 60 and 120hz?
Nobody is suggesting going to extremes. When it comes to visual tech, there are caveats. Iām an expert on displays and Iām happy to help anyone understand. Iāve done this for years to help people select a display thatās not perfect but perfect for them.
60 (2x30) or 120 (2x60) FPS screens were produced for 3D stereoscopic vision. In 3D as you know maybe better than me the screen has to generate images separately for both left and right eye hence the doubling of framerate.
Now, telling the percentage of people able to go beyond the 30FPS limit is a task for researchers of human vision. I guess it is a very small number.
Active 3D is a fun technology for sure. It would have been awesome to see how far they would have gone with it had it not went the way of the Dodo. I have a 47" 3DTV and a handful of movies that I plan to keep.
Alternating images is perceived by the eyes a little different than a progressive set of images. Itās kinda sorta similar to when you go from 1080i to 1080p. Theyāre both just as sharp but one is a little cleaner.
I do believe the average person may not immediately know the difference. Heck, even movies are 24Hz. Most people who can tell are āusuallyā always gamers who have had a lot of practice/use with them. Some jumps in tech are not always so immediately noticeable to some people. I remember my mom couldnāt tell the difference between DVD and Blu-ray when my dad upgraded. Now if you go back to DVD, she can tell.
Most gamers say they canāt see difference between 1080p and 2160p or SDR and HDR on 1700 nit display.
As far as resolution, if youāre talking gaming monitors then yes. The screen size is so small compared to a TV that the pixel density is hard for most people to see any difference. As far as HDR or SDR, it depends on the implementation and calibration settings. My gaming TV, I can adjust HDR settings to make it look like crap or pop.
A friend of mine recently bought a 49" screen, so they arenāt really small nowadays unless you have a real behemoth of a TV.
Iām referring to the average monitor size, which is 27". Some people do have those >40" monitors that are ultra-wide or some will even use the new LG OLED TVs as monitors. Above 40" (in my experience) is where most will start to notice the difference between HD and 4K. Lower than that, not so much and I never recommend it because all it does, for the most part, is make your GPU have to work a lot harder for little to no gain.
A note about the ultra-wide, they donāt increase in pixel density since they use non-standard resolutions. It may be a horizontal pixel count similar to a 4K screen but the vertical is no better.
Yeah, personally I donāt really get the appeal of a 40"+ screen for gaming, used one for a while but man itās not pleasant in the slightest. (my primary gaming screen died so had to use my secondary that I use to watch media from my couch as the primary for a while)