I agree if you’re sitting at a desk with that big of a screen in front of you. I game mostly on my 65" TV but that’s sitting about 8 feet from my face.
If someone enjoys doing that… I wish them the best.
I agree if you’re sitting at a desk with that big of a screen in front of you. I game mostly on my 65" TV but that’s sitting about 8 feet from my face.
If someone enjoys doing that… I wish them the best.
Some of us do have less than stellar eyesight. I actually need a 40-48" TV as my display. The 32" TV I used to use was readable at 1600x900, but not at 1080p. My current 46" Samsung TV in PC Mode is very readable at 1080p. Haven’t had the opportunity to upgrade to a 4k TV yet. As to why I don’t use a standard computer monitor, I also watch movies and I prefer motion smoothing on my blu-rays and DVDs. I can’t stand the awful judder presented by 24/29.95 FPS material. Hell, on a few JRPGs I’ll eat the small bit of input latency and activate motion smoothing because timing isn’t super critical there and I can use my prepro’s audio delay to sync up the visuals with the audio.
I’m gonna have to wall mount my next TV though. The only Samsung TVs that have the proper RGB layout are 55" and above, which is way too big for a desktop. Being able to play WoW on a large screen does feel nice though.
Case in point, thank you!
Man, I don’t know how you stand the motion smoothing. It looks so weird to me. Yeah, the option is there because many people do like it! Maybe if I used it more, it’d grow on me.
And the JRPG is a good point, games like that, input latency isn’t so critical. Some people want it on everything.
I did plug my gaming laptop into my 65" OLED and I agree, it was very nice and felt fresh. Wish I could make it easier and play it simply on my Series X but yeah, we know that’ll never happen.
Yea, a giant 40" monitor in your desk would require to actually move the head in order to see the extreme corners. I guess that is what you imply, besides the extra-stress to the eyes themselves.
Refresh rates.
I could easily tell the difference between 60hz, and 144hz. I switched to a 144hz display a few years back and was blown away by the visual improvement. (I had picked up an Acer Predator G-Sync, 24")
Now, could I tell the difference between that an say a 200+ Hz monitor? I don’t know. But I could easily see the difference between a 60 and a 100.
My older gaming laptop that I sold a few years ago, had built in 3D support and did 120hz in non-3D mode. That thing was impressive, for a Dell.
I used to play D3 on it, IN 3D! Complete with the glasses and everything. It was actually surprisingly cool looking, and you could tell the game was designed around it a bit.
But, it was graphic demanding, hardware limited, and ultimately I moved on from it.
Still.
Yes, people can tell the difference, but not everyone has the same set of eyes. So don’t think that just because you can’t tell, doesn’t mean someone else can’t.
Game on.
Yes, that’s what I’m talking about. I’m sure there are varying reasons for wanting that. I’m sure some do it as “look what I got!” material, not thinking that now, they’ll have to move their neck muscles, etc.
It’s an acquired taste if you’ve never used it before or don’t use it enough to become accustomed to it. The reason many don’t like it is because it can (and does) introduce artifacts every so often depending on the scene, esepcially in vertical panning scenes. Newer TVs are better with that though. I only activate that feature if the content is a divisor of 120 though (24/30 FPS), otherwise you get hitching, such as with 60 FPS material since this is a 60 Hz panel.
Mini-LEDs are where it’s really going to go in the near future. OLEDs still suffer burn-in risks aplenty, which are always front and center with games that have static HUDs.
Star Ocean 4 is really nice with motion smoothing on since even the remastered version is more often 30 FPS than 60 FPS. But the downside to motion smoothing is that if the source material is 4:4:4 chroma you lose some color fidelity because outside of PC Mode all TVs use 4:2:2 chroma subsampling.
Sadly I’m stuck with this scenario because I have just one eye to see with. Even with a 24" screen like the Westinghouse monitor I had 15 years ago I can’t see the whole screen with just one eye at “proper” distance, and can’t read at further distance. I can see the pixels at proper distance if I’m paying attention thanks to the cataract surgery I had this year, but I’m still stuck with just one eye.
Oh how I wish I could use VR. Sadly again, one eye.
I know the whole mini-LED thing is supposed to take away the fear of burn-in but I always tell anyone who asks me in interest of getting one is to just take care of it. I don’t leave mine on constantly or with static images. You get used to that very fast. It would be cool if they could improve on mini-LED to compete with the contrast ratio in full and especially response time. If the could do the latter, that would really make going OLED a harder sell.
Have you ever watched movie on screen that can properly refresh at 24Hz? Hint: computer monitors can’t unless you specifically set refresh rate that low.
Computer monitor can’t display 24fps content like movies properly. I guess, in theory 120hz and 144hz monitors should be able to.
I’m guessing that it’s because my eyes have been so used to watching 24hz content for movies.
You clearly didn’t understand.
60hz monitor can’t display 24fps properly even if you tried.
60 ÷ 24 = 2,5
To get smooth experience you would have to show every frame 2,5 times. Result: screen tearing or choppy playback.
Now watch movie on console or bluray player attached to TV and it’s a lot smoother because player is outputting at 24hz instead of 60hz that your GPU does.
A 60 Hz display can easily show 24 FPS material properly. 60 Hz is merely the upper limit to the refresh rate. Computer monitors can be brought down to 24 Hz, though you’re likely to get a headache doing so just like when TVs do so without motion smoothing (that has more to do with 24 Hz/FPS being the lower bounds for perceiving motion). That doesn’t mean there won’t be judder - there will, especially on computer displays that don’t have anti-judder circuitry like TVs do. I actually have a resolution hotkey preset for 24 FPS that I use for film based DVDs (video based DVDs require 29.97/30 Hz).
This works very well with Smooth Video Project (SVP) since modern GPUs are far more powerful than those found in TVs.
Well, this had the glasses that flickered at 60 hz alternating with a 120 hz screen to induce 3D effects on the laptop screen directly, but no matter. No 3D perspective gear will work with your condition unfortunately.
Then how do youtube videos that play back at a max of 30FPS look fine on a 60hz display, or a 100, or 144? And those aren’t lowered to match?
24FPS video would definitely look more choppy than say a 30FPS, and I can tell the difference between them, but that’s when referencing a PC monitor. Hooking a computer to a TV is a completely different experience and yes, there is going to be a lot uglier results. I have a 48" 4K Samsung TV (which I completely hate for a different reason) that when playing video back on it from the PC, it looks horrid compared to my PC monitor. No matter what resolution the video is actually playing at, it still looks bad. I can even force the TV to standard HD resolution and it still can be jittery and what not.
So I agree with the aspect that some things don’t work well when coming from a PC to a TV, but that isn’t going to translate the same for every device, every TV, or every computer.
As for the hating my Samsung TV, its the fact that they install apps you don’t want, and can’t remove. I used to be able to remove them by entering Developer mode and tricking them off, but they keep getting reinstalled and now even that doesn’t work. As a result, I have like 4 apps that I wanted to put on, a few that I can’t remove but do use, and a crap ton I don’t want and can’t delete which leaves me with almost no storage memory left on the dumb thing.
Bottom line, Samsung TVs suck. Kinda like their phones. (yeah, you can’t delete their crap from the phones either).
Game on.
You need to go into the device name and change it to “PC”. If you’re using Game Mode it’s still applying post-processing and using chroma 4:2:2 (4:2:0 for older TVs). Samsung TVs are weird like that. You must use the device name change function to specifically select PC mode. That’s the only way to access it. I know this because I have a Samsung UN46H7150 TV as my display. Everything is pixel perfect and butter smooth. Once PC mode was activated I was able to calibrate a proper 2.2 gamma curve as well.
Edit: To edit the device name you hit the Source button on the remote, then hold the remote’s “Enter” button down until you see the Edit menu pop up. Select “PC” for use with a PC. That is the only mode that allows 4:4:4 chroma on Samsung TVs.
As for the bloatware, I don’t use the apps on the TV. This is after all a computer/console display. If I want Netflix or whatnot I’ll do it from the console or computer itself and have a smooth experience (oh who am I kidding - I’ve never had good luck with streaming outside of youtube videos).
In that case why did Dell monitor I used to watch movies on PC have choppy playback?
Compared to really smooth (DVDs/blurays/streaming) on Series X attached to Sony XE70?
That can very from display to display. Some displays with the same refresh don’t look the same as others even though on paper, they should.
Also, I’ve seen displays where person A says it looks like crap and person B says it looks perfectly fine to him. Maybe you’re more sensitive to that. Some people are more sensitive to various display attributes… motion blur, etc.
Might have more to do with the player or the source than the FPS difference. I’ve watched 24FPS videos on my system without problems before.
Yeah I seem to recall a setting for that. I may or may not have changed it, I don’t remember. I don’t really care, I normally don’t connect to that TV for much in that fashion. lol
What TheTias is referring to can be an issue on not only Samsung but other displays… montitors and TVs. It varies a lot. On my LG TV, I don’t get full 4:4:4 chroma sampling unless I set to PC. It’s something you shouldn’t HAVE to do. In this day and age with HDMI 2.1 being so smart and being able to detect certain things, you’d think it could “detect” what we’re doing and adjust accordingly.
If you left the display at 60 Hz while playing 24 FPS material, of course you’re going to get the inverse telecine judder. Setting the refresh rate to 24 Hz solves that as the display is then only showing each frame in its native 41.666 ms window. The reason your console appears smoother is because it’s setting its GPU to output at 24 Hz with the appropriate content. The same is achievable by manually switching refresh rates on the PC. Of course unless your computer monitor has anti-judder circuitry in it (some do, some don’t), you’ll get the judder you experienced.
Believe me, I’m quite well versed in this after many years of using both TVs and monitors as displays and learning the quirks of both. Oh, and panel type does play a role. IPS will be better for color but worse for both viewing angle and potential judder. VA panels are the best compromise for gaming/video mixed use.
IPS typically has 178°/178° viewing angles and so does Dell U2312HM I’m talking about. Show me VA/TN with wider viewing angles.
Change manually to 24hz? Requires too much effort when you can just check during initial setup that you have “allow 50hz” and “allow 24hz” enabled in settings. So nice to jump from Ni No Kuni to Disney+ and back to game on Series X.