The giant black bars on screen, can be solved. It’s a feature on gpu software side. You won’t gain any more screen vision and the game could feel a bit weird because they would loss some proportions. The black bars it’s to keep the aspect ratio without distorting the image. If you want you can enable on gpu software and see if you like it.
Try getting some glasses or just getting your eyes checked in general.
There’s a reason why the pros and pretty much everyone in the high ranks use a high refresh rate monitor. It helps a lot.
And I literally posted visual proof that it helps a ton. Keep your little anecdotal opinions to yourself.
You suggest to stretch the 16:9 screen - which is not a solution. The Game would just get completely distorted and unplayable.
You could stretch everything to fit everywhere that way.
My monitor is the same its a 3840x1080p Resolution and it still has black bars on the side. It drives me up a wall and ruins the gameplay for me.
Yep, because of that I said: “see if you like it.”
If the game doesn’t support that resolution, means or you will have those borders or you stretch the image, if you see his complain it’s about black borders, maybe he doesn’t know about that feature, maybe he actually prefer that instead of scretch. It’s his choice to make, our job it’s to propose something useful.
We can’t change the internal resolution of the game, devs can. But doesn’t mean they will. So until they change or not, he can actually try that solution. If he like it good, if not he can stay as is. Don’t throw away any possible solution for him, just because doesn’t suits you, I know a lot of people who actually prefers having the screen scretch instead of black borders, I just play in a window just to avoid any those issues and because I use 2 monitors, sometimes even 3. So maybe that (stretch or window mode) could help him.
I never said that high refresh rate isn’t beneficial. I wouldn’t even play some games without having access to 120Hz+ gear (especially when it comes to VR). The visual proof you posted is about two games in which high refresh rate matters a lot, much more than in a game like OW. I mentioned this in my previous post but you probably forgot to read it in full.
In some games high refresh rate isn’t as important as in CS or Quake. Good examples are StarCraft and Diablo but I’d add OW too to that list depending on the played hero. 144Hz has became so inexpensive that most people can afford it (especially with OW that doesn’t requires as strong CPU+MOBO+etc as some other games at 144fps). However, if I had to recommend games for a child in a poor family that can’t afford buying a 144Hz gear OW would be on that list.
BTW, my eyes are fine, thank you. Your attitude isn’t. Whatever crap is going on in your life, I’m hoping that things will sort out for you eventually.
I believe in you, because it’s a fact. That it’s valid and true for years and not yet disproven.
The difference on Hz helps to notice on the edge of your vision. Because it’s where our eyes perceive more accurately any changes. In the center it’s really bad, but people most likely won’t notice. Those frames only made images more smooth for us because we only “catch” a few frames (those who actualy had sudden changes).
Sure 30hz vs 60hz it’s really perceptive because the way our eyes work, because our eyes can notice changes up to 65hz in central vision, but most of the time will be way less (light bulbs in certain conditions you can actually notice the “vibration”), also stable Hz it’s way more comfortable than unstable. But after that, your experience would be better because your periferal vision will notice more “changing” stuff, but doesn’t mean your reflex or your precision will be better. In central area would be good, only if they’re stable and “the surrounding” of that target changes, you will actually perceive that with your periferal vision. Our “focused vision” it’s just garbage. Most of the people who actually says that “more fps it’s better” it’s because they believe they can notice sudden changes more accurately on the edge of their vision, it’s true. Because when you focus on target, you notice more changes on surrounding that target instead changes on that target (blinking surrounding dots, vibrant images, optical illusions are affects based on those). On fps games and as you noted, as hitscan it’s more “comfortable” because you focus in a point and “your periferal vision” will notice sudden changes.
When you add input lag and the delay, more fps can decrease those. But if they decrease in a instable amount it’s bad as instable fps. So when people actually thinks that “more fps means better players” it’s not true, just they will have more “hints” on their periferal vision. If their reflexes are bad or their input lag are bad they will perform even worst, but it’s where placebo effect comes in place. Most of the people who actually play high fps plays in unstable fps, so those it’s placebo effect. Those who actually lock frames in certain value and stable will have the advantage. But like I said, it’s “more perception” doesn’t mean better players.
Most of the time, on overwatch you won’t notice anything different if your character won’t shoot anything. Because you not focusing on anything, so your “vision” will be all screen and that scenario anything above 65hz will appears almost the same. You will perceive more when you try to shoot things. Like I said before.
Central vision it’s where your eyes focus, will notice between 7 to 13 Hz changes. Some conditions can make those Hz be perceived on scenarios up 65Hz. But will still be 7-13Hz.
Peripheral vision it’s where your eyes tries to keep us aware of our surroundings, exist reports that can achieve around 800Hz, sometimes even more. So when you consider you focusing in a single point with a scope(which blocks a lot your overall vision area), you will notice better, any change surround that point, but in the center, will be the same 7-13Hz.
Experiencing that it’s not equivalent as better player skill, so when you perceive something doesn’t mean you will react the same way, so doesn’t make you better player, but will give you a slightly advantage in certain scenarios.
This wouldve been niche years ago but since then almost all phones have no bezels and a rounded screen so now all apps are pretty much optimized. But yeah, if more and more people become interested in 21:9 curved screens, developers will cater to that as they have with our phones already.
Blizz has double standards. They’re never admit it, but they do.
Interestingly most people don’t even check (or know about) the actual input lag of their monitors… Two different monitors at the same refresh rate can have very different input lag values and some standard 60Hz desktop monitors (similarly to TVs) have unusually bad input lag. This contributes to the (unfairly) bad reputation of 60Hz.
Games themselves can have very bad input lag (several frames!!!) just as a result of poor implementation. (I fixed quite a few bugs like this when I worked as a game programmer.) For this reason a bad implementation might require a multiple of 60 FPS just to be able to achieve an input lag similar to that of a well implemented game at 60 FPS. Fortunately OW doesn’t seem to have this kind of problem.
I can’t tell the difference between fix 120Hz and fix 144Hz but I know that something’s wrong when I try to flick aim with an FPS that fluctuates between 144Hz and a bit lower value.
Stable frame rate requires video settings that result in less than 100% average GPU load outside of a fight (I aim for 60-80% in case of OW). This way there is headroom for extra effects/explosions/objects during battle without dropping the framerate. Another thing I like about less than 100% average GPU load is less heat, slower fans and less noise.
Those who put constant 100% load on the GPU without locking the FPS usually end up with the worst FPS fluctuation during battle (because of the extra effects/objects/explosions on the screen) when it hurts the most.
Yes, several monitors can be worst or better against others with less or more refresh rate, also it’s good to keep in mind, that also software and drivers can add too. If something using too much your CPU can generate a lot of bootleneck, the same occours com GPU. It’s good to point that mostly of the applications can’t use all cpu potentials, most of the time my CPU got iddle because the application can’t scale much against cores. I have a FX-8350 and I noticed “several” improvements when DirectX12 or Vulcan increased their support for more cores, still the boundiares are around 6 cores. One of the most common tests I do it’s: if I can swap applications and I receive a little penalty on performance, means that application won’t use enough of my cpu, because if used I couldn’t swap too fast and lose a little fps, because I use hybrid gpu and individual displays for them. I tried several times to use Intel i7 CPU 2-5th gen and got bad results(freezes, crashes and bad multitask performance), for my surprise Fx-8350 stays really stable, sure mostly he won’t get the edge of FPS on many games, but I can play stable while doing other stuff, sometimes got a slowdowns but never froze or crashed. Even today he performs really great. It’s not fastest but he can handle almost anything without issue.
It’s easier to notice on fast paced situations and heroes with scope. Like widow, ana and ashe. Sure every time we increase the FPS means less noticiable results after 65hz, certain moments with certain “conditions” you can notice. On VR and others you need 120hz or more to have a decent experience.
It is, I almost never got “saturation on my gpu side”. Currently I have gtx 1080 and R9 380(that sometimes suffer when I scalated certain settings or force 4k), but normally things that bottlenecks my cpu. Shadows and things too much reliant on cpu. Partily because it’s not too well optimized arch but also because mostly I can’t use the entire performance of it on the software side.
My issues vanished when I put watercooler on my CPU. I had a lot issues related to thermals, in the past. I normally aim for stability, this rule I use more or less the same. First I try to check every setting and gauge where relies more or less (CPU or GPU bound). Then I tweak to receive optimal performance with decent appearance. I don’t care much about FPS if turns out my experience would be poor, but certain features I really don’t care (like too much bright, certain types of blur, but others like resolution and antialiasing I like, but can compromise if the texture also appears to be great. The rest I really don’t care much.
With Fx-8350 and gtx 1080, I set everything ultra except shadows (medium/low) and dynamic light(medium). (100fps)
With Fx-8350 and R9 380, I set everything on high, except shadows (medium/low) and dynamic light(medium). (80fps)
If I lift everything I can reach 180-240 ish, but instable, so I would drop to 120-160ish, but the game got really ugly.
I think overwatch uses dx11 on windows, so if ow2 get dx12 or vulkan could be great for multiple core cpus. My fx-8350 would love . Also, I noticed that recent patchs weirdly made my cpu perform a bit better, not exactly sure why.
About dx12 or vulcan, I really notice the benefits when I play something based on dx12 or vulkan, those extra cores it’s really cool.
People switching from 60Hz to 120Hz+ often hit some kind of CPU limit. The FPS is a multiplier on both the GPU and CPU load but unfortunately only the GPU has another multiplier (video settings) that can be lowered if necessary.
This is why the first thing I usually check in a game is the limitations of the CPU+MOBO+RAM: the max (stable) FPS that can be reached on lowest video settings without GPU bottleneck.
If the CPU isn’t enough for a given frame rate then usually there isn’t much the player could do (other than lowering the frame rate). Perhaps some OC if the CPU allows it and has a good enough cooler and VRM on the MOBO (but this assumes some kind of high-end hardware that is perhaps a bit old/outdated). In case of low end hardware the only solution is replacing the CPU+MOBO+RAM combo (it usually isn’t only the CPU’s fault…).
Certain settings can be implemented to use CPU resources, like physics and shadows and certain games audio features, those are the ones I usually tweak. But certain games can use other features to burden more CPU. On diablo3 by example mostly Physics and Audio features burden cpu side.
But sure, reducing everything and then test it’s one of the best ways to see if the problem lays on the rest of the hardware.
I saw a lot of pc setups that performs badly because their components won’t work in sync, mostly RAM + MOBO or RAM + CPU. A little RAM, bad internet connection and mechannical disk can give you certain drops in certain moments. It’s good to mention that low cost mobo, mostly will have VRM problems, I had in my first for this cpu. When I changed to one more reliable the problems gone away.
I literally enchanced each part of my gear to prevent those, my cpu works on 4.3ghz because I synced him with the mobo and ram. I needed that because I use 2 gpu(heterogeneous) and 5 disks(2 of them SSD). One of them works to OS and certain softwares, other for the games and 3 other disks I use to storage after some time, also make some backups.
I think this gear could handle at least more 3-5 years and perform ok for 10. If I got a decent deal and need the extra permance I could setup this PC as multistation and do other stuff with. I build another one, but I don’t think it’s a good time to buy new gear right now, too much groundbreaking stuff happening in a short period of time.
They didn indtandly changed the fps cap of overwatch when 360hz monitor came out.
Not everything is racist, calm down.
They’ve been inactive for 2 years dude, they aren’t gonna see your comment. But hey OW2 supports ultrawide!