Hmmm, really? Im kinda confused now ~ ~
Anti-vax nonsense. Hipster kids throwing tomatoes at their dorky parents.
Loss of vision – really? You really believe that in the total absence of evidence? I’m substantially older than you and my vision is better now than it has ever been before, thanks to corrective surgery. I see an advantage in 144 FPS over 60 FPS, and there is nothing special about me in this regard.
I was competitive in martial arts. I fought in tournaments, and I had enough experience to know that the younger stronger guys were the opponents I looked forward to. The old guys who joked around and didn’t look nervous? That’s where your day ends.
Age and treachery > youth and strength.
I know you will think I’m nuts, but hd crts monitors are far better. Those things have nearly instant response times. I’m outperfomring on my 60hz crt over my 144hz lcd.
That’s the major point I’d like to make. Not everyone has the means for such things and you’re specifically stating that you have better sight now due to that. If you wish to argue from personal experience, I have diabetes and my eyesight has been steadily getting worse since I was 16. I’m not saying that things can’t be corrected by going under a knife. I’m saying that BASED ON BIOLOGY the older you get the more likely you are to have factors that hamper your vision which, in turn, lowers the value of having a 144hz monitor. I’m not trying to dissuade people from buying 144hz monitors.
I’m literally saying “the older you get the more likely it is that having a 144hz+ monitor will not be the thing that pushes you to the next skill rank”
Take your bias, put it on the hanger, and come back when you wish to have a civil discussion for, as it stands, you’re the one acting crazy - not I.
Some monitors can get 0.5s response times now-a-days. It’s crazy. Typically a gaming monitor will have 1s - the extra 500ms can be HUGE when gaming.
If you require “evidence” of my claims I’d like to point you to this article by the new york vision center: https://www.nvisioncenters.com/education/20-30-year-olds/
Okay, I guess I could have worn contact lenses or glasses forever, but I thought it was a better idea to just get the corrective surgery. My nearsightedness had not changed at all between about 16 years of age and 35 when I got the corrective surgery.
Regardless, it’s not an issue of vision. Show me some evidence here. Convince me with reliable data that I’m a useless old man when it comes to Overwatch, and I should just give up now and only ever play Solitaire on my Windows 95 Gateway computer with a ball mouse.
I wonder how David Gilmour manages to get better with age. How can such an insanely old man still play Pink Floyd solos – actually play them better now than when he wrote them in the 1970s?
This is talking about presbyopia, which is the medical term for “I need reading glasses.” That’s not an issue in Overwatch. If you can’t see the screen clearly, that’s farsightedness, and you can get corrective lenses for that… which is what the article you linked to was selling. Or you can just move the monitor back to a reasonable distance… you shouldn’t have “reading glasses” issues with something that is further from arms’ length.
The folly of your entire position is that you will soon succumb to it. You’re over 30? lol, tomorrow you are going to wake up and be over 40, and you’re going to move the sticks ahead another decade to the “old” generation. “40’s fine,” you’ll say ridiculously, “but people over 50 can’t play video games – they’re too old!”
That’s not what I’m trying to say at all. Play FPS games as much and for as long as you want - try to get every advantage you can. I simply wish for people to understand that getting a 144hz monitor isn’t some sort of holy grail to making you play better. Making your computer better, especially a mid-high range graphics card, can have a much higher return on investment.
Your monitor should be the LAST thing you seek to upgrade (as long as you’re at or over 60hz) if what you want is to improve your gaming performance - especially if you’re a slightly older gamer.
What really matters is your input latency (CTRL+Shift+N) the higher your FPS the lower your input latency is going to be. For example my RX580 can push 300fps if I wanted it to and give me an input latency of 3.5ms-ish, but at 144hz it’s more like 5ms. Having a monitor that can push 144hz gives you a more fluid experience but isn’t entirely useful.
Ultimately we’re limited by the servers tick rate anyway so this whole topic is kinda moot.
144hz will give you a more fluid visual experience but not much more than that. Period.
About a month ago I bought the Curved ASUS 31.5in QHD 1440P 165Hz and insane low latency monitor. Was about $390. Well worth the price. The size is super optimal and everything run clean and smooth at both 144 and 165 Hz rates.
The only problem is we are not getting the full benefit of our monitors with most games. Its not just resolution and better colors. As others have said, its that latency of your choice of monitor x the input latency x server latency (also in the case of games like this the RTT or Round Trip Latency).
My argument is that, for most people with a 60hz monitor, upgrading to a 144hz monitor will probably have the biggest impact for >$250. Do you know of a GPU that costs around $250 that could have a bigger impact? I mean if you are going from a $190 GPU > $250, how worth it would be such an upgrade? I honestly don’t know that much about GPU’s, so this is a genuine question. Of course the best area to upgrade is going to vary setup to setup. In general GPU is going to be the next most important thing.
If someone is running a cheap and outdated GPU, they are probably already aware that they should save the $250 and put it towards a graphics card before a monitor. However, I don’t think most $250 GPU’s are very good? Especially right now since everything is out of stock and you have to pay an extra 20%-50% to get it off of eBay. Most people probably already have a GPU with decent input lag, and the monitor can also have a major impact on input lag as well (though it is much less than GPU).
I do need to say that before I was using a huion drawing tablet as my monitor, not at all designed for gaming. I should have mentioned that in my original post as there are probably things either than just FPS that improved with a monitor actually designed for gaming. Which might be why my skills improved so much with the new monitor.
However, I do want to add that the higher frame rates have also majorly impacted my endurance. I get fatigued pretty easily and stop playing well after 3-5 games (depending on how well I have slept). I think this monitor is much easier on my eyes and brain, as I don’t feel myself getting tired nearly as quickly.
So your argument that a monitor won’t help someone with poor eyesight doesn’t make sense. Poor eyesight makes you strain, anything that helps reduce that strain will make you game better as well. It “looking smoother” feels SO MUCH better for good reason: it’s literally easier on your body to operate. If you have poor eyesight, upgrading the monitor is probably even more important.
As far as this debate on whether or not you improve with age, I do believe cognitive ability has a major impact your your skill ceiling. If a 20 year old plays overwatch for 700 hours, they are probably going to be better than a 50 year old that plays overwatch for 700 hours. So what we are talking about here is the “skill ceiling” a person has.
The vast majority of people are not anywhere close to their “skill ceiling” (basically the maximum level of mechanical skill they can achieve). I would say the average person would need to play a minimum of 3000 hours to reach their skill ceiling. For all players who have reached their “skill ceiling” genetics and age start to have a bigger impact. It’s the same in all high levels of sport.
The difference between the Gold Medal and last place in Olympic swimming might just be that one person is 6 months younger than the other or has longer toes. What I am saying is that the players are so close together in skill, minor differences play a bigger role in who wins. No one is making dumb mistakes anymore and every tiny movement counts.
It doesn’t seem like cognition starts to severely decline until you hit the age of 50 I found this very interesting study, google " When Does Cognitive Functioning Peak? - BaCH Tech Lab" to find it, I can’t post links. Before that a 25 year old has a slight advantage over a 40 year old, but again this can still be compensated for with a little extra training and practice.
You can also compensate by working within your limitations. Cognition is complex, some people are naturally better at precise aim, and others are naturally better at tactics. Overwatch is forgiving because you can really customize what skills you need to have a major impact in the game, based on your unique cognitive abilities. Have superior vision and/or aim does not mean that you will have the biggest impact in this game.
Anyways, I just thought you might find that study on cognitive peak and decline interesting. Some skills peak in your 40’s-50’s, some peak in your early 20’s, some in your 30’s, etc. Empathy peaks in your 50’s, isn’t communication the most important aspect to this game at high levels of play?
It breaks it all down and I think it’s useful to see how the human brain tends to develop and change over the years. It is proven that IQ impacts video game skills, and IQ changes over your life. This is why I think that the best way you can improve your gaming (outside of practicing the game) is actually cardiovascular exercise, as regular exercise generally has the biggest impact on your IQ.
If you’re in the market for a GPU I would recommend taking a look at https://www.tomshardware.com/reviews/gpu-hierarchy,4388.html
and comparing what you have to what you’re in the market for.
Upgrading your monitor can be a tricky situation sometimes. If ANYTHING in your computer’s setup is more than 4 years old it’s probably time for an upgrade no matter what part it is. Especially now when people are buying new GPUs there will be a ton of people dumping their previous generations onto ebay, just be careful with nvidia titans as people used them to hardcore mine crypto.
It’s probably a good idea to combine upgrading your GPU with a new monitor just for compatibility sake or buy an adapter.
BTW, I can’t post links either but if you encase the link in ` you’ll be able to post it.
1920x1080p , 24", 1ms GTG, 144hz, Display port cable = the most smoothest upgrade ever and worth it even if its a used monitor or refurb. It will blow your minds.
People say 144hz is the go to, 240hz is overkill even if your pc is godlike and can hold 300 solid lets say (avg fps isnt what you look for its the .1%).
Yes and no.
144hz will do you just fine: Go for it even if your pc can handle the 240 the price difference isnt worth it imo (I have the 240hz benq zowie xl2546 dyac).
BUT the actual factor 99% of the time with the “240hz is a waste of $” haters iiiiiiiis…
They’ve never played on 240hz on a insane machine that can peg 300/400 fps solid. Tell them to spend a week on 240hz on an insane pc, then swap the 240hz screen to their beloved 144hz. I guarantee you as I own both (144 is now my 2nd monitor), you will see such a HUGE difference youll want the 240 back ASAP. Wont notice much going 144 to 240 really in OW- but aaayyy, try and go backwards boys - it’s just a gap you wont expect.
That feeling when your eyes cannot handle more than 50 FPS. These sticky slow locked eyes @__@.
Generally, what GPU you use will be determined by games other than OW, as it’s a low-req eSports shooter.
I have 2x 144Hz displays, though. One Good IPS main display and a decent TN secondary-display (for running my NLE/DAW/etc. across two screens).
You don’t need a big GPU to run OW at 144Hz 1080p.
You only need a mid-range card for 144Hz QHD.
And I’m talking Medium-High Settings, maybe some things on Ultra.
It has low system requirements. Other games and software will dictate GPU choice, not Overwatch (unless it’s the only game you play).
There is a huge difference between 240 and 144, IME.
It’s not big enough to even consider spending the extra money, to me.
Also, buying a new monitor always puts you at risk of monitor roulette - where you get multiple monitors with Permanently Stuck or Dead Pixels, or other issues. I remember going through 4-5 different displays when I upgraded from 75 to 144Hz because I kept getting bad displays (across multiple manufacturers - regardless of how well they were reviewed).
i would even recommend 240hz if you can afford it
haha,…
I got a 144hz 3 weeks ago and did all my PC stuff to switch it to 144HZ…
I read your post and realize i have to change a setting in OW itself too…
HAHA im a idiot.
I refused to believe this for years thinking it’s a marketing gimmick like many other things. Today I bit the bullet and ordered the new samsung ultrawide 144hz.
Better save me from Silver stuck status lol…
Huge difference 240to 144 when you try and go backwards (talking OW here playing as a 4.4 peak player and gm for 20~ seasons). Slight diff going forward as expected with a pc that can utilize it with headroom on top. If we talk other fps games with a pc with headroom to not only utilize the display but out perform it- heck yeah 144 to 240 is nasty if you’re a top 1% player that has the reaction timing to put the extra refresh rate to use on corner peaks and such.
The tl;dr this thread should be : 144hz on a budget is the best upgrade ever on pc ow. If you got the unlimited funds and your pc is stout- 240hz is the hands down option ofcourse. But if someone is debating between peripherals, or a 144hz screen when their pc can utilize the 144hz it is a no brainer just get the 144hz and never look back. Its to the point that you wont even wana use a school computer/work pc for basic work tasks with a 60hz screen because the way the curser moves will drive you up the wall haha.
with my mid of the mid tier RX580 I can only get a stable 144 when playing on low. you’d need at least a $300 card to get medium+ at a consistent rate.