This makes me even more convinced the GPU is not being utilized, and that the CPU is the only thing that’s being used to run the game, causing other processes to hang up like calculating mouse inputs.
It’s been experimentally proven that in the presence of motion blur, people can’t tell the difference in frame rates over ~40FPS. Games generally don’t have motion blur, so 40 can be distinguished from higher frame rates, but it’s not that much more. At about 60-80 and higher without motion blur people can’t tell the difference in motion. So all these 'I had 400FPS and now 300 or 200 or 100 is “unplayable”… mhm.
As for input lag between frame rates of, say, 60FPS and 400FPS, who here thinks they are super human enough to notice the difference in their input rate between the 16 to 2 millisecond range?
I’m not saying that frame rates didn’t drop, or that some people didn’t drop low enough to SEE the difference. There’s a report in the Tech Support forum who’s getting 7-14FPS. That’s legit.
However, there is some seriously deluded whining going on around those legit posts.
From 400 to 200/130.
200 PFS = WTF.
These made my day.
You are mistaken.
Just load the game without doing anything, and the FPS will fluctuate between 140 and 210. Previously, the changes were around ±5. Do you understand the difference between the program’s performance in idle mode, which was stable within ±5, and the jumps of ±70? Can’t you see the difference between 140 and 210 with your eyes? Yes, but no.
This indicates poorly optimized code. It shows that the program is running in bursts. It indicates that something is interrupting or disabling the main application thread for milliseconds. During this time, the game becomes unresponsive.
I assume something is broken. Warcraft was most likely designed to run on a single thread. There is a function marked as “synchronized” that doesn’t allow a second thread while one is already in it. And Blizzard added a second thread to it, probably a survey for their super useless observer panel (if that’s the case, please remove it because smooth gameplay is more important than any observer panel).
So now the operation is like this (game, game, game, game, statistics, game, game, game…). Let’s say the statistics take 10ms to execute. It is launched once per second and it hangs the game for 10ms. That’s why the game runs in bursts, affecting both the FPS and the mouse.
In a micro patch, they simply reduced the frequency of this statistics function polling. That’s why not everyone experiences drops below 80.
You don’t see the difference in the description:
- The human eye cannot distinguish between 140 and 210 FPS.
- The game runs in bursts, and when it does, it registers FPS between 140 and 210.
I’m not sure why you added all the explanation of how the code executes. That was never part of my point. How it was before or after, how the game runs its instructions, how the FPS spikes with executions, etc had nothing to do with my point.
I’m still stunned to see people complaining about FPS they shouldn’t be getting in the first place. 240hz+ monitors are not very common, and unless you do, you should be capping the frame rate to the refreshrate of your monitor to not put undue stress on your GPU. Because you will never see those extra frames as your monitor is not displaying them.
This sort of thing has been largely disproven. Humans do not see “frames” the way a computer monitor displays them. And what people are actually capable of percieving depends on their own biology, with some people having more capable eyes than others.
The numbers we’re talking about in this thread are beyond the range that almost anyone will notice, but having numbers that are below what one is used to is still an issue, whether the person can see it or not. That’s fine and I don’t dispute that at all. I just can’t get over the fact that people are trying to pretend that a framerate beyond what the average computer monitor displays is “unplayable.”
We all know they’re just using words like that to try to pressure a fix.
I can’t believe you continue to talk in this thread when you’re not experiencing the symptoms that make playing impossible. Your comments are irrelevant noise
For someone that is actually here to provide feedback and share their experience read every comment and ignore FPS numbers for a moment. You’ll see over and over symptoms mentioned trying to describe the same problem … recent impact to responsiveness of game, stuttering, etc coming from 1.36 patch (personally playing on classic graphics)
It hasnt been disproven. It’s the natural accounting for variability. In the same way that saying humans average 5’5’', but there are many above and below. It’s well established that the time from retinal activation to the final cognitive experience has a average 40FPS with motion blur, and higher without it.
Right, most computer rendered images don’t use motion blur. It’s was a significant field of study in digital rendering that was borne out of the slower frame rates previously used in analog not producing the same effect in digital renderings.
But all have upper limits, the highest of which are not telling the difference between, say, 200 and 400.
Agree. Especially since unless people have super high frame rate monitors, they were never even seeing those higher rates anyway. For example, the complaints that rates dropped from 400 to 200 is irrelevant if their monitors maxed out at 60, 75, or 144Hz.
I already acknowledged that I observed a performance difference? Your comments complaining about my comments are the irrelevant noise.
This isn’t really feedback, it’s an adventure in hyperbole. If people wanted to give feedback, it would be a lot more useful to just say “hey, i’ve noticed a performance decrease since the patch. Any plans to fix this?” than to try to make the problem look more urgent by claiming 200 FPS is “unplayable.”
No, they are not. It’s absolutely reasonable to call out statements like ‘60FPS is unplayable’. And attempts like yours to cancel rebuttals like that are worse than noise.
You can find people and articles all over the internet claiming “your eyes can’t see more than X (usually 30) FPS.” Motion blur is meaningless, as that is displayed in frames just as everything else is- the blur is part of the image.
We already know most of us can see well beyond that. This supposed FPS “barrier” is really just the point at which our brain processes the signals from the eyes as continuous motion instead of individual images.
Thanks. I’m glad at least some people can see how adding all this fluff just distracts from what’s actually important about an issue.
No Captian, that 100% wrong. Motion blur is everything. It’s what allows the brain to tie discrete images at slower rates together because that’s how we naturally see. All motion we naturally see is blurred due to the refractory periods of each retinal cell as it changes state. It’s the very foundation of the research that was done to help transition from analog to digital several decades ago.
Video game rendered images typically don’t use motion blur. They are crisp, unblurred images in rapid succession. The primary reason is the processing power needed to create the blur. That’s why we could record just about any game, pause the recording, advance frame by frame and see crisp unblurred images.
What we can see aside, there are some serious deluded claims going around about playability.
The motion blur produced by a computer on a display is artificial. it’s a graphical effect that alters the images produced. It’s not real. Actual motion blur (better known as persistence of vision) is different from pixels on a screen.
It doesn’t matter if its artificial, the retina and the rest of the pathway doesn’t care. And as I said, it can be replicated by pixels, and if it is not, higher frame rates are needed.
The human nervous system, including the visual system, is my career. Specifically, I have been, and still am, intimately working with the motor, somatosensory, auditory and visual systems for 20 years. Much of the testing of the visual system had to be revamped when analog changed over to digital, hence the research I’ve mentioned a couple times.
I do like your logical and rational approach to things when you post. That’s hard to find when the emotional, knee-jerk types congregate. But one has to at least consider the background of the person they are talking to, especially if the conversation takes a deeper, more technical path.
Of course, I know that 200+ FPS people cannot distinguish between people’s eyes. My frame rate is 400 to 200, and others may drop to 100 to 100 and 100 to 50.
There will be many soldiers in the game in the game. If there is no problem for stability 200, the problem is that it often drops to 80, and it will feel card when it is lower than 130 jump frames.
Find the problem and solve it, not to popularize nonsense here
In images that don’t use motion blur (e.g. most video games) the vast majority of the popuplation can’t sense individual frames above 80Hz. And…
But that’s immaterial. The real argument here in the thread is what some people are calling “unplayable”. 200, 100, 80, 60 are all playable no matter what it was before the patch. Even lower, like 30Hz, is playable. Maybe ugly, but absolutley playable.
In have a 60Hz monitor. Thus for each Bliz game I play (that has the option), I cap the frame rate at 70Hz. They are all perfectly playable. Diablo 2 frame rate is locked 60Hz and the frame calculations are made at 25Hz (if that hasn’t changed since D2R’s release) and it’s perfectly playable.
Not sure why you buffoons are having this off topic discussion about visual perception.
Currently, the game is experiencing discrete hangs, as in drops in FPS, every couple of seconds, which makes it unplayable. Constant 30fps might be playable, the spiking is what is the problem.
Now why would you insult these people like that? That solves nothing.
He did everything correctly, they just confused the chat, reading which might make it seem like all those who complain about the decrease in performance are deceiving themselves.