Why are we still not in 64 tick?

it’s really frustrating when there are at least 10 no reg with widow/hanzo in a game for me with 30ms ping

2 Likes

Ping has nothing much to do with network ticks, your frame rate does.

If you are talking about hitting stuff on your end but not damaging your enemies as Hanzo, that’s a projectile prediction bug. For widow, if you hit, it will always register.

3 Likes

Your framerate has nothing to do with network ticks. That’s why they’re called network ticks – everything is being handled through Blizzard’s first-party dedicated server.

4 Likes

It might not be on Blizzard’s side. There are more factors to take into consideration when it comes to your own network. Have you noticed any of the orange symbols on the left side of your screen? And are you using WiFi?

I think they had meant that it didn’t seem right for them to have issues with no reg with 30ms ping. A lot of people, such as myself, usually have higher ping.
Neither the ping or frame rate affect the tick rate they’re talking about. It’s purely on Blizzard’s side. I know that their servers were running on a lower tick rate before, but I thought they had updated it to what is considered standard… At least from my understanding. Please correct me if I’m wrong.

Just click the head, silly.

Your common sense should be able to tell that if you are playing at 24 frames per second, you can’t send at 60 frames per second.

You have no idea what you are talking about. At least server engineers have enough brains to not flood a player with 60 frames of data when he can’t even manage 24. Networking tick rates have everything to do with frame rate.

Blizzard uses 60 for people with decent computers that can steadily support 60 fps and lowers it if required.

Also, no, Blizzard rents their servers.

Common sense has absolutely nothing to do with this. You’ve made some stupidly incorrect assumptions. Having worked with Unity and Unreal Engine for the last three years of my university degree, I have 100% confidence that you’re wrong.

You sweet summer child. “60 frames of data” isn’t that much data for a video game. You seem to think that game data is something like an HD video stream off of Twitch or Netflix.

It isn’t. The data being transmitted is NOT a bunch of still images compressed into a videostream.

When you’re running a game executable at runtime, that means your computer has local models, particle systems, timers. All of them are given instructions BY THE SERVER to display things on screen in a particular order and fashion.

The “frames of data” that you’re talking about is LITERALLY just numbers and instructions. World locations in XYZ coordinates, forward vectors in a floating point variable, and event/input triggers which are booleans – literal true or false statements – all of these instructions, are just individual number values that tell YOUR CLIENT to play a local animation file packed in the game files on your computer, play a particle system on mouse button press which is packed in the game files on your computer, play a HUD animation that changes the value of your healthbar which is packed in the game files on your computer.

All of these numbers & instructions are literally less than a megabyte per second. 60 frames of data, is literally just numbers…

  • Each player has an aiming vector, an XYZ world position, a gravity impulse, health, ammo, and 3+ ability timers. That’s maybe 8-ish numbers. 11 other players… that’s 88 number values per tick.
    • Going from 30 tick to 60 tick puts it from 2640 number values to 5280 number values.
  • Your CPU literally crunches these numbers in no time at all, the standard Intel i7 (or AMD equivalent) processor crunches 33,860 trillion calculations per second (that’s 33.8 quintillion… 33,860,000,000,000,000,000 number crunches per second).

The bottleneck here has nothing to do with these incoming data from the server. The bottle is solely how fast your graphics card can push things out. And whether or not your graphics card happens to display something has nothing to do with whether or not something happens in a specific time.

Game developers haven’t tied things to framerates for crucial multiplayer actions for DECADES.

Your framerate and your inputs are completely independent of each other. Full stop. I can prove this.


Here’s a live recording of 30 fps gameplay. I played this a few minutes ago with 30 fps, you can confirm in the top left with the FPS counter

  • https://gfycat.com/DiligentQuerulousArizonaalligatorlizard

Here’s the highlight capture of that exact moment at 60 fps

  • https://gfycat.com/IdealisticMagnificentBunny

Highlights are server replays. That means information that was sent to the server from your game client is being replayed in the order that they were received from you, adjusted to the server’s tickrate.

Here is the same clip again, slowed down frame by frame, synced up by the sniper shot that I fired at the spawnroom drone

  • https://gfycat.com/BlankWindyGermanspaniel

The 30 fps footage has fewer frames, as expected. The 60 fps footage has movements in the inbetweens of the 30 fps footage. These are ARCING movements.

I’m PLAYING at 30 fps, but the SERVER is showing ARCS at 60 fps.

There’s absolutely ZERO way that you could have arcing movement in the 60 fps footage, unless the server received your inputs from your client at a rate of 60 tick.

The server cannot auto-generate arcs by interpolating frames. The most that the server could do is auto-generate LINEAR movement… it’s called linear interpolation.

If they were auto-generated by the server’s linear interpolation, they’d be robotic straight, linear movements (not arcs as you can see in the video). You can’t bezier-smooth interpolated motion, so this MUST be mouse inputs transmitted to the server.


Regardless of what your framerate is set to, you are sending your aiming coordinates (and by extension, all other inputs) at 60 tick… you are also receiving 60 tick worth of information. If you’re at 30 fps, you’re only seeing 30 “frames worth of data” but rest assured that the other 30 “frames worth of data” was processed, but couldn’t be shown visually because your graphics card didn’t draw it.

At 24 fps, your reaction time is vastly impaired, but the data isn’t being restricted. It’s still continuously flowing.

No, they don’t.

They explicitly said in blueposts, and a developer update that they only adaptively change your send/receive rates if, and only if, you’re experiencing network interruptions. Of which, as soon as those interruptions stop happening, it goes right back up to the maximum.

If your network connection is rock solid (which mine was in the video), then the only way you can limit those in/outbound inputs are if you manually, intentionally limit it by enabling the “Limit Client/Server Send Rate” settings in the options.

Every company rents servers from Amazon. That doesn’t mean that these servers aren’t dedicated servers, or that Blizzard doesn’t own the content on them.

4 Likes

Damn dude, I can’t believe you took the time to do it to them like this.

Good read, though. I learned some cool stuff. You should teach.

3 Likes

I used to. Was an unofficial TA for my major’s 3D authoring classes for 2 years.

3 Likes

Its funny how you wrote all that only to be wrong. Its funny how self confidence in previous experience without knowing how different netcode in different environment causes people to get defensive like this.

One simple test. Go set your frame rate to 30fps, press Ctrl + shift + N, your send rate is immediately lowered to 30fps. And yes if you think 60 frames per second isn’t a lot of data, you are simply generalizing things too much. There are tons and tons of games that can’t go full 60 fps of sending rate because it simply isn’t possible, especially battle royales.

Having worked in Unity and Unreal engine for the last three years has obviously narrowed your view to just your experience alone. Every game does it’s netcode differently.

Edit: Also, I’m sorry if I misunderstood you, but you mentioned “first-party” in terms of server, so I imaged you mean you thought Blizzard owns their own servers.

Edit 2: Even though you are still wrong at the end, it’s nice to see someone who at least showed some decent understanding in netcode before trying to correct someone for the benefit of others.

Learn to lose gracefully, kid.

I hope you are not the same guy who did the long post judging from your post count.

Well then, you’d be sorely mistaken considering I just played a game on 30 FPS with a “PPS OUT” at random fluctuations between 30-60.

Here it is at 49. Top left FPS counter, 29.

https://imgur.com/xWuS2fL

Equally important to note, you’re still RECEIVING 60 tick information, which is at ~64 PPS IN.

I don’t know what to tell you here other than there’s a possibility of different servers being rated for different tickrate loads and priorities… or another similar possibility that the client misreports things based on my above arcing footage.

The data capacity is not the problem in a Battle Royale game.

The time it takes for the server to resolve arguments is the problem. The amount of data in a Battle Royale from 100-ish players means that the server has to reassemble packets from 10x the normal player count… that means it has to parse through 100 different clients from various networking conditions and latencies, and then streamline it to one singular coherent timeline.

And that’s a problem because packets can be received from various clients out of order. It’s like a pizza delivery route, where depending on traffic conditions, pizza delivery drivers may opt for different open roads to reach their destination.

While packets ARE sent from clients sequential fashion, unless you opt for a very specific networking model, packets are NOT received in the same sequential order.

Packets take various avenues through a network to reach the server – Packet #101 can be received AFTER Packet #102, and if Packet #101 has the event information that said a player fired while the server is receiving Packet #103+, the server has an argument, and it has to backdate the firing event in it’s official timeline.

When you’re dealing with 100+ accounts of that, it’s not the data load, or the amount of information. It’s the stress that comes from resolving 100+ different arguments at once.

No. That isn’t me.

That just proves that framerate does affect your PPS. Idk why you just want to defend it so much. Just have to go into a simple training game, look at the pps, limit your fps to 30 and you get the cold hard evidence. Still, I was wrong about blizzard not stepping down on the send rate for weaker computers. the 49 pps you are seeing is called a moving window average, not the actual pps due to fluctuation in frame rate.

Also, I know and have first hand experience implementing a game server, so don’t assume I’m speaking from speculations, I’m just lazy to bother with elaborating the specifics. Though I appreciate the long posts that teaches other people what are you talking about (although there are still other points that are wrong but I wont nitpick).

The packet sequence isn’t that much of a problem actually, all decent netcode will simply implement a buffer window which delays processing and sending info received (which is what overwatch does in both the client and server side). The real problem for battle royale games is simply the streamlining of data to tailor fit only the top priority information for each player due to the insane amount of data required for those kind of games. Causing the server to sacrifice on simulation speed. Again, 60 fps of data isn’t as cheap as you think. I have no idea what kind of games you have made in your 3 years to think that.

Then again, if you really have to arrange the data and simulate the game with each packet accordingly, you are dealing with games which determinism is a high priority. Overwatch simply doesn’t require that much and battle royale games certainly cannot handle that level of processing.