Post by exaltedvanguard on Dec 8, 2019 11:41:17 GMT -5
I'm curious about how netcode handles variation in ping. I'm posting this in the CoD section for visibility, but it really probably applies to most modern shooters.
For simplicity, let's assume the following:
Game states are sent to and from the server every 16ms (62.5hz). We will ignore computation time and assume the server/client does not vary from this 16ms tickrate.
Our ping starts at a rock solid 32ms, with symmetric 16ms send and receive times.
There is zero packet loss.
So in our starting scenario, pulling the trigger will fire a shot. The packet will take 16ms to reach the server, we'll have a buffer delay of 1 tick (another 16ms), and the hit confirm will be sent by the server to the client, resulting in another 16ms delay total time from fire to hit marker is 48ms.
(BTW this scenario almost exactly matches battlenonsenses testing).
This scenario repeats 80ms later when our full auto gun fires another shot at 750rpm.
Now here's where it gets fuzzy for me and I'm hoping someone can clear things up.
What if the player suddenly experiences a very small, transient increase in latency? No packet loss and no packet reordering.
Our first shot fires, but now takes 90ms to arrive at the server. Our second shot fires 80ms after the first and that packet is sent, arriving 16ms after fired, because for whatever reason our connection returned to normal. That means our first shot and second shot arrive at the server at 90ms and 96ms, respectively. We are assuming that any other packets sent by the client between the shots also arrive, in order, between the two shots.
What happens? The server just received 2 shots 6ms apart.
Presumably if there's consistently jitter on a connection you could use an input buffer, but in this case there is none.
Does it honor both shots immediately? In that case being on the receiving end means getting instamelted...
For simplicity, let's assume the following:
Game states are sent to and from the server every 16ms (62.5hz). We will ignore computation time and assume the server/client does not vary from this 16ms tickrate.
Our ping starts at a rock solid 32ms, with symmetric 16ms send and receive times.
There is zero packet loss.
So in our starting scenario, pulling the trigger will fire a shot. The packet will take 16ms to reach the server, we'll have a buffer delay of 1 tick (another 16ms), and the hit confirm will be sent by the server to the client, resulting in another 16ms delay total time from fire to hit marker is 48ms.
(BTW this scenario almost exactly matches battlenonsenses testing).
This scenario repeats 80ms later when our full auto gun fires another shot at 750rpm.
Now here's where it gets fuzzy for me and I'm hoping someone can clear things up.
What if the player suddenly experiences a very small, transient increase in latency? No packet loss and no packet reordering.
Our first shot fires, but now takes 90ms to arrive at the server. Our second shot fires 80ms after the first and that packet is sent, arriving 16ms after fired, because for whatever reason our connection returned to normal. That means our first shot and second shot arrive at the server at 90ms and 96ms, respectively. We are assuming that any other packets sent by the client between the shots also arrive, in order, between the two shots.
What happens? The server just received 2 shots 6ms apart.
Presumably if there's consistently jitter on a connection you could use an input buffer, but in this case there is none.
Does it honor both shots immediately? In that case being on the receiving end means getting instamelted...