Shawn Hargreaves Blog
"The time has come", the Walrus said, "to talk of networking"...
The network game programmer has three mortal enemies:
I shall write about all three, in that order.
Latency is caused by physics. Decades of science fiction notwithstanding, physicists have yet to figure out how to surpass the speed of light. Because of their failure (the fools! I mean really, how hard can this be?), we are unable to send any data faster than 186282 miles per second.
That's still pretty fast, though, right?
I live in Seattle. My colleague Eli used to live in New York. That is 2413 miles away, which is 13 milliseconds at the speed of light.
I used to live in England. From Seattle to England is 4799 miles, which is 26 milliseconds.
In a 60 frames per second game, each frame gets 16 milliseconds. So I already have nearly two frames lag when I play with my friends in England.
But wait! This is not the whole story...
How bad can it get?
How can you try this at home?
NetworkSession.SimulatedLatency = TimeSpan.FromMilliseconds(200)
What can you do about it?
Talking about Wikipedia:
So, if each frame gets 16 milliseconds in a 60 fps game and Xbox games are expected to work with latencies up to 200 milliseconds, that is to say that it's expected -in average- that every 12-13 frames a "jerk" could eventually happen, what in turn is the amount of frames that the human eye can perceive.
Taking 200ms as the worst case scenario, we would be rendering 12 to 13 frames based on our predictions and assuming no changes in the oponents' behaviors.
Thus, the question is how much of those 12-13 frames we use to adjust the wrong values to the correct data by using interpolation in a smooth way.
This is a very interesting topic, so cannot wait to get that sample!
Could you please tell me where to find the sample?