Network Prediction Sample
This sample shows how to use prediction and smoothing algorithms to compensate for network lag. This makes remotely controlled objects appear to move smoothly even when there is a significant delay in packets being delivered over the network.Sample Overview
The Peer-to-Peer and Client/Server samples demonstrate two different network topologies. Each uses an example of a tank that the player can drive around the screen. In both samples, tank data is sent over the network every frame (60 times per second). That is a lot of data! When playing on a local network, these packets are delivered quickly enough to achieve smooth and continuous movement, but things do not work so well over the Internet. Most Internet connections do not have enough bandwidth to send data so often, and are slow enough that players will see delays and jerkiness in the movement of the tank.
This sample shows how to make the tank example from the Peer-to-Peer sample work over the Internet. It uses the NetworkSession.SimulatedLatency and NetworkSession.SimulatedPacketLoss properties to artifically emulate a typical Internet connection so that you can see the effects of lag even when testing over a fast local network. It then applies prediction and smoothing algorithms to compensate for this lag, making the tanks move smoothly even though the underlying network data is far from smooth.
Sample Controls
This sample uses the following keyboard and gamepad controls.
| Action | Keyboard control | Gamepad control |
|---|---|---|
| Create a session | A | A |
| Join a session | B | B |
| Change network latency simulation | A | A |
| Change packets per second | B | B |
| Toggle motion prediction | X | X |
| Toggle motion smoothing | Y | Y |
| Move the tank | UP ARROW, DOWN ARROW, LEFT ARROW, RIGHT ARROW | Left thumb stick |
| Aim the turret | K, O, L, ; | Right thumb stick |
| Exit the sample | ESC or ALT+F4 | BACK |
How the Sample Works
Problem #1: Bandwidth
Bandwidth refers to the amount of data that can be sent over a network connection. The XNA Framework measures bandwidth in bytes per second, but you will sometimes also see this measured in kilobits per second (Kbps). A kilobit is 128 bytes.
If you try to send more network data than there is available bandwidth, some of that data will be discarded. If you keep sending way too much data, eventually your connection will be dropped entirely and the network session will end.
So how much is too much? That depends on how good your network connection is. As a rule, you should assume a worst case of 8 KB (64 kilobits) per second. Many people will have better connections than this, but if your game requires more than 8 KB, some people will be unable to play it.
It is surprisingly easy to use too much bandwidth. For example, the Peer-to-Peer sample sends the following data over the network.
| Data | Type | Size (bytes) |
|---|---|---|
| Position | Vector2 | 8 |
| Tank rotation | float | 4 |
| Turret rotation | float | 4 |
That may not seem like much: it is only 16 bytes in total. But things quickly add up. Every time we send a network packet, approximately 50 more bytes are used for the packet header (28 bytes for a standard UDP packet header, plus ~22 for LIVE and the XNA Framework). Including this header data, our packets are now 66 bytes. We are sending 60 packets per second, which makes 3960 bytes per second. Finally, we must send packets to every other player in the session. If there are three players in total, each of them sends packets to both the others, making a total bandwidth usage of 7920 bytes per second. We have used almost all of our 8 KB total bandwidth, with only three people in the session! The Peer-to-Peer sample is therefore not efficient enough to support even a four-player game over a typical Internet connection.
The best way to reduce bandwidth usage is to send packets less often. Rather than sending every frame, most games send packets only 10 times per second or 20 times per second, and in some cases even less. It might seem as though you could achieve a similar result by compressing your packet data, but that tends to be less useful when you think about the packet header overhead. The Peer-to-Peer sample sends 66-byte packets, which contain 16 bytes of game data and 50 bytes of header. Even if we could halve the size of our game data, that would only reduce the packets from 66 bytes to 58: not much of an improvement. Decreasing the send rate has a bigger impact because this cuts down on the number of packet headers as well the amount of game data.
When you send packets less often, you can expect remotely-controlled objects to move less smoothly. Hence, you need smoothing algorithms to cover up the resulting jerkiness.
To see the effect of a low packet send rate without any latency or correction algorithms, press X and Y to disable prediction and smoothing, press A twice to change the "Network simulation" setting to 0 ms, and then use B to cycle through the available "Packets per second" settings. Notice how locally-controlled tanks always remain smooth. However, if you join a second computer into the session, and then drive a tank on the first computer while watching the screen of the second, this becomes increasingly jerky the less often you send packets.
Problem #2: Latency
Latency refers to the time delay between a packet being sent and when it is received. Local networks have low latencies (small enough to completely ignore), but the lag can be much higher when playing over the Internet.
Latency comes from many sources. Firstly, the speed of light, which is 186,282 miles per second. From Seattle to New York is 2413 miles, so even if everything else was perfect, data sent from one side of the United States to the other cannot possibly arrive in less than 13 milliseconds. Also, network data usually travels down fiber optic or copper cables, in which light slows to only 60 percent of its speed in a vacuum. Finally, every piece of hardware along the way causes additional latency. A modem typically adds around 10 milliseconds (there will be two modems, one at each end), while a router (of which there could be several, depending on your ISP) adds between 5 and 50 milliseconds. As a rule, you should assume a worst case of 200 milliseconds latency.
To see the effect of poor network latency without any correction algorithms, press X and Y to disable prediction and smoothing, press B twice to change the "Packets per second" setting to 60, and then use A to cycle through the available "Network simulation" settings. Notice how locally controlled tanks always remain smooth, but if you join a second computer into the session, then drive a tank on the first computer while watching the screen of the second, this becomes increasingly delayed and jerky the more latency you introduce. Unlike the evenly-spaced jerks caused by a low packet send rate, the effect of latency is more random, as every packet will be delayed by a different amount of time.
Solution #1: Smoothing
Although limited bandwidth (which causes you to send packets less often) and latency (which delays the delivery of packets) are different problems, their symptoms are similar. Both prevent network data from arriving in time to produce a smooth movement. Fortunately, the same solutions can be used to fix both problems.
Take this example of a game that sends packets every 0.2 seconds. If the tank drives upward, then turns to the right, following the blue line, it will send three packets at the times and positions shown in this diagram.
If latency delays each packet by 0.1 seconds, a network client will see the tank behaving as follows.
The dotted red lines represent jumps, where the tank moves instantly to the position described in the most recently received packet. Note how the motion is not only jerky, but also delayed. The tank does not move at all for the first 0.3 seconds, as it must wait first for a packet to be sent at 0.2 seconds, and another 0.1 seconds for that packet to be delivered.
Smoothing is a simple concept. When a network packet is received, rather than teleporting immediately to the new position, we can interpolate gradually from the previous position toward this new location, giving the illusion of continuous motion. With smoothing in place, our remotely controlled tank will move as follows.
The jerkiness has been removed, but this is still not perfect. The main problem is that the movement delay is even worse than before. Not only must we wait for packets to arrive, but now we must also wait while we smooth the tank gradually toward the new position, even after the network has told us what position that is.
Smoothing without prediction is simple to implement (see the Tank.UpdateRemote and Tank.ApplySmoothing methods in this sample), and may be adequate for some game objects. Still, objects controlled in this way are always going to be delayed slightly.
Solution #2: Prediction
Prediction algorithms attempt to remove the delay caused by more naive smoothing implementations. Because packets do not arrive instantaneously, it is impossible to ever know exactly where an object is located, but if we know where it was a short while ago when the packet sent, and also know which direction it was moving and whether it was turning, we can make a good guess as to how it is likely to have moved since that time.
To implement prediction, we must include more data in our network packets. If we only sent the tank position, it would be impossible to predict how that might move, so we must also send the velocity and user input controls. Using this extra data, we can make some guesses. For instance, if the first packet tells us "the tank is facing upward, and is not moving, but the user is pressing Up," we can predict it will accelerate upward along this blue line.
After 0.3 seconds, at the same time as our locally simulated tank reaches the end of the blue line, we receive a new network packet. The position in this new packet is somewhat behind where we predicted the tank should be, because the packet took 0.1 seconds to arrive, so our prediction has already moved on past the position where the packet was sent. Also, the new packet contains updated input values, telling us the player is now providing a steer input that will make them curve to the right. Using this new information, we can reset our simulation to the state described in the new packet, and predict the tank will curve to the right along the green line.
But here we have a problem. We have already moved our local tank prediction all the way to the end of the original blue line! That prediction turned out to be close, but slightly wrong. If we move the tank backward to the position described in this second packet, the user will see an ugly jerk, so we cannot directly use this new position. That tells us where the tank was at 0.2 seconds, but we are now at 0.3 seconds, and we don't want to jerk backward. Instead, we assume the tank has been moving along the green line since this packet was sent 0.1 seconds ago, so we take a point 0.1 seconds forward along the green line, which is only a small distance to the right of our original blue prediction. Rather than just teleporting the tank from our old and slightly wrong blue prediction to the newer and hopefully more accurate green prediction, we can use the same smoothing technique described earlier to gradually interpolate from the old position toward the new, covering up this minor misprediction in hopes that nobody will notice it.
This process repeats every time a new packet arrives. For instance, at 0.5 seconds, we get a packet telling us the user is now steering straight to the right.
We incorrectly predicted the tank would continue turning downward along the green line, so we update that guess with the new and more accurate orange prediction, and gradually smooth our position from the incorrect green guess toward the new orange version.
The great thing about prediction is that the object is no longer delayed. When the prediction works well, it will move smoothly and be in the right place at the right time. When things work less well, you may notice the adjustment after a prediction that turned out to be inaccurate. The easiest way to demonstrate this is to move in a straight line, then stop suddenly. The prediction assumed the tank would continue moving, but then a network packet says it has stopped, so the smoothing must move the tank slightly backward to correct for this mistaken guess. Network programming is all about compromises, and even the best prediction algorithms cannot be right 100 percent of the time!
So far we have skimmed over the details of how to actually implement a prediction algorithm. This turns out to be trivially easy for most games, because you can simply reuse your existing object movement code. You already have a method that updates the movement of your locally controlled objects. What better way to predict how a remote object is likely to move than by running the exact same code that is controlling that movement in the first place? This occurs in two places:
- When a new network packet is received, Tank.ReadNetworkPacket calls ApplyPrediction. This calls the UpdateState helper method as many times as necessary to compensate for the packet delivery latency. In order to know how much compensation to apply, ApplyPrediction must estimate how long each packet took to arrive. The XNA Framework provides an estimated average network latency via the NetworkGamer.RoundtripTime property, but not all packets will take exactly this average amount of time to arrive. To handle varying latencies per packet, we include the send time as part of our packet data. Unfortunately, there is no guarantee that the clock will be set the same on every computer! The sender told us the clock time when the packet was sent. But this is meaningless unless our local clock is in sync with the sender's clock. To compensate for any clock skew, we maintain a rolling average of the send times from the last 100 incoming packets. If this average is, say, 50 milliseconds, but one specific packet arrives with a time difference of 70 milliseconds, we can deduce this particular packet was delivered 20 milliseconds later than usual.
- Tank.UpdateRemote also calls the UpdateState helper, keeping the tank moving smoothly along its predicted path during the gaps between one packet and the next.
One of the many paradoxes of network programming is that prediction algorithms can reduce our total bandwidth usage, even though they require sending more data over the network. Remember how the Peer-to-Peer sample sent only the position and rotation (a total of 16 bytes), but had to do this 60 times a second, making a total of 3,960 bytes per player per second? To implement prediction, we must also send time, velocity, and controller input data. However, thanks to the prediction, we can now get away with sending packets less often. Look how the math works out.
| Data | Type | Size (bytes) |
|---|---|---|
| Packet send ime | float | 4 |
| Position | Vector2 | 8 |
| Velocity | Vector2 | 8 |
| Tank rotation | float | 4 |
| Turret rotation | float | 4 |
| Tank input | Vector2 | 8 |
| Turret input | Vector2 | 8 |
That is 44 bytes of game data. By adding a 50-byte packet header, and multiplying by 10 packets per second, you get a total of 940 bytes per player per second. Therefore, we can now support up to nine players within our 8 KB total bandwidth.
Extending the Sample
You can further reduce bandwidth usage by using smaller data types to transmit the tank state. For example, you could convert the position and velocity to HalfVector2 format, the input vectors to NormalizedByte2, and then scale the rotations to fit into a single byte. This reduces the game data to only 18 bytes, which enables you to fit 13 players into your total 8 KB.