From the TF2 page:
The Source engine runs internal simulation at a tick interval of 15 msecs (66.67 ticks per second). Each client "usercmd" is a 15 msec timeslice (which makes it possible for prediction to work, since the inputs on both the client and server should result in the same outputs in most cases). The cl_cmdrate ConVar determines how many physical packets per second the client will try and send to the server. Note that this is decoupled from the tick interval. If the cl_cmdrate setting is low, or the client's actual framerate is low, then it's possible that a single physical packet from the client to the server will contain multiple actual "usercmd" payloads. Conversely, if the cl_cmdrate is higher than 66.67, it's possible that some physical packets will be sent to the server without any actual "usercmds" in them. Furthermore, if the client sets a low "rate" setting, then less physical packets could be sent to the server. The frequency of cl_cmdrate updates generally doesn't impact a player's ability to hit opponents, since lag compensation factors in the player's latency to the server and interpolation amount when checking whether shots would have hit opponents.
From the server side, the client's cl_updaterate setting determines how often the server will attempt to send a physical packet to the client. The basic formula is:
next packet time = current time + max( 1.0/cl_updaterate, bytes sent/rate setting )
Note:"bytes sent" includes the UDP packet header overhead of 28 bytes.In other words, if the player is requesting an updaterate of 20 packets per second, then the minimum time interval between physical packets is 50 milliseconds. However, if the player has a rate setting of 10000, and we just sent the player a 1000 byte packet, then the minimum time will be 1000/10000 or 100 milliseconds instead. If 1.0/cl_updaterate has elapsed and the server checks the "rate" part of the above equation and finds that it cannot yet send a packet, then the "choke" counter is incremented for the player. All this means is that the combination of rate, cl_updaterate, and physical packet size has forced the server to defer sending the player another packet. Thus, artificially setting cl_updaterate to a high number will usually increase "choke", especially in a busy scene, since the server is always trying to obey the user's specified rate setting. Choke is not necessarily a negative indicator, it could just mean that the cl_updaterate setting is too high.
The cl_updaterate, cl_interp_ratio, and cl_interp ConVars control interpolation (and lag compensation) in the following relationship. By default, Source games are tuned for an updaterate of 20 packets per second. This leads to an expected delta between packets of 50 msecs. Because packet loss can occur, the interpolator was tuned to allow for a single dropped packet without any hitch in the smoothness of motion perceived by the client. Thus, 100 milliseconds was chosen as the default for cl_interp (0.1 s = 2 x ( 1.0f / cl_updaterate default of 20 ) ). cl_interp_ratio defines the lower "bound" on what the actual interpolation amount used on the client. Specifically, the interpolation amount is:
min( max( cl_interp, cl_interp_ratio / cl_updaterate ), 0.25f )
Note:Server operators can clamp the allowable cl_interp_ratio and cl_updaterate settings and the clamped values are factored in to the above calculations.The Source netgraph now includes "lerp" indicator which shows the actual interpolation amount (usually 100 msec unless the server or user is forcing non-default settings). The indicator will turn yellow if the server's framerate (actual framerate on the remote machine) drops below this interval. This can be used to figure out why all of the objects in the world are no longer moving smoothly. In addition, the indicator will turn orange if the user or server has tuned the ConVars such that the interpolation amount is less than 2 / updaterate. This indicates that if there is any packet loss (or possibly choke if the choke is occurring for long periods of time due to large packets being sent over a low bandwidth rate setting) that the player will likely see sluggishness in the game.
From the network engineer at valve:
The default for cl_interp_ratio is two to allow for an occasional dropped packet w/o a visual hitch. I think amount of interp is max( cl_interp, cl_interp_ratio/cl_updaterate ) ... max( 0.1, 2.0/20.0 ) = max (0.1, 0.1) = 0.1, etc.
If update rate is 20, then packets come in at 50 msec. If interpolation is 100 msec, you can drop one packet, but get the next one, and not have run out of data to interpolate up to. That was the design of the original system and why interp was set to 0.1 seconds.
If they have a higher updaterate, then a lower interpolation amount can be fine. If they hardly ever have lost packets, I could see living with inter_ratio of 1.0. If they go less than 1.0, then the game will be somewhat stuttery, modulo at a high cl_updaterate with a fast server giving them updates close to the 66 hz max tickrate would make the visual hitching less noticeable. The other thing that impacts packet frequency is the amount of data being sent and the user's rate setting. We will "choke" packets if the rate setting for the client is too low, which will delay them. If they are only interpolating for 1/updaterate seconds, then if there is any choke at all, they will run out of data for interpolation, again causing visual hitching. This is another reason that keeping the default intep_ratio > 1 is good and why 2 is a reasonable compromise. Note that the lag compensation system takes into account the user's computed interpolation amount, so from an aiming point of view it shouldn't matter.
Updated--
here's some good community (unverified by us) information:
Sheyster's post on tradeoffs with cl_interp:
http://forums.steampowered.com/forum....php?t=1358325
Bookmarks