I started using fps_max 100 to limit the choke caused when your fps jumps around. I then realized that my monitor runs at 60hz so I set it to fps_max 62 since it was pointless to have the game run at a higher fps if my monitor wouldn't show it. However, I've noticed my update rate on netgraph 3 (at least that's what I think it is) is capping out due to my fps_max limit. The number I'm referring to is in the yellow box.
Am I correct in my understanding of what this number means? If so, then should i raise my fps_max up to match my cl_updaterate 101? It doesn't make sense to me that my fps limit would limit the number of times I send info to the server...
Actually, in writing this I think I figured out the connection... If the game is only running at 60 fps then it wouldn't be able to process 100 "events" per second to send to the server. I'll leave this up anyway so you can tell me if I'm correct or not and recommend whether its better to match the fps of the monitor or boost the number in the yellow box. Oh, and so Turtles can laugh and call me a noob or some other stupid name he came up with at his "job".
Edit: Also, i had fps_max set to 45 when I took this screen shot... de_tides wrecks havoc on my shitty graphics card.
A NEW HIGH SCORE! What does "high score" mean? New high score, is that bad? What does that mean? Did I break it?
Correct you update rate is always <current fps. Therefore even though you can only see 60fps on your monitor you will want >100fps from your game so that your rate can fully take advantage of the 100tick server.
Here is the logic behind it, as you see in netgraph your in remains at 100 because that is what you got cmd rate set to. As a result (assuming your connection can handle it) the server will send you a flow of packets equal to its current tick rate. Under heavy cpu usage this tick can fluctuate just like your fps does but will usually be 100 for a 100tick server and 45 for a 45 tick server ect ect. Now the out if limited by your computer, if your computer only preforms 60 calculations per second (60fps) then you can only send updates to the server 60 times per second. This is why having a good computer is recomended to improve shot regristration becuase then you can send the max of 100 updates per second. Also note that you need to set your interp to 2/cl_updaterate.
Interp controlls how your computer interprets the future, and here is how... Lets say you or any other player gets some lag or drops some packets. Now becuase there is a delay before the server and then your system can collect this data you computer may be forced to predict where you or anohter player is suposed to be at a given tick of the server time (this is one of the places tick rate comes into play). In some cases interp may cause a client to render actual hit boxes behind the players actual model (this is known as one of the netcode problems). Despite many people complaning about this it is not as bad in css as other games because you CAN adjust your rates. Although bad rates can fuck you, correct rates can minimize shuch errors. All you really need to understand is that setting interp correctly helps you computer properly inrerpret incomming tick and command packets and thus accuratly estimate where the player model acutally is so that when you shoot you can aim for the actual target and not have the hitbox lagging behind.
On another note this is why on my older computers you will see me hs when not aimed at the model. If you can read rates properly and do the calculation of delay with you computer assuming you are not able to get 100fps then you can estimate where the hitbox is in relation to the model. Logically this does not help if they are coming toward you only if they are runing by. Also it is not as though it is easy to estimate and not something I do all the time as it takes a lot of effort. It is a skill you aquire and pull out when you want to pull of a rape fest in the pub or if you have been playing and studying this game for many years.