HLDS (Counter-Strike 1.6) Server
Hlds not giving 1000fps with sys_ticrate 1000. 3.5ghz quad core intel e3 xeon. 8gb ram, intel motherboard, 1tb SSD. Operating system tried windows 2000, xp, 2008r2, 2012r2. How to get 1000fps on server or did STEAM UPDATED ANYTHING regarding this?
< >
Showing 1-11 of 11 comments
Blyte Feb 10, 2017 @ 12:34pm 
This old post might help you. mainly the part about high resolution multimedia timers.

-----Original Message-----
From: Kevin Ottalini [EMAIL PROTECTED]
Sent: Wednesday, May 17, 2006 11:01 AM
To: hlds@list.valvesoftware.com
Subject: Re: [hlds] more then 1000fps at HLDS


HLDS (HL1 servers) can easily and with little burden run at either ~500 fps
or ~1000 fps. There is no control over the actual maximum FPS since it is a
motherboard chipset related issue.

This is controlled by the "sys_ticrate" CVAR so the max setting is:
sys_ticrate 1000

Win32 servers will also need to run some sort of high-resolution timer
(please see other mail threads about this).

We are only talking about HLDS here (HL1 servers). Source (SRCDS) servers
are quite different and (at the moment) appear to run the best at their
default settings.

This is not really FPS in the sense of visual FPS, but rather how often the
server will process the available event information (take a "snapshot") and
if needed send an update to clients that need updates. The more updates the
server sends out the more bandwidth the server will use on the uplink.

Clients can receive a maximum of 100 updates per second regardless of the
server sys_ticrate setting.

A client getting a server update is not the same thing as the video FPS that
the client is actually viewing.

The client graphics FPS, which for clients is controlled by the scene and
event complexity and the "fps_max" CVAR could indeed be set to fps_max 1000
but anything above 100 is quite silly. Again, this "viewing FPS" has
nothing to do with the server sys_ticrate setting.

The client has a CVAR that tells the server how often to send updates, this
is the cl_updaterate CVAR. cl_updaterate 100 is the maximum (fastest)
setting which the server may or may not allow. The server can limit the
client maximum via the sv_maxupdaterate CVAR.

Again, this has nothing to do with the client's VISUAL FPS.

OK, so why would a server operator want to run his/her server at sys_ticrate
1000?

In the case of HL1 servers only, running a faster ticrate on the server can
slightly improve the apparent client latency (sometimes called ping, but
ping is a little different). If the server is running sys_ticrate 100 then
there is a 10ms interval between server snapshots that can be sent to
clients. If a client has an 80ms ping distance from the server (real ping
this time) then the maximum latency is 80ms (ping) + 10ms (snapshot rate) or
90ms (latency).

If the same server is running at sys_ticrate 1000, then the snapshot
interval is only 1ms, so that same player will only see an 81ms latency.

Is a 9 ms savings important during game play? Probably not, although there
are internet players that claim to be able to feel the difference. In a LAN
setting this may be different, 10ms extra may be 10X what the ping is on a
LAN (but still, is this important? probably not).

Running an HLDS server at a higher sys_ticrate should have the overall
effect of keeping what players see on that server more accurate. This
appears to be a real and valuable effect at the cost of much higher CPU
utilization.

The real reason that a server operator might want to run his HLDS server at
sys_ticrate 1000 though is that it gives the server the ability to send
updates to individual clients on a more timely basis. Again, this is not
more updates, just updates that don't have to wait very long for the next
server snapshot to happen.

This has the overall effect on the server of spreading out client updates so
they don't all happen for all clients at the same time. This can slightly
lower the demand on the server uplink and might help the server to run a
little smoother.

Extensive testing on my HLDM server resulted in the conclusion that running
sys_ticrate 1000 actually allowed me to add one additional player slot (out
of 10 total) and the server had a much tighter "feel" to events with a
slight improvement in accuracy.

Of course, running sys_ticrate 1000 also took my average CPU utilization for
a 10-player server from around 3% to around 40% for some maps.

Even my old 800MHz Intel P3 server was able to run sys_ticrate 1000, the
real question is are you overloading your server CPU? This is a function of
the number of players, the map you are running and the sys_ticrate setting.

If your CPU is running more the 50% with sys_ticrate 1000 then decrease the
sys_ticrate to 500.

For testing purposes, use the Server GUI (don't use -console) and look at
the utilization graph.

qUiCkSiLvEr
Last edited by Blyte; Feb 10, 2017 @ 12:39pm
Blyte Feb 10, 2017 @ 12:35pm 
Have windows media player open but not playing any media and minimize the player. Then start your HLDS.
76561198158091308 Feb 11, 2017 @ 12:38am 
i tried opening windows media but still fps is at 496 to 514.
76561198158091308 Feb 11, 2017 @ 12:38am 
i also want to know will mmtimer or hl booster will increase fps?
76561198158091308 Feb 11, 2017 @ 12:40am 
and playing online how much bandwidth is needed for 1 server a month if it get full 32/32 max for 20hours daily?
~TiEsTo-PuNkZ~ Jun 3, 2017 @ 5:13pm 
Yes if you put booster and mmtimer u increase fps, and if like more fps, change the priority of hlds put high ;)
Its not about fps i need HELP WITH BANDWIDTH
Blyte Jun 4, 2017 @ 3:24pm 
Lower your sv_maxupdaterate and / or cmd_rate on your server for less bandwith usage. I use a sv_maxupdaterate 20 for Half-Life DeathMatch servers.

But keep in mind I run my servers from a VPS (virtual private server) It is rented and i do not have to worry about bandwidth to much . The fast downloads for the game materials is redirected / run from a seporate ftp/http webhost. This keeps the game servers from feeling choppy when people connect and disconnect.

A booster for hlds can help in most cases. Because they usually come with thier own high-res timers.

fyi
The maximum ticrate a sever can produce is deterined by the motherboard chipset being used.
Last edited by Blyte; Jun 4, 2017 @ 4:04pm
Blyte Jun 4, 2017 @ 4:03pm 
If you start seeing problem in the maps. Things like doors not openning or sounds start to shutter / glitch. It is most likely the high-res timer settings are to high. Lower your sys_ticrate.
ok but can you help me in how much data transfer will needed for 30days for 32/32 24by7 online.
Blyte Jun 5, 2017 @ 3:48pm 
The game itself does not use much data transfer. Custom maps, sounds ect... is where most data transfer is used.

here is a server rate calculator.
http://www.reece-eu.net/drekrates.php
Last edited by Blyte; Jun 5, 2017 @ 3:59pm
< >
Showing 1-11 of 11 comments
Per page: 1530 50

Date Posted: Feb 10, 2017 @ 9:49am
Posts: 11