host_timescale messes with FPS counters and CPU usage counters

BroPikapi

Member
Good night everyone.

I found a bug where the FPS counter gets divided by the host_timescale and the CPU usage counter gets multiplied by it.
Example:
Lets see we place host_timescale 5 on a local server (for any purpose), we can see the game at 60fps (12fps in host scale), so, apparently it uses the host scale to calculate client's FPS, so we would be seeing 12FPS on the botton right corner.
also server CPU would be multiplied by host_timescale value, so using the same example host_timescale to 5, if we see 50% CPU usage on an external benchmarker (I used shadowplay), we would be seeing around 50*5=250% CPU usage

Proof:
(Uso de CPU=CPU usage)
1720158724993.png
I will be reading answers and adding valuable information regarding this bug, i invite the community to replicate this bug and send results.
 
Hi, please look at the load on each of the cores. This can be done in the Task Manager.
Perhaps the game uses only one logical core.
 
At first, I changed the host_timescale to 5 and then to 1 again, when I changed that, in the game it showed to be going to 5fps and the CPU was at 207%, which is completely false. 1720404505888.png1720404602625.png
 
Back
Top