Efficient Timing
#1

Suggest the most efficient (for performance) global timing system:

1) A timer gets server time (GetTime) each second, compares it with custom server variables (hh, mm ,ss), if they not same - makes them same, updates time (SetPlayerTime) for all players using a loop every next minute.

2)Same system, but "GetTime" replaced with a counter (ss++, mm++ and so on).

I know 1st variant would work more sharp, but 2nd variant gives an ability to set time directly in game.

EDIT: seems like noone have a clue, nevermind then.
Reply
#2

timers tend to run not very precise - if you use a SetTimer() with exactly 1000ms, then it sometimes needs 1001 ms, sometimes 1010 ms.
the method at increasing a variable in a non-presice timer +=1, will cause the time-delays to sum up:
a timer (1000 ms) repeated 60 times, can need 61 or 62 seconds, while the variable got incresed 60 times "only".

your first method will give better results, since it compares the non-precise timer with a fixed amount, which i assume its set in ms: as soon the timer exceeds that amount, your function gets called.

since the "trigger"call will always happen delayed, the difference between each delayed triggered call is almost the same amount: 0 to 10 ms (caused by the timer itself), BUT it will NOT be off like 2000 ms, which is often the case if you let the timer "decide" how long to run.
Reply
#3

Thanks for reply, but actually i was wondering which method is better for script performance (like less memory usage etc). Is that normal if i use "GetTime" too much, like every second?
Reply
#4

If you want to know how much time passed, something like tickcount() or the Unix timestamp returned by gettime() might be useful.
Reply
#5

I am currently running a system like what you said, I use GetTickCount() for each of my different systems that checks the time every 500ms. I noticed a drop in CPU usage, mainly cause i went from 8+ timers down to one main player timer and one main server timer, memory didn't change from what i seen.

But performance wise i guess it is better in a way, it would help with less timers and things would be executed with better accuracy. Like players time online would be more accurate, game-time, or any other time related thing since its going off at a certain time.
Reply
#6

pawn Код:
millisecond_time_counter += GetTickCount() - lastMillis; // += time passed since last timer call
lastMillis = GetTickCount();
As this adds just the time that passed by since the last timer call, you got the sharp time AND the ability to set the time ingame (by modifying the millisecond_time_counter). GetTickCount should be quite lighweighted and shouldnt cause any lag.
This is my favourite method when it comes to measure time for whatever.
Reply
#7

Quote:
Originally Posted by Mauzen
Посмотреть сообщение
pawn Код:
millisecond_time_counter += GetTickCount() - lastMillis; // += time passed since last timer call
lastMillis = GetTickCount();
As this adds just the time that passed by since the last timer call, you got the sharp time AND the ability to set the time ingame (by modifying the millisecond_time_counter). GetTickCount should be quite lighweighted and shouldnt cause any lag.
This is my favourite method when it comes to measure time for whatever.
Important Note: GetTickCount will cause problems on servers with uptime of over 24 days (physical server, not SA:MP server) as GetTickCount will eventually warp past the integer size constraints.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)