27.10.2009, 18:41
I don't know if this bug is already confirmed, but I couldn't find anything related with ******.
I'm creating a gamemode, which uses a timer. In fact, a very importand timer. It would be lovely if it works fine. But it won't. SA:MP 0.2X had already a very little timer problem, but in 0.3a it's even bigger.
I've made a almost completely empty gamemode, with only a 60 second (60000 MS) timer.
Results:
The duration of this timer in 0.2x was 1 minute and 3 seconds, which means 63 seconds. Not perfect, but ok.
Than I tested the same script with 0.3a. 1 minute and 6 seconds. 66 seconds. A difference of 3 seconds. Ok, it's not really much, but it is a fact the timer runs too slow. Than I tested it with 10 minutes. (600000 MS)
0.2x: 10 minutes, 31 seconds
0.3a: 11 minutes, 2 seconds
I don't like this. A 10 minute timer, goes 11 minutes? Thats fucked. I think this is a bug, I couldn't find a reason why this shouldn't.
I'm creating a gamemode, which uses a timer. In fact, a very importand timer. It would be lovely if it works fine. But it won't. SA:MP 0.2X had already a very little timer problem, but in 0.3a it's even bigger.
I've made a almost completely empty gamemode, with only a 60 second (60000 MS) timer.
pawn Код:
// This is a comment
// uncomment the line below if you want to write a filterscript
//#define FILTERSCRIPT
#include <a_samp>
forward test();
main()
{
print("\n----------------------------------");
print(" Blank Gamemode by your name here");
print("----------------------------------\n");
}
public OnGameModeInit()
{
// Don't use these lines if it's a filterscript
SetGameModeText("Blank Script");
AddPlayerClass(0, 1958.3783, 1343.1572, 15.3746, 269.1425, 0, 0, 0, 0, 0, 0);
SetTimer("test", 60000, false);
print("start");
return 1;
}
public OnGameModeExit()
{
return 1;
}
public OnPlayerRequestClass(playerid, classid)
{
SetPlayerPos(playerid, 1958.3783, 1343.1572, 15.3746);
SetPlayerCameraPos(playerid, 1958.3783, 1343.1572, 15.3746);
SetPlayerCameraLookAt(playerid, 1958.3783, 1343.1572, 15.3746);
return 1;
}
public OnPlayerConnect(playerid)
{
return 1;
}
public OnPlayerDisconnect(playerid, reason)
{
return 1;
}
public OnPlayerSpawn(playerid)
{
return 1;
}
public OnPlayerDeath(playerid, killerid, reason)
{
return 1;
}
public OnVehicleSpawn(vehicleid)
{
return 1;
}
public OnVehicleDeath(vehicleid, killerid)
{
return 1;
}
public OnPlayerText(playerid, text[])
{
return 1;
}
public OnPlayerCommandText(playerid, cmdtext[])
{
if (strcmp("/mycommand", cmdtext, true, 10) == 0)
{
// Do something here
return 1;
}
return 0;
}
public OnPlayerEnterVehicle(playerid, vehicleid, ispassenger)
{
return 1;
}
public OnPlayerExitVehicle(playerid, vehicleid)
{
return 1;
}
public OnPlayerStateChange(playerid, newstate, oldstate)
{
return 1;
}
public OnPlayerEnterCheckpoint(playerid)
{
return 1;
}
public OnPlayerLeaveCheckpoint(playerid)
{
return 1;
}
public OnPlayerEnterRaceCheckpoint(playerid)
{
return 1;
}
public OnPlayerLeaveRaceCheckpoint(playerid)
{
return 1;
}
public OnPlayerRequestSpawn(playerid)
{
return 1;
}
public OnPlayerKeyStateChange(playerid, newkeys, oldkeys)
{
return 1;
}
public test()
{
print("End");
}
The duration of this timer in 0.2x was 1 minute and 3 seconds, which means 63 seconds. Not perfect, but ok.
Than I tested the same script with 0.3a. 1 minute and 6 seconds. 66 seconds. A difference of 3 seconds. Ok, it's not really much, but it is a fact the timer runs too slow. Than I tested it with 10 minutes. (600000 MS)
0.2x: 10 minutes, 31 seconds
0.3a: 11 minutes, 2 seconds
I don't like this. A 10 minute timer, goes 11 minutes? Thats fucked. I think this is a bug, I couldn't find a reason why this shouldn't.