okay, working on porting some code from windows to linux and i've gotten stuck. It's a game server and we have a timer class, which needs to be a high resolution timer (hence why i can't just use time.h)
currently, here's how the game creator descibes the Timer class:
and here's how the class is layed out:
if someone could point me in the right direction of a high res timer for gcc so i can redo the class, it would be most appreciated.
currently, here's how the game creator descibes the Timer class:
When POG starts up, it first calls Timer::init_frequency() to set ms_frequency to however many "hardware timer ticks" there are per second. This frequency is different on every computer, but it never changes while the computer is running.
QueryPerformanceCounter() reads the current value of the hardware timer. Suppose you call this function once and put the result value into a variable. Then you call it again at a later time, put the result into another variable. Then you subtract the first variable's value from the second, and you'll get however many hardware ticks have passed. Divide that by the frequency and you get however many seconds have passed. m_lengthOneTick holds however many hardware ticks there are in the Timer's tick.
I think those functions ultimately wind up using the x86 processor's RDTSC instruction to read the processor's high resolution hardware timer. You'll have to find whatever Linux functions do the same thing.
and here's how the class is layed out:
Code:
class Timer
{
public:
Timer(float ticksPerSecond);
bool TickHasPassed();
bool PeekTickHasPassed();
bool PeekTickHasAlmostPassed();
bool PeekLessThanQuarterTickHasPassed();
bool PeekHalfTickHasPassed();
bool PeekTickPlusAQuarterHasPassed();
bool PeekTwoTicksHavePassed();
void MarkTickHasPassed();
void Reset();
void HalfSleep();
float count_elapsed_ticks();
void AdjustOnFly(float ticksPerSecond);
UINT GetSecondsLeftInTick();
UINT GetSecondsInTick();
float get_seconds_left_in_tick_float();
static UINT GetRandom(UINT modNumber);
static LARGE_INTEGER ms_frequency;
static LARGE_INTEGER ms_freq_div_30;
static bool ms_frequency_set;
static __int64 get_time_code ();
static void init_frequency ()
{
QueryPerformanceFrequency(&ms_frequency);
ms_freq_div_30.QuadPart = ms_frequency.QuadPart / 30;
ms_frequency_set = true;
}
private:
LARGE_INTEGER m_startCounter, m_counter;
__int64 m_lengthOneTick;
};
if someone could point me in the right direction of a high res timer for gcc so i can redo the class, it would be most appreciated.