This was originally going to be the answer to a rather long thread about accurate millisecond delays in QuickBASIC, so it goes rather abruptly into an explanation of timing loops based on previously calibrated counters (i.e., for this machine, you determine that it takes some precise number of iterations in a given FOR loop per millisecond), since that is what the original answer to the post was about. That said, it also contains a lot of useful information, explaining how the TIMER function and the hardware that it depends upon functions. So, here goes...
Linux does its internal timing using loops that it calibrates on start up. This is what the 'BogoMIPS' number is all about. However, Linux's success with this stems from the fact that it has complete control of the processor at the time (i.e., there are no other executing tasks), and the timing loops (both the test and the delay which uses the results of the test) are hand-written in assembler language (and therefore they know exactly how many instructions will be executed between the top and the bottom of the loop). In QuickBASIC, neither of these conditions hold (unless you're running under pure DOS). The loop used to calibrate the timer will not run with exactly the same characteristics as the loop which delays the system.
Another problem with the delay loops method is that frequently, the user wants to wait for an interval that contains some other executing code. For example, the following frameloop will restrict the engine to 18.2 frames per second. However, it will not drop it to 9.1 frames per second if a frame takes longer than 1/18.2 of a second to complete.
[tt]
[color #000080]DO[/color]
st# = [color #000080]TIMER[/color]
drawframe
[color #000080]WHILE[/color] st# = [color #000080]TIMER[/color]: [color #000080]WEND
LOOP UNTIL[/color] quit%[/tt]
However, a loop in this style depends on an external, independant timer -- a timer that updates effectively simultaneously to your code. The TIMER variable is the only access that QuickBASIC gives to such a timer. Without modifying the frequency of the TIMER variable, you are therefore limited to waiting in increments of 1/18.2.
The reason for this increment is that the system timer that the TIMER function depends upon is a chip on the motherboard with an internal clock rate of 1,193,181 ticks per second (&H1234DD). Every time this clock ticks, a counter inside the chip goes up by 1. After this happens, it then checks it against a "trigger" value, which starts out at 0 when you turn on your computer. The counter and the trigger are 16 bits wide, and since it increments before checking, it "misses" the trigger from zero. Since it's testing if they're equal, not if the counter is greater than or equal to the trigger, it ends up counting right the way up to the maximum unsigned 16-bit number, which is 65,535. On the next tick, it does what most counters do when they reach their limit: it overflows and wraps back to zero. At this point, it is equal to the trigger. This is just the default behaviour; the value of the trigger can be changed to reduce the number of internal ticks between triggers.
When the counter is equal to the trigger, two things happen: first of all, the counter is reset to zero, and second of all, a signal is sent to the processor. The processor receives this signal (called a "hardware interrupt request") and runs a little subroutine (called an "interrupt handler") to service it. The handler that it has installed for the timer's interrupt request increments a 24-bit number in memory, which is the same number that TIMER reads its value from (except that TIMER divides by 18.2 before returning the result, so that you get the number of seconds, not the number of clock ticks).
So, to get back to the original topic, the reason the increment is 18.2 seconds is that the counter/trigger mechanism is effectively dividing the clock rate of 1,193,181 by 65,536, since there are 65,536 ticks from one interrupt request to another. If you divide 1,193,181 by 65,536, you'll find that the quotient is roughly 18.2.
Now, as mentioned earlier, you can change the value of the trigger to decrease the interval between clock ticks and thus increase the frequency. First, you have to calculate the value of the trigger. You can get an approximate result by dividing the timer chip's frequency by the desired frequency (note that 1,193,181 / 18.2 ~= 65,536), after which it is simply a matter of sending the new frequency to the chip. The following QuickBASIC function will set the TIMER frequency to the closest matching possible frequency as specified, returning the actual frequency that was set:
[tt]
[color #000080]FUNCTION[/color] setupTimer#(frequency#) [color green]'Returns the actual frequency that was set[/color]
[color #000080]IF[/color] (frequency# > 1193181) [color #000080]OR[/color] (frequency# < 18.3) [color #000080]THEN ERROR[/color] 5 [color green]'Illegal function call[/color]
ticksPerTimer& = 1193181# / frequency#
[color #000080]OUT[/color] &H43, &H34 [color green]'Specifies a bitmask setting the timer style and initializing the ports for receiving a clock rate[/color]
[color #000080]OUT[/color] &H40, ticksPerTimer& [color #000080]AND[/color] 255 [color green]'Send low 8 bits first[/color]
[color #000080]OUT[/color] &H40, ticksPerTimer& \ 256 [color green]'Then send high 8 bits[/color]
setupTimer# = 1193181 / ticksPerTimer& [color green]'And finally, return the resulting clock rate[/color]
[color #000080]END FUNCTION[/color][/tt]
The following is the conjugate, returning the timer to normal operation. Note that the setupTimer#() function is not capable of doing this.
[tt]
[color #000080]SUB[/color] unSetupTimer() [color green]'Returns the timer to normal operation[/color]
[color #000080]OUT [/color]&H43, &H34
[color #000080]OUT [/color]&H40, 0 [color green]'Set low 8 bits of trigger to 0[/color]
[color #000080]OUT [/color]&H40, 0 [color green]'Set high 8 bits of trigger to 0[/color]
[color #000080]END SUB[/color][/tt]
Once you have changed the clock rate of the timer (and thus of TIMER), using it is a little bit tricky. You have to remember that QuickBASIC's TIMER function has no way of knowing that the clock rate has changed, and thus it assumes that it is still changing 18.2 times a second. Therefore, trying to return the number of seconds, it still divides the value by 18.2. Correcting this back to ticks is pretty much out of the question, because the method used to perform the division is not perfectly accurate (remember, QuickBASIC was made in the days before floating-point units, so it does the floating-point division using only integer math). At 1,000 ticks per second (for example), correcting in the reverse direction is also not going to work very well. However, the value returned by TIMER still changes as fast as the actual timer ticks, so you can check if a previously returned value is still equal to the current value. Doing this repeatedly, you can wait for a certain number of ticks to pass. The following loop will wait for the specified number of seconds, given the frequency returned by setupTimer#():
[tt]
[color #000080]SUB[/color] delay(numSeconds#, frequency#)
numTicks& = numSeconds# * frequency#
[color #000080]FOR[/color] i& = 1 [color #000080]TO[/color] numTicks&
st# = [color #000080]TIMER
WHILE[/color] st# = [color #000080]TIMER[/color]: [color #000080]WEND
NEXT[/color] i&
[color #000080]END SUB[/color][/tt]
Using these three functions, it should be possible to write accurate inline delays. It should also be possible, using the type of frameloop shown above, to set an upper bound on the framerate of a game (though you won't be able to accurately know how many clock ticks have passed if a given frame takes longer than one frame's allocated time to process).
This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
By continuing to use this site, you are consenting to our use of cookies.