Hi,
paulbarker wrote:
An interesting question is, given than both the PIT and RTC can generate interrupts at a 256Hz frequency, which is a better choice? Is one more accurate than the other, are they both similar or does accuracy differ depending on the computer (ie. one computer may have a more acurate PIT and the other may have a more accurate RTC)?
Both timers can be relatively inaccurate (depending on the motherboard, a $2.00 kids watch can be more accurate), but I would hope the RTC is at least as accurate as the PIT (or has less drift) as it's intended to measure real time. It's hard to get good statistics for the accuracy of available timers though. In any case I'd recommend having some method of adjusting the drift, like the
UNIX adjtime() function which could be used to manually correct drift or used in conjunction with
NTP to automatically correct drift.
For timer resolution, the maximum possible resolutions are:
RTC - 122070 nano-second or 8192 Hz
PIT - 838 nano-second or 1.193181666666 MHz
HPET - 100 nano-second or 10 MHz
Local APIC - usually between 40 nano-second (25 MHz bus) and 1.25 nano-second (800 MHz bus) depending on front side bus speed
Of course the maximum possible resolutions aren't practical. For the ISA devices (RTC and PIT), often the chipset can't handle the higher frequencies and IRQs are lost. For all timers there's problems with IRQ overhead and IRQ latency.
For the "best possible" timing, I'd use the highest resolution timer the computer has with drift adjustment, and then dynamically set the frequency used by that timer. For example, for a slow 80486 you might want to reduce timer frequency to reduce timer IRQ overhead, while on a computer with four 3 GHz CPUs you might want to increase timer frequency to increase system timer resolution.
Cheers,
Brendan