Holy cow, I wrote a book!
The story behind the
55ms timer tick rate goes all the way back to the
original IBM PC BIOS.
The original IBM PC used a 1.19MHz crystal,
and 65536 cycles at 1.19MHz equals approximately 55ms.
(More accurately, it was more like 1.19318MHz and 54.92ms.)
But that just pushes the question to another level.
Why 1.19...MHz, then?
With that clock rate, 216 ticks equals
approximately 3600 seconds, which is one hour.
(If you do the math it's more like 3599.59 seconds.)
[Update: 4pm, change 232 to 216; what was I thinking?]
What's so special about one hour?
The BIOS checked once an hour to see whether the clock
has crossed midnight.
When it did, it needed to increment the date.
Making the hourly check happen precisely when a 16-bit
tick count overflowed saved a few valuable bytes in the BIOS.
Another reason for the 1.19MHz clock speed was that it was
exactly one quarter of the original CPU speed, namely 4.77MHz,
which was in turn 4/3 times the NTSC color burst
frequency of 3.5MHz.
Recall that back in these days, personal computers sent their
video output to a television set. Monitors were for the rich kids.
Using a timer related to the video output signal saved a few
dollars on the motherboard.
Calvin Hsia has another view of the story behind the 4.77MHz clock.
(Penny-pinching was very common at this time.
The Apple ][ had its own share of penny-saving hijinks.)