Hello, All.
This post is to get some general advice and answers from the Group to
some questions about Python Timer modules and the Windows Operating
System.
Please note that this isn't just a question of how to write a timer
application in Python code; everyone has been pointing out a number of
examples for me to study in that regard; Thank You for that.
These questions are to try to get a clearer understanding of the Timer
modules themselves - their limitations and characteristics, and what
they are connecting to "behind the scenes" on the PC. I am also trying
to get a basic understanding of timer structuring on the PC in
general, for I believe the selection of one Timer module over another
might make an important difference in my particular application.
So forgive me if my questions are a bit unusual, but hopefully, I have
made my concerns clear enough to understand and answer.
The best way to begin is by explaining what I am ultimately desiring
to do in Python:
I wish to run a GUI program in the Main Python application, which will
collect User Commands, and display Operational Status on a laptop
screen. Meanwhile, in the background, on a regular timed basis, the
program flow will pass to a routine that will output the User Commands
and input new Operational Status by means of a very short packet
exchange on the Ethernet port. The routine will then return to the
Main GUI application, passing on the new parameters to the GUI,
updating it.
This background process will operate continually, at a regular fixed-
timed interval. I am currently targeting a time-slice update rate of
about 33.33 ms. (30 times a second), but this value isn't etched in
stone. I could probably go to 66.66 ms. (or 15 times a second), if
absolutely need be.
What is critical, is that this update rate doesn't vary greatly in
time from update to update (resulting in what is referred to as "Data
Frame Jitter". That is a problem that I must put maximum effort in to
avoid.
And that point brings up my first set of questions in regards to how
PCs can reliably produce this time interval:
For example, "wxPython in Action" makes it clear that in regards to
the wx.Timer module, "Not all operating systems have system clocks
with millisecond precision." So if I can't set the timer time in
milliseconds, what does it get set to - SECONDS? How can I know which
systems work in milliseconds and which don't - I can't put out a
distribution disk with unknowns like that.
As an explanation, I have been told and have read that the Windows
Operating Systems have no real reliable clock sources to hook software
methods to; that they vary from PC-to-PC, OS-to-OS, etc. Because of
this, most Python-based Timer modules are either very coarse in
resolution, or are sloppy in performance from machine-to-machine.
Okay. Second set of questions: Video player software must put out
video frames at a fixed reliable rate. Wave sound files decode at
typical sample rates of 44.1 khz. These rates must be stable and
reliable, or all sorts of distortion occurs to the media. How are they
keeping stable update times there, all across the Windows OS? Does
multimedia modules like PyGame have better Timer modules to access?
I heard there was a problem similar to this in Windows-based "C"
routines; using the standard API Timer function would vary the setting
resolution from 10 millisecond to 50 millisecond steps, depending on
the OS. Many programmers resolved the resolution problem by going to
the Multimedia Timer. Has Python done something similar?
Does the wx.Timer have some known standards around it, where a Windows
OS version such as XP guarantees the millisecond resolution setting?
Is there a wxPython GetData() type method that will let you know what
the fastest reliable rate that can be used for timer intervals?
Are there ways a programmer can make up his own code to get around
this problem, in a more "long-handed" variation to applying the Timer
modules normally?
I can't believe a program language as all-powerful as Python could be
stymied by something as fundamental as this. It's obviously just my
incredible ignorance in this area. Hopefully, a little explanation to
me will help eliminate my confusion.
This is what I meant when I mentioned that my questions were more
fundamental than just requesting examples on implementing the Timer
module itself.
Another issue is how to empirically test that the Timer routines
operate correctly, and at the right interval. I'm not exactly sure how
that is done with software checks, but here's some ideas I wanted to
run by everyone:
One test was this: I took the Timer example in "wxPython in Action",
deactivated the clock frame display, and set up the "OnTimer" method
to accumulate the amount of event calls to that method by the Timer. I
then made a single print of this value to the command prompt window
when the amount should have equaled every 15 seconds in elapsed time.
The 15 second prints were to minimize any extra processor operations
other than timing functions. I then started the Timer with different
millisecond attributes to see how it handled those. I then compared
the printed number on the screen with a stopwatch after 30 minutes
elapsed time.
I found a Windows 98 OS was "slipping" horribly when I used Timer
resolutions of 100, 250, & 500 milliseconds. The same tests on a
Windows XP OS looked to be a lot tighter to the actual time. The
problem was that this was a pretty stupid approach to check this, and
I'm not sure I really measured anything conclusively.
However, as a mostly hardware-based designer, I came up with two
alternative ideas that I believe would take all question out of the
issue:
One idea is to install the PySerial module, and on every Timer event,
toggle a Serial signal pin on the Serial port. I then put an
oscilloscope on it, and measure the resulting square waveform. This
will also show me any Data Frame Jitter, if it is occurring.
Another idea was to send out a brief UDP packet out the Ethernet port,
instead of toggling the Serial bit. I then monitor the packets with
Wireshark, which has a "packet received time" stamped in microseconds
for each received packet. By the way, how are THEY able to get such
accurate time within THEIR program?
These are the main issues I am in the dark about, so I should stop
here. If this post is too long, I apologize. I don't expect any one
person to try to chew on this whole issue themselves. If someone has
an obvious answer to one or two of the above questions, perhaps they
could post their thoughts on just those. That way I can get the
answers over time, without taking too much of anyone's time.
Thanks for your kind attention,
Regards,
Kit