Originally Posted by nighthawk
Lessee... All common micro controllers will clock at 20 MHz or better and we'll concede four clock cycles per machine cycle. So that gets us to a precision of 1/5e6 0r 200 nanoseconds per tick. Assume a velocity of 3,200 fps and a screen spacing of 1 foot. Sp you get 1/3,200 = 3.125 microsecond between the screens. Now it is in the nature of digital systems to be accurate, all things being equal, to plus or minus one tick or 200 nanoseconds here. Or 312.3 to 312.7 microseconds. Or 3197.953 to 3202,049 fps or ±4.096 fps. Or if 16MHz clock speed if your cronograph is old, ±20.48 fps. Remember that' worst case, the mean is still 3200 fps so it depends on sample size.


Thank you very informative.

and assuming "ceteris paribus"


Most people don't have what it takes to get old