Calibration of a chronograph is interesting....

At one time, NISt had a container of speed standards, but when they opened it, and the standards sped off into the bush, and have not been seen since. That's a joke, of course. But it is true that there is no such thing as an NIST speed standard.

Speed is a derived quantity, derived from distance and time. We do have NIST traceable standards for those quantities.

Distance between the photosensors is fixed.

Crystal controlled oscillators are used to measure time. Even a cheap crystal is very stable. The crystal will age a bit when first put into service, but will not change enough to matter. If you want to check the oscillator frequency, you can make a little pickup loop, place the loop near the circuitry, and feed the signal to a frequency counter. But that won't be very revealing, since the crystal is very time stable, and you'll always get an answer close enough to nominal that the difference won't matter.

Once the chronograph is calibrated at the factory, it pretty well stays calibrated.

Now the LabRadar works very differently. The outgoing signal is mixed with the return signal to create a "beat frequency", and it is trivial to measure that to four or five significant digits. Ken Oehler has said that getting the effective spacing between photosensors to better than 1/8" was difficult. The LabRadar overcomes that problem.


Be not weary in well doing.