Originally Posted by ChrisF
Quote
I’m not convinced that’s a realistic assumption. Shooting groups at 100 yards with all load increments where trajectory differences are negligible would reveal that the precision of the various increments is often inconsistent enough so as to be comparable with the dispersion in the trajectory between increments at longer distance.
Yes, the signal gets lost in the noise as they say…hence as described by Audette, he shot at 300 yards to boost the signal and shot prone off a front bag, with a scope (even though the end use was irons for competition), during the early morning when the wind was calmest to reduce the noise.

Curious what the results of your testing was like?
There are two problems with that, the first is that both dispersion due to limited precision of the rifle as well as dispersion due to barrel position when the bullet is released are both angular quantities, so the signal and the noise both get amplified as distance increases. The second problem is that with increased distance comes additional sources of noise, like wind, as you mentioned.

I experimented with the Audette method with a few different rifles at various distances. In a couple of cases, I repeated identical tests at 600 meters to investigate repeatability, and the correlation between the results left me with little confidence in the repeatability of the test.

If the method is truly valid, then it should be repeatable. My experience and sample size is very limited, which is why I’m interested in the repeatability of others’ results, as well.