Somebody already mentioned some of my previous writings on this, but we can actually figure out the difference in potential aiming ability with various scope magnifications.

Most people think 20/20 vision is "perfect," but in reality it's only average: It's what most humans see at 20 feet. Some people have 20/15 vision (meaning they can see at 20 feet what most people see at 15) and a few have even better vision.

Somebody with 20/20 vision can "define" just about an inch at 100 yards, meaning they can tell the difference between alternating half-inch black-and-white lines. Beyond 100 the lines disappear and look gray, just like a zebra a long way away.

Scopes will shrink how much somebody with 20/20 vision can define at 100 yards by the amount of magnification. A 10x scope shrinks it to 1/10 inch--and a 6x scope to 1/6 inch. 1/10 of an inch is, of course, .1" and 1/6 of an inch .1666". The difference in aiming error is .0666", slightly more than 1/16 of an inch, which ain't much.

Consequently, it's usually obscured because of other factors, such as not firing enough groups to be able to find it, common among handloaders who settle on a "pet load" due to one small group.

Other factors are scope parallax (not many shooters know how to look for it, much less compensate for it), the built-in error of most handloads, and unsuitable targets for the reticle.

But probably the biggest is wind, something very few handloaders consider when testing ammo at 100 yards. A 1-mph breeze (which most would consider dead calm) will drift a typical .30 caliber, 150-grain spitzer boattail at 3000 fps just about exactly as much as the difference in aiming error between 6x and 10x.

Last edited by Mule Deer; 06/20/16.

“Montana seems to me to be what a small boy would think Texas is like from hearing Texans.”
John Steinbeck