The maximum error, ie that where your eye is located right at the edge of the exit pupil, can be calculated:

Maximum error = 0.5 (d)abs(t-p)/p

where d is objective lens diameter, t is target distance, and p is parallax-free distance. If d is denominated in mm, so will the error be in mm.

So, for example, if we take the op's question, and insert some numbers, let's say that the scope is set to be parallax free at 250 yards, and it has a 40 mm objective lens. At 100 yards, the maximum error would be 12 mm. At 500 yards it would be 20 mm.

If on the other hand our scope was set to be parallax free at 150 yards, maximum error at 500 would be 46 2/3 mm. This is less than 0.4 moa. That is with the eye displaced as far as it can be from the axis of the scope without losing the image.

You can also calculate error for a given displacement from the scope axis x if you also plug in magnification m, as follows:


Error = (xm)abs(t-p)/(p), provided that x < 1/2 exit pupil diameter

So, for example, with a scope set to be parallax free at 150, shooting at 500 yards, and where your eye is, say, 1 mm from the scope axis at the eye-relief distance, you can calculate for a 3x magnification:

Error= 1x3 abs((500-150)/150) = 7 mm

While at 15 x

Error=1x15 abs((500-150)/150) = 35 mm