- Posts: 208
- Comments: 2452
- Overall: 2660
It all comes down to how close you get the scope aligned to the trajectory. The flatter the trajectory, the less it matters.
Say you got the scope off by only 1 degree and you’re shooting a 45-70 @ 1500 FPS at 400 yards. Your bullet will drop roughly 143 inches at that range, so if you adjusted your scope for impact with the turrets or holdover, depending on the direction your scope is canted, you would be off with your windage at that range by 2.5 inches.
On the other hand, if you’re shooting a 270 @ 3050FPS at the same range, your bullet drop will only be 23 inches, and your error will only be .4 inches.
It’s a simple trig calculation where side B is your bullet drop, and angle A is your deviation in degrees from perpendicular. Solve for side A to determine your error. Doesn’t’ matter how far away the target is. All that matters is how far your bullet drops. Obviously, in a particular rifle, the further the target is from you, the more important this becomes, as your bullet drop increases the scope alignment error on target.
Does this apply to you?
Depends on how fast your rifle gets the bullet to the target at 400 yards, and how jacked up your scope is. Remember the figures above are for a deviation of only 1 degree which is pretty hard to eyeball. I often replace scopes for gents who mounted them by “using the Force” and they are off by as much as 10 degrees judging by my plumb line.
Check it out.