In the area I hunt elk in Colorado, most of my shots are from 200-300 ft above the elk and shots are from 300 yds +. I was considering getting an angle cosine indicator, but I ran some math using the Pythagorean theorem. Assuming I am 300 ft (100 yds) above the elk, when the lasered distance is 300 yds, I come up with a straight line distance of 283 yds. As I increase the distance to target, the difference between line of sight distance and straight line distance gets smaller, assuming the same vertical distance above of 300 ft.
If my assumption about LOS relationship to straight line distance is valid, I don't think I need to worry about compensating for the difference between the two when I figure bullet drop unless shooting at a much greater uphill or downhill angle then I ever would from that area. Am I correct? What say you guys?
Thanks in advance,
Sam
If my assumption about LOS relationship to straight line distance is valid, I don't think I need to worry about compensating for the difference between the two when I figure bullet drop unless shooting at a much greater uphill or downhill angle then I ever would from that area. Am I correct? What say you guys?
Thanks in advance,
Sam