jamiebolseth
Well-Known Member
- Joined
- Feb 12, 2015
- Messages
- 401
I've got a question/topic that I'd like to get people's thoughts on...
I live at about 1000 ft. above sea level (ASL). I hunt at 8000 ft. above sea level (ASL). I zero all my rifles at 100 yds, and dial elevation corrections in MOA for shots longer than 100 yds. Specific rifle I'm talking about is a super accurate 7SAUM shooting 175 bergers @2770 fps. I'm using the Applied Ballistics app for ballistics solutions.
The question is whether my 100 yd zero will change from 1000 ft. ASL to 8000 ft. ASL? Common sense tells me it has to. Bullet is hitting more air molecules during the first hundred yds of flight when flying through denser air, so it has to slow more. And the software computes a velocity of about 35 fps slower at 100 yds when shooting through the denser air, so that seems to back up my theory. The question is whether it's noticeable enough to even observe based on this rifle's (and my ability) to shoot roughly 0.2-0.3 MOA groups.
When I look at the AB software there's an option to turn on "Enable Zero Atmosphere", which seems like the feature that I'm looking for. To me it seems like I would enter my zero conditions at the 1000 ft. ASL environment. And then when I travel to 8000 ft I would expect the software to account for the fact that my zero conditions are significantly different. For example, I expected to see a small correction of, say, D0.25 at 100 yds when I shoot at 8000 ft ASL since the bullet is traveling through less dense air. But this is not what I see. Instead I always just get 0.0 correction for 100 yds.
Then I thought, well, 100 yds just isn't far enough to see the difference (bullets still flying to fast/flat to notice). So I set my zero distance in the computer to 1500 yds. With "Enable Zero Atmosphere" turned on. I calculated a solution for 1500 yds with the exact same conditions as my zero atmosphere (1000 ft. ASL). Not surprisingly, I got a correction of 0.0 at 1500 yds. Then I changed the conditions of the shot to be at 8000 ft. ASL thinking I'd see a correction needed at 1500 yds to account for the difference in air density. BUT - I still got a correction of 0.0 to hit at 1500 yds.
I either don't understand what the "Enable Zero Atmosphere" feature does. Or it doesn't work. Or I am just confused all together.
All this started because my perfect 100 yd zero is "between clicks" on the hundred yd target at 1000 ASL, and I got to wondering if it made sense to zero slightly low based on the fact that I would be hunting at higher elevation. So I set out to prove that my thinking was correct with the software, but I can't make sense of what I'm seeing.
Can anybody enlighten me?
I live at about 1000 ft. above sea level (ASL). I hunt at 8000 ft. above sea level (ASL). I zero all my rifles at 100 yds, and dial elevation corrections in MOA for shots longer than 100 yds. Specific rifle I'm talking about is a super accurate 7SAUM shooting 175 bergers @2770 fps. I'm using the Applied Ballistics app for ballistics solutions.
The question is whether my 100 yd zero will change from 1000 ft. ASL to 8000 ft. ASL? Common sense tells me it has to. Bullet is hitting more air molecules during the first hundred yds of flight when flying through denser air, so it has to slow more. And the software computes a velocity of about 35 fps slower at 100 yds when shooting through the denser air, so that seems to back up my theory. The question is whether it's noticeable enough to even observe based on this rifle's (and my ability) to shoot roughly 0.2-0.3 MOA groups.
When I look at the AB software there's an option to turn on "Enable Zero Atmosphere", which seems like the feature that I'm looking for. To me it seems like I would enter my zero conditions at the 1000 ft. ASL environment. And then when I travel to 8000 ft I would expect the software to account for the fact that my zero conditions are significantly different. For example, I expected to see a small correction of, say, D0.25 at 100 yds when I shoot at 8000 ft ASL since the bullet is traveling through less dense air. But this is not what I see. Instead I always just get 0.0 correction for 100 yds.
Then I thought, well, 100 yds just isn't far enough to see the difference (bullets still flying to fast/flat to notice). So I set my zero distance in the computer to 1500 yds. With "Enable Zero Atmosphere" turned on. I calculated a solution for 1500 yds with the exact same conditions as my zero atmosphere (1000 ft. ASL). Not surprisingly, I got a correction of 0.0 at 1500 yds. Then I changed the conditions of the shot to be at 8000 ft. ASL thinking I'd see a correction needed at 1500 yds to account for the difference in air density. BUT - I still got a correction of 0.0 to hit at 1500 yds.
I either don't understand what the "Enable Zero Atmosphere" feature does. Or it doesn't work. Or I am just confused all together.
All this started because my perfect 100 yd zero is "between clicks" on the hundred yd target at 1000 ASL, and I got to wondering if it made sense to zero slightly low based on the fact that I would be hunting at higher elevation. So I set out to prove that my thinking was correct with the software, but I can't make sense of what I'm seeing.
Can anybody enlighten me?