So far I have spent zero time on the G1 vs G7 issue. Tell me though. If I check my drops using G1 out to my self imposed max range, does it then matter whether I ever switch over to utilizing G7 instead of G1? In other words is G7 an advantage only for longer-than-drop-verified distances? Or are there interpolation errors inherent in the use of G1 that don't exist when using G7?
When shooting at long range bullets will have a wide velocity range, If you've "tweeked" the drops at multiple velociy ranges (as Sierra publishes" you can't really say you're using the G1 model, only a series of interpolations. Doing that may give better results than swithing to the G7 model. There are several ways to fudge the numbers to get a better match for a set of measured data points. Doing that may or may not give better matches to points in between. The difference between the G1 and G7 models change their relative ratios slowly down to transsonic veloicities, typicallly below 1300 fps, then they can vary wildly because of the way the supersonic shokwaves form and detach between the two models. They return to being just a simple ratio below around 1000 fps, the what the ratio is depends on individual bullet shapes.
If you take sufficient data points and velocity intervals you'll get the same quality fix by adjusting either G1 or G7 coefificients. What your reallly doing as making another model to match your particlar bullet. The argument for using the G7 model instead of the G1 is that it's closer to start with for long ogive boattail bullets. Either model can be tweeked to give a better fit. Some programs like those from Pejsa don't use BC's at all. If you're going to curve fit to match actual drop measurements it's a simpler (and more logical) method. Tweeking either method if it gives correct drops willl allso give correct wind deflection.
It may not be obviious, but drop is directly locked to time of flight:
D=1/2 G *T^2 where:
D=drop, G= acceleration of gravity, and T=time of flight to that point on the trajectory. I believe Isaac Newton first published it.
Crosswind deflection is also locked to time of flight by:
Dw = Vw(Tf- X/Vm)
where Dw is deflection caused by wind, Vw is the crosswind velocity, Tf is the actual time of flight to that point on the trajectory, X is the distance to that point on the trajectory, and Vm is muzzle velocity. That is Ddion's equation.
Notice that BCs don't appear in either equation. They are only needed if you don't know the time of flight and need to calculate it from the bullet drag characteristics, atmosphere, and muzzle velocity. BC are just a number that is a multiplier to drag vs velocity table of a standard model of one shape of bullet. Those standard US Army models are numbered 1-7. They could have been any projectiles the US Army was using at the time but they made the firing tests and measurements to determine the drag curves for each one They weren't easy test to make with the instrmentation of the early 20th century. Today it's much easier to measure the drag function for each projectile design using millimeter wave radar to get a continous velocity, distance, and time profile.
If you have enough data points you can correct either G1 or G7 models to give an exact match to the actual drag function. But then what would be the point of using BC's at all instead of just using the actual drag function for for that particular pojectile design.