In the context of wind's contribution toward drift, I'm pretty sure you got it backwards. Although, I'd have to look around a bit to find supporting evidence.
Drift occurs with a bullet's slowing. If the bullet did not slow from the muzzle to the target, no amount of wind would cause drift. So a wind value is applied to the time of flight difference between a bullet that hypothetically doesn't slow, and your actual time of flight. This slowing occurs at the highest rate nearest the muzzle.
For example;
In space a bullet might take 0.50 seconds to travel 1kyds.
In our atmosphere, it might take 1.00 seconds.
Any wind would apply to that 1/2sec of TOF difference.
And by far, most of that difference occurs nearest the muzzle, as a bullet slows relatively little, further downrange.
This passes test w/regard to the mechanics of drift, which holds that base drag pulls a bullet downwind while it's nose is pointing into wind.
This base drag would be highest at the highest velocity, and affect a bullet's direction earlier and therefore longer during travel.
Many point to the apparent increasing moa of drift with distance, to support a notion that drift has an increasing rate downrange, and so downrange wind must be a bigger factor.
But demonstrated drift could not be easily quantified in moa, as nothing here is linear.
It can't be suggested that 1" of drift printing at 100yds is ~1moa, because much of that deviation may have occurred between 90yds-100yds. So it could be & print 8moa further downrange, even with no wind contributing beyond 100yds. That is, if 0.8" of that 1" deviation occurred in the 10yds between 90-100.
Seems like alot for a bullet to 'steer' so fast, but consider 1kyd drifts. They don't correlate to wind speed at all.
Anyway, I'm thinkin out loud I guess.
Pretty sure near wind is a bigger factor than far wind.