To my mind, 50 fps is statistically meaningless at 1,000 yds. Here's how I look at it. When shooting long distance (typically defined as 1,000+ yds), we dial up for our shots. Gravity is a constant, so for slower bullets we simply dial another MOA or two. No big deal. However, it is the wind and not gravity that gets us into trouble at longer ranges.
Assumption: 300 gr. Berger, shot at 7,000' elevation, 49 degrees and 10 mph wind.
2,950 fps muzzle velocity yields a wind call of 3.05 MOA and 2,133 fps at 1k yds.
2,800 fps muzzle velocity yields a wind call of 3.3 0 MOA and 2,008 fps at 1k yds.
Out west, the winds are rarely constant in terms of speed and direction, so most shooters end up dialing for elevation (gravity/air density), but holding for the wind in order to make last second adjustments depending on what the wind is doing at the moment of the shot. The 0.3 MOA difference between the two muzzle velocities is almost impossible to compensate for with most reticles when holding for wind. In the end, we just add it to our fudge factor.
The only other criteria I look at when analyzing ballistic data is the down rang velocity to ensure the bullet still has enough speed to open properly and perform as it should. For Berger's, that tends to be minimum of 1,800 fps. Given the same bullet, both velocities mentioned above provide admirable wind deflection and down range speed to ensure optimim bullet performance. That is my reasoning behind the statement that 150 fps muzzle velocity variance doesn't really matter much.