Thank you for this link. I've been keeping up with this thread and find it interesting to see all the theories and test data many of you have witnessed. In the software calculation, playing with the numbers and the moly on or off leads me to what I theorize, but have no test data to support, that reduced barrel friction by the bullet will increase barrel life. The only way to get to the same peak chamber pressure, with a lower friction bullet of the same weight, is to increase its acceleration. Moly and HBN do this as do certain designs that reduce friction by nature. Our bullets do this, as does the OPs. Curious as to how that plays out overall, as barrel erosion isn't a function of just one variable, but rather an interaction between a serious of variables, probably not linear as well, making any test analysis very difficult to wrap a design of experiments around to actually understand all the puts and takes and how they interact.
If I give it my best go, I do think it's a sum of kinetic energy dumped into the barrel and displaced over the surface area of the barrel when the fire cracking phenomenon is an issue. A larger diameter caliber obviously has more surface area, which distributes the load over a much larger area. At some point, whatever that is, the fire cracking phenomenon falls below a critical threshold, hence why we don't see it further down the bore. That's probably a function of a chemical and mechanical reaction, combined. Back to the kinetic energy above; a lower friction bullet has more of that energy physically imparted into it as a function of acceleration rather than the barrel as a function of friction, hence the theorized life improvement due to less energy put in the barrel, especially the first part of the engraving process.