cleardatum said:
... that reducing bullet "jump" (the distance from bullet to lands) usually improves accuracy- i've corroborated this phenomenon first hand. the question is, why? i have my theories. what are yours? or are there hard, proven answers?
Hi,
One thing you're going to learn about "common knowledge" in this hobby is there is a LOT of supposition and very little actual hard facts/data/science behind much of it.
First off, one must test, test, and then test, before making definitive statements. The guys in the white lab coats with the millions of dollars worth of equipment behind them do just that, 40 hours a week, for entire working careers, and are generally less likely to make a definitive statement about a lot of this stuff than the guy on that barstool down at the end who "knows" the answer, even if his only testing was to check if that last beer was cold when the barkeep handed it to him!
The most common "explanation" for this phenomenon I've heard is the bullet enters the barrel straighter. (Jimbo's heard the same story!) Ok, maybe it does. But even with my impaired olfactory system, it's hard to get that to pass the sniff test:
First off, how much "wiggle room" IS there in the chamber, forcing cone/leade and initial rifling of the gun? While the bullet's nose may not be stabilized for a few thousandths of an inch until the ogive engages the rifling, at the same time its body and tail ARE being held "in line" by the mouth of the case. Frankly, there's just not that much wiggle room best I can tell!
From there, the bullet is accelerated to a rotational velocity somewhere over 150,000 rpm, mach schnell! It exits the barrel a traveling gyroscope, and spinning at that speed, it's pretty darn stable. Doesn't really make much difference where it was when it got started. Yet the phenomenon is still observed! Why?
My own theory, which can't be proven with the test equipment available to me (which is good only to determine if that last beer was cold!) has to do with the harmonics of the barrel/bullet combo. If you've ever seen high speed photography of a barrel being fired, and the lens is powerful enough to see it happen, there's a "wave" that travels down the barrel with the bullet. There's pressure in front of the bullet as it compresses the previously still air in the barrel to get it out of the way. There's pressure behind the bullet from the burning propellant. And all the various factors involved joined together serve to make the barrel move kinda like watching a garden hose jumping about as you put water pressure to it.
So the "secret" to accuracy is to get the bullet to exit the barrel as exactly the right moment (or millisecond?) the barrel is pointed exactly where it needs to be to place the bullet on target as desired. Not the biggest problem on paper, that task is a touch more of a challenge in the field. This is one of the reasons for heavy bull barrel designs--to minimize that "whip" effect. It's also one of the reasons for barrels being bedded in some cases, free floated in others. Various barrel tapers are also used for the same reason. It goes on and on, but the fewer variables one has, the easier it is to control the rest. So... seating the bullet out close to the rifling is simply a way of minimizing one variable. And with the proper charge of the right powder for his chosen bullet, MAY enable the shooter to sorta control that barrel whip in his favor. If he hits on the "magic" combo, he sees positive results! Goal achieved, even if he might be off a bit in attributing the observed result to the proper cause.
That's my story, and I'm stickin' to until someone can show me some hard data to the contrary! I'll have another beer now, thank you.
Rick C