wildernessmaster
Lil-Rokslider
I have been reading up on the arrow tolerances and if there really are practical differences in match (typically .001), standard (typically .003) and economy (typically .006) arrows.
Tons of sites and forum posts go back and forth on this issue with the opinions varying from "definitely makes a huge difference beyond 40 yards" to "makes no difference - the average shooter can not shoot to .001".
There are also a lot of posts about cut off both ends of your arrow and you will get a (close to) .001. I actually found a site that had a great article where they went from bare new arrows to cut and fully built arrows with .001's and .006's and wow the actual differences were stunning small. In the original pre built arrow the batch the tester had (12 each) the -006's were actually tighter to spec than the .001's.
All that said... When I think about it from an engineering perspective it seems to boil down pretty simple... The difference between a .001 and a .006 is math. the 6 is going to potentially (note I say potentially because it is the worst case tolerance difference) inject 6 times more variance than the .001.
What does that mean pragmatically? Well if a .001 arrow difference injects a 1/2 of being off target at 40 yards, then the .006 will have 3 inches of being off target - WORST CASE. Again Worst case. Since most archers would take a 3 inch group at 40 yards, I think that means its a moot difference.
Before someone does some wonky math and says, no wait that would be 18 inch group if your .001 group is 3 inches... Wrong. For it to mutate into an 18 inch group would mean every arrow would have to be off from every arrow (in the exact same way) by the tolerance. Given we build and tune our own arrows that would be very highly unlikely. More so, remember that the tolerances are worst case - and if a manufacturer is making every arrow worst case - they won't be in business long.
Taking this out further... So even at 80 yards the .001 would be off by an inch (extrapolating the angle) and the .006 would be off by 6 inches. On most animals this is tolerable. Again, I think if most archers (particularly bow hunters) can shoot 6 inch groups at 80 yards they would be happy.
Now once again before anyone jumps the string...I know in a bow system the arrow tolerance is not the only error injected into the system and that some of the down range accuracy issues are compounded issues (arrow, bow, string, cam...).
Given the one factor arrow tolerance, is the value of any .003 or .001 arrow really that much? Typically you see a 25-40% price difference... Or am I missing something beyond the basic math?
Tons of sites and forum posts go back and forth on this issue with the opinions varying from "definitely makes a huge difference beyond 40 yards" to "makes no difference - the average shooter can not shoot to .001".
There are also a lot of posts about cut off both ends of your arrow and you will get a (close to) .001. I actually found a site that had a great article where they went from bare new arrows to cut and fully built arrows with .001's and .006's and wow the actual differences were stunning small. In the original pre built arrow the batch the tester had (12 each) the -006's were actually tighter to spec than the .001's.
All that said... When I think about it from an engineering perspective it seems to boil down pretty simple... The difference between a .001 and a .006 is math. the 6 is going to potentially (note I say potentially because it is the worst case tolerance difference) inject 6 times more variance than the .001.
What does that mean pragmatically? Well if a .001 arrow difference injects a 1/2 of being off target at 40 yards, then the .006 will have 3 inches of being off target - WORST CASE. Again Worst case. Since most archers would take a 3 inch group at 40 yards, I think that means its a moot difference.
Before someone does some wonky math and says, no wait that would be 18 inch group if your .001 group is 3 inches... Wrong. For it to mutate into an 18 inch group would mean every arrow would have to be off from every arrow (in the exact same way) by the tolerance. Given we build and tune our own arrows that would be very highly unlikely. More so, remember that the tolerances are worst case - and if a manufacturer is making every arrow worst case - they won't be in business long.
Taking this out further... So even at 80 yards the .001 would be off by an inch (extrapolating the angle) and the .006 would be off by 6 inches. On most animals this is tolerable. Again, I think if most archers (particularly bow hunters) can shoot 6 inch groups at 80 yards they would be happy.
Now once again before anyone jumps the string...I know in a bow system the arrow tolerance is not the only error injected into the system and that some of the down range accuracy issues are compounded issues (arrow, bow, string, cam...).
Given the one factor arrow tolerance, is the value of any .003 or .001 arrow really that much? Typically you see a 25-40% price difference... Or am I missing something beyond the basic math?