Welllll .... they had zero retention from me after about 40 seconds ...No, they are not. How are they measuring zero retention in this?
Welllll .... they had zero retention from me after about 40 seconds ...No, they are not. How are they measuring zero retention in this?
Now here's a thought: wouldn't it be ironic if Leupold's shipping cartons had been drop-tested, but their scopes hadn't?This is hilarious! Freaking cardboard boxes have a standardized test for drop durability, but scopes don’t! And worse yet, the riflescope industry runs and hides from such testing.
Not sure there's much to be gained going down that path, to be honest ... I've read and listened to some of his critiques of the process (some of it is only a sentence or two, buried in 1-2 hour rambling videos, so it would take you dozens, if not hundreds of hours to track it down), and he made a bunch of unfounded and unhelpful claims ... it didn't really advance anything.Does anyone have a link to what Koshkin said about these drop tests, etc? I think he's an opical engineer if I'm not mistaken. I find them interesting personally but not something I'd "test" personally.
You mean optic engineers that shoot less rounds in a year than some will shoot this week? Or one that makes statements about how great scopes are with a couple hundred rounds from a bench?
I will bet you $5,000 that we can go to any outdoor store and you can buy a NF NXS, I’ll mount it on my rifle, zero it, you can drop it per the stated eval, I’ll shoot it, and it will hold zero.
No, they are not. How are they measuring zero retention in this?
Except they don’t replicate actually dropping the rifle or induce the same zero shifts that would be possible from even small drops, which are bound to happen from time to time if you carry a rifle long enough. I see you’re always on here pushing back against the drop tests and their results. This is despite Form offering you the chance to come out and test it, with the opportunity to get a brand new optic out of the deal. Why wouldn’t you take him up on that? Well it’s because deep down you know he’s probably right.Manufacturers use collimation tables. Scientific equipment that completely eliminates shooter error.
Don't let the guys get to you. I really liked your test and appreciate what you put into it. It's real world applications with results that do show trends worth considering when purchasing a rifle scope (my opinion).You mean optic engineers that shoot less rounds in a year than some will shoot this week? Or one that makes statements about how great scopes are with a couple hundred rounds from a bench?
I will bet you $5,000 that we can go to any outdoor store and you can buy a NF NXS, I’ll mount it on my rifle, zero it, you can drop it per the stated eval, I’ll shoot it, and it will hold zero.
I would modify that to state that optics companies do not have a requirement for side impacts and thus it is never tested in DT or OT. I actually think many optics companies do not do any sort of OT.In the military we do two basic types of testing:
(1) developmental testing where engineers see if the weapon system meets all the specs
(2) operational testing where the actual user in an operational environment sees if it’s suitable (doesn’t break where he uses it), effective (gets the job done), sustainable (can be kept working). This also includes live fire test and evaluation to confirm lethality (disables/disrupts enemy capability) and survivability (doesn’t get disabled/disrupted)
It’s totally conceivable that we develop something that meets all specs but the operator rejects as useless.
Perhaps the optics companies are doing DT&E but not OT&E
The Leupold statement tells us that they are testing zero retention. 5000 impacts at a g force of three times that of a 308 Winchester Recoil. Full quality control and quality assurance before and after.
leupold is telling us that they are using industry standard quality processes to provide a high quality process.
this is simply how things are done in very professional engineering organizations
You must be gifted at breaking scopes, I dropped mine onto a very thin midway mat on top of rocks from waist height about 6 times. The most physical damage It had was a few Knicks from breaking through the mat.On your flawed test that a SWFA fixed power somehow passed? I knocked one of those things over one time and the elevation knob took the brunt of it and bent the turret enough that it was binding. It wasn't anywhere near the height of your drop test.
On your flawed test that a SWFA fixed power somehow passed? I knocked one of those things over one time and the elevation knob took the brunt of it and bent the turret enough that it was binding. It wasn't anywhere near the height of your drop test.
without some big scientific test to correlate the drop test to real world use and real world failures, it’s all conjecture to say where the line SHOULD be between pass and fail and what level of abuse is required during a test to identify that.
Also within a few days of the annual trip to the range to sight in said scope. A few inches up and over is SOP.I think the biggest reason Leupold continues to sell buttloads of scopes is because probably 95% of game animals are killed under 250 yards and dialing isn't necessary.
I think you should quit your dayjob and design riflescopes!Don't let the guys get to you. I really liked your test and appreciate what you put into it. It's real world applications with results that do show trends worth considering when purchasing a rifle scope (my opinion).
I'm an engineer for a major car manufacturing company. I manage part quality concerns and warranty issues. We have drop test requirements on some parts. I have found cases where the "engineering specification" for the drop test can be met, but when technicians at the plant drop the same part on the floor accidently before installing it into the vehicle, it breaks. One case I can think of is we require the part to still work after being dropped from (I'll give you the English conversion because everything in automotive is metric) 3 feet. The way the part is oriented before it's dropped is not specified. The supplier of the part has data showing the part passes. I did the test and it failed. When I approached our engineer and the supplier about this, they told me I wasn't orientating the part correctly. Where was the part orientation specified? Nowhere. They had just figured out how to make the part pass our specification by holding it a certain way.
If folks don't want to use your test method when evaluating which scope they buy, that's ok. If they want to tell you your method was stupid. That's ok too. I looked at your report yesterday for the first time and I've been rethinking my next purchase because of it.
@Reburn I dont disagree with that ^^. My main point was to address many of the people who are dismissing the testing being done because they have not seen personal failures with scopes that have failed. They are saying that the test is irrelevant because they don’t personally drop their stuff. I am saying that if you want to figure out what the most durable products are, you have to make the test harsh enough to see which ones rise to the top. And that leaves everyone in the middle wondering how to apply it to their scope purchase. It would be perfectly reasonable if you had a real scientific version of this test to use an “A” rating and a “B” rating, where the B rating applies to those scopes that are above average (maybe that means they pass the 18 inch drops but not the 36 inch drops), and then the A rating applied to those very few products that pass the most rigorous part of the test. As an industry-wide test, something like this might make sense and allow better transparency for most people, while for someone’s personally-developed test who clearly has very high standards for durability, it makes much more sense to focus only on the products that will meet their particular needs. Ultimately what I’m saying is that there are valid criticisms of this testing, but at the same time those criticisms dont invalidate what it DOES tell us, and people really have no other choice, they just have to use the testing in context for what it is and understand what it can and cannot tell them, or come up with a better test.
If your statement was true (manufacturers testing for scope function), this thread wouldn’t exist.Manufacturers use collimation tables. Scientific equipment that completely eliminates shooter error.
If nothing else, it's good to know they are gay friendly.No, they are not. How are they measuring zero retention in this?