Backcountry Rifle Scope

Formidilosus

Super Moderator
Shoot2HuntU
Joined
Oct 22, 2014
Messages
9,981
I don’t know how to make it much simpler.

You developed your standard.
You developed your procedure to test.
You executed your procedure.
You (and others) can’t recognize the difference between a pass or a fail based on your own results.

I’ve performed material testing professionally in the private sector. Wrote testing and performance specs. Worked with various govt agencies/departments. It’s never this hard. Ever.

I can take a quick look at any of the results of the legitimate standardized (and repeatable) tests that I’ve been involved in and go into more detail than anyone cares about. Having 6 results lined up next to each other would be extremely helpful and easy to explain the nuances, differences, similarities to the layperson.


Neat. Which targets correlate to your shot overlay picture?

Here’s your picture-
D6E17A17-CB26-4E21-8AC5-ED1C6BAE55F3.png


In your picture,

Group

#1- does not match any of those targets I posted.

#2- does not match any of the targets I posted.

#3- does not match any of the targets I posted.

#4- does not match any of the targets I posted.

#5- does not match any of the targets I posted.

#6- Seems to match the Leupold 3.6-18x. If that is it? It does not show the second target from that scope, nor any of the rezero targets.
 

pwrdbycotn

FNG
Classified Approved
Joined
Oct 4, 2020
Messages
64
Sorry, I didn't intend to imply that you are beholden to anyone, for anything. You said you would never do it, and my questions were honest questions about why you would have that view. I even crafted my question with one reason that makes complete sense to me - that you don't know the guy. If you are reading more into my post than that, I apologize - but maybe read it again and my prior posts in which I'm trying to bend over backward in this discussion/debate to get to a civil discourse on varying viewpoints.



I did.

No one has to answer anyone's questions. Just asking a question in my opinion doesn't mean anything more than that. But I do get information from all threads in which people ask each other questions - whether they are answered or not.
My only response would be that as hunters we should have confidence in our equipment to make clean, efficient kills. If some random guy’s test makes you question your gear, sell it and move on to something you can put your trust in. Reviews and “tests” are a dime a dozen, so I don’t put a lot of stock in them, especially when there’s just one sample.
I had a pickup one time that was a lemon and the dealership had to buy it back. Since then I’ve probably had 15-20 of the same brand of truck and they’ve been relatively flawless.
 
Joined
Jan 26, 2017
Messages
3,113
Location
PA
@TK-421 ah, you did comment, I didn't see it because you're on and off my block list, apparently back then you were on. I'll put ya back there soon so we can let the neighborhood rest a bit.

Yup, 8-10 moa shooting off a toolbox. Same before during and after the drops. The gun does quite a bit better with a real rest set up, but my main goal that day was breaking in a glock, the 22 was just because I had time left over.

Here, as in that post, I'm advocating for people to test their crap, because a lot of it is just that: crap. The inability to trust manufacturers to design, produce, or cull nonperforming scopes before consumers get them is the actual problem.
 

fwafwow

WKR
Joined
Apr 8, 2018
Messages
5,522
My only response would be that as hunters we should have confidence in our equipment to make clean, efficient kills.

Agree completely.
If some random guy’s test makes you question your gear, sell it and move on to something you can put your trust in.
I’ve been on here long enough that I don’t think he’s random, but putting that aside, if there’s any reason I question my equipment and I move on, what’s the best way to choose something to trust?

Reviews and “tests” are a dime a dozen, so I don’t put a lot of stock in them, especially when there’s just one sample.
Agree about reviews, but short of Consumer Reports doing a test, I’m not sure of the alternatives. And tests done by the manufacturer are no more dependable or trustworthy to me.
I had a pickup one time that was a lemon and the dealership had to buy it back. Since then I’ve probably had 15-20 of the same brand of truck and they’ve been relatively flawless.
 

pwrdbycotn

FNG
Classified Approved
Joined
Oct 4, 2020
Messages
64
Agree completely.

I’ve been on here long enough that I don’t think he’s random, but putting that aside, if there’s any reason I question my equipment and I move on, what’s the best way to choose something to trust?


Agree about reviews, but short of Consumer Reports doing a test, I’m not sure of the alternatives. And tests done by the manufacturer are no more dependable or trustworthy to me.
How you choose a product is up to you. You don’t want me spending your money for you.
If you trust the test done by this gentleman, more power to you. And again, I’m sure he’s a great guy, and I’ve only looked at one of his tests just as a result of reading this thread, but I haven’t seen anything that would make me trust his opinion over my own.
Best of luck in your endeavors. Have a good one.
 

Formidilosus

Super Moderator
Shoot2HuntU
Joined
Oct 22, 2014
Messages
9,981
Every single group is taken from your test results.


No. I posted every single group from the stickied scope drop evals on this thread. Only one matches up- you are avoiding it. The pictures are in this thread- point out what scope shot groups 1-5 on the drop portion. Also, point out what drop portion a scope “passed” but shot a larger group than stated baseline.


You’re right that I didn’t include additional targets from a couple of those above. Some were said to function correctly after a single test. Some performed similarly, but were put through additional tests until a failure was claimed. Some performed worse, but we’re put through additional testing until a pass was claimed. All shot outside of the stated baseline cone regardless of stated pass/fail. Some maintained a zero (for the larger drop group) and were said to have failed. It’s all over the place.


You mean like the Tanget theta that was shot for over 900 rounds giving it every change to magically start working? Or the Leupold Mark 5 3.6-18x that was repeatedly shot, remounted, rezeroed, and shot again giving it every change to magically start working? Or the other Mark 5 that was the same, and is still being shot, remounted, and shot again to see if it will magically start working? Or the Trijicon Credo that failed and wasn’t shot again? Or the Vortex LHT that was shot for 400+ rounds, remounted multiple times, sent back to Vortex, then with two Roksliders present and on video, failed epically, repeatedly? Or the Maven? Which again failed epically, consistently, and was remounted and shot multiple times? Or maybe the Zeiss LRP that on over 1” of padded mat, sheared a turrt clean off? (Of course that was after totally failing to stay zeroed).

Which one exactly doesn’t follow what I wrote in the Standards and Evaluation? It’s very easy- just say “x scope didn’t follow the protocol and you didn’t explain why”.
 

Formidilosus

Super Moderator
Shoot2HuntU
Joined
Oct 22, 2014
Messages
9,981
That’s cute. You’ve been dodging and avoiding my initial question. I’m surprised that nobody (especially you) is able to say ‘these are good, those are bad, I’d reshoot these. And here’s why’.

It’s all your data man. Some of those groups were sufficient to make an immediate claim of pass/fail. Some prompted you to test further. Can you explain why? If you’re being objective, you won’t need to know which logo, you should be able to just say what it is, and the standard should be clear enough for everyone to recognize it.


Here, I’ll make this easy- you can’t say which shot group in your picture corresponds to which drop group- because they don’t. Those groups aren’t from the drop tests. I’ve looked at everyone of them- they are in this thread.
 

waldo9190

WKR
Joined
Jul 10, 2018
Messages
311
Location
Minnesota
Dude, those groups are your data. I brought in 6 of your photos to create that exhibit. This isn’t a gotcha attempt. In fact, it should be an opportunity to show how objective and consistent these tests can be. No logos are shown, just the results from the test that you devised and performed. I’m shocked at how hard this is.

Some were claimed to pass/fail solely from those posted groups. Some were tested further. Can you explain why some passed or failed, and others were shot more before deciding?
No. I posted every single group from the stickied scope drop evals on this thread. Only one matches up- you are avoiding it. The pictures are in this thread- point out what scope shot groups 1-5 on the drop portion. Also, point out what drop portion a scope “passed” but shot a larger group than stated baseline.





You mean like the Tanget theta that was shot for over 900 rounds giving it every change to magically start working? Or the Leupold Mark 5 3.6-18x that was repeatedly shot, remounted, rezeroed, and shot again giving it every change to magically start working? Or the other Mark 5 that was the same, and is still being shot, remounted, and shot again to see if it will magically start working? Or the Trijicon Credo that failed and wasn’t shot again? Or the Vortex LHT that was shot for 400+ rounds, remounted multiple times, sent back to Vortex, then with two Roksliders present and on video, failed epically, repeatedly? Or the Maven? Which again failed epically, consistently, and was remounted and shot multiple times? Or maybe the Zeiss LRP that on over 1” of padded mat, sheared a turrt clean off? (Of course that was after totally failing to stay zeroed).

Which one exactly doesn’t follow what I wrote in the Standards and Evaluation? It’s very easy- just say “x scope didn’t follow the protocol and you didn’t explain why”.
Feeling spicy this morning.
 
Last edited:
Joined
Sep 8, 2014
Messages
1,807
Location
Front Range, Colorado
If you don’t have adequate samples (again, we’re just assuming procedure is legit), then you’re unable to ascribe the meaning that some very desperately want to see.

It does not make sense that one outcome has more significance than another. You don’t get to pick significance of a result. That flies completely in the face of what’s trying to be accomplished, and is evidenced by the “what are the odds” responses. Exactly. What are the odds? That’s exactly why you need a statistically significant sample size based on each production run to understand what your “failure rate” will be. Like I said, this has a tendency to become more about personalities than anything.
I disagree about the significance of one outcome vs the other (fail vs pass). Coming from a background in manufacturing engineering, we approach new processes and products from the same standpoint Form is when it comes to fail vs pass. A failure results in a complete stop to the process, and nothing proceeds until an RCA is complete and the problem is resolved. There is no continuing with a process that has proven itself to be unreliable. A pass doesn't guarantee perfection, but it allows us to move forward with the process. One outcome is far more significant that the other; failure means full stop, pass means proceed with further observation. Because of the need for an extremely high yield in production (99.7% or higher as an ideal), failures are mathematically far more significant than a pass.

The application of that mindset works ideally for aiming devices. Anything that produces a failure on its first test cannot be used in the real world until it is evaluated, fixed, and tested again. An aiming device that passes may not be guaranteed to be perfect in every instance, but it can be used in the field. Unfortunately, it appears that most manufacturers aren't interested in fixing their products' inherent flaws.

While it would be nice to have a greater sample size for each optic, a 1:1 failure ratio is significant enough to consider any optic that fails as being too unreliable to trust in the field (another 1:1 opportunity).
 
Last edited:

Deywalker

FNG
Joined
Sep 18, 2021
Messages
84
Here, I’ll make this easy- you can’t say which shot group in your picture corresponds to which drop group- because they don’t. Those groups aren’t from the drop tests. I’ve looked at everyone of them- they are in this thread.
In fairness to TK, I believe group 2 is your Tenmile 3-18 group rotated 90°
 

thinhorn_AK

"DADDY"
Joined
Jul 2, 2016
Messages
11,215
Location
Alaska
That’s cute. You’ve been dodging and avoiding my initial question. I’m surprised that nobody (especially you) is able to say ‘these are good, those are bad, I’d reshoot these. And here’s why’.

It’s all your data man. Some of those groups were sufficient to make an immediate claim of pass/fail. Some prompted you to test further. Can you explain why? If you’re being objective, you won’t need to know which logo, you should be able to just say what it is, and the standard should be clear enough for everyone to recognize it.
You're getting really emotional about this. Thats fine, I would be too if I spent 5k on a scope that wouldn't pass tests that other scopes pass but bringing emotion into discussions of facts just dosent work.
 

thinhorn_AK

"DADDY"
Joined
Jul 2, 2016
Messages
11,215
Location
Alaska
LOL, don't project your feelings onto me friend. None of the scopes "tested" were mine. I didn't establish the standard, the test, I didn't draw any conclusions, or partake in any way. I have zero tie (emotional, financial, or otherwise) to the outcomes.
But you are getting defensive about the possibility of your scope not passing those same tests. Im not projecting at all. I don't have a tangent theta scope.
 

atmat

WKR
Joined
Jun 10, 2022
Messages
3,184
Location
Colorado
Anyone know of a bombproof scope in 1” tube that weighs <20oz? I really wish NF made one.

I prefer holdover and don’t need the turrets or lots of travel. But I want bombproof without adding a ton of weight to my 5.25-lb rifle.
 

cmahoney

WKR
Joined
Jun 18, 2018
Messages
2,448
Location
Minden Nevada
Anyone know of a bombproof scope in 1” tube that weighs
I prefer holdover and don’t need the turrets or lots of travel. But I want bombproof without adding a ton of weight to my 5.25-lb rifle.

The NXS 2.5-10 weighs 20.5 oz. You can’t beat it.

84a65b8bc382197543cdc1dd11169350.jpg



Sent from my iPhone using Tapatalk
 

atmat

WKR
Joined
Jun 10, 2022
Messages
3,184
Location
Colorado
The NXS 2.5-10 weighs 20.5 oz. You can’t beat it.
Sent from my iPhone using Tapatalk
Yeah, I know about the NXS and I’d consider it (or the SHV 3-10x42 for 2 oz more but half the cost)… but it’s still heavier than I’d like. I’d like same rugged features but less weight (possibly by dropping the big turrets or going to smaller 1” tube).

It seems like everyone I know of who is building really robust scopes are building robust dialing scopes, which obviously tacks on extra weight.
 
Joined
Sep 8, 2014
Messages
1,807
Location
Front Range, Colorado
What are you manufacturing? Are you talking about testing the concept, design, or production run? Do you QC every product or batch test? If production, what’s your lot size and AQL? Are you performing something like an astm standardized test with calibrated equipment, etc?

In the context of the scope tests we’re talking about, we (the consumer) are buying products that are typically past concept, design, and are in full production. Most of these companies are brands that use third party OEM, and there’s a failure rate with any of these scopes. Not one single brand is 100%. It’s reasonable to expect that on occasion you’ll get a lemon, it happens. When you do see a failure (again, completely putting aside the procedure), you have no clue if that’s indicative of the entire population, that production run, or a 1:xxxxx event. Many don’t care, and that’s fine too.

Edit to add:
This is one of the many reasons why I was saying let’s put everything else aside, the basis, procedure, personalities, etc. Look solely at the results.

And yet, the results are only able to be interpreted if we know the logo first…

giphy.webp
We manufacture the most accurate and precise Coriolis flow meters on the market. What I'm referring to is primarily in the design and early production phase for new products. In terms of the type of testing, we do a wide variety ranging from CMM measurement of components and locations to final calibration of the finished product. That includes field condition testing of new designs to ensure they hold a calibrated zero (imagine that!) Either way, fail and pass do not have the same value at any stage of design or production.

While I agree with your point about scopes being past design and production, we have an unfortunate case wherein the vast majority of scopes don't do what they are supposed to. That leads to us having to do the manufacturer's job and determine capability ourselves. Reality is that there isn't a standardized test for doing so; I'd argue that the industry needs just that. So far, it appears that manufacturers aren't interested in being held to such a standard. In the mean time, this is the closest thing anyone has bothered to pursue.

Would you like to outline an alternative method of testing and put in the time and effort to implement it? I think we've all latched on to Form's results because 1. field failures suck 2. this is the only current source of any consistent testing of zero retention.

I don't think it has anything to do with logo. Some of these failures are completely catastrophic, interpreting that data isn't ambiguous at all.
 
Joined
Aug 2, 2021
Messages
741
Yeah, I know about the NXS and I’d consider it (or the SHV 3-10x42 for 2 oz more but half the cost)… but it’s still heavier than I’d like. I’d like same rugged features but less weight (possibly by dropping the big turrets or going to smaller 1” tube).

It seems like everyone I know of who is building really robust scopes are building robust dialing scopes, which obviously tacks on extra weight.
Trijicon accupoints seem to have a reputation as being tough for a small light scope however I have no experience with them
 
Top