Finished the rokslide podcast w 
@Aaron Davidson .  Dont fear, there were fewer “panty-wad bombs” in there to trigger folks. 

  I thought it was pretty good, and there was a little more detail on the scope evals and scope testing in general.  As I understood it gunwerks does not test for impacts at all, only for recoil-induced stress of various sorts, although their scope fared very well in the eval here recently. Aaron‘s feeling was that much of the problem is recoil induced, and talked about the OEM‘s standard test being 1000 foot pounds, while some of the bigger cartridges in a lighter rifle produced half again or double as much.  They discussed bonding some of the parts into the scope, and how that can both help and hurt performance, depending on some variables. One thing I found interesting was that in the Cliff Gray podcast he talked about “identical” scopes from the same OEM testing differently as a reason to suspect a problem with the evaluation itself, while in this case he talked about changing specifications with the OEM, which resulted in a different eval result. I’ve worked with OEM manufacturers before and in my experience you can absolutely specify different tolerances or procedures even for an off the shelf option, so well I haven’t worked with this particular manufacturer I think  in general having scopes made to the same general pattern by the same factory that test differently is actually very realistic. Aaron also acknowledged that impacts can be really rough on a scope in the cliff gray podcast. I know that, Aaron knows that, so maybe theres some sort of collab potential here to develop something that can evaluate various scopes ability to handle impacts as well, the only trick being funding the equipment and figuring out who does something like that…  I really hope Ryan takes him up on the offer to go thru the scope testing, and it would be really interesting to hear the micro-specifics from an engineer in this field on what could be done to the rokslide eval to “tighten up” the eval or perhaps to provide some calibration to it, while keeping it approachable to an independent or to crowdsourcing it, which is critical to its usefulness. Ultimately, Aaron‘s motivation for testing, or any scope companies motivation for testing, is not to produce a standardized evaluation across the industry so much as it is to test or develop their own product. Obviously many companies do testing on competitor products, but I’m not aware of anyone that publishes this or makes it available in a format for consumers.So this consumer testing is really focused on a totally different goal than a scope manufacturer’s testing is. That’s the source of the friction I think, is that everybody talks about testing, but various entities within the industry have very different goals from it.  As a consumer, I want standardized published results across the entire industry from all of the models that I’m considering. That would be a ton of wasted time and energy for gun works, which product assortment doesn’t include many of the scope footprints that I find critical in my own hunting, and publishing results from competitors could easily result in either animosity, or maybe even legal friction that simply isn’t helpful for them. But with a transparent, “open source” test that any engineer could reproduce with simple equipment…????
Probably wishful thinking, and certainly rambling. 
