Hello,

I am a Non-Math major person, and I need to prove mathematically along with some results about how the probability of detecting errors will improve with the scanning frequency. Can anyone please help or share some literature about this.

For better understanding:

I want to do something similar to how does scanning a bar code ( probability of reading a bar code on the first attempt and how it will get better with the frequency) works. I mean, like bar code often misses the first time scanning but will gradually read the bar code when you increase the frequency of the scanning effort ( moving the same bar code again and again through that scanner). So I want to show ( in context of this example only) that the probability of true positive and true negative in the first effort, and how to improve the probability with the more scanning efforts ( of the same product).

Thank you!!!

More Neeraj Dhingra's questions See All
Similar questions and discussions