My question is on how the input/output equations for sensors are found, for example a pressure sensor which relates force [N] to resistance [ohms]. Would this equation be found by measuring resistance of the sensor at different force values, multiple times, then plotting the average as a curve? Where an equation for the curve is then found to be either linear, power function or polynomial? How are the sensor uncertainties then calculated?
Textbooks and online sources often say the industry standard for uncertainty is given at a confidence interval of 95 %. This means out of 100 measurements, 95 would achieve the specified value and deviate around this value in range of the +/- uncertainty given. For example temperature sensor: states uncertainty of 1 deg C within a certain range. So 95% of the time the uncertainty is within this range.
However i am not sure how this uncertainty is calculated. Is the uncertainty 2 times standard deviation?
If for example a temperature sensor says it measures temperature with an uncertainty of +/- 1 deg C, then did the manufacturer test this sensor, say a thousand times at a controlled temperature, for example 25 C, the mean temperature measured was then 25 C of those 1000 measurements, and then the uncertainty is +/- 2 times standard deviation of the 1000 measured data points, to get the final uncertainty of +/- 1 deg C at a confidence of 95%?