I'm interested in Simplified Calibration Interval Analysis for various types of measuring instruments and appropriate literature... Thank You in advance...
I am not familiar with Simplified Calibration Interval Analysis (I'm assuming you're referring to a specific protocol). But one approach I'm familiar with is measuring the Allan Variance curve. Basically, it is total noise vs integration time. For example, you can collect a hundred samples of nominally zero signal at a range of integration times. The shorter times will be dominated by white noise, and the longer times will eventually be dominated by 1/f noise. Usually there is a minimum in between these extrema, which is the optimum integration time.
Instruments that age even faster (eg. spectrographic instruments with fast aging light sources) may require daily calibration. Or calibration every x hours of operation.
Thinking of pH meter electrodes, calibration (1 point 'check') prior every single measurement is highly recommended to prevent measurement error.
For radiometric instruments like Pyranometrers, UV meters, the mean calibration interval suggested by the constructors is around every 6 months.
Spectro-radiometers with PM tube as detectors requires more often verification with a standard source for instance, It is also useful to verify angular response, linearity, straight light,...