In the 21st century, we can safely say that absolutely all modern achievements in the field of science are based on the successes of modeling theory, on the basis of which practical recommendations are given that are useful in physics, technology, biology, sociology, etc., are extracted. In addition, this is due to the fact that the application of the principles of the theory of measurements in determining the fundamental constants allows us to check the consistency and correctness of the basic physical theories. In addition to the above, the quantitative predictions of the main physical theories depend on the numerical values ​​of the constants included in these theories: each new sign can lead to the detection of a previously unknown inconsistency or, conversely, can eliminate the existing inconsistency in our description of the physical world. At the same time, scientists have come to a clear understanding of the limitations of our efforts to achieve very high measurement accuracy.

The very act of measurement already presupposes the presence of a physical and mathematical model that describes the phenomenon under study. Simulation theory focuses on the process of measuring the experimental determination of values ​​using special equipment called measuring instruments. This theory covers only aspects of data analysis and the procedure for measuring the observed quantity or after the formulation of a mathematical model. Thus, the problem of uncertainty before experimental or computer simulations caused by the limited number of quantities recorded in the mathematical model is usually ignored in measurement theory.

More Boris Michailovich Menin's questions See All
Similar questions and discussions