When we run a measurement depending on the model that we use (and the way that we get the data ex: dispersion law or NK files, ...) we get different thicknesses and different Goodness of Fits (GOF) for the same sample; how do we know which one is more accurate? is GOF the only parameter we should consider when choosing between these results?

for example I just ran a test on a sample with SiO2 on top (300A apprx. thickness), Nb2O3 below that (100A apprx. thickness) on glass substrate. Machine software WinSE gave me a GOF of 89% (SiO2 54.76A and Nb2O3 100A !) which seems to be very off then I pulled up the results in Winelli software (Data analysis software) and modeled it with NK values from database and I got R^2=0.70 (SiO2 355.4A and Nb2O5 233.2A) then I changed my model in several ways and I was able to get R^2=0.80 (SiO2 383A and Nb2O3 164A). I wasn't able to get a R^2 higher than this number. I'm not sure how accurate are our results. Even 5A tolerance is acceptable for us but the result fluctuates a lot based on different model and we don't know which one is more accurate and how accurate the best result is. As you can see there is a 30A difference in thickness when R^2 is 80% instead of 70% which means that even my best results might be VERY off from actual values.

What is the tool that the end user should use to verify how good his results are.

(definition of R^2 or GOF: how much knowing X (in this case our model) helps you predict Y (thickness

))

Thanks.

Similar questions and discussions