I have various concentrations of heavy metals determined by AAS method. Is there any specific formulae for estimating the sensitivity of the analytical methods
You should determine a STANDARD substance i.e a substance of known composition and determine the metal content in it through AAS. In this way you can understand the reliability of your results.
The concept of sensitivity is the amount of signal per unit of the analyte. If you are referring to AAS the sensitivities for many analytes would already be documrented. And it's not a very useful concept. Please look up the ISO-IEC guidelines for test methods
There are several key metrics used to determine the sensitivity of an analytical method:
Limit of detection (LOD): This is the lowest amount of an analyte that can be detected with a high degree of confidence. It is usually determined based on the standard deviation of blank samples and a set confidence level, often 3 times the standard deviation. A lower LOD means higher sensitivity.
Limit of quantification (LOQ): This is the lowest amount of an analyte that can be quantified with acceptable precision and accuracy. It is usually determined based on calibration standards at low concentrations. A lower LOQ also indicates higher sensitivity.
Calibration curve: A sensitive method will have a steep calibration curve, with a high slope. This means a large response for a small change in concentration.
Signal to noise ratio: A higher S/N ratio means the signal from the analyte can be more easily distinguished from noise, enabling detection of lower concentrations.
Continuing calibration verification (CCV): Closely spaced CCV standards at low concentrations provide evidence that the method can accurately and precisely measure low levels of the analyte. Wider spacing or larger errors at low CCV levels indicate lower sensitivity.
Spike recovery: Analyzing spiked samples at low concentration levels provides a measure of sensitivity, as higher recoveries at lower spikes demonstrate the ability to detect smaller additions of the analyte.
Detection limit (DL): The DL is defined as 3 times the standard deviation of the blank. A lower DL is indicative of higher sensitivity. However, DL is not commonly used nowadays in favor of LOD.
Those are some of the key metrics analytical chemists use to determine and optimize the sensitivity of their methods, especially for trace analysis of heavy metals or other contaminants. Let me know if you have any other questions!