Normally the calibration data allows us to calculate the response function used in validation step and the choice of levels depends on the analyst and the capacity of the laboratory, but generally the minimum calibration levels is 3 and you can work with more if you have enough time and reagent. Before starting a validation study you must make a preliminary study about your needs in terms of time and reagent.
It must be noted that the most important is the level selected for validation and the number of repetitions which must be realized for each concentration level
When validating with total error approach, you must calibrate as you need for your routine method. The validation method doesn't have to influence your analytical method design. For example, with some softwares you have the opportunity to choose an one-point calibration.
However you have to choose your validation levels to cover your expected scale of exactitude. Don't mistake calibration levels for validation levels.
The total-error approach is based on the routine method, don't forget this, it's the most important. If there are some other external guidelines in your field, follow them of course.
I have used the total-error approach for ten years. (English for less ... :-) )
No ! When you validate by the total-error method, the calibration keeps the calibration.
The upper and lower limits are defined by the validation standards. Validation standards are determined like real samples in routine. Think your validation as a real sequence with calibration and unknown samples.
I have some posters and presentations about it but they are in French, sorry.
External calibration curve: a series of known concentrations, ideally spiked in the original matrix and processed the same way, were the lowest level is the lower limit of quantification (LLoQ) and the upper level is the upper limit of quantification (ULoQ).
You can claim a curve to be a calibration curve if you have a linearity relationship between these two points.
It is common practice to determine these point for total method (which is what you are interested into) and for the instrument.
They also define what is known as linearity range.
If you want to know what is the linearity range of your instrumentation, you may use pure standard solutions.
If you want to know what is the linearity range of your method, you need to spike known amounts of calibrants in the matrix, process them as you would an unknown sample and analyze them.
Alternatively, you might use an external calibration curve with pure standards and another approach to measure recovery during sample prep, although this approach is more prone to errors.