I am fairly new to Real Time PCR and have some questions regarding the protocol.
Question 1 - When should input RNA and/or cDNA be normalized?
I was told that one should normalize the RNA concentration prior to making cDNA, and then ALSO, normalize the cDNA concentrations prior to performing Real Time PCR. We use Nanodropper to read OD and get nucleotide concentrations, but is the ND accurate enough? The Reverse transcriptase master mix contains high concentration of dNTP's and primers, so are these readings of cDNA to be trusted? I have been using the no RNA input to blank the spec, then taking the concentrations of my samples, which often times comes back very different. I have recently read that it is not necessary to normalize the cDNA if the RNA input was normalized.
Question 2 - Should the delta Ct taken from housekeeping gene and GOI at 10-fold dilutions be the same?
I performed a serial dilution on the cDNA and ran RT-PCR to make a standard curve and check efficiency. I think everything looked fine and R^2 was above 0.98, while efficiency was only around 0.6. I did this for both GAPDH, the HK gene, and our GOI. Out of curiosity, I checked the dCt values, and they are not consistent between different concentrations. Are the dCt values expected to be consistent for the 10-fold dilutions?
For example, the following refers to the dilution series with corresponding dCt values.
ng/uL ; dCt
5 ; (-0.149)
0.5 ; (0.378)
0.05 ; (-0.109)
0.005 ; (-0.270)
0.0005 ; (2.36)
0.00005 ; (0.312)
I feel as though the dCt should not differ so greatly at the various cDNA concentrations.
Any help is appreciated. I feel I am still learning a lot about it, and learn something new every time I run another expt.