I know it can be done by first converting glucose to carbon dioxide and using an isotope-ratio MS (or by Isotope Ratio Infrared Spectroscopy). But can I do it by recording mass spectra of glucose?
For a number or perhaps better, a combination of reasons, it is not possible to detect in a quantitative manner differences in isotopic composition at natural abundance level using a molecular mass spectrometer (MS) equipped with an electron multiplier (EM) as detector. These reasons chiefly comprise (1) ion statistics of a standard electron impact source, (2) detector characteristics and detector statistics of an EM, and (3) the fact scanning MS use a single detector and therefore cannot continuously detect any particular isotopomeric ion pair (or triplet etc.) of X+ and (X+1)+ (and X+2+ etc.) simultaneously the whole time which means data are lost.
In addition, repeat MS analysis of subsamples of the same compound will yield mass spectra that may look the same but which upon closer inspection show signal intensity for key fragment ion to vary with an RSD of 10%. This in turn translates into an achievable precision for isotope abundance measurement of approx. 0.2 atom% (on a good day with favouring winds) which is not enough to distinguish between the 13C abundace of beet sugar (typically -26 ‰ or 1.08265 atom%) and 13C abundance of cane sugar (typically -11 ‰ or 1.09915 atom%), a difference on 0.0165 atom%.
All these confounding factors taken together result in isotopic abundance meaurements in SIM mode to be limited to isotopically labelled compounds with a level of enrichment of at least 0.5 atom % excess; e.g. in the case of 13C an abundance level of 1.61 atom%.
Thanks for your answer and explanations, they are really helpful. I consider using IRIS (isotope ratio infrared spectroscopy) for measuring 13CO2/ 12CO2 ratios in breath samples. One instrument has, according to specifications, a SD of
Based on 20+ years experience in stable isotope analysis, I would always advice to visit the manufacturer's application lab and see, feel and test instruments first-hand before making a purchasing decision. If you are planning to carry out both qualitative (yea/no) and quantitative (e.g. substrate oxidation) breath tests I would advice to go for the most sensitive instrument since labelled substrates cost money and it makes a difference if you have to use 10 APE or 20 APE in order to see a measurable and significant change above baseline. If mempory servesThermo are currently quoting ± 0.05 ‰ as measure of precision for their Delta Ray but even at ± 0.1 ‰ if I would have to make a purchasing decision, this is the instrument I would go for. It is a mid-infrared instrument, a technique that to my mind is superior to CRDS. It has been designed for 13C analysis of CO2 in air, so analysis of CO2 in breath will be a piece of cake. In addition, its software interface is set up in such a way, all QC/QA procedures can be run automatically and conditionally before sample runs can proceed.
Yes, I will look closer at the Delta Ray. Right now I have access to an IRIS 3 instrument (Kibion/Wagner Analysen Technik), but as you point out, 13C-labeled substances are quite expensive, so in the long run it is not economical to use a cheaper instrument. The day I have enough projects and financing for our own instrument, we will certainly do hands-on testing. Thanks once again for sharing your experience.