Mass spectrometry has long been a valuable quantitative tool used across many industries for numerous applications including food , beverage , pharma  and environmental analysis . Its utility ranges from use as an alternative detector to UV in LC for difficult non-chromophoric analytes to those applications that fully exploit the extra resolving power or sensitivity that mass spectrometry can afford.
Whilst some industries have fixed methodologies and technologies to deliver their analyses the plethora of possible approaches and instrumentation can be somewhat daunting for a new application and the less-experienced user. This article provides a short overview of some analytical considerations when performing quantitative LC/MS and GC/MS.
Analytical method development
Choosing an MS technique to use in method development will come from a careful compromise of considerations including, but not exclusively:
- Development time
- Numbers of samples
- Matrix composition
- Analysis time/throughput
- Required sensitivity and accuracy
The compromise reached will be a balance between the time and money invested in method development, standard preparation and instrumentation juxtaposed against the benefits of having an accurate, robust, specific and valid method. The unique business considerations will determine what constitutes a fit-for-purpose method for each case.
Whilst most error sources in the experiment can be minimized the one intrinsic source of error is from the measurement of the signal intensity. For good measurements a maximum number of ions should be sought.
Ways to maximize ion count include:
- Sample pre-enrichment through physical methods; e.g., SPE extraction
- Changing the eluent composition to improve ionization
- Chemical derivatization
- Increasing the proportion of time an analyte spends in the detector. Moving from full scan spectrum monitoring (Table 1) to single ion monitoring (SIM)
- would typically increase the time the
- analyte ion spends in detector with a corresponding gain in sensitivity. This is normally the method of operation with a single quadrupole instrument when quantifying trace-level (ppm) impurities. In practice, the instrument would typically operate in a compromise multiple-ion detection mode to allow simultaneous quantification of more than one analyte with a corresponding loss of sensitivity.
Table 1: Some general MS terms for LC and GC.
With an ever-increasing need to improve equipment utilization the requirement to gain maximum throughput from analytical equipment is important. In many MS-based quantitative analyses the rate-limiting steps are chromatographic run-time and sample preparation.
Any given chromatographic analysis shortening run-time will compromise resolution. In a carefully controlled manufacturing environment, the sample components are generally fixed, and a chromatographic method will be developed to give the required resolution. In less well-characterized samples, the guarantee of achieving resolution is impossible and thus the added confidence of providing mass selectivity becomes paramount. High-throughput analysis labs will typically employ 3-minute run times or less allowing 20+ samples an hour to be run. Even allowing for improvements to resolution afforded by ultra-high pressure chromatography (UHPLC) may not guarantee perfect separation in many complex media.
An extra layer of selectivity can be incorporated into the experiment through Selected Reaction Monitoring (SRM), typically via a triple quadrupole instrument (Figure 1). SRM allows not just the selection of a single mass ion to quantify but then further filter, via a distinct fragment ion, to discriminate it from other components that share the intial mass ion. SRM data are relatively simple and typically one peak produced. SRM is particularly suited to the analysis of complex matrices such as blood plasma where good specificity is needed in a complex matrix such as blood plasma. A robust and discrete fragmentation is required. The analyst will need to optimize the collision energy and collision gas to maximize the desired fragmentation.
Figure 1: Schematic of an SRM experiment showing parent mass selection, collisionally activated dissociation, fragmentation and product ion formation.
The other main problem with poor chromatographic resolution is the variation in response from the analyte caused by matrix effects. In particular, ion suppression from interfering components and various cross talk (interference from product ions from a preceding SRM transition) effects can have a dramatic effect on the precision of the method.
Whilst traditional HPLC analysis uses interspersed standards and samples for quantitative analysis, the problems of ion suppression, variability in cone voltage and eluent composition cannot be completely overcome due to technical limitations of the instrument.
Cone voltages: Variation in cone voltage can have a significant effect on how an analyte ionizes. The ratio of the adducts can vary significantly across a range of voltages; e.g., in positive-ion electrospray, the ratio of M+H, M+NH4, M+Na and M+K ions will vary, thus affecting any intensity of the ion being monitored and quantified.
Eluent composition: Small changes in eluent composition can affect ionization processes  and thus, where possible, the use of isocratic HPLC methods is strongly recommended.
The sample introduction method is of key importance in GC/MS quantitation and depends upon the analyte volatility in the sample matrix. The use of headspace analysis to reduce contamination of the GC inlet can be highly beneficial in routine quantitative analysis. EI GC/MS, being a high-energy process, produces a large amount of fragmentation and thus voltage needs to be very carefully controlled to ensure consistent fragment distribution when developing methods.
The most reliable way to improve quantitation is to use an internal rather than external standard. The best internal standard will be one that has exactly the same physical and chemical properties as the analyte. In this case it will be subject to the same ion suppression, adduct distribution, etc., as the analyte. The optimum standard is thus normally a stable isotope labelled variant of the analyte doped into the sample. To avoid the complexity of untangling the peaks of the isotopomeric standard from the natural isotope peaks of the analyte, the standard would normally be prepared with at least an additional 3 mass units (more for halogenated species) – typically a CD3 in place of a CH3. Whilst the synthesis of such a standard may be a valid activity for long-term analysis of large sample numbers; e.g., polychlorinated biphenyls in food products, it is an uneconomical luxury for many applications. A compromise can often be reached by using a very similar structural homologue. An isomer of the parent molecule is common as it typically has similar chemistry and retention to the analyte. An internal standard should be used when optimum accuracy/precision is required in MS quantitation. The internal standard will correct for errors in sample preparation and subsequent analysis however, an internal standard needs to be carefully selected and monitored in use [6, 7].
Developing quantitative LC/MS and GC/MS methods is a complex task and involves many technical considerations during the development of the chromatographic separation and careful set up of the MS instrumentation. However, despite its complexity, quantitative MS has enabled the analyst to determine previously impossible-to-quantify compounds with the requisite specificity, sensitivity and robustness.
 S. Bogialli and A. Di Corcia, Anal. Bioanal. Chem., 2009, 395, 947.
 E. Garcia-Beneytez et al., J. Agric. Food Chem., 2003, 51, 5622.
 C.K. Lim and G. Lord, Biol. Pharm. Bull., 2002, 25, 547.
 P. Loconto, Trace Environmental Quantitative Analysis, CRC Press/Taylor & Francis Group, Second Edition, 2006.
 R. Kostiainen and T. Kauppila, J. Chromatogr. A, 2009, 1216, 685.
 J. Wieling, Chromatographia, 2002, 55, S-107.
 A. Tan et al., J. Chromatogr. B, 2009, 877, 3201.