Liquid chromatography-mass spectrometry (LC–MS) is the standard instrumental procedure for quantitating nitrosamine drug substance-related impurities (NDSRIs) due to its superior specificity and sensitivity. Electrospray (ESI) is the most used ionization source in LC–MS. However, analytes can undergo fragmentation directly within the ESI source before reaching the collision cell. This phenomenon is known as in source fragmentation (ISF). To our knowledge, the impact of ISF on analytical procedure performance for NDSRI measurements has not been explored. Thus, here, we present a case study on an NDSRI (nitroso-bumetanide) to illustrate how efforts can be taken during analytical procedure development to minimize ISF while still achieving the analytical target profile (ATP) measurement goals. In addition, we share some thoughts about incorporating risk assessment and leveraging prior knowledge for analytical procedure development for NDSRI LC–MS technology-based testing purposes.
If you follow closely FDA’s Office of Pharmaceutical Quality Research (OPQR) work, you will recognize Dr. Zhang from most agency published analytical work realted to Nitrosamines.
This time, the team shines the light into a potential problem known as in-source fragmentation (ISF), which occurs when the analyte molecule undergoes fragmentation before reaching the collision cell.
The authors suggest orthogonal methods using different detection systems, such as ultraviolet (UV) absorption, if available, can be used to confirm the mass spectrometry data. This will be of particular value to the analytical procedure for a newly identified NDSRI as no prior knowledge and reference dataset is available
In the case of nitroso-bumetanide, increasing the fragmentor voltage was found to cause cleavage of the N–NO bond. This is something to be careful about.
There was minimal ISF when fragmentor was lower than 100 V, however, 15% and 30% of both analytes lost NO group due to ISF when fragmentor was 120 V and 135 V, respectively different scientific principles (MS v.s. UV) which can identify potential inherent limitations of one or both methods, if a difference in resultant measurement values exists.
During method development I observed a small amount of NO‑cleavage for my analyte (approximately 10%). This made me wonder whether it might actually be advantageous to include a transition that specifically monitors this fragment.
For example:
Classical approach:
300 → 180 m/z (Qauntifier)
300 → 200 m/z (Qaulifier)
Alternative approach including NO‑cleavage monitoring:
300 → 180 m/z (Qauntifier)
270 → 180 m/z (Qualifier / monitoring the NO‑loss channel)
By tracking the NO‑cleavage pathway (e.g., using the 270 m/z precursor), you could generate a ratio that is monitored throughout the sequence. If the fragmentation behavior is sensitive to small variations in source conditions, this ratio might help detect those fluctuations and provide an additional layer of control.
Has anyone used this approach, or see potential drawbacks? In theory, it seems like a useful way to evaluate the stability of signals over time.