@Nitesh You highlight some real challenges faced by API manufacturers. In your experience what would you say is the # 1 problem API manufacturers faced when conducting their risk assessment? Do you think is there a harmonized approach to resolve that?
I believe there is lot of dilemma within API manufactures as the risk assessment is usually done only on the published 7 Nitrosamine impurities. The industry might be facing problem in anticipating other Nitrosamine derivative impurities based on there route of synthesis . Also there is an challenge to get the standard for the impurity which is over and above the known 7 imps.
Basically I understand published guidelines has a flow chart which shows the path forward how one can perform the process risk assessment however the major onus lies with the synthetic chemist wherein has to religiously able to identify the right Nitrosamine derivative impurity (over above the published one ) which can form at every stage of the reaction and accordingly should adopt control strategy approach thereby demonstrating the purged factor and assess the API based on the agreed AI limits. I think this is the only harmonized & systematic way to resolve it.
@Nitesh you bring couple really good point of current discussion: What happend any other Nitrosamine from the ‘core’ known list? and Can purging factors solely justify the inclusion or not of those nitrosamines? Maybe @David@fernandaw can add some of his perspective in this.
Any potential nitrosamine has to be evaluated - not just the 7/8 (regulator-dependent) published ones… it’s slightly concerning to hear that risk assessment is in some cases only done on the set of ones with published limits, especially as carcinogenicity data is known for 120-odd (carcdb.lhasalimited.org), of which about half are more potent than the general TTC of 1.5 mg/kg. Given the current regulatory concern, it is better to consider all nitrosation reactions, then show that the nitrosamine formed is not a concern due to either to purge or negative toxicity data, rather than have an assessment pushed back because a regulator has noticed a hypothetical nitrosamine that wasn’t considered…
Purge factor analysis and tools to automate that (e.g. Mirabilis) can help significantly with this, ideally both showing that nitrites/amines (and other formation routes) do not into contact at appreciable levels and also showing that, if that isn’t the case and a nitrosamine is formed, the nitrosamine is appropriately purged. Theoretically, though this has, anecdotally, not always been the case due to the level of concern and confirmatory testing has been requested, if M7 Option 4 can be used as a control method, no reference standard should be needed.
Work on the safety/risk assessment side is ongoing and we believe that we have solid mechanistic arguments and carcinogenicity data to support it to show that many of the AI limits can be quite a lot higher than 26.5 ng/day without significant increase in risk, and thus purging may be able to be less extreme.
@David thank you for sharing such insightful information. Indeed @AndyTeasdale authored a wonderful paper introducing and laying out this approach. I believe we are all curious about the evolution of ‘limits’.
Some of the organization we are trying to help and guide here will likely not have full expertise on this, what would be your suggestion or advice to those stakeholders (Ex: small ingredient manufacture)? I believe leveraging the knowledge on all these tool will help advance the conversation and adoption.
Thanks David for sharing very useful reading material on Nitrosamine and Mirabilis, this can certainly help industry a lot.
To inform we are regular user of Derek Nexus to estimate GTI’s , I have one query when we upgrade the software version sometimes we found some impurities show positive alert whereas earlier it was negative. Any specific reason why this happens. This completely changed the strategy and regulators expect us to demonstrate the content analytically.
Apologies for the slow reply, I didn’t see that you’d replied to my message.
In terms of changing predictions in Derek Nexus, this occurs when additional toxicity data - whether positive or negative, proprietary or public - becomes available to us; we then update the relevant alerts based on this data, and if public data also update the training set for the negative predictions model. As a result, the accuracy of Derek has steadily increased over time, so the more recent prediction should be considered to supersede the older.
We’ve also recently looked at the overall impact of updating models - which is of course required under M7 - and published on it here alongside industry and other software vendors: https://doi.org/10.1016/j.yrtph.2020.104807
Absolutely no problem, in fact you have clarified the doubts with respect to change in predictions which is due to update with an additional toxicity data which basically increases accuracy in Derek prediction.
Actually this are some frequently ask questions in industry since the regulators were keen on to reconfirm the predictions with latest version before filing so as to avoid deficiency.
@David@Nitesh thanks for the great inside and provide clarity on the update mechanism. I do recognize Q(SAR) is a new concept for me. Can you help us understand How and Where do this type of tool fit in the overall risk assessment strategy?
(Q)SAR tools fit, or should fit, into a nitrosamine risk strategy in essentially the same way that they fit into a more general M7 (or indeed other use case) workflow. Specifically for M7: The use of two (Q)SAR tools, with different underlying technology - typically expert (like Derek Nexus) and statistical (like Sarah Nexus) - and appropriate expert review of the results is permitted and indeed encouraged in lieu of an Ames test, broadly resulting in an M7 class 5 impurity for negatives and class 3 for positives.
The data for nitrosamines being biased towards positives, a broad “NN=O = hazard” (Q)SAR would give reasonable performance by some definitions (c. 80% accuracy, 100% sensitivity but 0% specificity). However, there are clear areas of chemical space - for example a tert-butyl or di-iso-propyl nitrosamine* or derivative - with negative results, and thus more refined QSAR is able to identify those. Negative QSAR results like these can also be used to reinforce a negative Ames result (i.e. “this negative was expected rather than due to issues with study design”). Efforts are underway in the working group I coordinate to both confirm these areas of chemical space and find any more, increasing specificity without compromising sensitivity.
An additional, and important, use for QSAR in a nitrosamine-specific context is analysis of carcinogenic potency data. The EMA and FDA permit the use of a close analogue with reliable carcinogenicity data to be used to set an AI limit, however, given the wide range of potencies (>4 orders of magnitude), and dramatic differences in potency between even apparently close analogues, a QSAR system can be used to help identify the most appropriate read-across analogue. While some information to this end is encoded in Derek Nexus already, a second major focus of the working group is to develop rules for determining these - which we will ultimately encode.
In addition to ultimately publishing the work and using it in our QSAR tools, some of our initial work in this direction will be presented at QSAR2021 (registration is closed, but attendees should look for the Cross and Ponting talk), a session of ToxForum that I am chairing (https://toxforum.site-ym.com/page/2021SummerMeeting), and the ACT (ACT AM2021)
*DIPNA itself is carcinogenicity positive but stubbornly Ames negative, any superstructure thereof for which we have data is negative. methyl- and ethyl-tert-butyl nitrosamines are carc negative and the Ames is unknown, there are other tert-butyls known to be Ames negative in the sensitive TA1535.