Skip to main content

Analytical Chemistry

The Infinitely Active Impurity

Yesterday’s post touched on something that all experienced drug discovery people have been through: the compound that works – until a new batch is made. Then it doesn’t work so well. What to do?
You have a fork in the road here: one route is labeled “Blame the Assay” and the other one is “Blame the Compound”. Neither can be ruled out at first, but the second alternative is easier to check out, thanks to modern analytical chemistry. A clean (or at least identical) LC/MS, a good NMR, even (gasp!) elemental analysis – all these can reassure you that the compound itself hasn’t changed.
But sometimes it has. In my experience, the biggest mistake is to not fully characterize the original batch, particularly if it’s a purchased compound, or if it comes from the dusty recesses of the archive. You really, really want to do an analytical check on these things. Labels can be mistaken, purity can be overestimated, compounds can decompose. I’ve seen all of these derail things. I believe I’ve mentioned a putative phosphatase inhibitor I worked on once, presented to me as a fine lead right out of the screening files. We resynthesized a batch of it, which promptly made the assay collapse. Despite having been told that the original compound had checked out just fine, I sent some out for elemental analysis, and marked some of the lesser-used boxes on the form while I was at it. This showed that the archive compound was, in fact, about a 1:1 zinc complex, for reasons that were lost in the mists of time, and that this (as you can imagine) did have a bit of an effect on the primary enzyme assay.
And I’ve seen plenty of things that have fallen apart on storage, and several commercial compounds that were clean as could be, but whose identity had no relation to what was on their labels (or their invoices for payment, dang it all). Always check, and always do that first. But what if you have, and the second lot doesn’t work, and it appears to match the first in every way?
Personally, I say run the assay again, with whatever controls you can think of. I think at that point the chances of something odd happening there are greater than the chemical alternative, which is the dreaded Infinitely Active Impurity. Several times over the years, people have tried to convince me that even though some compound may look 99% clean, that all the activity is actually down there in the trace contaminants, and that if we just find it, we’ll have something that’ll be so potent that it’ll make our heads spin. A successful conclusion to one of these snipe hunts is theoretically possible. But I have never witnessed one.
I’m willing to credit the flip side argument, the Infinitely Nasty Impurity, a bit more. It’s easier to imagine something that would vigorously mess up an assay, although even then you generally need more than a trace. An equimolar amount of zinc will do. But an incredibly active compound, one that does just what you want, but in quantities so small that you’ve missed seeing it? Unlikely. Look for it, sure, but don’t expect to find anything – and have ’em re-run that assay while you’re looking.
Update: I meant to mention this, but a comment brings it up as well. One thing that may not show up so easily is a difference in the physical form of the compound, depending on how it’s produced. This will mainly show up if you’re (for example) dosing a suspension of powdered drug substance in an animal. A solution assay should cancel these things out (in vitro or in vivo), but you need to make sure that everything’s really in solution. . .

33 comments on “The Infinitely Active Impurity”

  1. It could be the vehicle, a la valproate or a mischaracterized compound, like chlordiazepoxide.

  2. 2mrklr says:

    I have to admit that I have always hated the “infinitely active impurity” concern / argument, but I actually did witness a successful (and an unsuccessful) outcome to that scenario, which occurs all too frequently. In the unsuccessful scenario, a team was “optimizing” a compound in an autoimmunity program. With time the SAR was falling apart, which was the first sign. Turns out that all of the SAR was based on different amounts of an FK-506 impurity – a large batch had been made, and the chemist who made it hadn’t done a very effective job of cleaning his glassware. Calculations showed that FK-506 was so potent in the assay, that truly minute amounts looked extremely active. At least one person lost some career momentum on that one – not because of the impurity, but because he insisted that the results were accurate and there was no chance of false positives due to contamination.
    In another program, a series of compounds were active and SAR apparent. Notably, it was a series of compounds prepared by combichem. On resynthesis it turned out they were inactive, and I was ready to abandon the project. However, one intrepid chemist investigated, traced it back to a common intermediate, and lo and behold, a new, more potent, viable lead structure was identified, which eventually resulted in a very promising clinical candidate.
    In yet another program, we were doing an SAR study, and one particular compound didn’t really fit the SAR the way we had predicted. Initially we took it in stride, and had all sorts of hand-waving arguments of why that compound might not fit the SAR. With time it became more obvious that something was rotten in Denmark – and a close look at the NMR spectrum revealed that in fact it wasn’t the compound depicted, but a regioisomer. To me it showed the power of a solid structure-activity relationship, and I also learned the lesson that you can’t always trust that your colleagues are doing their due diligence in characterizing their compounds. It always reminds me of Reagan’s quote – “trust but verify.” It’s something worth remembering in drug discovery.

  3. Ron Williamson says:

    Derek, I would offer up another fork in the road that even the most seasoned researchers sometimes don’t consider: the testing solution and the form of the compound.
    I refer you, as an example, to Anal. Chem. 76, 7278 (2004).
    The variability in what the biologists are testing can many times have nothing to do with the purity of the compound or the consistency of the assay:
    – manual vs liquid handling robot dilution of solutions in DMSO and/or DMSO/buffer mixtures do not always provide the same level of mixing or equilibration times.
    – solutions that look completely dissolved can have clear precipitate or microprecipitate floating, sticking, or settled.
    – compounds which were speed-vac’d down from TFA/MeOH can show dramatic solubility differences than ones lyophilized from a similar aqueous mixture.
    – compounds where the first batch is amorphous from a small scale rx is going to typically dissolve more quickly that a larger scale recrystallized batch, although the dissolution procedure in the biology labs can be the same.
    – serial dilution procedures (DMSO in each step vs buffer dilution down from a DMSO stock) can show variability with non-so-soluble compounds.
    Bottom line is that the chemists and biologists should keep these issues in mind as well when investigating such discrepancies.

  4. Dennis Fritz says:

    Years ago a program was started using a screen hit prepared via a Diels-Alder high temperature cyclization. The project progressed routinely but the initial SAR was very flat. All compounds were purified by flash chromatography.
    All of a sudden, the next several sets of compounds were completely inactive even though the structure variations were minimal. The only change was they were now purifying their compounds by HPLC.
    It wasn’t until quite a bit of detective work, and the usual it’s-the-chemist-no-it’s-the-biologist concerns, that a centrifugation of the silica-gel purified solution through a 3000-molecular weight membrane showed that the active component was a high MW polymeric material from the Diels Alder chemistry that was just consistently buggering up the assay.
    Lesson learned.

  5. Moody Blue says:

    We had a couple incidents when a hit for kinase inhibitor, upon resynthesis was flat out dead. One was traced back to AlCl3 – chemist had done Boc group removal (HCl/dioxane) in a vial with Al foil in-lined cap! Another time the original hit – again for a kinase inhibitor – had been contaminated with Yb(OTf)3, which was used as a catalyst for a multicomponent reaction library from which this hit had come!

  6. quintus says:

    It works the other way too, at least in development! I have had the experience of having only 1 polymorph, right up to medium size phase 2 batches. Suddenly,during the preparation of a large batch for formulation validation, the compound was much cleaner as previous and another polymorph appears it turned out to be the most stable form. The impurity(ies) stopped it forming. Very annoying, but live and learn.

  7. milkshake says:

    The best contamination story I heard was with a chemist who made a startlingly potent and since they were a small company, the management was excitedly telling their investors how great the discovery was.
    The guy who made it first took the crude NMR, and then he did prep HPLC, combined the purified fractions by HPLC and LCMS, was satisfied that compound was indeed pure so he put it on lyo. He used a glass vial with Kimwipe over it for the lyophilization and put it into a shared lyo jar without cleaning the jar first.
    Some nucleotide dudes were using the jar before him for lyophilizing stuff in Falcon tubes and they actually cut a chunk of the styrofoam rack that comes with Falcon tubes and used it to secure the Falcons in the lyo jars. So there were few loose bits of styrofoam left int the jar after they put it back on the shelf, and ur guy used the not-so-clean jar to dry his stuff. Unfortunately for him, the next day another colleague vented the lyo carelessly and the the onrush of air broke the Kimwipe cover of the already dried sample and the fluffy material spilled out into the jar. So as to cover up the mishap the colleague took the spilled stuff and scooped it back into the vial and replaced the Kimwipe and kept mum about it. You would not believe how few bits of dissolved styrofoam can improve the potency in the enzymatic assay…

  8. CMCguy says:

    Its probably less likely to occur these days I think due to modern tools and better upfront characterizations as suggested but I have heard of similar stories throughout my career and was involved in a several programs where the activity was from either from a minor by-product or a misidentified structure. The by-product was more puzzling at first but did lead to interesting series of analogs (before failing tox screen unfortunately). The misidentified compounds came from a wrong (mislabeled) regioisomer by supplier (so heed Derek’s advice to pre-check purchased material as in this case was only obvious after a 2nd bottle gave a slightly different spectra) and another was an unanticipated chemical rearrangement during the synthesis (again only obvious after 2nd round gave less rearrangement). Although frustrating at the time these events were some of the most interesting times in the lab.

  9. MedInformaticsMD says:

    In my experience, the biggest mistake is to not fully characterize the original batch, particularly if it’s a purchased compound, or if it comes from the dusty recesses of the archive.
    you point out a critical part of drug discovery: extreme levels of rigor. (Good medicine’s similar.)
    Do non-scientist executives understand this? Or do they prefer the “cheap and quick” to boost stock prices?
    I suggest the pressure is towards the latter via non-ending rounds of layoffs that sap corporate wisdom, cost-cutting, social engineering/”diversity” (instead of colorblind meritocracy, see BU professor Peter Wood’s book), age discrimination, and other talent mismanagement ills.
    Unfortunately, there seems “nobody home to talk to” in pharma regarding these problems.

  10. KB says:

    In my own short(ish) experience I’ve seen at least two instances to a super-active byproduct giving false positives. First one came from a coupling that sometimes gave a byproduct at less than 1% by LCMS and NMR. I think most of us were pretty amazed to find that it was picomolar active. The thing is that even with only 1% in the batch you’re still looking at enough to make your compound look low nanomolar when it’s actually micromolar at best. It took us ages to pin that by-product down. The other example came from a compound being contaminated in the Genevac. It’s always a problem with shared equipment that many people don’t think about. My own rule now on this is to always have LCMS and NMR (so many vendors only do LCMS) and if you get a good hit make a batch 2 on a bigger scale so you can really look at what’s in there. If I’m really serious I’ll try to make it a different way (normally at the last step) to remove further doubt. It’s also helpful to have two different assays (biochemical and cellular) since it’s unlikely that your byproduct will be active in both.

  11. RM says:

    Derek’s zinc story reminds me of something a colleague had happen during grad school. They were working with various Bleomycin derivatives, and chemical reactions that went perfectly for some were complete duds with others. Turns out the compounds, as obtained from the supplier, varied in their metal coordination. Some had already been pre-loaded with the copper ions (needed for biological activity), whereas other were metal-free. Careful analysis revealed that the successes/duds lined up perfectly with the with/without copper results.

  12. DrSnowboard says:

    There’s a nice BMS story of a compound that degraded into another, more complicated structure in the assay but which was actually a better, more stable lead (and the biofractionation was good work). Similar to KB, I’ve had experience of a 1% impurity giving an apparent sub 200nM lead. As mentioned previously, contrary SAR (acidic and basic substituents both equally good), activity only associated with one hindered regioisomer, micromolar activity at best when prepared by a different sequence…all good hints that you are about to screw the pooch unless you pay attention.

  13. barry says:

    I’m all for properly characterizing materials before drawing conclusions about them, but we mustn’t get dogmatic about elemental analysis. All analytical techniques have their blindspots. Elemental analysis is uniquely blind to just the sort of decompositions that happen in a closed vial. Unless there is a volatile fragment lost in the decomposition, the elemental analysis will be unchanged by polymerization, isomerization, changes in crystal-form…

  14. Chris C says:

    This is as true in molecular biology as it is in chemistry. Most biologists will use plasmids given to them in an eppendorf tube labeled with Sharpie marker without checking. They are lucky if they have a hand drawn map. I have had to learn through experience that any plasmid obtained this way requires at least an analytical restriction digest if not a run of Sanger sequencing.

  15. RandDChemist says:

    barry (13) is spot on, all methids have their blind spots. To me getting a good elemental analysis can be like winning the lottery. Then there are the tricks of adding solvent, fractions of acid equivalents (in the case of an amine of some basicity), etc. to make it fit.
    I’m aware of an entire project that ended up being based on an impurity, rather than what was thought to be the active entity. The project had been going for a little while (few months I recall), but it went away after that. Nothing to capitalize upon I guess.

  16. anon the II says:

    I’ve been in Med Chem for over 25 years and was part of a lab inventing what we called combi-chem though in retrospect it looked a lot more like what they call parallel synthesis these days. Anyhow, I’ve seen all of the above mentioned scenarios played out at some time or another. My sense is that the cost of a compound (i.e. how pure?, how sure?) should parallel the cost of the biology being done on it. You don’t need EA, NMR, LC/MS, IR on each of 250,000 compounds going into a high throughput screen. And you better be mighty sure what you’re putting into a dog tox study or into the clinic. If you get a high throughput hit, you need to confirm (more $) before you move forward very far (more $). There’s nothing wrong with occasionally screening garbage if you don’t act stupid after it hits.

  17. Fred says:

    Doctor D said –
    “Blame the Assay” and the other one is “Blame the Compound”
    I’m sorry, I find it’s always “Blame the chemist”

  18. eternal optimist says:

    Here is an interesting story with a twist on the infinitely active impurity
    a real warning story!

  19. bitter pill says:

    My favorite was a project where all the compounds had some color to them. One of the chemists finally made the correlation of color to activity. He took a pH strip scale and wrote IC50 ranges to the colors. it was startlingly accurate.

  20. ChemPharm says:

    Even if the chemist purified the compound, bioactive contaminants can leach from laboratory plasticware as shown in this recent Science article (Science 7 November 2008:
    Vol. 322. no. 5903, p. 917),

  21. cliffintokyo says:

    All of these horror stories should make the med chemists on this thread cringe.
    My experience is that, at most companies, discovery research needs to spruce up its sloppy attitude to characterisation. (I dread to think what happens in academia, where we used to wait weeks for MS because the people were too busy doing *research* to be *bothered* with mundane structure confirmations).
    The approach that works in most cases has already been hinted at; synthesise batch 2, for preference by a different chemist, on a gram scale, and fully analyse (tlc and hplc) and characterise – Yes! Use C-nmr to check that all the carbons are there (!), and that their 1-bond and 3-bond couplings are logical.
    This is also an excellent nmr learning exercise for new BSc chemists.
    At one company I worked for, many years ago, analytical support had a tlc screen using a battery of solvent systems which was run on all compounds before submission to the ‘bank’.
    A bit grim for the operator, but great for spotting those vfts.

  22. Jose says:

    Elemental! You must get elemental! IFF 1H, 13C, HRMS and HPLC purity all check out, then get a CHN before sending it on. It’s worth the extra money. I’ve seen too many people chase too many phantoms because they omitted this step.

  23. A Nonny Mouse says:

    It’s not only biology; we were doing a chiral reduction using a metal catalyst. The substrate that we made in the lab gave >99% ee. We received the bulk starting material from India which was made by a different process, though completely pure by all analytical methods. This only gave 93% ee and, even after a double recrystallisation, 97%. We did change the catalyst to improve things back, but we still have no explanation for this.

  24. milkshake says:

    Did the original batch (that gave good ees) also hydrogenate faster than the batch made in India?
    One thing that can erode ees in asym hydrogenation with Ru, Rh chiral catalysts is the reversibility of activated catalyst metal-H addition to C=C, the opposite process of elimination can lead to C=C E/Z isomerization and C=C migration – in this way you get undesired isomers of your substrate in the process. They all hydrogenate to the same product but with a different ees 9and sometimes even opposite sense of induction). Suppose you have a hindered substrate like trisubstituted olefin and the rate-determing step is later in the cycle. Then you could have had a trace impurity in the starting material original batch that was helping with the catalyst turnover and by the same token with the isomerisation problem, hence better ees.

  25. GladToMoveToProcess says:

    A few more, from ~25 years ago:
    The stable crystal form that showed up in the first pilot plant run, after the tabletting guys had worked things out with the kilo-lab batches. Of course, the stable form was cotton-like needles and wouldn’t dry-mill. Rewrite a lot of the IND! At least it didn’t happen when the compound was in the clinic.
    Our first preps were usually a few grams, which was enough for quite a few in vivo tests if there was reasonable activity in enzyme and isolated tissue assays. First lot: great activity, so we made the enantiomers – NO activity at all. By then, nearly all the first lot had been used up. Heroic HPLC gave a trace (

  26. GladToMoveToProcess says:

    Sorry, lost a bit. Heroic HPLC gave a trace (

  27. GladToMoveToProcess says:

    Something in the ether doesn’t like a “less than” symbol; one last try. HPLC gave a trace of the highly potent impurity, not enough to get the structure. Many attempts to reproduce the first synthesis exactly failed to give the impurity. Presumably, an impurity in the commercial starting material for that first batch was to blame; of course, all of it had been used up, and we hadn’t recorded the Aldrich lot number.

  28. Hap says:

    The less than sign is the lead character for HTML coding – there are modified characters to use to display less than/greater than signs, but I don’t know what they are.

  29. partial agonist says:

    I once had the misfortune to work on a phosphatase inhibitor project. We followed all sorts of false leads, finding in the end that all sorts of trace metal impurities led to false positives. Many phosphatases just love metals, Zn, Pd, Pt, etc.
    I felt a little less miserable (or at lease in good company) when I later saw a talk by a guy at Wyeth on the same phosphatase. He reported the same sad story about wasting time pursuing literally thousands of false positives.

  30. mouse says:

    I once outsourced a chiral final compound on 25g scale to a reputable vendor. The NMR, mass spec and HPLC retention time all checked out (and nice LCAP too). We sent it to our in vivo assay, it came back inactive…and then we checked the rotation after asking about the usual sorts of in vivo variables! They used the wrong amino acid isomer (natural instead of unnatural) at step one. They were nice enough when confronted with data to agree to re-do the synthesis for free.

  31. timo says:

    I’m dealing with a infinitely inactive purity right now… No less frustrating.

  32. Sili says:

    The less than sign is the lead character for HTML coding – there are modified characters to use to display less than/greater than signs, but I don’t know what they are.

    Type &lt; to get <
    Type &gt; to get >
    Type &amp; to get an & that isn’t interpreted as coding.

  33. D.J. says:

    The original Infinitely Active Impurity horror story: The Strange Case of Dr. Jekyll and Mr. Hyde, by Robert Lewis Stevenson.

Comments are closed.