Here’s a warning about trying to write compounds off too quickly as PAINs (pan-assay interference compounds). The authors, from UNC-Chapel Hill, have gone through the PubChem database using software structural alerts for such motifs. Despite the original PAINs list being derived from AlphaScreen interfering compounds, they found that most of the compounds that get flagged are actually infrequent AlphaScreen hitters, and that PAINs alerts did not seem to reflect greater activity in the other assay techniques listed, either. Update: took the bold step of actually adding a link to the flippin’ paper. That’s what happens when I post and then spend the rest of the morning in meetings!
They go on to point out (as have others) that there are indeed marketed drugs that set off the same alerts, and that similar percentages of “dark chemical matter” compounds, the ones that don’t seem to hit in any assays, also set off such software warnings. That’s food for thought, I have to say. Here’s the conclusion:
(The) PAINS concept has been widely accepted by many experienced medicinal chemists both in academia and the pharmaceutical industry. Indeed, the original study from which the PAINS alerts were derived and the impetus behind it are an important step towards reproducibility and the appropriate use of resources in drug discovery. However, our findings based on the analysis of public data suggest that many compounds containing PAINS alerts do not actually show high assay promiscuity, leading to the conclusion that these alerts should not be blindly used, in the absence of orthogonal experimental assays, to deprioritize a compound.
At the same time, it is undeniable that pan-assay interference compounds exist and care must be taken to avoid these compounds. Moreover, we recognize that true “PAINS” may be present in the data analyzed herein but have not been classified as such because the current alerts do not cover these compounds. The issue of what constitutes a pan-assay interference compound thus remains unclear.
In my own experience, there are indeed such compounds, and there are structural motifs that are more likely (compared to random chance) to lead to such behavior. Quinones are number one on that list (and this paper notes that they seem to be the worst offenders as well), but there are a few others. I take their point, though, that you’re not going to be able to strip out the interfering compounds with a long list of substructure alerts, because the signal/noise of that approach decreases pretty quickly once you get past a few stinkers. Trying to do it that way is tempting but lazy.
The only way to be sure that a compound is a false positive is to run orthogonal assays to check for aggregation, redox cycling, covalent reactivity, fluorescence interference, and all the other ways that things can go wrong. Those of us beating the drums for the PAINs concept are, for the most part, reacting against the papers we see that just seem to pretend that such problems don’t exist, and blithely publish on “active” compounds that excite great suspicion if you’ve seen a lot of screening hits before. At the same time, there is certainly an error in the opposite direction, where you blast through a list of compounds crossing things off automatically with no further evidence. That’s wrong, too.
In my view, presence of the (relatively few) structural motifs at the top of the PAINs lists should put a compound under suspicion. If you want to work on it because it’s the best and most promising thing out of your screen, go ahead – but run some other assays beforehand to make sure that you’re not wasting your time. These are the sorts of assays you should really be running on the best compounds from the other structural classes, too – these behaviors can crop up at any time. But while a rhodanine quinone should not be allowed to proceed without showing its papers, lesser offenders should certainly be given a chance to prove themselves. Assay data, real assay data, is the only way to be sure.
The paper finishes up with this call:
It would be of great value if a community-wide effort to screen and analyze a large set of commercially available compounds representing the all current PAINS alerts against multiple targets in various assays was performed by several independent groups.
I agree. I hope that something like this can be realized, but until then, the lesson is (and always has been) to get real numbers on your compounds, from several directions, before you trust them.