Skip to Content

New Sirtuin Inhibitors

And since that last post was about sirtuins, here’s a new paper in press at J. Med. Chem. from the Sirtris folks (or the Sirtris folks that were, depending on who’s making the move down to PA). They report a number of potent new sirtuin inhibitor compounds, which certainly do look drug-like, and there are several X-ray structures of them bound to SIRT3. It seems that they’re mostly SIRT1/2/3 pan-inhibitors; if they have selective compounds, they’re not publishing on them yet.
I should also note, after this morning’s post, that the activities of these compounds were characterized by a modified mass spec assay! I would expect sirtuin researchers in other labs to gladly take up some of these compounds for their own uses. . .
Note: I should make it clear that these are more compounds produced via the DNA-encoded library technology. Note that these are yet another chemotype from this work.

17 comments on “New Sirtuin Inhibitors”

  1. Virgil says:

    I won’t be lining up to obtain these.
    Showing a bunch of chemicals can inhibit a purified enzyme in a test tube is one thing. Showing useful bioactivity in a cell or other complex system is another thing altogether. This is especially true in the case of SIRT3, locked away inside the mitochondrion where I doubt any of those compounds will go. There’s also that rather electrophilic looking functional group on the lead compound (11c) which is going to do wonders for cysteine residues all round.
    It’s a wonder this got into J.Med. Chem., since typically the term “medicinal” applies to, you know, things that might actually be used in or relevant to medicine. Again, show me this stuff in a mouse or a primate an I might be interested, but for now it’s just some rather neat chemistry. Move along now, nothing to see.

  2. Notice says:

    Another example of compounds found with the ELT method from Praceis.

  3. KNP says:

    Going to threadjack here, but what is the general opinion about this new bill dealing with Helium selling? I was a little confused as to what is actually being proposed and how it might affect our current consumption rates.

  4. Chemjobber says:

    KNP: For a more comprehensive look at the issue, click on my handle for a Reuters article on it. It’s good.
    Long story short:
    1) The USG is somehow the largest seller of helium. By law, it’s only allowed to sell He to get rid of the debt from building the stockpile of helium. Now that the debt is almost gone, they won’t be able to sell helium anymore. This is a way to fix the problem.
    2) The USG/Congress doesn’t really want to be in the business of selling helium and wants to get out, but doesn’t know quite how to do it.
    (stealing liberally from finance blogger Felix Salmon.)

  5. anon2 says:

    As most folks in the industry would recognize, a J.Med.Chem. paper may, or may not, include the lead(s) toward any human drug candidate. But, then again, this is Sirtris at Cambridge, and they have not seemed to be under any restrictions or requirements for review of possible publications within GSK.

  6. Anoninfinity says:

    Hmm, where is the triazine? Weren’t a bunch of supposed ex-GSK folk saying that was all ELT was good for?

  7. Calymene says:

    Not a triazine this time, but a thienopyrimidine instead. It’s also a great core structure if you want to churn out vast libraries. Just saying.

  8. freddy says:

    Glaxoid, your silence is deafening.

  9. exGlaxoid says:

    I’ll bite. GSK already had TWO previous systems that they spend many millions of $ on to be able to build large libraries of compounds with tags, one from Mario Greyson, and one from Affymax when they bought Praecis. My point is that those systems never produced any useful hits that lead to drugs or even clinical candidates as far as I ever heard, yet they continue to spend money on this concept. There is nothing about these compounds that they found that could not have been also found in a simple sparse array made by other means, likely for less then the cost of buying Praecis.
    At the same time they spent millions more building large HTS systems and collecting well over 1,000,000 compounds to screen in it. It seems that creating 4 or more different screening technologies at the same time might be considered hedging your bets, but I considered it not being able to make a decision. Even after people inside GSK evaluated some of these systems, as well as other companies, their recommendations were often ignored. Given that GSK’s stock has lost value since the merger was announced in ~2000 (high near 72 I believe), I think that their decisions are reflected in their stocks poor performance since then.
    During the GW merger period, 1996-2000, the stock ran up to more than double its previous value, which I would say shows that that merger was much more successful, in part because the leadership used more common sense and kept some of the best of both companies and did not waste near as much on purchasing magic beans.

  10. annonie says:

    #9: The reason that GW stock did so well from ’96 to ’00 was that former Welcome had some good compounds coming along, and the former Glaxo folks knew how to develop, (mis)advertise, push & sell, sell, sell. On the other hand, the GW and SB merger was one of hopeful size and efficiency as neither had any pipeline of substance. On the other hand, both sides had hidden liabilities in improper sales and advertising practices, upcoming product liabilities, manufacturing problems, etc.
    The “history” of the “diversification of libraries” is another line of mis-adventures, and then there was Sirtris…..

  11. freddy says:

    A reasonable perspective, thanks for engaging.
    I agree that Geyson and Affymax did not pan out, and that similar experiences were encountered at many other pharmas, but not quite so spectacularly as at GSK.
    I just believe that ELT is a different beast altogether.
    I’ll just note that the arguments against encoding appear to be morphing from “it’s only triazines” to “we could have found that by HTS”. That’s progress of a sort.
    Lastly, you’re preaching to the choir when it comes to the stupidity of pharma decision-making. But that doesn’t pertain to the question, “does the Praecis method have value?”

  12. Calymene says:

    freddy: “I just believe that ELT is a different beast altogether”
    When GSK bought Praecis the reaction from me and many of my colleagues was “It’s Affymax all over again!” What is it about ELT that leads you to your conclusion?

  13. freddy says:

    It’s all about the screening process. One bead or one compound per well activity assays are prone to give false positives due to aggregation, impurities, etc. On-bead affinity screens with labeled target are similarly difficult and error prone.
    But one-tube, solution-phase, mixture-based affinity screening with high throughput sequencing as the output is just a qualitatively different thing. Maybe better, maybe worse, definitely different.

  14. Anon1 says:

    What Affymax does is pasted below directly from their website. I don’t see any significant similarities between DNA encoding in general and what Affymax does.
    From Affymax website:
    Affymax has developed a discovery platform based on advanced recombinant peptide and peptide chemistry techniques that has been used to generate novel peptide alternatives to protein drugs. The approach has identified peptides with no amino acid sequence homology to other known naturally-occurring human sequences. In developing peptide drug candidates, our goal is to improve upon currently marketed peptide and recombinant protein-based drugs.

  15. freddy says:

    In fairness to previous posters, that blurb from Affymax is what they’re doing nowadays. Back in the 90’s they were doing big-time combichem, including (solid-phase) encoding with DNA.

  16. Anon1 says:

    High throughput sequencing is what makes this technology feasible. I don’t see how Affymax in the 90s could have done anything similar to what GSK currently does? Or does this technology work without high throughput sequencing?

  17. Calymene says:

    There are many similarities between what GW / Affymax were doing in the ’90s and the GSK / Praecis approach now: both rely on a particular technology to produce megalibraries of compounds for screening. And they have similar drawbacks: the technology restricts the chemistry you can do, it can take forever to optimise the chemistry so that it works with your platform and the drive for big numbers means that it is so tempting to work with scaffolds with 3, 4, 5 points of diversity and you end up with big ugly molecules that are very difficult to turn into a drug when you get a screening hit. Furthermore you get your “diversity” by making every methyl, ethyl, butyl, futyl analogue you can.
    Just can because you can screen millions at a time doesn’t mean you have to.

Comments are closed.