Skip to main content


Precision Oncology Isn’t Quite There Yet

Here’s a gauntlet thrown down – let’s see how many people show up to the duel. Vinay Prasad has a piece in Nature titled “The Precision Oncology Illusion”, with the subhead saying “Precision oncology has not been shown to work, and perhaps it never will”. (Here’s Prasad earlier this year, with co-authors in The Lancet on the same theme).

To get the definitions straight, by “precision oncology” he means (rightly) the idea of using DNA sequence information from individual patients to tailor what therapy they should get. (Keep in mind that there are a lot of stories in the popular press that have garbled that idea a bit over the years, and there are a lot of press releases that name-check it without having much connection to it at all – the signal/noise isn’t that great). Now, that idea would seem to make very good sense, because it’s become abundantly clear over the years that every tumor is slightly different when looked at on a genetic level. So what’s Prasad’s problem?

For one thing, we just don’t have many targeted therapies to offer yet. It’s gotten easier and easier to sequence human tissue, but it has not gotten correspondingly easier – to put it gently – to match that result with a therapy that’s been shown to have better outcomes. Prasad has some unsetting figures on this, especially for solid-tumor patients. In that area, he says that he has been able to find only 32 reports of exceptional responses over the years after sequencing and targeted therapy, and even many of these don’t seem to hold up as well as they should on closer inspection. In the big oncology picture, there aren’t many of those at all, so for most patients (arguably “nearly all patients”) there really isn’t a targeted therapy to offer. That problem, though, could in theory resolve itself over the years with a lot more work in the clinic (and a lot more oncology agents coming out of the labs). That’s not an easy fix, but the other problems may be even harder to deal with.

As Prasad mentions, one big problem is that even when you find a particular mutation that would seem to make a tumor susceptible to a targeted therapy, it often doesn’t seem to do any good. The mutation is present, but it’s not the thing driving the tumor’s growth, and most of the time there’s no good way to tell that up front. (See this report from a couple of months ago as an example). Other (even bigger) obstacles that Prasad doesn’t have the space to go into are the genetic diversity of most tumors (you’re not fighting a single cell type with a single set of mutations) and the mutation rates themselves. Too often, especially for many solid tumors, what we end up doing is allowing patients to die, a few weeks or months later, from another subset of their tumor cells after we have cleared the way by eliminating the cellular competition. That’s not a very nice way to put things, but from what I can tell, that’s about the size of it (I’m well aware that those extra few weeks or months may well be worth a lot, depending on what they’re like and on the patient’s own situation). Many people are aware that this is the situation, but my worry is that they associate this state of affairs with old-fashioned chemotherapy, not that new precision DNA-targeted therapy that you hear about. But the same thing is going to happen for many of those, too.

That’s a real problem. You have a lot of cancer treatment centers advertising their ability to tailor a therapy right for you, the patient, with cutting-edge techniques that have recently become available. Except that these mostly don’t exist yet, and mostly aren’t applicable to the people who show up for treatment, and may not (for the reasons given above) ever exist for some (many?) patients at all. This field has been proclaimed as the coming thing for many years now – fine. But proclaiming it as already having arrived is not a good move.

46 comments on “Precision Oncology Isn’t Quite There Yet”

  1. Sounds similar to a critique by MIT’s Michael Yaffe which I wrote about a few years ago (link in handle). Here’s what Yaffe had to say:

    “We biomedical scientists are addicted to data, like alcoholics are addicted to cheap booze. As in the old joke about the drunk looking under the lamppost for his lost wallet, biomedical scientists tend to look under the sequencing lamppost where the “light is brightest”—that is, where the most data can be obtained as quickly as possible. Like data junkies, we continue to look to genome sequencing when the really clinically useful information may lie someplace else.”

    1. grad student says:

      Are there any indications where other clinically useful information might lie? Metabolism? Proteins? lipids? RNA?

      1. Crick says:

        Reaction networks, signaling pathway…
        From a guy named Waston ““You could sequence 150,000 people with cancer and its not going to cure anyone. It might give you a few leads, but it’s not, to me, the solution. The solution is good chemistry. And that’s what’s lacking. We have a world of cancer biology trained to think genes. They don’t think chemistry at all.”

        1. Curious Wavefunction says:

          Agree with Crick. The best marker of a cancer driver might still be biochemical. As Watson noted, doing classical biochemistry on metabolic and signaling pathways might still be the best way to identify legitimate targets. The problem is that this kind of work is relatively slow and tedious and unflashy so it’s not always considered hot and fashionable like genomics. Nonetheless, this is not an all or none proposition. Genomics can help prioritize targets which could then be investigated rigorously using biochemistry.

  2. Conversely, there are some targeted therapies that clearly work — Gleevec being the poster child and EGFR inhibitors being another. ALK fusions yet another.

    What would be useful to shed light here would be a table or graphic showing for each therapy-genotype pair the strength of the evidence and the class of the target (several more cases of activated kinases being successful targets come to mind).

  3. bhip says:

    The statement …”the genetic diversity of most tumors (you’re not fighting a single cell type with a single set of mutations)”…I think that this is the gist of the problem behind the history of failures with “precision” therapies.
    A cancer is rarely a single homogeneous population of cells, all sharing the same mutational load. This can be the interpretation from a single needle biopsy in solid tumors (this is exemplified in an excellent paper on intratumor heterogeneity from 2012 in the New England Journal of Medicine;
    You will see the statement “the cancer eventually became resistant to the treatment”- one explanation is ongoing mutation in a/multiple tumors during the course of treatment. It seems to me that it is more likely that the treatment in question effectively killed a dominant subset(s) of the cancer cells resulting in the initial positive response. However, the treatment simply selects for the resistant cell species & the cancer rebounds.

  4. Anon says:

    Seems the only way to get real precision medicine in oncology, is with a treatment that can evolve to attack the tumour as it evolves in situ. And that effectively means replicating the immune system, or indeed using it directly.

  5. Peter Kenny says:

    The precision medicine folk do need to get their heads around the idea that intracellular unbound concentration is not something that is currently measurable in live humans.

    1. Dr CNS says:

      Hi Peter – I agree with your comment.
      Even if we could measure it, do you think things would change so dramatically in terms of the probability of success of new drug programs?

      1. Peter Kenny says:

        Our inability to measure intracellular unbound concentration means that Phase I trials are incomplete when the target is intracellular. Being able to measure this in a clinical setting would reduce the guess work going to Phase II and projects could be put out of their (and our) misery as swiftly and mercifully as possible. Being able to measure intracellular concentrations in vivo is only likely to increase the probability of success in new drug programs if the information can be used to select compounds.

        However, this is all conjecture because in vivo measurement of intracellular drug concentrations in vivo is a long way off (although there have been some encouraging in vitro developments recently).

  6. Andre Brandli says:

    Interestingly enough, the recently published “Ten Targets” for Joe Biden’s Cancer Moonshot does not mention the term “precision oncology”:

  7. Janex says:

    I’ll agree with him that precision oncology is oversold and overhyped. But I’ll disagree on the perhaps never will. Assuming research continues in the area I believe that we will eventually be able to achieve the dream of precision oncology. We’ll need (at a minimum) the following 1) more types of targeted therapies 2) improved genetic screening so that these multiple cell types aren’t missed 3) better ways of pairing targeted therapies. I tend to feel that in the long run most precision therapies will work better in combination with one another rather as stand alones. I think that with the current focus which is primarily on single therapies, we are missing a lot of potential success stories. Often with there is redundancy and when we block one pathway another substitutes. We need to learn how to do better with identifying those issues and pairing together targeted therapies to block both pathways. It’s going to be a slow step by step process with a series of small victories which will likely take decades but I believe that we will eventually get there (assuming funding doesn’t disappear).

  8. Brad Smith says:

    It’s unfortunately that Prasad suggests incorrect figures (number of patients profiled at Foundation Medicine) and fails to cite studies ( beyond the Shiva study that support Precision Medicine. Tumor heterogeneity means that we need to do a better job using targeted therapies rather than refuting the hypothesis that oncogenic drivers exist and can be targeted to benefit patients.

    1. Anon says:

      I too am surprised at Dr. Prasad’s failure to cite positive studies in the field.

  9. Anon says:

    Even if precision medicine does eventually work, it means treating smaller and smaller patient populations at greater and greater cost. Which is ucompletely nsustainable, and exactly the nub of the problem we have in healthcare right now.

    Stupid, stupid idea!

    1. Oliver H says:

      Don’t see why it would necessarily have to be at greater and greater costs. Yes, traditionally, smaller populations would mean lower economies of scale, but manufacturing has quite some interesting paradigm shifts in the pipeline itself.

      I’d also suggest that the problems in healthcare are in an entirely different area, or rather not that generalizable, as evidenced by the fact that the US healthcare sector has problems alien to other countries’ sector, which in turn have problems of their own.

      1. Kelvin says:

        “Don’t see why it would necessarily have to be at greater and greater costs.”

        Because each new drug still needs to go through clinical trials no matter how many (or few) patients it might treat, meanwhile success rates will continue to fall as each new drug raises the bar for the next. The result will be just an extrapolation of the law of diminishing returns that we already see.

        1. wlm says:

          I agree, if and only if cancer treatments continue to be palliative. If we can develop combination therapies that are curative (think testicular cancer), then the cost per patient *might* decline.

          1. Kelvin says:

            … except that any cancer survivors eventually die of some other more expensive disease like diabetes or Alzheimer’s. Despite the cost of cancer drugs, cancer is still one of the cheapest ways to die, at least for now. And we all die eventually. The only question is, how much will it cost for each of us to die?

          2. Mark Thorson says:

            So the secret to reducing health care costs is cheaper tobacco, and subsidies to those who can’t afford all they want.

          3. Kelvin says:

            Yes, or a cure for diabetes/AD.

    2. Some idiot says:

      Hmmm…. Is it? Disclaimer: I don’t know much about this field. And yes, I have my doubts that this targeting approach will work any time soon (although I hope it will). But let’s consider the following scenario:

      A group of (say) 100 general DNA-indicated targets has been found, and (a) agents treating each effectively has been found, and (b) screening of (probably multiple) biopsy samples does a good job of getting a good grip of which of the 100 DNA targets are relevant in any given case. In that case, the individual person’s treatment could be tailored from a large number of off-the-shelf (or almost) options. Yes, the per-treatment cost will rise. But hopefully the effectiveness of the treatment would rise far more significantly. Which means that if you can actually reign in the tumor(s) so much that the body’s immune system can cope with it thereafter, then if you calculate the extra number of QUALYs you get out of it, then it would make a lot of sense, both from social and economic perspectives.

      Now, I am not stupid enough to think that we will be there anytime soon. The required wish-list for that scenario is pretty long (or tall, or both). But the point of the argument is that it would not necessarily lead to treatments helping smaller numbers of people, and at a higher cost (“higher” in terms of cost vs benefit).

      1. Mark S. says:

        IANA expert either.

        IMO: If we were curing a man-made device I’d agree with this reductionist approach.

        But from what I’ve learned here, even with perfect knowledge you’re not going to discover 100 essentially orthogonal treatment vectors. Each one will have spillover effects on the others. Many of which will destructively or constructively interfere with the others in non-linear ways. The problem is to deconstruct (living) soup, not to deconstruct a wristwatch. Orthogonality is a very fundamental goal of human design regardless of the field. Somebody forgot to tell biology about it.

        If we arbitrarily assign 10 possible dosage levels (0 to 9 units) to each of the 100 assumed targets / compounds there are 10^100 possible regimens. Even simple yes/no levels of dosing for each compound will reduce the testing space to “only” 2^100 ~= 10^30.

        Obviously the total human population will never need them all. But after we get past the first few common combos, each patient will be his/her own personal Phase II trial.

        Right now in a very real sense each patient is his/her own Phase III trial. We know quite a lot about safety and something about efficacy. But will it work? How well and what’s the ideal dose? N=1 and no control. That’s a lousy space in which to expect high quality first-run results.

  10. Casual Observer says:

    The Nature article seems downright misleading.

    “So far, precision oncology has been tested in only one such published study”? What about Herceptin, Xalkori, Tagrisso, Zelboraf, for starters? How does PD-L1 testing not qualify? Is he excluding any “precision” test that is not just for a point mutation? (And even then, hard to see how Tagrisso and Zelboraf would be excluded.)

  11. swattie91 says:

    What about the challenge of conducting statistically meaningful clinical trials with ever smaller numbers of patients possessing a specifically defined set of genetic alterations?

  12. Dr CNS says:

    I think there are two different points here:

    1. The heterogeneity of human disease defined as a phenotype (instead of departure from homeostasis defined at the molecular level).

    2: the false narrative driven by broad media reports on the progress made in life sciences aimed at mitigating human disease. Yes, dumbing down may be necessary to explain findings… but you’d expect scientists know the difference…

  13. CMCguy says:

    Is it too early in the efforts for this field of precision oncology to make the applications a meaningful argument? The technical ability to map the human gene is only a couple decades old I believe and then having routine and reliable ability to isolate and correlate variations to specific diseases is even younger yet rapidly improving (unfortunately the level of hype for delivering miracles have been extreme) but remains non-trivial exercise. Targeted therapies have most often lagged the detection and expansion of the key biology knowledge but then involves substantial (years of) trail and error to discovery and refinement of what works so is it unsurprising that limited there is not much to offer currently? Critical evaluation is worthwhile as long as the purpose to guide better avenues of future effort but the time scales here seem a little premature to disregard progress so far and not rule out greater potential (even though can never surpass the hype or misleading that goes on.)

  14. Shanedorf says:

    Are we sequencing the right target ? There’s a fair amount of work looking at cancer stem cells as the source of metastasis and mortality

    Instead of sequencing the bulk tumor, perhaps looking at the CSC’s in greater detail offers a path forward for both NGS and targeted therapies. Its also unlikely that a monotherapy is going to be the answer; multiple agents and multiple modalities may be the way to turn bring Precision to cancer treatments

  15. steve says:

    To my mind, the complexity and heterogeneity of cancer leaves only one alternative – immunotherapy approaches. Complexity and heterogeneity is exactly what the immune system evolved to combat. Think about influenza, adenovirus or any of a host of other rapidly mutating pathogens. The issue is that tumors, particularly solid tumors, undergo immunoediting during their development to avoid provoking immune responses. It will take time and effort but once we are able to block each of these tumor immunosuppressive strategies (e.g., checkpoint inhibitors, IDO inhibitors, etc.) we’ll be able to tailor combination treatments that will enable the immune system to more efficiently do its job and reduce cancer to just another annoyance like the common cold.

  16. Clifford says:

    disclosure: I work for a precision oncology company.

    The trials Prasad cites (eg NCI-MATCH) actively *exclude* patients for which there is a known (meaning FDA approval or NCCN guidelines) relationship between a biomarker and a therapy. For instance, a NSCLC patient who is ALK fusion+ will not be accepted in the crizotinib arm, as this is an approved indication.

    Inevitably immunotherapies will also require molecular stratification. Consider the recent failure of Opdivo in first-line advanced NSCLC patients. Even though BMS had demonstrated greatly enhanced efficacy in high PDL1 patients (the high PD-L1 expressing group had median overall survival of 19.4 months compared to 8 months in the docetaxel arm), BMS gambled that they could get more market share if they did not stratify the population. How’d that work out?

    In other words, “precision oncology is an illusion” if you actively exclude those who are known to benefit from precision oncology.

    Where are the editors at Nature? Sheesh.

  17. Barry says:

    The heterogeneity of cancers is a huge problem. As has been noted, any single therapy may wipe out some or most of a tumor but just make way for other clones to proliferate. More basically, DNA is the wrong place to look. I don’t care that your p53 gene is WildType if you’re not expressing any protein from it–or if you’re failing to fold/phosphorylate it correctly. We need to be looking at the RNA transcripts (if not at the proteins themselves)

  18. Mike B. says:

    Biologists and medicine are completely resistant to the fundamental changes that must occur wrt to the way we think about life and disease. Life is NOT DNA –> protein–> disease. Mining genomics data for a lead on how to attack cancer could be barking up the wrong tree in many cases. Has it not occurred to many people that something like post-translational modificaitons, of which there are over 300 types, could be driving disease? Abnormally distributed PTMs on proteins that could be driving disease can respond to very minute changes in metabolism, and you’d never know it by looking at genetics, gene expression, or protein levels alone since PTMs have no code to crack and are no where near as easily detectable. The number of permutations of a single protein, once the entire class of all PTMs possible is taken into consideration, is absolutely mind-numbingly massive, and it is in THIS massive space that the answers which might give us break through in medicine and our understanding of complex diseases may lie. DNA and proteins are akin to classical physics. It wasn’t until the quantum revolution happened that scientists could finally start explaining phenomena they were observing on the small scale. In just the same type of way, biology and medicine may need its very own type of ‘quantum revolution’ and start placing focus on non-classical biology and molecular biology fields, starting with trying to crack the massive realm of post-translational modifications and how the realm of PTMs are altered in disease.

    1. steve says:

      Sorry but even if the eventual product is a post-translational modification it must be driven by a genetic change or it wouldn’t be transmitted every time the cell divides. Post-translational modifications have been well-described in cancer. There are a number of glycosylation differences, for example, but these are driven by genetic modifications that result in expression or silencing of the glycosyltransferases that are responsible for the modified glycosylation patterns. For example, many tumors up-regulated fucosyltransferases that increase selectin binding allowing for metastasis to occur. Post-translational modifications do not themselves drive disease, rather they are manifestations of the disease process and wouldn’t occur without some underlying genetic or epigenetic change.

      1. Mike B. says:

        Steve, have to disagree. I can immediately link to a number papers describing how metabolic flux regulates patterns of PTMs. Flux driven changes are something almost impossible to predict by looking at a genetic code, level of transcript, and even protein quantity in a cell alone. It’s a completely different level of complexity.

      2. Mike B. says:

        BTW, you mention epigenetic changes drive disease. It is well known how PTMs can regulate epigenetic patterns. For example, the TET family of enzymes are heavily modified by sugar and are glycosylated. A PTM like glycosylation of TET proteins can regulate epigenetic patterns, in theory.

        1. steve says:

          Metabolic flux is also genetically controlled – see, for example, the Warburg effect is largely caused by up-regulation of PKM2 instead of PKM1. Bottom line, cancer is a genetic disease. It has to be this way as tumors are derived from a single clone that self-replicates. You’d have to figure out some other mechanism besides genetic or epigenetic changes (i.e., changes in gene expression) that can drive a single clone to self-replicate. Post-translational modifications and metabolic changes are all important drivers but they are downstream events driven by changes in gene expression.

          1. Mike B. says:

            Steve, sure there exist things like PKM2, but to claim that metabolic fluxes are entirely controlled DIRECTLY through genetics is misleading. If that were the case we’d be able to lack our bags and go home. No one word need to do metabolics and fluxomics workbecaise you’d be able to predict what would happen from genetic instructions alone. Clearly that isn’t the case as the field of metabolics exists for a reason. You know what the Warburg effect also does? It Also causes something like PFK1 to be abnormally glycosylated at a site that possibly induces an allosteric change. The amount of glycosylation added to PFK1 then goes on to regulate the amount of carbon from glucose that gets shunted toward the PPP. These sets of instructions are complety outside of the genetic code, how is a gene transcript or even quantity of total protein going to give you insight into when and where PFK1 will be .modified by sugar to alter PPP flux? Here, PPP metabolism is being altered for reasons completely outside of mutations amd gene expression.

            Secondly, yes, we’ve known for a long time that cancer has altered gene expression. But guess what kind of protein RNA polymerase II is. It’s actually a glycoprotein and without its sugar, RNA polymerase II doesn’t work right.

  19. Ex-Pharma says:

    In an inspiring plenary talk at a corporate retreat around the year 2000, an eminent and highly respected clinician put forward the view that by 2020 cancer would be treated as a chronic not an acute disease, with patients routinely surviving for 20 years or more. Oh well, maybe by 2040…

  20. steve says:

    In response to Mike B (there wasn’t a reply button): No, that’s incorrect. Of course blocking metabolic changes can influence the course of the disease. The question is why would a cell’s metabolism change in the first place? It doesn’t just occur magically. It occurs because of changes in levels of metabolic enzymes as in the PKM2 example I gave. Those enzymes are proteins, which are produced from mRNA, which is derived from changes in gene expression. That doesn’t mean that research into metabolic changes isn’t valuable but it does mean that it’s downstream from changes in genomic expression. Same thing with post-translational modifications. It’s great to target them but they only occur because of changes in gene expression of modification enzymes like glycosyltransferases, proteases, etc. If you know of any other mechanism that can drive a normal cell to become a tumorigenic clone besides changes in gene expression please elucidate.

  21. PTMiner says:

    @steve and @mike, just to clarify a misconception in the literature, the purported PKM2/1 switch is fallacy. Many papers were published after those first splashy CNS papers showing that PKM2 is the dominant isoform in nearly all tissues (as measured using targeted proteomics and not antibodies) -the first paper showed one tissue that had dominant 1 expression. there is no ‘switch’ in cancer cells. M2 does appear to be regulated differently in cancer cells, which, appropriately, appears to be driven by change in PTM state and metabolite binding profiles…we need to move away from genetics and people are just too scared to dive into the proteome.

    1. steve says:

      I haven’t kept up with the latest publications (it’s not my primary field) but my understanding is that PKM2 is primarily expressed in the embryo and is upregulated by oncogenic expression of genes like myc. In any case, the proteome cannot be responsible for malignant transformation as it has no mechanism of transmitting information from cell to cell. Again, cancer is driven by a malignant clone. That clone divides and each daughter cell retains the oncogenic phenotype. The only way to account for that is from changes in gene expression. Someone would have to come up with a pretty convincing (and detailed!) mechanism to explain how cell metabolism, proteome, post-translational modification or whatever is transmitted each time a cell divides to have a convincing alternate theory of carcinogenesis. Those things are all important but they are secondary to, and downstream of, differential gene expression.

  22. JOlechno says:

    Great discussion. Here are my two cents (if you want to see my two bits, read ).
    Genetic analysis will be very useful as we continue to look to cure cancer. I doubt that genetic analysis will ever be the complete solution. As already articulated by others, cancers is not the product of a single, static mutation. Cancers mutate constantly. But there are other problems in precision (or personalized) medicine. First, it is not precise or personalized. A mutation is associated with a cancer. Oncologists have noted that people with this particular mutation tend to respond well to drug X. But that is a tendency. A percentage of the population does not go into remission or, if they do, a variant of the cancer rises via mutation to plague the patient anew. It would be a big step to eliminate chemotherapy of patients that will not respond to a certain treatment. That would save money (fewer rounds of futile chemo) and, more importantly, lives as the cancer would not have the opportunity to further mutate during the futile chemotherapy.
    There is also the problem that mutations linked to specific cancers have been identified in completely healthy tissue (see the link above). The mutation burden in that healthy tissue approximated the burden in tumors. You could be prescribing treatment based on the signature of a coincidental mutation.
    Then you have the problem that the Quackenbush lab reported. Even when two different labs reported the exact same mutations in cells, the response of those cells to specific chemotherapy agents was very different between the two labs.
    I would like to suggest that more attention be paid to the work done at the Institute for Molecular Medicine, Finland (FIMM). They have revitalized ex vivo testing and drug sensitivity assays. In the US, both ex vivo testing and drug sensitivity have fallen from favor. But the results they have shown (as well as others in Spain and Sweden), suggest that they have addressed many of the problems seen two decades ago. Note that their techniques are not done completely independently of genetic analysis. Mutation analysis may suggest a particular treatment and they can test that treatment (along with hundreds of others) with fast turn-around. They can amplify the likelihood of a DNA-based therapy working. Alternatively, if the drug fails to kill the cancer ex vivo, it is highly unlikely that it will work in vivo. Their process could dramatically reduce failed rounds of chemotherapy. Their technique could also be used to monitor the sensitivity of new mutational strains of the cancer. Since the tests are relatively fast and inexpensive, patients could be tested regularly to see if a new resistant cancer line was emerging. While most of the work at FIMM has been limited to blood cancers, the work in Sweden has moved towards taking the FIMM techniques into solid tumors.
    The FIMM technique also offers the potential to identify new uses of old drugs. One example is the discovery that the anti-angiogenic renal cancer drug, axitinib, is a specific inhibitor of BCR-ABL1(T315I), a mutation with very limited drug sensitivity. Thus the FIMM technique can bring about more repurposing of drugs (something that should interest the pharmaceutical industry as well as the clinical oncologist). And while the FIMM methods are an N=1 approach, it does not have the harsh negative impact on patients that might occur when a drug is given based on intuition or mutational analysis.

  23. K Liu says:

    I think if the problem is only you can diagnose more precisely and personally but there is no drug for it yet, then there is no problem that we should improve the diagnose, at least we can avoid using the drug on “wrong” patient,

  24. David Young MD says:

    There may be a role for Trojan horse chemotherapy agents. Use an antibody that attaches to most of the tumor cells and attach a ultrapotent chemotherapy drug, in that way the antibody brings the drug in the vicinity of tumor cells that have mutated to where they do not have the antigen.

    1. Barry says:

      alas, most of what’s distinctive about cancer cells is cytosolic; they display few distinctive antigens on the surface for such an antigen. And just getting proximate isn’t good enough; you want to spare the nerves even as you kill the prostate tumor, but the two are intimately entwined

Comments are closed.