Skip to main content

Drug Assays

Outsourced Assays, Now a Cause For Wonder?

Here’s a look at Emerald Biotherapeutics (a name that’s unfortunately easy to confuse with several other former Emeralds in this space). They’re engaged in their own drug research, but they also have lab services for sale, using a proprietary system that they say generates fast, reproducible assays.

On July 1 the company unveiled a service that lets other labs send it instructions for their experiments via the Web. Robots then complete the work. The idea is a variation on the cloud-computing model, in which companies rent computers by the hour from, Google, and Microsoft instead of buying and managing their own equipment. In this case, biotech startups could offload some of their basic tasks—counting cells one at a time or isolating proteins—freeing their researchers to work on more complex jobs and analyze results. To control the myriad lab machines, Emerald has developed its own computer language and management software. The company is charging clients $1 to $100 per experiment and has vowed to return results within a day.

The Bloomberg Businessweek piece profiling them does a reasonable job, but I can’t tell if its author knows that there’s already a good amount of outsourcing of this type already. Emerald’s system does indeed sound fast, though. But rarely does the quickness of an assay turn out to be the real bottleneck in any drug discovery effort, so I’m not sure how much of a selling point that is. The harder parts are the ones that can’t be automated: figuring out what sort of assay to run, and troubleshooting it so that it can be reliably run on high-throughput machines are not trivial processes, and they can take a lot of time and effort. Even more difficult is the step before any of that: figuring out what you’re going to be assaying at all. What’s your target? What are you screening for? What’s the great idea behind the whole project? That stuff is never going to be automated at all, and it’s the key to the whole game.
But when I read things like this, I wonder a bit:

While pursuing the antiviral therapy, Emerald began developing tools to work faster. Each piece of lab equipment, made by companies including Agilent Technologies (A) and Thermo Fisher Scientific (TMO), had its own often-rudimentary software. Emerald’s solution was to write management software that centralized control of all the machines, with consistent ways to specify what type of experiment to run, what order to mix the chemicals in, how long to heat something, and so on. “There are about 100 knobs you can turn with the software,” says Frezza. Crucially, Emerald can store all the information the machines collect in a single database, where scientists can analyze it. This was a major advance over the still common practice of pasting printed reports into lab notebooks.

Well, that may be common in some places, but in my own experience, that paste-the-printed-report stuff went out a long time ago. Talking up the ability to have all the assay data collected in one place sounds like something from about fifteen or twenty years ago, although the situation can be different for the small startups who would be using Emerald (or their competitors) for outsourced assay work. But I would still expect any CRO shop to provide something better than a bunch of paper printouts!
Emerald may well have something worth selling, and I wish them success with it. Reproducible assays with fast turnaround are always welcome. But this article’s “Gosh everything’s gone virtual now wow” take on it isn’t quite in line with reality.

11 comments on “Outsourced Assays, Now a Cause For Wonder?”

  1. pgwu says:

    Sounds like a systems integrator doing some extra assays. Folks buying Zymark systems hoped to do this in the 80’s/90’s (minus the internet part), and a few more iterations followed. Each time things got better but I am not sure it’s there yet. There is always something trivial manually that tripped you up in a robotic system. I do see it being used for some robust assays.

  2. Anonymous says:

    I have to say it’s a good way for ex-scientists and potential entrepreneurs to test their ideas when they no longer have access to a lab.

  3. Chrispy says:

    Lotsa luck with this plan. It turns out that biological assays need constant tweaking and a vigilant eye. As much as people have tried, they have proven to be stubbornly difficult to outsource because the data you get back is crap. Some assays could be run this way (like simple enzyme inhibition assays), but unlike chemistry, where LC/MS can tell you a lot about the quality of the product, if the product is data it can be very hard to QC after the fact.
    One could see a certain demand for expensive equipment (Cellomics, FLIPR, patch clamping, LC/MS/MS) but this is not the kind of thing you’d want to “fly by wire.” And routine assays (HERG, liver microsomes, etc.) are already outsourced.
    Anyone who guarantees results in a day is fiercely expensive, doing next to nothing, and/or generating garbage (pick two). Caveat emptor.

  4. db says:

    “Even more difficult is the step before any of that: figuring out what you’re going to be assaying at all. What’s your target? What are you screening for? What’s the great idea behind the whole project? That stuff is never going to be automated at all, and it’s the key to the whole game.”
    Sadly though, that’s what a lot of Big Data proponents see as the endgame–throwing enough at the wall to take a fundamental advance out of what manages to stick.
    The problem, obviously, is that no matter what trappings it is dressed in, that isn’t *science*, it’s tinkering. In addition, without designed experiments and carefully collected data set made up of carefully selected types of data sought, it is hard to impossible to really sort out meaning.

  5. Dr Octopus says:

    Whoa, whoa, wait a minute Barry… I’ve heard that some of these so-called “Zeolites” are actually made of CHEMICALS!!!

  6. Sisyphus says:

    I read this at Businessweek before Derek posted and found the article to be one of the most vacuous and uninformative articles that I have read. This is my take on it. Why did the antiviral idea not work out? Because it was a half-baked, highly academic ruse concocted by two individuals with zero experience in drug development and the inverse of zero in arrogance and self-confidence. If they really had something, they would be talking about it – not claiming it is a “stealth” project. So onto another half-baked idea. So what exactly are these guys doing now? Creating a new LIMS system? Or running assays? Or making molecules (as a rotovap would suggest)? Despite all the words and sentences in the article, I still don’t know what they are doing? What is the business model? What are the “experiments” and what are the numbers of assays, of clients, of experiments, of revenue that is required to make the business model sustainable? $1-$100/ experiment? Is this one LCMS run or one gel or one XRD or one sequence? For a $100 experiment, how much profit is there?
    Reading this made me think of the combichem era when there were many promises made and even more money wasted. See this article for a flashback: Symyx made similar promises with a slight variation on the business model and where is Symyx now?

  7. John Wayne says:

    I’m still cringing because neither of the guys in the picture (first link) is wearing safety glasses in the lab.
    The strength of this idea is in the weakness of most operating systems for laboratory equipment. It has been my experience that they range from average to terrible; any improvements could be heartily welcomed by the low bar maintained by the market.

  8. anchor says:

    @8-I am on board with you on this. Nothing intellectual here other than enough resources ($$$) to buy equipments and then asking for profit sharing, if something pans out. People like these are dime a dozen and I have seen many in my life time!

  9. newnickname says:

    You can automate the squirting, mixing, heating, cooling and even the outcome measuring (spectro, mass, etc). You can’t automate the THINKING to know if any of the above was even done correctly or makes any sense without seeing it or being there.
    Today, I see grad students and post-docs who are nothing but “button pushers” who have no clue what their instruments are doing, let alone measuring, and they just wait for an answer to be spit out.
    I have asked, “What value did you use for parameter-X?” “I don’t know. I let the machine do it.” [Default setting is the wrong value.] And “Does that result even make any sense to you???” “Well, that’s what the machine says it is.” “Do you think the machine might be WRONG??” [crickets chirping]
    More worthless data and worthless publications and worthless investments when someone realizes that the Emerald — ooops, I mean the Emperor — has no clothes.

  10. hellooooooo says:

    They have some serious Flash horsepower in the background. I don’t see how that fits into their bio-cloud model. Those are significantly “hands-on” in my experience.

  11. rumtscho says:

    > To control the myriad lab machines, Emerald has developed its own computer language and management software.
    I’m not a chemist, I’m a computer scientist. “Developed its own computer language” raises a big red flag for me. How would you chemists react to an article about somebody synthesizing a compound and then analyzing its properties using a spectrometer they built for themselves?
    And then, are the tasks really uniform enough to be automated? They are talking about supporting “40 common experiments”. As always with automation, the devil is in the details. If each of the experiments runs the same way every time, it can indeed be automated. But what if there are subtle differences? What happens when a researcher gives them spindly cells for counting instead of round ones? Even if they have thought to cover that particular example, there will be many other such gotchas. My experience with life science is the number of variations is usually too high for automation to make sense. Computers are only good when you want the same process repeated perfectly time after time on standardized input, no deviations. Biological material and scientific processes are both poorly suited for automation.
    There are some amazing things happening right now in bioinformatics. They usually include the heavy information processing at some point after we’ve converted analog information to digital data. If these people can pull it off at analog-digital-coversion point, hats off to them. But it sounds too complex to work out well.

Comments are closed.