BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

Understanding Precision Medicine And AI Within The Life Cycle Of Technology Revolutions

Following
This article is more than 4 years old.

Powerful new technologies have the potential to radically transform both science and society. In science, as Douglas Robertson describes in Phase Change (2003), a new technology like the microscope, the telescope, and the calculus can profoundly alter the questions we ask, and advance our ability to better understand nature. Society, visibly, can also be transformed by technology, as we’ve seen with examples ranging from the steam engine and the telegraph to automation and the internet.

The catch is, this transformation doesn’t occur overnight – far from it.  The remarkable and often maddening aspect of innovation (as I’ve discussed here, here) is the exceptionally long time it takes between the time a technology is originally invented and the time when people figure out how to use it most effectively.

In this three-part piece, I will first present a framework, developed by economist Carlota Perez, describing the life cycle of transformative technologies, and outline relevant refinements, introduced by columnist Daniel Gross. I’ll then locate our contemporary debate around the utility (or not) of precision medicine – and particularly, precision oncology – in the context of this framework; this section is richly informed by the perspective shared by key physician and physician-scientist thought leaders in this space. Finally, I’ll suggest that AI (as a proxy for the emerging excitement – and skepticism — around digital and data in health) seems to be entering the earliest stages of the technology diffusion trajectory, which may help explain both the frenzy and the confusion.

Part I: The Life Cycle Of Technology Revolutions

Coaxing technology through implementation is both thrilling and fraught – thrilling because of the enormous potential it represents, as envisioned by originators, advocates, and investors including venture capitalists, and the excessive hype it induces, which can range from overly-optimistic claims to outright fraud, and most everything in between.

Seen in this context, the debates around successive emerging healthcare technologies – precision medicine, AI – are in a way entirely predictable, and just the most recent iterations of the discussions that have surrounded the introduction of every new and potentially important technology. Moreover, like the famous rabbi joke, everyone may have a valid point. The advocates may correctly foresee the ultimate impact of the technology, while the critics may correctly identify where early, often aspirational claims of success may be greatly exaggerated.  Despite the cognitive dissonance that may result from sustaining both views, the tension may nevertheless ultimately help unlock the true potential of the technology. 

A particularly useful framing of the innovation cycle was developed by economist Carlota Perez over twenty-five years ago. Her central contention (expressed in her 2002 book, Technological Revolutions and Financial Capital) is that “the full fruits of technological revolutions” are “only widely reaped with a time lag. Two or three decades of turbulent adaptation and assimilation elapse from the moment when the set of new technologies, products, industries, and infrastructures make their first impact to the beginning of a ‘golden age’ or ‘era of good feeling’ based on them.” 

As Perez observes, “the real possibilities of a radical innovation can be so difficult to envision, before the appropriate paradigm is there, that even those who carry them out may grossly underestimate their potential.” She cites the examples of Edison, who “thought the phonograph he invented in the 1870s would be useful for recording dying people’s wills,” and IBM, whose leadership in the 1950s “still thought a few computers would cover the world’s total demand.”

Perez divides the development and diffusion of technology into two parts, “installation,” brought on by the discovery of a new technology – a “big bang” — and “deployment,” the time when the technology finally achieves wide adoption and use. Installation itself consists of two phases: “irruption” — a period of “explosive growth and rapid innovation” — and “frenzy,” characterized by “flourishing of the new industries, technology systems, and infrastructures, with intensive investment and market growth.” The “evangelistic sale” – an approach to selling a vision described in my last column and championed by silicon valley VCs like Marc Andreessen, plays a critical role during this early phase.

Driven by such growth, the technology trajectory enters the “deployment” period, which Perez also divides into two phases: “synergy,” characterized by the continued growth of the technology, and “full expansion of innovation and market potential.” Ultimately, the technology enters the second phase of deployment, the “maturity” stage, essentially the squeezing of the last bits of toothpaste out of the tube. 

To summarize the framework: “big bang” -> installation (irruption, frenzy) -> deployment (synergy, maturity).

As the late British economist Christopher Freeman pointed out in his concise introduction to Perez’s 2002 book, the “early upsurge of a new technology,” a defining feature of the installation stage, also leads to “great turbulence and uncertainty in the economy. Venture capitalists, delighted at the new possibility of very high profits first demonstrated by early applications … rush to invest in the new activities and often in new firms.”

However, Freeman cautioned, “the uncertainty which inevitably accompanies such revolutionary developments, means that many of the early expectations will be disappointed, leading to the collapse of bubbles created by financial speculation as well as technological euphoria or ‘irrational exuberance.’”

Twelve years ago, in Pop!, finance columnist Daniel Gross took a close look at these bubbles, and suggested there may be underappreciated, long-term benefits associated with the processes that drive bubble formation. He acknowledged that the excesses, which may be especially prominent in the U.S., can be stomach-churning and profoundly destructive in the short-term, but argues they may ultimately help accelerate the intelligent and productive adoption of the technology. The gold rush mentality can result in the rapid construction of infrastructure or critical components that are ultimately useful, even if most of the companies rushing to do this, like the many companies initially seeking to build telegraph connections or lay fiber optic cable, ultimately fall by the wayside.

At least as important, Gross notes, “during bubbles, a great deal of money and energy is spent building up the mental infrastructure surrounding a new technology, which ultimately helps prepare both businesses and individuals to do new things, like bank from home, jump into a stranger’s car, or use their phone as a camera.

There may be other, underappreciated benefits of the frenzy as well. “Bubbles, entrepreneurial storms that disrupt the existing commercial order, provide shots of adrenaline. The enthusiasm they generate had led successive generations of entrepreneurs to open new territory for settlement, to create valuable new infrastructure, to spur innovation, to push people to work, invest, and spend at a higher level – all in pursuit of promised massive short-term gains.” It’s not so much that Gross endorses bubbles as that he sees them as part and parcel of the entrepreneurial energy that drives innovation and economic growth which on balance benefits society – an argument similar to the one advanced by Andrew McAfee in his latest book, More From Less (see my recent Wall Street Journal review, here).

Gross cites the example of the telegraph, where American enthusiasm in the mid-1800s led to a boom in the construction of telegraph lines at an absurdly rapid pace; most of the original companies doing this work went bust, as excess capacity led to debilitating price wars; however, most of the lines constructed ultimately were consolidated by a single company, Western Union, and the nation was suddenly on the grid, and ready to engage with it, due in large part not only to the infrastructure build, but also to the market preparation, in effect, done by the many now-bankrupt companies and entrepreneurs.

This context – intended as a useful conceptual framework, not as prescriptive instruction for innovation — may help us understand both some of the ongoing debate around the utility of precision medicine, and the debate we’re just getting into around the utility of AI in health.

Part II: Precision Medicine In Oncology - A Technology Revolution In Progress

As I’ve discussed, the molecular biology revolution, catalyzed by the sequencing of the human genome, has transformed science, and increasingly impacted how new medicines are discovered and developed, and administered to selected patients. Equally true, this revolution has been driven at times by exceptional hype, as I’ve explicitly called out as well – again, see here and references therein, including specifically Harvard geneticist Richard Lewontin’s classic writing on the subject. This tension is perhaps most apparent in our ongoing national discussion about the utility of “precision medicine.” Prominent critics, such as Dr. Vinay Prassad (see his website here, and references therein), Dr. Michael Joyner (see this 2019 Journal of Clinical Investigation article, and references therein), and others, argue the hype has far exceeded what’s been demonstrated, and what is likely to be demonstrated, and suggest rather than celebrating meager victories we might consider more gainfully redeploying our limited resources elsewhere. 

A recent and accessible article by scientist and writer Drew Smith pressing the case against precision medicine was published in Tincture (here), and argues targeted cancer therapy is delivering modest benefit, at immodest cost, and that the degree of benefit, if anything, seems to be decreasing over time – i.e. we’re spending huge amounts of research dollars to develop new cancer medicines that are only incrementally better than previous offerings, and the degree of improvement is getting smaller, not larger – though the price tags are high and getting higher.

I solicited the view of several trusted experts in the field, physician-scientists who are deeply familiar with these issues; most (including the ones quoted below) spoke on the record, and their remarks have been minimally edited for clarity. 

A few points emerged. First, most seemed thrilled to engage in what they see as a critically important dialog. Second, most specifically acknowledged the excessive hype associated with precision medicine. Third, more mechanistically, several experts suggested we will increasingly approach cancer the way we approach infection diseases, both in terms of needing to simultaneously attack multiple vulnerabilities at once (as has proved so effective for HIV), and also in terms of needing to energize the immune system as well as attacking the cancer directly (as I recently discussed in a Wall Street Journal book review, here).  Finally, even though I had specifically asked the experts about the progress and clinical impact of the science, several independently called out the cost of new medicines as an obviously important issue that likely required its own serious discussion. In the words of geneticist Dr. Robert Nussbaum, “the biggest challenges are not scientific or technological, they are economic - treatments cost too damn much to develop and implement.” Adds Stanford oncologist (and recent Tech Tonics guest) Dr. Alison Kurian, “Drug cost is certainly a huge problem, but that is a separate question from whether we are seeing any progress.”

Dr. Hal Burstein, a breast cancer specialist at Harvard and the Dana Farber Cancer Institute, emphasized the historical importance of the concept of “precision medicine,” noting that in the 19th century, this meant the gram stain and bacteriological culture. This “proved very important,” says Burstein, because it “provided a clinical and bacteriological taxonomy - distinguishing streptococcal pneumonia from staphylococcal pneumonia from tuberculosis from likely viral pneumonia - in understanding pneumonia (then, the most common cause of death in children and adults) and counseling patients.“

The benefit of more precise disease classification has continued resonance today, even outside of cancer, notes Harvard geneticist Heidi Rehm (also the chief genomics officer of the Massachusetts General Hospital as well as a member of the Broad Institute), who points to ”the many thousands of non-cancer diseases we can diagnose. And although only a small number have targeted therapeutics, a larger number have life saving interventions as well as quality of life improvements through appropriate management. Furthermore, diagnosis can end the diagnostic odyssey which is not only highly beneficial to families but also cost saving as one ends a very expensive and continued search for answers.”

The ability to categorize cancer has proved particularly useful, Burstein explains. “Genomic taxonomy has proven immensely valuable in dozens of cancer types, even if we do not always have a ‘target’ for ‘precision medicine’ as yet based on existing therapeutics or treatment paradigms. There is no doubt that ‘precision’ medicine has revolutionized the treatment of innumerable cancers.” 

In breast cancer, specifically, Burstein explains that “HER2 testing and trastuzumab [Herceptin] therapy has saved countless lives; genomic testing (e.g. OncotypeDX) has spared hundreds of thousands of women from chemotherapy; adjuvant endocrine therapy for ER+ cancers has saved more cancer-jeopardized lives than any other medical therapy in history.”

He points to other examples of efficacy, including imatinib (Gleevec) in chronic myelogenous leukemia (CML), which “has meant that people with CML have a normal life expectancy.”

Burstein acknowledges there “are many less positive examples – tumors where sequencing has not so far yielded a therapeutic target or classification that has enabled transformation of treatment. Some of that is because we don’t have drugs to pair with the target or the ‘target’ drug does less than we hope, and some is because not all cancer situations are driven by genomic mutations.”

Perhaps most interestingly, from an evolution of technology perspective, Burstein notes that “the price of genomic sequencing has plummeted and the methods are rapidly becoming a commodity. Many well-annotated catalogs of important mutations are freely available on shared public databases. Commercial sequencing for tumors is now available for prices very much on par with other diagnostic tests in medicine – CT scans, MRI scans, specialized lab tests, etc. The trends here – decreasing costs, better knowledge bases in shared domains – are a marked contrast to most of testing and treatment experiences in American medicine.”

Contextualizing recent advances, Burstein observes, “We are roughly ten years into the era of tumor genomics, and about five years into the clinical era of such testing. Precision medicine has proven extremely useful in many powerful instances, and to date, has less of a role in other diseases. I would say that so far, it is a very good return on a new technology, with more to come.”

Similarly, Nussbaum, a distinguished geneticist and internist who in 2015 left UCSF to become chief medical officer of the diagnostic testing company Invitae, locates medicine as “at the beginning of a revolution in precision therapy.” He observes “strong parallels to infectious diseases,” and the importance of needing both “an immune system to fight an infection” as well as powerful treatments against the pathogenic entity itself. He suggests our current ability to “harness the immune system is just in its infancy and will have a major impact when combined with better precision therapies.”

He notes that “The somewhat disappointing results from checkpoint inhibitors is, to my mind, a result of inappropriate and poorly conceived assays, like total tumor mutation burden,” adding “We have NOT identified the genomic signature that will allow maximum use of immune checkpoint inhibitor treatment, nor do we have all steps by which tumors evade the immune system.” He points out that even when we know what to target, it can be challenging to develop solutions: “We also have identified RAS mutations as the most common driver mutations and yet we have no effective targeted therapies yet. There is a lot more to learn and a lot more to do.”

Hopkins physician-scientist Dr. Drew Pardoll defines precision medicine as “the guidance of therapies to subsets of patients based on a biomarker - genetic, immunologic, pathology or otherwise. This should be distinguished from ‘personalized medicine, which is the creation of a different therapy for each patient,” and cites example of CAR-T cells, “which are made by engineering each patient's own T cells.”

While acknowledging “we have a long way to go in precision medicine,” Pardoll notes there are specific examples of precision medicine in oncology that have been great successes that no reasonable person could deny. For example, 15% of lung cancer has an EGFR mutation.” There’s an inhibitor specific for this mutation, called osimertinib (Tagrisso). All lung cancer patients, particularly those who are not smokers, should have their tumors tested for an EGFR mutation because those patients have extremely impressive and durable responses to osimertinib, while all the other lung cancers without the mutation do not respond.”

Pardoll also notes cites work from his lab (which I have previously discussed here), demonstrating that “patients whose tumors have a mutation in a DNA mismatch repair gene are highly responsive to anti-PD1 (pembrolizumab [Keytruda]), regardless of the origin of their tumor. It appears that anti-PD1 may cure 50% of these patients with advanced metastatic cancer. While DNA mismatch repair mutations occur in only 5-6% of cancers, all cancers should be tested for these because if your cancer has it, you have a 50% chance at cure. The test is very straightforward and not that expensive. Unless you think that $200 is too expensive to potentially save a life. Prior to this finding, all these patients had a guaranteed death sentence. The FDA appropriately approved this combination test/treatment without a randomized trial because the data were so dramatic.”

From the perspective of some oncologists, such as Kurian, progress tends to accrue slowly, over time: “I think the public hopes for an acute transformative advance – e.g., to wake up one day in a new world without deaths from cancer…while in fact we are more likely to see incremental progress.” This is of course similar to how progress is achieved in a range of other domains, from the power looms discussed by James Bessen in Learning By Doing (2015) to the transportation systems discussed by Robert Gordon in The Rise and Fall of American Growth (2016) (see discussion of both here). Citing work from her own group (here), Kurian notes that “the decline in mortality from breast cancer is a good example of chipping away at the problem incrementally. We’re still not where we need to be, but we cure many more women without exposing them to chemotherapy, and our patients with metastatic disease live longer, in a pattern that appears to follow the pace of new drug approval.”

Kurian points out that “we all see patients who gain much more than the average duration of benefit from a new drug, and those stories stand out in our minds and give us hope (and of course, the human brain is good at denial of bad news, namely that many patients gain less than the average benefit).”

Pardoll’s Hopkins colleague, pioneering cancer researcher Dr. Bert Vogelstein, offers a particularly useful synthetic perspective, informed, as he notes “by studies from our group over the years.” First, he emphasizes that the sheer number of neoplastic cells generally present in advanced cancers mean that it “would be extremely difficult for a single drug to induce cures (rather than remissions that last several months),” essentially because in this large population there are likely already rare cells with a molecular “work around” rendering them less susceptible to the initial treatment. However, he notes, based on mathematic modeling, “if it were possible to develop two drugs whose resistance mechanisms were distinct, cures would in theory be possible.”  

Vogelstein also highlights (as he’s recently reviewed here) the potential benefit of earlier detection, which would allow both existing as well as newly-discovered drugs to “work better.” This is important, he emphasizes, because “no therapy – conventional, targeted, immunotherapy – works well on advanced cancers.” His recent review cites the example of colorectal cancer (CRC), noting “conventional chemotherapy agents can cure 47% of patients with micrometastatic CRC, but nearly zero patients with bulky disease.”

Consequently, Vogelstein cautions the survival curves cited in the Tinture article aren’t likely “to change much until cancers are detected earlier and patients are able to be treated with two or more agents whose resistance mechanisms don’t overlap.  It’s early days in that respect – single drugs must be developed and shown to be somewhat effective (at least inducing objective responses in many patients, even if the responses aren’t durable) before multiple drugs can be used together. The HIV story is informative in this regard.” Adds Vogelstein, “this doesn’t mean, of course, that it will be easy to develop such drug combinations, or detect cancers earlier, but I’m optimistic that these developments will occur with further intensive research.”

If there is a bottom line, it’s that technological advances that have permitted the more precise characterization of cancer have helped inform treatments, and has helped researchers begin to implement the sort of strategies most likely to be useful, approaches that have significantly impacted the lives of some cancer patients, but there’s clearly still a long way to go, and leading experts see the field as still in its early days.

Part III: Digital, Data, and AI in Health - A Technology Revolution Just Beginning

If precision medicine is in the first stages of it’s journey, the application of AI to healthcare and, in particular, drug discovery, is just getting out of the starting gate. Predictability, this has led to the same patterns of contrasting perspectives that have emerged in response to precision medicine. 

Advocates, such as entrepreneur Chris Gibson of Recursion Pharmaceuticals (Gibson was also a recent guest on Tech Tonics – see here), seem keen to demonstrate that AI has entered the mainstream, and has successfully crossed the so-called chasm. In this Linkedin post, Gibson emphasized the AI-embracing messaging of Novartis (as I discussed earlier this year, here), which apparently aspires to “become an AI-driven organization where every single employee ‘thinks like a data scientist.’” Concluded Gibson, emphatically: “From Novartis the message is clear: the future of pharma will be fully integrated with machine learning.” 

Gibson also cites two key challenges: the need to “create massive new datasets for the purpose of AI-driven discovery” and the need to “create a culture that truly values the contributions of highly diverse and cross-functional teams.”  While Gibson is naturally talking his book, he may also be right, and in any case is authentically, evangelistically, advocating for the vision of the future as he sees it.

In contrast, a skeptic like Medicixi’s David Grainger (also a Forbes contributor and occasional co-author) argues “I think more data and better data science can revolutionize some human endeavors, but drug discovery is not one of them - it can help make better the parts of the process we do adequately now, but it doesn’t address the real rate-limiting step on innovation, which is understanding biology. Here, what is needed is not data and data science, but education about complex systems (a la the Santa Fe Institute). Current ‘AI’ approaches are so linear and reductionist, it makes me shake my head in disbelief at all the hype.“

And so we enter the “installment” phase of the technology cycle, as Perez called it, where visionary champions will emphasize the potential of new technology, skeptics will highlight the very real limitations of what’s been accomplished to date, and VCs captivated by the transformative potential of emerging technology will seek to identify the most compelling opportunities, the few breakout companies that tend to emerge from each cycle, and to avoid being taken in by the “great manias and outrageous swindles” Perez so vividly describes.

Check out my website