Pharma is at a pivotal point in its evolution. The dearth of good new compounds in its pipeline is central to all its other problems, including its rising sales and marketing expenditure, poor financial performance and battered reputation. Moreover, though global demand for medicines is growing, as demographic, economic and epidemiological trends reshape the marketplace, soaring healthcare costs will force Pharma to engage in the dialogue on healthcare funding and work much harder for its dollars.
Clinical advances and financial constraints are already changing the way in which healthcare is delivered and will soon change the way in which it is measured. The political climate is likewise becoming much more aggressive –and no politician will stand up for an industry that does not win votes–.
These trends will ultimately apply everywhere. The US is struggling to foot a healthcare bill that touches $2 trillion and cannot continue to generate the bulk of the industry’s profits. And though the E7 countries look increasingly promising, they cannot afford to match the prices the developed world has historically paid. Thus Pharma’s traditional strategy of placing big bets on a few molecules, marketing them heavily to primarycare physicians and turning them into blockbusters will no longer suffice. J.P.
Garnier, chief executive of GlaxoSmithKline, admitted as much in February 2007, when he noted: “This is a business model where you are guaranteed to lose your entire book of business every 10 to 12 years.” The “first reflex” for many companies is to merge and that buys them “a little time” to deal with patent expiries, but fundamental changes will ultimately be necessary, he concluded.
Some of these changes will depend on the nature of the products and services different companies offer, since there can be no one solution to the needs of an industry as complex as Pharma. The choices each organisation makes will have a bearing on the structure it adopts, alliances it forges, culture it espouses and people it employs.
But some common themes are likely to emerge. We believe that Pharma will have to use new technologies to improve its understanding of disease, reduce its R&D costs significantly and spread its bets to increase its productivity in the lab. It will also have to work more closely with governments, regulators and the healthcare community to make the medicines patients really need, test them as quickly and effectively as possible, and provide a more holistic healthcare service.
Lastly, it will have to tailor its sales, marketing and pricing strategies to new audiences and markets; show that its products are worth the money that is spent on them; and rebuild its reputation by adhering to the highest ethical standards. We shall discuss some of the changes we believe will be required in more detail in this next section of our report.
Access to basic research
Pharma will have to begin by expanding the pool in which it fishes for basic research. It has traditionally scoured the scientific literature to get leads or bought them in from academic institutions and niche biotech companies, but this approach is becoming increasingly unviable.
Most of the Western universities in which scientific research is performed are under huge pressure to commercialise their findings. Between 2000 and 2004, for example, there was a 70% increase in the number of patents the leading US research institutions filed (although the number of patents they were granted remained broadly the same).British universities are also getting much smarter about the value of their research.
There was a three-fold increase in the number of licences and options they executed, and a two-fold increase in the gross income they generated from intellectual property, over the same period. So, where basic research is available, it is generally costing the industry considerably more.
The same is true in the biotech sector. Between 2000 and 2005, the average cost of an early-stage compound increased by a factor of eight, and the competition for assets is now so intense that valuations have started to overtake the figures recorded for Phase III deals just a few years ago.
Many biotech companies are also securing more favourable rights, in the form of co-promotion arrangements or other options, suggesting that they are keen to make the transition from pure R&D to commercialisation.
Much of the scientific research performed in the West is becoming prohibitively expensive, then, but the research base itself is also shifting east –and Pharma is not in a strong position to exploit these new sources of knowledge (see sidebar, Degrees of change)-–.
Most of the industry leaders are trying to establish a foothold in Asia. Wyeth has, for example, opened a joint early development centre with Peking Union Medical College Hospital in Beijing; Roche has set up a research base at Zhangjiang Hi-Tech Park in Shanghai; and AstraZeneca is planning to do likewise. Meanwhile, Novartis is building an $83m R&D centre in Suzhou, near Shanghai; and GlaxoSmithKline is contemplating a move to China, too. Similarly, Eli Lilly, Novartis and GlaxoSmithKline have all set up research centres in Singapore.
Novartis has also just embarked on a new clinical research venture in Indonesia. AstraZeneca has opened a process R&D laboratory in Bangalore. And GlaxoSmithKline plans to set up a global drug development support centre in Mumbai with Indian software firm Tata Consultancy Services.
But these investments are tiny, compared with the amount Big Pharma is spending on R&D in the West. Moreover, although the majority of multinationals are keen to expand their presence in Asia, relatively few are focusing on research. In a survey recently conducted by PricewaterhouseCoopers, only 8% of respondents said they were interested in doing more research in Asia, whereas 50% wanted to increase their sales and marketing activities, and 25% to increase their manufacturing activities, in the region.
This may prove a rather short-sighted approach. If Pharma is to get access to the basic research it needs, it will either have to establish a much stronger footprint in Asia or forge close links with the most reputable centres of scientific excellence in the area.
That, in turn, means it will have to overcome barriers of language and culture. And, as experience in the IT sector shows, following the herd can prove a costly mistake. Many parts of India are now short of the very resources that prompted numerous companies to flock there in the first place, so it is essential to choose the right location.
But even if Pharma can get access to good basic research, it will still need to transform the way in which it performs R&D. At present, many companies concentrate on investigating new molecules before they have created a clear picture of the pathology of the diseases they are trying to address and the physiological responses those diseases cause.
This is too narrow a focus at such an early stage in the research process, and helps to explain why attrition rates in development are so high. We believe that, by 2020, the most successful companies will be those that focus on building a much better understanding of the pathophysiology of disease.
They will study the disease variability arising from multifactorial aetiology the underlying disease mechanisms, targets that are amenable to therapeutic intervention and what markers could be used to distinguish between patients with similar clinical symptoms but distinct biological conditions.
Scientists currently use public-domain information on disease epidemiology, pathways, mechanisms and targets to formulate hypotheses about the likelihood of being able to alter the course of disease via pharmacological intervention.
They then use internally generated data derived from in vitro cellular models or in vivo animal studies to achieve limited validation of a specific target and, when they have established a certain degree of non-clinical “confidence in rationale” (CIR), they begin high throughput screening to find a molecule that can interact with the target protein.
Once they have identified a series of leads, they initiate a full programme of lead optimisation and experiments to test the physical and toxicological properties of a given molecule, but it is only after several more years have elapsed that the molecule is ready for studies in man.
Even then, early clinical studies do not test the central hypothesis that the target has any pathophysiological link to the disease being investigated; they focus on establishing what the human body does to the molecule. It is not until Phase II (some five to seven years after the first high throughput screen against the target) that the CIR is truly tested –and this is the point at which most compounds fail, although some fail at an even later stage in development (see sidebar, Molecular fallout)–.
The key to reducing the time and costs involved in researching new molecules is to test the hypotheses underpinning them in man as early as it is safe and practicable to do so, and to invest far more in creating a more holistic understanding of disease pathophysiology and epidemiology before embarking on expensive development programmes.
Today, it is clear that the real source of intellectual capital is a robust understanding of disease, and that the research process should no longer be limited to a specific therapeutic area, disease mechanism, target or biological pathway.
Recent research indicates, for example, that there are eight different disease mechanisms underlying Type 2 diabetes. In order to develop a treatment for patients with Type 2 diabetes, it is therefore necessary to understand the “context” of the disease, including:
- The nature and incidence of the various disease subtypes
- Whether all eight mechanisms are amenable to therapeutic intervention
- The relevant targets for therapeutic intervention
- The feasibility of developing biomarkers to identify which patients suffer from which disease subtypes
- The safety characteristics of different potential therapies; and
- The commercial viability of those therapies. Once it has acquired an in-depth understanding of the pathophysiology of disease, a company can develop a probe molecule and biomarkers for early testing of the CIR in humans.
This will generate a steady, iterative increase in its knowledge about the relationship between molecular intervention and disease pathophysiology, as well as enabling it to create a more precise and sensitive set of biomarkers for determining disease subtypes, patient subpopulations, safety and efficacy. When it is confident that the mechanism of action in the probe molecule works as intended (based on iterative testing in man), it can move the molecule into “development” (see Figure 12).
Figure 12: In the R&D process of the future, a pharmaceutical company will only develop a molecule when it is confident that the mechanism of action works Source: PricewaterhouseCoopers
Some biotech firms and specialised research organisations already use this approach to accelerate their research, establish the commercial viability of their molecules and reduce attrition rates –with obvious benefits–. Big Pharma typically takes about 40 months and $25m to establish proof of concept.
Conversely, Chorus, the independent drug development unit set up by Eli Lilly, took just 12 months and $2.7m to show that an anticoagulant with a novel mechanism of action worked in 74 patients. We suggest that Pharma should emulate such pioneers, and that acquiring a much deeper knowledge of the pathophysiology of disease should become an early part of the research process.
Such an approach would alter the balance of risk dramatically by enabling the industry to pursue many more leads than it can currently afford and develop them with a much greater probability of success. Some of the new technologies now emerging will also help it to integrate the insights derived from the molecular sciences with other kinds of knowledge.
Semantic webs will, for example, enable scientists to move seamlessly from one database to another, analyse disparate forms of data spanning multiple disciplines and organisations, and connect genomic, proteomic and metabonomic data with clinical data.
hey will also facilitate the mining and re-use of data from previous research projects and clinical studies to generate testable new hypotheses. The W3C Technology and Society domain has already developed a prototype development dashboard –the BioDASH– which connects information about biological targets and compounds with data on the molecular biology of specific diseases.Several big pharmaceutical companies have also been conducting pilot studies, and some industry experts predict that use of semantic technologies could be widespread within the next five years.
Common data standards will clearly be necessary to support such technologies. But the Clinical Data Interchange Standards Consortium (CDISC) has already developed several data standards, and there are others in the pipeline. They include various labelling standards; the HL7 “family” of standards for discharge summaries, summary patient records and medical claim attachments; and the Digital Imaging and Communications in Medicine (DICOM) standard for transmitting medical images.
Other, more remarkable advances –such as machine-learning systems– are also on the horizon. “Autonomous experimentation”, as it is sometimes called, will ultimately allow Pharma to us o carry out the entire cycle of scientific experimentation, including the origination of hypotheses to explain observations, the devising of experiments to test these hypotheses and the physical implementation of the experiments using laboratory robots (see sidebar, The road to robot scientists).
However, crucial though such new technologies will be in facilitating biopharmaceutical research, they cannot redress the cultural obstacles Pharma faces –and these are an even bigger roadblock–. The corporate cultures and kinds of people the largest pharmaceutical companies employ often preclude them from being very innovative. Some companies are still wedded to the blockbuster model of R&D, and restrict their research agendas accordingly.
But even those that have abandoned the blind pursuit of blockbusters generally have very complex decisionmaking processes. They also reward research scientists for delivering candidate molecules to the clinic (most of which seem to come through just before the year-end) rather than for acquiring sufficient insight to determine whether those molecules are viable or not. It is therefore quite possible that new entities may emerge to fill the gap.
By 2020, for example, specialist organisations focusing exclusively on biological pathways and proofs of mechanism may sell their research, just as biotech firms now sell promising molecules. Indeed, given the cultural and organisational challenges the industry must tackle, it may be questionable whether pharmaceutical companies are even the right place in which to perform such work.