U.S. flag An official website of the United States government
  1. Home
  2. News & Events
  3. Speeches by FDA Officials
  4. Speech to the Regulatory Affairs Professionals Society (RAPS) 2017 Regulatory Conference - 09/11/2017
  1. Speeches by FDA Officials

Speech | In Person

Event Title
Speech to the Regulatory Affairs Professionals Society (RAPS) 2017 Regulatory Conference
September 11, 2017

Speech by
Scott Gottlieb, M.D.

Remarks by Scott Gottlieb, M.D.
RAPS 2017 Regulatory Convergence Conference
Washington, DC

(Remarks as prepared for delivery)


Let me take a moment to reach back across the history of medicine, in order to draw a contrast to where we stand today.

The germ theory was first proposed by Girolamo Fracastoro in 1546, and expanded on by Marcus von Plenciz in 1762. But these ideas were initially met with skepticism. Humanity struggled to understand how germs caused disease. This disbelief had predictable consequences. Epidemics would continue to ravage society for centuries to come.

A transitional period began in the late 1850s as the work of Louis Pasteur and Robert Koch offered convincing evidence that germs caused disease. Eventually, this resulted in a golden era of bacteriology. The germ theory gained acceptance, and allowed us to identify the organisms that caused illness. The development of effective vaccines ensued.

By the 1870s, Joseph Lister was instrumental in developing practical applications of the germ theory with the advent of modern sanitation. In medical settings, we finally embraced aseptic surgical techniques.

It took 300 years from the first discovery of the germ theory of disease to see these principles established as a part of the practice of medicine.

It took four decades for our modern understanding of immunology to be firmly established, and then to be converted into tools that allow us to fully manipulate the human immune system as a way to cure cancer.

It’s taken just two decades from the first mapping of the human genome, and its translation into technologies that allow us to edit those gene sequences as a way to target deadly diseases that afflict children.

There’s a period of time between the discovery of a central principle of biological science, its fuller illumination, and its eventual translation into the tools of clinical medicine. This gap in time between the discovery of a basic scientific principle, and the adoption of new treatments based on the original insight, has been shrinking as science evolves. Primary ideas are more quickly reshaping how we approach illness. This simple truth is transforming human history.

But at the same moment that we’re shrinking the sequence of time between a basic science discovery and the creation of practical solutions for patients that are harnessed from these breakthroughs, the process for developing those opportunities is also getting more costly and uncertain. In many cases, it’s also getting more prolonged.

This is especially true when it comes to new drug development.

We’re on an unsustainable path, where the cost of drug development is growing enormously, as well as the costs of the new medicines. We need to do something now, to make the entire process less costly and more efficient. Otherwise, we won’t continue to realize the practical benefits of advances in science, in the form of new and better medicines.

Even when we can afford to develop breakthroughs, more people will have a hard time paying for them if we can’t reduce the cost of drug development, and find better ways to capture those savings. The high cost of development also reduces competition. With the high costs, it can be less viable to develop some drugs, especially if you’re second or third to market. Yet we know this kind of competition lowers prices.

Also, because the costs are so high, sometimes a drug targeted to a less profitable circumstance might not be developed at all. This can increase overall healthcare costs by forestalling useful innovation. We need to reduce the risk and uncertainty that makes drug development increasingly costly. And we need to make sure that we have markets that are competitive, and that let us capture those savings in the form of lower prices.

At FDA, we’ll continue to take steps to bring more competition to the drug market, as one way to reduce costs and improve access. All of the efforts I’m going to discuss today are not just about advancing science that can treat illness. That’s a key part of our mission. But it’s also about making sure patients can have access to these opportunities.

The same policies that we pursue as a way to help advance the science of drug development, and make that process more efficient, also have to be geared toward making sure we can lower the cost of developing medicines. Those savings must be captured for the benefit of patients, and impact prices in a way that extends access to these opportunities.

We’ve already announced a series of FDA actions aimed at addressing the cost of medicine and access to drugs. We’re taking steps to boost generic competition, and will continue to build on those policy steps. We’ve also set our sights on the ways that branded firms sometimes game our rules to extend their drug monopolies past the point that Congress intended for. We’re targeting these practices where we can. Soon, we’ll announce some additional steps to address these same goals.

But we also need to address the other side of the continuum, where the costs start: With the high and rising expense of developing a novel drug.

It’s true that we’ve recently seen some targeted drugs move quickly through the development process. This is especially true in oncology. These are cases where the biological rationale for a treatment is firmly established, where proof of concept can be more easily obtained, and where the drugs are typically targeting significant unmet medical needs.

In these cases, the risks and benefits can be made obvious through more focused studies. The clinical trials to develop such a drug, and the time it takes for FDA to review the results, can unfold efficiently. This would suggest that drug development overall is getting more efficient.

But this experience is generally the exception. Even while some drug development programs move quickly, most take many years. The costs are also high, and growing. There’s been criticism of the various estimates of how much it costs to develop a new drug. But we know some drug programs can easily top $1 billion, just in direct outlays.

We also know that the average cost of developing a single new drug continues to increase at a pace that often dwarfs even the rate of increases in other healthcare costs. Take just one time period, between 2003 and 2013, and one estimate of these costs. Over that time period, the cost of developing a drug rose by 145 percent after correcting for inflation, according to the Tufts Center for the Study of Drug Development.

Moreover, on a relative basis, in many cases the costs of early stage drug development has grown at a proportionally faster rate than the cost of late stage drug development. In other words, inflation in early stage drug trials is rising faster than inflation in late stage development.

By front-loading the cost of drug discovery, the broader biomedical community is making it harder to advance new ideas. It’s economically harder to capitalize the cost of an early stage drug program, relative to funding a later stage project. So frontloading the costs are a recipe for reducing the amount of new ideas that can be advanced.

When drugs succeed, they’re ultimately priced in a way that reflects a lot of factors, including their value to patients. But on some level, the prices also reflect some measure of the high development costs, and in particular, the cost of the capital that it took to develop them.

This is how the economics of our entrepreneurial model works. But that capital is fluid. It moves across different opportunities, whether it’s developing a new biotech drug, or a social media site like Twitter. At a time when people are rightly worried about the rising prices of drugs, and the impact on patient access, we also need to be thinking about these factors that contribute to the high cost of making new medicines.

That cost of capital isn’t just a reflection of the direct costs of doing the research to discover a medicine and the cost of running the clinical trials needed to test a new drug. In fact, in many cases, these direct costs are the least expensive consideration in the economic modeling that entrepreneurs shoulder when they’re pursuing a new drug idea.

That cost of capital is affected the most by the risk of failure. As the risk of failure grows, entrepreneurs seek a higher potential return in order to support the initial investment. The cost of capital is also significantly impacted by the time anticipated it will take to develop a new medicine.

In a dynamic and competitive market, the capital needed to fund a new drug program is always competing against the next best economic use of that money. In other words, any investment is measured against the risk-adjusted, projected return on the next best opportunity. It can be developing a different drug, or building the next Amazon.com.

All of these factors affect how much investment we’re going to see in endeavors that advance human health. They also affect how the products that succeed are ultimately priced, and how we make sure patients have access to these opportunities. Access is a critical matter of public health concern. For these reasons, as a matter of public health, we must consider all of the factors that impact the cost of new products. That includes the cost of discovery and development.

Last week, I spoke about some of the steps we’re taking to make the process for the pre-clinical development of new technologies more scientifically advanced and efficient. We’ll have much more to say about other new steps we’re taking to achieve these pre-clinical goals. But today I want to talk about some of the steps we’re taking to address the clinical part of drug development -- the traditional three phases of trials.

To address these issues, our Center for Drug Evaluation and Research, under the leadership of Dr. Janet Woodcock, is taking steps to modernize its Office of New Drugs. The goal is to make sure that our workflows and policies are rooted in the best science and management principles, and that our staff has the support and tools they need to fully achieve their public health mission. I plan to talk more about some of these steps, later this month, at the National Press Club.

I’ll also advance a Strategic Policy Roadmap that will detail additional steps we’re pursuing. These, and similar, efforts are aimed, in part, at making sure that FDA is able to adopt the modern scientific tools we need to maintain the rigor of our programs. We need to make sure that our approach to regulation is efficient, and doesn’t become an obstacle to the translation of scientific discoveries into practical solutions for patients. We need to make sure that we’re using the best science so we maintain our gold standard for determining safety and benefit.

I’ll have more to say on these topics soon. Today I want to outline some of the new steps we’re taking when it comes to our policies related to clinical development. I want to focus on two efforts, in particular.

First, I want to address some of the steps we’re taking to modernize our approach to how we collect the clinical information that we use to make decisions about the safety and effectiveness of new drugs. And second, I’ll discuss other steps we’re taking to modernize how we evaluate that information. This includes improvements in the tools we use and parameters we adopt as a way to measure safety and benefit and properly evaluate new drugs.

I want to start with the first issue; the steps we’re taking to modernize the ways that clinical information is collected. FDA continues to advance the use of new tools and clinical trial designs. For example, we’re seeing wider use of adaptive approaches, which allow scientists to enrich trials for patient characteristics that correlate with benefits, or that help predict which patients are least likely to suffer a certain side effect.

This predictive information is valuable. It can be incorporated in a new drug’s label and help inform more careful prescribing.

As part of these approaches, we’re also seeing more use of combined-phase studies, what’s referred to as seamless trials. Instead of conducting the usual three phases of study, seamless trials encompass one adaptive study where the phases are separated by interim looks. By using one large, continuous trial, it saves time and reduces costs. It also reduces the number of patients that have to be enrolled in a trial.

These methods are increasingly prominent in oncology drug programs. Under the leadership of Rick Pazdur, our Oncology Center of Excellence is taking steps to better evaluate and cultivate these new approaches as one part of our ongoing efforts to modernize our approaches.

Owing in part to these leadership efforts, we’ve seen more sponsors develop oncology drugs that forgo the conventional three sequential phases of drug development. They opt instead for seamless approaches. Under these trial designs, they’ll typically add cohorts to a first-in-human trial to investigate doses and activity in a variety of cancers.

These methods have been used with some of the newer immunological therapies. It may also be used in other drugs targeted against specific molecular defects. Seamless designs are particularly advantageous for drugs that work in a variety of diseases, allowing rapid evaluation of the drug and potential approval under our accelerated approval pathway.  These new approaches are also highly consistent with the goals of the 21st Century Cures Act and the recently passed FDA Reauthorization Act.

We’ve seen examples where this approach has allowed the rapid development of drugs in multiple different tumor types. If we had to stop and start formal Phase II trials in each different organ system where a cancer arose, it could have been a protracted process. This approach is well suited to the kinds of drugs that are being developed now, where drugs intervene on common elements found across multiple kinds of disease states. At FDA, we’ve identified more than 40 active commercial investigational new drug applications for large first-in-human oncology trials alone that use these seamless strategies.

Other concepts that we’re seeking to better adapt to clinical development, using new guidance and policy work,  include common control studies and the wider use of large simple trials.

We’re also advancing the use of ‘Master Protocols’ to enable more coordinated ways to use the same trial structure to evaluate treatments in more than one subtype of a disease or type of patient.

This approach is particularly relevant when it comes to targeted drugs. These are drugs that may intervene on markers that are relevant across many different disease subtypes. We may, for example, want to evaluate these different targets simultaneously, as part of one large study. This could give us a better way to understand the comparative benefits of a drug across different settings. To enable these master protocols, it’s often important to do molecular patient screening. This can lead to the development of a diagnostic that can also be used to guide patient care.

We’ll be talking more about all of these concepts and initiatives. We need to be mindful that these and similar new strategies also bring new uncertainties. These new uncertainties can create new risks. So, as in all of our regulatory efforts, we must also take additional steps to make sure that we’re protecting patients and ensuring drug safety.

For example, with adaptive and seamless clinical trial approaches, informed consent used in clinical trials must be updated as the trial progresses,  not only to reflect new safety data, as is already standard practice, but also to incorporate data on the evolving view of efficacy.

First-in-human trials with expansion cohorts may also encompass an entire drug development program in a single trial. So potentially important regulatory interactions, such as the standard guidance meetings held at the end of Phase 2 and before the initiation of Phase 3 trials may not automatically occur. Comparable regulatory milestones need to be built into the new seamless clinical trial process. We need to ensure we provide comparable interactions and oversight.

The Agency also needs to engage in more communication between sponsors, investigators, IRBs, and other stakeholders involved in the development program. This is not a “business as usual” approach. It may require a much more iterative process, with greater communication between all of the stakeholders involved in the clinical trial processes.

On the second point that I wanted to highlight today, we’re also taking new steps to modernize how sponsors can evaluate clinical information, and how FDA reviews this data as part of our regulatory process.

This starts with better use of more advanced computing tools, and more sophisticated statistical and computational methodologies, as part of the drug development and the drug review process. This includes more widespread use of modeling and simulation, and high performance computing clusters inside FDA.

FDA already has high performance computing clusters. These tools help us develop more sophisticated methods for evaluating the data that’s submitted to us from clinical trials. The computing tools also enable us to properly evaluate the more sophisticated components that are submitted to us as part of product review applications.

I’m directing an effort to try and increase our investment in these computing tools. They’re an increasingly important part of our work. But access to them at FDA is limited. We’re taking new steps to make sure our review staff has more access to these computing platforms.

These tools are especially important to our use of modeling and simulation as a part of drug review, not only in our Division of Pharmacometrics, in the Office of Clinical Pharmacology, but across our review program. Almost 100 percent of all new drug applications for new molecular entities have components of modeling and simulation.

Typically, these modules are focused on similar uses. They include modeling dose response as a way to better evaluate the safety and efficacy of different doses, and help select the optimal dose for the general population or subgroups. They also include methods to estimate a new drug’s effect size to develop the appropriate sample size for pivotal trials. Finally, modeling and simulation is also commonly used to evaluate the reliability of endpoints, such as helping to demonstrate a relationship between a biomarker and a clinical endpoint relationship.

We’re going to be making more advanced use of these and similar tools as one part of our overall efforts to make our review process more efficient and scientifically advanced. Among other things, we plan to convene a series of workshops, publish guidance documents, and develop policies and procedures for translating modeling approaches into regulatory review. We’ll also be conducting a pilot program on these approaches. As part of this pilot, FDA will grant meetings to participating sponsors who use these approaches, as a way to provide more collaboration on the model-informed drug development issues.

Our goal is to see how we can accelerate methods that improve our ability to use advanced tools to meet FDA’s gold standard for regulation.

These methods are being applied to both common and rare diseases.  FDA is also collaborating with scientists to use similar computational tools to develop natural history models, based on placebo arms in Parkinson’s disease, Huntington’s disease, Alzheimer’s disease, and muscular dystrophy. If we’re able to make better use of rigorous, reliable natural history models, especially for rare diseases, it can help make clinical trial recruitment more efficient.

We’re also using similar methods to develop predictive algorithms, like those found in applications of artificial intelligence. To take one example, we’re developing classification algorithms in lung cancer, as a way to enable machine reading of CT scans based on widely accepted standards. These tools are still in development. But they can eventually be used to classify tumor dynamics like response to treatment, and increase the precision of our assessments. All of these advances help make the development process more efficient, and hopefully, less costly.

Once we design and publish these algorithms, depending on their stage of development, they can either be incorporated into clinical studies as an exploratory endpoint for validation or, if already validated, as means of helping to better inform the results of a primary endpoint.

Additionally, to better delineate how we’re going to approach the overall development and evaluation of drugs targeted to certain unmet medical needs, we plan to begin work on at least ten new disease-specific guidance documents over the next year. Some of these documents are already underway. Among the diseases we’re targeting are areas of significant unmet need like Amyotrophic Lateral Sclerosis (ALS).

This latter guidance is an outgrowth of a scientific document developed by the ALS Association that laid out some relevant principles. That document was developed with funds that the ALS Association generated from their Ice Bucket Challenge. FDA is grateful for their efforts, and the agency is responding with its own version of that guidance document.

These are just some of the steps we’re taking to make the drug development process more scientifically modern and efficient. We need to make sure that we’re giving our review staff access to the best scientific tools and opportunities to advance their work.

It’s also the case that drug development doesn’t stop with drug approval. We can and should learn a lot more about how treatments work in actual patients in real world settings, so that patients and payers are getting the highest possible value for their money.

In all these things, we need to make sure our own policies and approaches are keeping pace with the sophistication of the products that we’re being asked to review, and the methodologies being brought to these endeavors. Our work with high performance computing, and simulation, is a good example of an area where we need to make sure our methods match the sophistication and resources of  the tools and approaches being adopted by sponsors. If we fall behind, it’ll be harder for innovators to bring forward novel products and approaches. It could also make it more challenging for FDA to adopt the most efficient and effective scientific methods, and continue to meet its high standards.

We need to do these things to make sure we’re providing an efficient path for the translation of cutting-edge science into practical treatments that are going to benefit patients. We need to do these things because the rising cost of drug development is unsustainable.

Unless we find ways to modernize how we approach our work, and make more efficient use of our resources, then we’re going to get fewer medicines, and higher costs. We’re not going to realize the benefits of the scientific advances we’re seeing as quickly, if we see them at all.

Most of all, we have to do these things to make sure that we’re efficient in bringing new scientific opportunities to patients who need them. And that we actively uphold FDA’s gold standard for safety and effectiveness.

Thank you for the opportunity to be here today.

 

 
Back to Top