- Speech by
Norman E. "Ned" Sharpless,
Thank you Gov. Castle. Good afternoon, everyone.
I want to thank Research! America for convening this important forum to highlight key issues we face in medical science today,
but also for its longstanding leadership in recognizing the need for quality research to improve medicine and health.
The Value of Research
I started my career as an academic biomedical researcher as well as a doctor, who treated patients with blood cancers for many years.
Given my background, I can’t think of anything more critical for medical progress than making sure that we have the best research producing THE best data needed to develop new treatments for our patients.
My formative experience in this regard occurred when I took a year between my second and third years of medical school to participate in the HHMI research scholars’ program at NIH, where I studied AIDS dementia.
That experience helped me appreciate the art of generating and analyzing research data, data that I generated with my own two hands.
It fostered a real love of science.
Within a few weeks of starting in the lab back then, I was already telling people that my time at NIH would be one of the best years of my life, which turned out to be true.
It also fostered my belief in the opportunities for doing good through science and medicine.
So not surprisingly, when I left the lab back then, and then finished medical school, and then did all my clinical training, and then became a medical oncologist, well, I quickly became frustrated.
Taking care of patients with cancer is an immense privilege, and I really have always loved the practice of medicine,
But no matter how much satisfaction I gained from clinical oncology, I was struck back then by how limited our options were for those patients.
Our drugs didn’t really work, at least not for most patients. That was clinical oncology in the 1990’s.
This was very frustrating for the medical oncologists, and imagine how the patients felt.
That fact forced me to return to the lab – so I could do better for my patients.
I had to leave clinical oncology to become a molecular biologist and a mouse geneticist.
What a demotion that seemed like at the time…stir bar story
Going back to the lab allowed me to focus on research and producing the science that we need to prevent disease and diagnose and treat our patients.
That effort is really what it’s all about.
Our commitment to the search for reliable and rigorous data, so that we can find answers for our patients who are looking for a treatment, a cure, or sometimes just the hope that one is on the horizon.
We are privileged to be a part of an extraordinary time of scientific progress that offers enormous transformational opportunities in medicine.
I’ve been particularly lucky in this regard, since I would argue that no area of medicine has seen more exciting progress than the treatment of cancer.
Those bad old days from the 1990’s I described earlier are gone now in medical oncology.
We have seen a pace of progress in cancer in the last two decades that has been breath-taking
Take a disease like malignant melanoma. When I was a fellow at the DFCI in 1995, this was the worst cancer imaginable.
Metastatic disease was terrible, nothing worked, and most patients were dead within a few months of diagnosis.
We made virtually no progress in this disease for decades despite hundreds of failed clinical trials – until around 2011.
With a new biological understanding of this disease and the immune response to cancer, we developed several effective therapies that all provided meaningful benefit for patients.
Over a 5 year period back then, FDA approved 5 new melanoma drugs with remarkable activity.
Now, what was about the worst cancer imaginable has gone from a nearly 0% long term cure rate to a 60+% cure rate.
The majority of people today who develop metastatic melanoma will be cured of it.
That’s remarkable progress based on basic science and clinical research.
To be clear, we still have a ways to go in cancer, it is perhaps the leading cause of death in this country and there are still way too many people, especially children and young people, dying from cancer.
But nonetheless, one has to admit that the progress in this disease (or collection of related diseases) has been amazing.
And it is clear to me, that to make additional progress in cancer, we need more and better research.
And that view is not restricted to cancer, but also applies to all other areas of medicine.
In fact, I would argue, even more needed in other areas of medicine.
Our progress in certain other disease areas has not been as good as we’ve seen in cancer.
Just take the A’s: Alzheimers, Anti-microbial resistance, arthritis, ALS, and aging. All areas where we need to see more progress, and that’s just one letter of the alphabet.
Which brings me back to this session.
Developing New Sources of Data
The title of the session itself, “Leveraging Data to Accelerate Medical Progress” captures two of the most critical issues we face in the medical research community generally, and specifically at the FDA.
First, what types of data do we need? And how do we get those data, what methods do we use? In a manner that is efficient and cost-effective, and for clinical research, and that is ethical and respectful of patients’ rights to privacy.
Second, how can we best use these data and their downstream technological advances to speed the development of treatments and cures to help patients?
At a time of limited resources, and enormous scientific challenges, we all want to make the most of the resources and opportunities at hand.
We want the maximum payoff for our efforts.
This rings especially true when it involves data science, where it’s crucial that we bring novel analytical techniques to bear.
Nothing seems more frustrating to me than having a huge data set, knowing the answer is in there somewhere, and not having the appropriate modern analytical capabilities to find that answer.
By gathering better quality data and mining that data in new ways, we can more effectively make scientific progress, find additional answers, develop new products, and, ultimately, help more people.
Few places depend more on high-quality data than the FDA.
It enables us to support scientific innovation and fulfill our unique role to help scientists and developers turn their vision for scientific advancement into reality for patients and consumers.
It helps us meet our responsibilities to ensure that these products are safe and effective for their intended use…
And high-quality data are needed throughout a product’s life cycle, from pre-market development to clinical trials, to post-market safety surveillance.
But to fulfill these responsibilities, we must be able to integrate the increasing wealth of available data into effective regulatory decision-making.
Data itself, while useful, is not necessarily transformative.
We must turn it into smart data that is usable to connect cutting-edge scientific discoveries to the real-world products and solutions that make a meaningful difference in people’s lives.
At the FDA, we’re working with researchers who are innovating in this area in a number of ways:
-establishing new linkages between complex and diverse data sets;
-harnessing real world data; and
-using novel analytical approaches
-- all in the name of enhancing innovation and providing better information to those who need it to make medical choices.
As an example of novel data aggregation, the Center for Devices and Radiologic Health (CDRH) at FDA has been working through a public private partnership to develop the National Evaluation System for health Technology (or NEST) that merges several disparate sets of health systems data to allow studies of device safety and efficacy in real-world use.
We’re continuing to speed development of effective therapeutics by promoting innovative clinical trial designs such as platform trials, basket studies, adaptive trials, and pragmatic randomized controlled trials.
These designs can be more efficient, and can help lower costs and speed accrual.
And we’re also continuing to use the expedited pathways or designations, such as Fast Track and Breakthrough, Accelerated Approval, and Priority Review that Congress established for drugs, biologics, and devices to help develop and speed the review of products that treat serious conditions and fill unmet medical needs.
We’re not abandoning the traditional ways that researchers and FDA have been collecting and evaluating data.
Rather, we want to build on and strengthen that hierarchy, particularly as the quality and reliability of new forms of evidence grows and the methodologies for evaluating them improves.
And of course, increasingly, we are learning from the input of our patients.
We’re also building new structures to encourage and support multidisciplinary research collaborations across the FDA.
A well-known example is the Oncology Center of Excellence at FDA, which coordinates cancer activities from disparate parts of the agency to allows for faster and more efficient review of cancer applications,
But we have similar plans for other areas of the FDA that need trans-agency cooperation; for example, in products utilizing artificial intelligence or advanced manufacturing techniques.
One key technical question to these trans-agency efforts is how to use cloud computing and storage at the agency.
We’ve jumped into cloud computing in a big way, already using this technology in innovative ways, allowing sponsors and reviewers to exchange messages and datasets in real-time.
This is related to IT infrastructure and data storage/usage at the FDA, which provides some challenges for the agency.
I’m pleased to say that in just a few weeks we’ll be announcing a new FDA plan for modernizing FDA’s technology infrastructure
This will have many advantages, including allowing us to support increasing use of RWE and other new types of data in our regulatory decisions.
Striking the Balance Between Speed and Safety
So, as you can see, the FDA is committed to advancing new approaches for gathering data.
This leads me to a final point today: the critical role FDA plays at a nexus between leveraging data for speedy development of new products, while always ensuring the integrity of both the data and the process.
Everyone here understands that we are being pushed to approve newer, better, safer, more effective medical products – and to do so more quickly.
The key, of course, is to do so while maintaining FDA’s gold standard for safety and effectiveness.
This requires striking a balance. And the balancing point must be the health and well-being of the patient.
Unfortunately, finding this balance – or in some cases explaining it – can be problematic.
On one hand, it’s rare to find anyone who is searching for a new treatment for their disease complaining that FDA is moving too fast.
On the other hand, there are some who believe that faster approvals using some of the new approaches must involve lower standards.
Nothing could be further from the truth.
I believe we can have it both ways: faster approvals for our patients while preserving the standards of safety and efficacy.
What’s often misunderstood in this discussion is how much science has changed and how much more we know about developing effective treatments.
Take another example from cancer: when I started practicing, randomized, blinded placebo drug trials were the standard.
Patients hated this, for obvious reasons. But we did it because we didn’t have any credible alternatives.
Over the years we learned a great deal about the biology of the disease and how to treat it, and made a lot of progress. In short, we developed a new paradigm.
And we came to understand, we did not need placebo controls to make progress in many areas of cancer research (and just put out final guidance to that effect last week).
We still use these for certain types of trials (supportive oncology) but it is rare for a therapeutic endpoint.
We found we could make progress in cancer using other approaches.
And this is good news for patients. No one with pancreatic cancer wants to get randomized to a sugar pill.
It’s also more efficient: consider for example some pediatric cancers, there just aren’t enough patients to effectively use traditional randomized trials.
So, we can use smaller trials, with innovative designs
And we’ve seen some enormous successes in cancer using these approaches; with impressive progress in melanoma as I mentioned, but also lung cancer, breast cancer, pediatric leukemia, and many other cancers.
The goal is to replicate this success in other disease areas. That’s been challenging in other areas, for example, neurodegen diseases, which continue to prove highly elusive to treatment.
We don’t yet have an adequate understanding of the complex pathophysiology of these complex diseases.
All of which leads me back to where I started –and where Research!America comes in-- the critical importance of the need for more biomedical research yielding good data.
And here let me dwell for a moment on what I mean by “GOOD” as opposed to “BAD” data.
Every day there’s new research that offers great promise. But some of this rapid desire for progress comes a risk: the potential for taking a shortcut with the FDA, by collecting not GOOD but BAD data, and then submitting this “BAD” data to us in support of a medical product.
Data can be bad for a few reasons, but two important causes are sloppy, slipshod data collection or even data falsification.
It can be difficult for the FDA to tell these apart, because they differ in terms of intent; that is, if data are bad because of carelessness or because of fraud.
But from one point of view, it does not matter that much whether we are dealing with incompetence or malfeasance, because either case can lead to a regulatory decision that harms patients.
Let me specifically comment on the problem of data fraud: I learned long ago, when I was in academia and serving as an editor on the Journal of Clinical Investigation, that submission of false data could be a problem.
Later, as director of NCI, I saw some people mislead us in grant applications.
So it should not be surprising to me now at FDA that if people will lie to get their paper published or get their grant funded, then people will also lie to get a billion dollar medical product approved.
We simply cannot tolerate deception of any kind.
Now, I do not wish to imply this problem is rampant, or even that it is increasing in frequency (in fact we have no evidence of that).
But we do see data fraud at FDA, and more than I would have expected prior to coming to the agency.
And I would argue that these instances are rare, they are significant because even a few examples can damage public confidence in the approval process.
Today’s modern FDA – the nimble, effective, fast FDA that everyone desires – requires that those who submit application and data be truthful. When someone lies or falsifies information, that paradigm breaks down.
If someone comes to the FDA with data that is inaccurate, or if they submit an application that contains a false claim, it undermines the search for a treatment or cure, violates the public trust, raises costs, exposes people to needless therapies, gives science a bad name, and most importantly, it’s bad for patients.
At the FDA, we don’t have the resources to check every aspect of every bit of research. We have to trust sponsors at some level.
But we will be vigilant concerning the accuracy of the research we review.
And when we do identify data fraud, we will use the full range of our authorities to address this; including civil and criminal penalties.
I would argue that good data is the product of a good research culture, and building this culture is the work of the scientific community.
And this is one place where Research!America can clearly be of help, by continuing your work to promote a scientific culture that values data integrity to protect patients.
I know this is an area about which you care deeply.
Thanks for the invitation to speak today and I look forward to working with you in this area.