U.S. flag An official website of the United States government
  1. Home
  2. News & Events
  3. Speeches by FDA Officials
  4. Speech by Commissioner Robert M. Califf to the House of Medicine - 06/16/2023
  1. Speeches by FDA Officials

Speech | In Person

Event Title
Speech by Commissioner Robert M. Califf to the House of Medicine
June 16, 2023

Speech by
Robert M. Califf, M.D., MACC

(Remarks as prepared for delivery)

Good morning.  Thank you for inviting me to be with you today to speak on a topic that poses one of the most serious threats to public health today – the continuing and growing challenge of medical misinformation and disinformation and what the FDA is doing to help respond to it. 

As you are aware, information is a critical aspect of public health.  Throughout my career, I’ve been concerned with the ways that those of in the medical profession use information to make decisions.  It’s been central to my search for new and better ways to collect data and generate evidence to better inform and strengthen the work we do.  

My appreciation for this potential was advanced further during my recent work with Alphabet, where I was just prior to returning to the FDA.  There I saw first-hand some of the enormous technological innovations now being put into action –ways in which information technology and social media can be combined and used positively.  But that experience also gave me a ringside seat to the enormous growth in the use of misinformation and the harms it can cause.  

At the FDA, producing reliable, fact-based information on which the public can rely is central to our work.    One of the most basic responsibilities of the agency is to disseminate facts about science and medicine to the public to help Americans make informed choices about their health.  This duty is reflected in our mission statement, which in relevant part states that we “help[] the public get the accurate, science-based information they need to use medical products and foods to maintain and improve their health”.  

The FDA has long communicated authoritative and reliable information to the public about various public health issues related to medical products, food, and tobacco – outbreaks, diseases, treatments that are safe and effective, and products that are not what their manufacturers claim, among other responsibilities. 

But more recently, our approach and our effectiveness have been dramatically altered by the digitization of our world and the ability to accumulate and share resources and transcend previous boundaries via the Internet.  

Make no mistake, there are many pluses related to digitization and other changes in how we communicate.  For example, these changes are leading to expanded opportunities to generate reliable evidence and disseminate evidence and knowledge to almost all 340 million Americans.  And it doesn’t stop there.  In fact, we can reach most of the eight billion inhabitants of the world, almost all of whom now or soon will have cell phone access.  In the old system that  depended on communication to learned intermediaries, we now realize that many people were left out simply because they had no access.  

But the changes also have had a negative impact on how we communicate and share information.  The FDA today must compete with many other voices (some of which have no expertise in the topics on which they opine) in what is a 24/7 stream of information and, far too often, unsubstantiated, false, or misleading assertions.  These developments have an impact on the work of the FDA (as well as other government agencies) and our effectiveness in advancing our mission to apply science to protect and the health of patients and consumers. 

That’s no accident.  The disharmony of noise and unverified or misleading assertions, the torrent of misinformation, and, too often, disinformation that we see today is often intended or designed specifically to undermine and erode trust in science, scientists, and expert agencies such as the FDA. 

When the term “misinformation” first gained prominence, it was often labeled a “trend” in communications, a natural byproduct of the evolution of social media.  It has become increasingly clear, however, that this trend not only is not likely to fade away, but also has evolved into a full-fledged crisis.  

Indeed, the use of misinformation (and disinformation) has accelerated along with the growth of social media and social networks that increasingly drive communication based on social identities. It has become equally clear that there is no simple fix to countering and overcoming this misinformation crisis; rather, it will require a creative, sustained and broad-based collaborative effort, which is what I’d like to talk to you about today.

Misinformation affects virtually every corner of our society, from the interpretation of history to politics, from advertising to education.  Today, I want to focus my remarks on the impact misinformation has on the medical and science community and its impact on public health.  I do this not just because I bring the perspective of the FDA, but because of this issue’s importance to overall public health and well-being, and the key role all of you play in shaping that.

The preponderance and dissemination of medical misinformation is already having a significant negative impact on health outcomes, causing people to make plainly uninformed and adverse choices regarding their health. We see it across the spectrum -- from continuing use of tobacco products and vaping, to failure to use effective medical treatments, to eating an unhealthy diet.  Each of these decisions is made in the face of definitive facts that make clear these actions are harmful.

Perhaps the recent and clearest example of the harm that comes from medical misinformation involves vaccines, and specifically Covid vaccines. It is one of the greatest (and continuing) tragedies of the pandemic that even after scientists developed vaccines to protect against the Covid 19 virus in record time and the FDA speedily authorized those treatments because it found them safe and effective, hundreds of thousands of people died from this disease when staying up to date on their vaccinations and being  treated with an authorized or approved antiviral would have reduced the risk on the order of 80%--with no cost impediment.   

The attacks on the integrity of the science and clinical research that led to the development of the Covid vaccine is just the tip of the iceberg.  Vaccine opposition and hesitancy is not a new phenomenon, of course; but it has become far more dangerous as vaccine opponents have been able to more widely and quickly disseminate their messaging of misinformation or disinformation across the Internet.  The cumulative effect of these factors is an unprecedented decline in the life expectancy of Americans, now almost five years shorter than other high-income countries.

Indeed, the consequences of misinformation are also linked to a broader problem – the worrisome pattern in which our scientific knowledge and technological abilities continue to advance rapidly, but our health outcomes are failing.  We are not successfully translating our knowledge into actions that result in better health.  

Furthermore, this increasing divergence between the pace of knowledge generation about fundamental science and our translation of that science into policies and clinical strategies that can improve health is amplifying disparities among different portions of the population.  Demographic factors that play a role in the inequitable provision of health care generally -- race, ethnicity, wealth, age, and education, as well as where someone lives -- all can exacerbate the impact of misinformation and disinformation.  This is compounded because the information ecosystem is often linked with social determinants in ways that make them extremely susceptible to manipulation by people trying to sell adverse choices through persuasion and social identity.

We all know, as a matter of common sense, that lies or false explanations spread faster than the truth.  Those who tell falsehoods aren’t burdened with the need to be consistent with other information or, for that matter, even logical. This simplistic dynamic is only magnified on the Internet, which is well positioned to take advantage of the public’s lack of understanding of diseases and treatments, not to mention their fears.  As we saw at the outset of the pandemic, public anxiety and confusion led people to scour the Internet for answers, a demonstration of what might be called the magic potion character of the Web.

Unfortunately, those panic-induced searches can precipitate the discovery of not only well-meaning but still quite dangerous medical advice, to outright false claims that are based in more devious behaviors.  There are many reasons for this maliciousness, from the broad-based desire to sow distrust in the government, to a basic profit motive by those who produce so called alternatives to safe and approved treatments.   

We know that it doesn’t take much to be a catalyst for misinformation. Consider the study done near the beginning of the pandemic by the Center for Countering Digital Hate, which found that just 12 people were responsible for 65 percent of the misleading claims and outright lies about COVID-19 vaccines that proliferate on Facebook, Instagram and Twitter.  The greater problem, however, is what this leads to -- the difficulty of challenging, eliminating, preventing, or debunking false information, especially when that information is spread electronically.  

All of this is particularly disheartening for us as scientists, since we were trained to understand that science and data, while not infallible, give us the most reliable information upon which decisions can be based.  We know that although the scientific process is designed to apply dependable evidence to reach informed conclusions, that process is a dynamic one. We are always searching for even better evidence.  Nonetheless, at the FDA, when we reach a decision, we have confidence that we have done our best to base this decision on all of the available evidence.  It’s also the case that while imperfect, our decisions can and should be relied on by the American public as the best decisions as a result of a time-tested systematic approach.

Unfortunately, thanks to misinformation, that basic chain of trusted communication is not always the case.  The sharing of misleading resources, including so called evidence that may appear to have some semblance of legitimacy, is increasingly the rule, not the exception. 

Danielle Allen, a Washington Post contributor and Harvard professor, recently made a point that I found compelling.  Her view was that social media has knocked a pillar out from under our democratic institutions by making it exceptionally easy for people with extreme views to connect and coordinate. It turns out that the designers of the Constitution thought geographic dispersal would put a brake on the potential power of dangerous factions. But people no longer need to go through political representatives to get their views into the public sphere.  

A similar phenomenon is affecting transmission of medical and public health information.  Purveyors of misinformation have always existed in our country and an anti-science faction has been a part of our culture.  But in the past these people have been limited to constrained dissemination mechanisms through magazines, newspapers or AM radio, all limited by geography. 

We have to ask the question that Dr. Allen asked: In the absence of these limitations, What are the mechanisms that we as a society can develop to fill this gap and act as a brake on faction or misinformation?

There’s another important factor, which is that these false communications often have a connection to a political agenda or a cultural identity, which may be tied the delegitimization of a particular scientific fact or conclusion because that fact or finding may in some way support a political approach or cultural identity that plays into peoples’ deepest fears.  

So we have clearly identified the problem.  But finding a solution is a bit trickier.  Where then does the medical community fit in?  And what are we doing at the FDA to deal with this situation?  

First, we are working to improve our general approach to scientific communication to the public.  This includes more frequent scientific communication, but it also has a major component of improving communication directly to the public, professional societies and advocacy groups, paying a lot of attention to the best language for the particular audience. 

Second, when issues arise that either require decisions or complex actions like recalls or warning letters on tobacco, we routinely consider the opportunity for “pre-buttal”.   Anticipating the counter-arguments is an essential part of preparing even the most routine scientific decisions these days.  

Third, when misinformation appears in a way that we can see that it is having impact, we do everything possible to rebut quickly before the misinformation is broadly disseminated.  One of the things we’ve done at the FDA is to start a “rumor control” page on our website.  It’s designed to address the rapid spread of false and potentially harmful information by providing specific facts about individual topics that may be trending.  For example, the currently featured topics on our page are Covid-19, sunscreen, and dietary supplements. The page has several other important features, as well, including general health information for consumers, resources about the FDA, and what the agency does and, perhaps most importantly, some basic tools to help stop the spread of information, including ways to report misinformation online.  The site also includes a consumer-focused videos that we produced on how to identify and help stop the spread of health misinformation.

Fourth, we continue to work on our interactions with the media, as they include all three elements I’ve already mentioned.  For example, it is not unusual to see a headline, designed to draw the readership to the article, that may not represent the body of the article. It’s important that we are responsive to media inquiries by providing them with truthful and scientifically reliable information to inform their work. 

Finally, we have commissioned the Reagan Udall Foundation to help us develop a long-term strategy to deal with this issue.  We expect the report to be available later this year.  The foundation is working with multiple parts of the ecosystem to get input on strategy.

One thing we are NOT doing is to suppress free speech.  The First Amendment is fundamental to the fabric of the U.S.  But institutions can respond to misinformation, both to correct the record and to add to the vibrancy of public conversation.

Finally, we are actively working on the issue of the use of AI and large language models in the production and promulgation of misinformation.

Ultimately, what is within our control is the quality of the independent scientific work being done to support good decision-making in a polarized environment.  We open the door for people to misrepresent our results and conclusions when don’t have the highest quality evidence for decision making.  As I have said previously, the controversy generated by our decision-making process is inversely proportional to the quality of the evidence used to make it. 

So, what then can we do together?  And by “we” I mean all of us, since this challenge can’t be solved by the FDA or any one government agency alone, or even by multiple government agencies working together.  Indeed, because so much of misinformation and disinformation is designed or targeted to fanning the flames of distrust in the government, it will require the involvement and leadership of those outside government, including the medical profession, universities, health systems, and industry.  It must come via the personal connections, networks and affiliations that are the strength (or weakness) of the Internet and social media, and which can lend the credibility to those findings. 

As scientists, it’s disappointing that we have think about this aspect of our work.  But this topic can’t be avoided in today’s world of public policy and public health. And the medical community has a particularly important role to play, to be a powerful force for change, through the trust and respect you command and the opportunity you have to transmit reliable information to patients and consumers.  Study after study shows that personal connections – connections in which the listener has trust in the speaker – makes enormous difference in getting believe to believe information (or doubt disinformation).  

 We need to rebuild this trust.  Then we can continue our efforts to convey the results of our scientific, public health and regulatory work in words that can be understood and embraced by the public.  

Thank you. 

Back to Top