• Decrease font size
  • Return font size to normal
  • Increase font size
U.S. Department of Health and Human Services

Science & Research

  • Print
  • Share
  • E-mail

IV. Enhancing Safety and Health Through Informatics

Table of Contents: Advancing Regulatory Science for Public Health

FDA houses the largest known repository of clinical data — unique, high-quality data on the safety, efficacy and performance of drugs, biologics and devices, both before and after approval. But despite the availability of these data, questions about subpopulation responses and underlying placebo effects remain unanswered. FDA data could be used to address fundamental questions about patient subsets who respond in varying ways to new therapies, or for whom a drug is more or less safe. But we lack the right infrastructure, tools and resources to organize and analyze these large data sets across the multiple studies and data streams. In other words, we have a valuable library full of information, but no indices or tools for translation.

What has FDA done?

FDA has been investing in the infrastructure necessary to support receipt of study data electronically and to develop an environment conducive to analyses of large data sets.  This dedication to creating a scientific computing environment in which multiple studies can be compared and analyzed requires data harmonization and standardization such that comparisons between data can be made effectively.  FDA has participated in efforts and initiated pilot projects to begin aligning its systems to the Health Information Technology (HIT) standards that are part of the national effort to develop a system to support electronic health records.  Clearly, in a world of accumulating data and information, FDA remains in pursuit of systems and approaches to house and analyze its vast data stores and ever increasingly complex data arriving daily.  As noted below, engagement in standards activities is one of several activities devoted to health information technology for assessing products both pre- and post-market.
 
• Clinical Data Interchange Standards Consortium (CDISC)
In 1997, CDISC began as a group of 25 individuals, including representatives from FDA, pharmaceutical companies and vendors. These volunteers came together to support the development of standards that could be used in clinical trials to improve the ability to collect and analyze data across industry and academia, without bias towards any one sponsor or organization. These standards enable better and more efficient safety and effectiveness data analysis and faster evaluation of important new medications. CDISC has evolved into a not-for-profit organization with hundreds of participants from around the globe and has produced a number of data-production standards including the Study Data Tabulation Model (SDTM), a standard used for submitting electronic safety and clinical data to FDA. (In November 2009 the FDA announced that it will accept submission in SDTM version 3.1.2)

What can FDA do with increased investment in regulatory science?

The vast data stored at FDA must be transformed into a harmonized format and organized in a common database so that it can be queried by topic and analyzed to address key questions. These goals require investments in informatics hardware and software and the development of standardized data models for relational databases and scientific computing.

With a common platform in place, scientists could take advantage of existing historical data as well as the new data coming into FDA every day. For example, we would be able to look at 15 studies of HIV drugs at once to analyze which drugs work best for which patients and when or use the data to detect a new or rare safety risk by capturing safety information from millions of medical records in months instead of years. These types of analyses will not only enhance review by applying lessons learned from one study to another but also provide incredibly valuable information about diseases and therapies, along with unprecedented insight into the mechanisms that govern their successes or failures. These insights will benefit FDA and the biomedical and healthcare community at large, enabling physicians to make more informed decisions about the optimal use of FDA-regulated products.

A better understanding of the natural history of disease and the effects of specific interventions should make clinical development and evaluation of new products more efficient and quick and less costly and risky for patients. There are many areas for advancements:

• Real time monitoring of safety data using healthcare data 
We need to expand and harmonize our electronic systems for receiving, processing, storing and analyzing adverse event reports and other safety information for FDA-regulated products, while at the same time ensuring we protect patient privacy. The database requires a portal through which external users can easily submit data to FDA for organization and analysis. (This system should ultimately be able to communicate with the Sentinel System, which is in development and will provide active surveillance for monitoring post-approval product safety using electronic data from healthcare information holders such as HMOs and other health systems.)  Additional investments to capture healthcare and other related surveillance data are also needed — as is continued collaboration with agencies like the Centers for Medicare & Medicaid Services (CMS) and the Department of Defense — to develop a system that can provide ongoing, accurate, real-time information about the safety of therapies in different patient subpopulations. Similarly, we require investments in infrastructure to support these complex and interconnected data systems and to promote development of electronic data standards to facilitate electronic submissions of readily interpretable sponsor data.

• Data mining and scientific computing 
Investments in new software tools are needed, as are collaborative projects that bring together the latest technologies and approaches for mining complex data from clinical trials, healthcare settings and biological studies. These approaches will not only enhance review quality and efficiency but also provide FDA with knowledge that can move product development toward personalized medicine.  Some key areas for focus for scientific computing include but are not limited to the following:

  • Develop and implement active post-market safety surveillance system that queries health system databases to identify and evaluate drug safety.
  • Expand PRISM (Post-Licensure Rapid Immunization Safety Monitoring) System to other vaccines and biologics.
  • Employ advanced informatics, modeling and data mining to better detect and analyze safety signals.
  • Apply computer-simulated modeling to risk assessment and risk communication strategies that identify and evaluate threats to patient safety; develop methods for quantitative risk-benefit assessments.
  • Enhance IT infrastructure to support the scientific computing required for meta-analyses and computer models for risk assessment.
  • Apply clinical trial simulation modeling and adaptive and Bayesian clinical trial design methods to facilitate development of novel products.
  • Apply human genomic science to the analysis, development, and evaluation of novel diagnostics, therapeutics, and vaccines.
  • Apply appropriate statistical analysis of genomic studies.

Partnerships with academia, industry, and other governmental agencies are an important part of the equation. These efforts, with new paradigms for clinical trial design and surveillance of all FDA regulated products, will enhance patient outcomes and bring FDA fully into the 21st century.
 

Next Section: Protecting the Food Supply