FDA Consumer magazine
The Centennial Edition/January-February 2006
By Philip J. Hilts
The Food and Drug Administration is a government body with a presence far beyond its size. That is because
it is not just an organization, but an idea. It was conceived a century ago to address a problem of modern society, and the creation has proved vital. There is now no democratic society that does not employ a body like it.
At the end of the 19th century and beginning of the 20th, there was revolution in Europe and revolt in America as well, against capitalism and the political leaders unable to tame its worst excesses. The idea, as President Theodore Roosevelt saw it, was to save capitalism by civilizing it.
When America was founded, the Industrial Revolution had not arrived; the telephone and telegraph, railroad, airplane, and auto had not been created. Nor had the great engines of free markets been set up to drive it all forward into the modern era. The dream of the founders had been of a placid agrarian society, where wealth and position were based on land ownership, and where decisions for the nation were taken not by lords but by citizens deliberating together.
But soon a rush of change swept away those visions as machines were invented and applied in every field; factories were built up around those hissing, tapping steam beasts. Workers left the farms for the cities to feed and operate the machines and to supply labor to the giant, national businesses--the first corporations, and America's first big bureaucracies.
The FDA regulates over 1 trillion dollars'
worth of products, which accounts for 22 cents
of every dollar spent annually by American consumers.
An Era of Business Changes
The upending of the settled, farming society had seemed impossible. "It was frightening and bewildering to many--that a whole society should be taken over by moneymaking and the pursuit of individual interest," wrote historian Gordon Wood. President Abraham Lincoln was aghast. He said he saw it as the coming of a new class of tyrants to replace the kings and nobles just removed from their positions only 50 years before. "Corporations have been enthroned," Lincoln said. "An era of corruption in high places will follow and the money power will endeavor to prolong its reign by working the prejudices of the people … until wealth is aggregated in a few hands … and the Republic is destroyed." The era of the robber barons was at hand.
Harvey W. Wiley, the government's chief chemist and the first leader of what would become the Food and Drug Administration, wrote of the robber baron period: "Various and colorful terms have been applied to that next-to-last decade of the nineteenth century. However the era may be characterized, one thing is certain." The time "brought forth many changes in business life and left many evils that called to high heaven for remedy."
At the beginning of the century, more than two-thirds of the people lived and worked on farms, but by the end, fewer than half remained. Cheap, factory-made products flooded the land, from Sunday clothes to strawberry preserves; everything was ready-made in a distant city.
The New Art of Chemistry
After centuries of airy science, the link of knowledge to practical life had finally been made, and the chemists were the leaders. Louis Pasteur in France and Robert Koch in Germany showed the nature of disease in germs and how to defeat them. At the same time, they worked on commercial concerns--Pasteur for the wine and silk trades, Koch on industrial dyes.
With new powers, but no new sense of responsibility or rules to create it, commercial life began its race to the bottom. New forms of cheating were now possible on a large scale for the first time, at exactly the moment when a food or medicine maker did not have to face his customer directly.
Wiley, chief of the chemistry bureau at the Department of Agriculture, had his chemists lay out the issues involving food in a report to Congress. They described an array of chemical-cheapening ingredients, colorings, and preservatives being used to radically change the appearance, smell, and taste of bad food without setting off alarms in the senses.
Copper sulfate can make faded vegetables appear green again; sodium benzoate can prevent decayed tomatoes from rotting altogether; and borax can make odorous ham acceptable when canned.
The chemicals used were not required to be tested for their effects on human health, and were not. There were no penalties for selling chemical-laden food. A new industry grew up making and selling preservatives with names like Freezem, Freezine, Preservaline, and Rosaline. They carried no labels, and food company owners testified in Congress that they used them without ever asking their manufacturers of what they were composed. Most were dilute solutions of formaldehyde, sulfites, borax, salicylic acid, and benzoic acid, among other things. All are somewhat toxic, and now restricted for human use to some degree; three of them are banned.
In addition to trying to prevent food from going bad, the new art of chemistry created opportunities for complete fakes. A bit of brown color and a dead bee or a honeycomb dropped into a jar of laboratory-manufactured glucose made a "honey" that was cheap to manufacture. Brown coloring and a pinch of flavoring could also "turn" glucose into cane or maple syrup. A spoonful of hayseeds and some pulped apple skins for color transformed the glucose into what could be called strawberry jam.
On the farm, foods that must be stored and their containers are sterilized and set aside--not practical for the new business. Sterilization was difficult on a large scale, and while the woman at home might put up a dozen jars of pickles and accept that three of them might go bad, businesses could not accept such odds. In addition, they had to ship foods many miles while the foods broiled in the sun, cooled in the shade, and shook with the train and wagon. With sterilization not a viable option and refrigeration too expensive and not always practical, preservatives became vital for transport. Altering food became the easiest way of saving it.
Wiley suspected that many of the preservatives were harmless, but in an effort to find out, he created the first significant study of the effects of preservatives on humans. Using a small newspaper ad, he recruited a dozen "young, robust fellows" who might be expected to have "maximum resistance to deleterious effects of adulterated foods." He fitted out a basement mail room as a dining hall, complete with white table cloths, and asked the volunteers to take all their nourishment there, at scheduled times. He also made them submit to frequent medical exams and worse, required them to carry a satchel at all times with the necessary jars and sampling equipment to collect all their urine and stool for tests.
Over months, meals were given with and without preservatives to a series of groups at the "hygienic table." To Wiley's surprise, he had no trouble getting volunteers. And along with them came newspaper reporters who were completely absorbed in what they called the adventures of the Poison Squad.
Though the experiments were flawed scientifically, the results shocked Wiley. He expected to see little or no effect, but instead he found quite a few of the young men getting ill when fed higher doses of preservatives with meals. It made him question his long-held assumption that it was permissible to allow chemicals, and drugs for that matter, to be put on the market first, and tested only later.
Thousands of Remedies
The trouble was not just in food; it may have been worse in the medicine trade. There were very few medicines known to work effectively--one doctor said all could be counted on the fingers of one hand. But there were thousands of remedies on the market, divided roughly into two kinds--the inadequate but seriously intended treatments prescribed by doctors, and the commercial medicines sold by some for no other reason than for profit.
Medicine was one of the first fully national markets that used nationwide advertising, and so quack medicines, of which there had always been a trickle, suddenly became a flood as tradesmen, not doctors, saw the possibilities for profit. A new category of drugs and health supplements, shaped in England but exploited on the American side of the Atlantic, was created specifically to suit the new methods of business. Though called patent medicines, they were not actually patented. Rather, the words "patent" and "proprietary" referred to the secrecy in which their formulas were held. Ingredients were not disclosed to either the doctors who administered the "drugs" or the patients who took them.
This secrecy was a break from the past. In every developed nation, doctors and lawmakers had worked to construct a known, stable list of medicines and their ingredients. These were the pharmacopoeias and national formularies. They were put together in an effort to assure citizens that the medicines they were taking were standard mixtures of known ingredients, whether or not they could be said to be effective. The efforts to regularize medicine, to set universal formulas for each known remedy, had advanced from the 17th century until the 19th century.
The patent medicines often were created from the standard remedies doctors used, made from the same herbs and minerals. Their effects were very similar, though confounded by the multiple ingredients they contained, from a half dozen to 40. They were sold solely on the basis that someone, somewhere, was said to have been cured by them. The difference between them was one of marketing. It was the packages and the advertising sheets that were the key to the rise of the era of quack medicines.
This was a first for any product. Manufacturers, as America's leading historian of food and drugs, James Harvey Young, writes, patented neither the key medicine in these offerings nor the composition of the entire formula. Instead, the makers patented, trademarked, or copyrighted the distinctive shape of the bottle, the box the medicines came in, the type styles and pictures on the labels, and the advertising associated with it all.
They worked with more than 15,000 names, including--the "Grand Restorative," the "Universal Vegetable Pill," and "Wheeler's Nerve Vitalizer." The claims for what they did were relatively modest and narrow in the 17th and 18th centuries, but became florid and aggressive by the 19th century. Swaim's remedy, for example, was sold with claims to cure "cancer, scrofula, rheumatism, gout, hepatitis, and syphilis," and the list expanded gradually over the decades it was on the market.
The makers of these remedies were the first to exploit the new communications boom and accounted for a large percentage of the income of many newspapers, often half of a newspaper's entire advertising income. The drugs "flourished in direct ratio to the availability of cheap newspapers and magazines," writes historian of medicine John Duffy.
The Troubled American Food and Drug Trades
Standards for known drugs were also low. At one point, concern about the medical drugs coming into the port of New York from Europe prompted a yearlong investigation by M.J. Bailey, an inspector appointed by the secretary of agriculture. At the end of the study, he told Congress: "More than one half of the most important chemical and medicinal preparations … come to us so much adulterated, or otherwise deteriorated, as to render them not only worthless as a medicine, but often dangerous." Rhubarb root was a common ingredient in medicines, and in one three-month period, Bailey recalled, he had reviewed 7,000 pounds of the root and found "not one pound of it fit, or safe." Opium was a vital painkiller, and when medical shipments arrived in New York, they had been cut to one-third natural strength and laced with Spanish anise and other bitter powders to disguise the dilution. Further, a substantial part of all opium shipments was infested with live worms. Massachusetts reported that of all medical drug samples taken there between 1882 and 1900, 37 percent were adulterated. In New York, of the 343 samples of one drug purchased, phenacetin, 315 were diluted with acetanilide, a very hazardous painkiller.
The market in medicines, without any regulation, was essentially the same as the illicit trade today in heroin, cocaine, and other drugs. The supply was unreliable, the purity suspect, the price high and variable.
"The magic power of profits," said Edward R. Squibb, M.D., the founder of the drug giant of the same name, was apparently able to corrupt "a large majority" of the pharmacists, and many doctors. He felt sympathy for those who wanted to maintain high standards, and believed they deserved a fair chance to earn a good living, which they could not do in such an environment. Adulteration of drugs was enough to get the U.S. Navy's Bureau of Medicine and Surgery to stop buying drugs from pharmaceutical companies altogether. The bureau set up its own laboratory for making drugs to be sure they would be both potent and uncontaminated.
European nations increasingly came to fear American goods and started embargoes against food. Rampant contamination of American products gave European nations an opportunity to create trade barriers that would do wonders for their own farmers and food manufacturers. After a round of trichinosis infections in the United States caused by infected pork, the acting British consul in Philadelphia wrote to his foreign office, with an apparent combination of horror and glee, of the symptoms of one man in Kansas, ill with trichinosis: "Worms were in his flesh by the million, being scraped and squeezed from the pores of his skin. They are felt creeping through his flesh and are literally eating up his substance." Even in countries without boycotts, the sale of American products dropped drastically.
The year 1905 was one of crisis and change. By that time, national congresses on food and drugs had been held for five years running, and one bill after another was pressed in Congress, but House Republican leaders killed every measure without consideration. In 1905, a barrage of magazine articles appeared detailing the horrors of the American food and drug trades, from the alcohol-laden restoratives for adults to the deadly opiate-laced syrups for children, from the drug makers' bribes to newspapers not to cover the stories to the phony testimonials used to advertise products.
The final straw came with a novel, The Jungle, by Upton Sinclair. He had worked in the meat-packing industry to research the book, and intended to write a novel that would expose the awful life under "wage slavery." But in the book, he also spent a few pages describing the conditions in which meat was prepared for market: workers sick with tuberculosis spat on the floor and then dragged carcasses across it; meat rotting in storage rooms and carcasses covered with rat droppings were then made into sausage, detritus and all. The book sickened readers and became an instant best seller.
Roosevelt was outraged, doubting that anything like the conditions described could be real. He sent a team of investigators on a secret mission to Chicago to make an unannounced inspection in the meat yards. Company spies soon discovered the mission and had two weeks to clean and whitewash the plants. It wasn't enough. The investigators still found conditions revolting, the practices awful, and they even witnessed one carcass fall from the process line into a latrine. It was quickly hauled out, put back on the line, uncleaned, and sent to storage with the other meat.
Within weeks of the publication of The Jungle, meat sales dropped by half across the United States. "I aimed at the public's heart," Sinclair said, "and by accident, I hit it in the stomach." Meat companies were frightened, and ready for some rules to govern the trade, but leaders in Congress still refused to pass a bill. Women's organizations from around the country and the American Medical Association (AMA) threatened everything including a march on the offices of the congressional leaders if no action was taken this year. A watered-down version of a bill was finally released, but Roosevelt read it and found it worse than the status quo. He had held back releasing the report of his meat plant investigators, but now he turned it over to the newspapers. It verified in fact the conditions described in Sinclair's fiction, and it secured the passage of the nation's first broad food and drug acts. They were signed in June 1906.
New Laws Bring Hope, Disputes
The laws set new standards. For meat, they established inspection and approval before marketing to ensure safety. For drugs, the law did not go as far, but at least set into law the principle that the labels of medicines must be truthful and not misleading. It didn't require listing ingredients, but if they were listed, they had to be correct. The only requirement was listing poisons and dangerous ingredients such as cocaine, opium, alcohol, morphine, and chloroform. One loophole was the vagueness about cures. If a medicine claimed to cure cancer, was that misleading? The courts of the time said any claim was fine if the manufacturer believed it; evidence was not required. The understanding of what "scientific evidence" might be and what experimental trials could show remained weak for decades more.
Wiley was already the head of the new regulatory agency designated to carry out the laws and he served from passage of the law in 1906, to his resignation in protest in 1912 over what he called its lax enforcement by his superiors.
Congratulations were printed across the country on editorial pages; the people's movement for pure food and drugs had won a great victory against the bad businesses; and the future was assured. "The purity and honesty," wrote the enthusiastic editorialists at The New York Times, "of the food and medicines of the people are guaranteed." It was believed that the quacks would be put out of business and the honesty in business would rise to the center of the food and drug trades.
It did not happen. Wiley had hoped that the passage of a law, after more than three decades of voluntary failure by professions and businesses, would give the FDA the influence needed to support the honest merchants and take the miscreants to court. But the wording of the law was simply too weak.
An important case on quack remedies that Wiley chose to pursue in court was one that seemed straightforward enough. The remedy, called, weirdly enough, Cuforhedake Brane-fude, was sold as a "brain tonic." It contained healthy doses of alcohol, caffeine, and a lethal pain reliever called acetanilide. In one of the magazine series before the law was enacted, this pain reliever was featured together with 22 deaths the writer had found linked to it; the AMA and the chemists at the Agriculture Department had evidence of a far greater number of fatalities. The label on the Branefude packages said the medicine was "a most wonderful, certain and harmless relief" and that it had "no … poisonous ingredients of any kind."
Simple enough: the ingredients were not harmless, and the only thing that might be called food for the brain in it was alcohol. A jury found the maker guilty. Then, the problem with the law became clear. The maker of the remedy was fined the maximum amount under the law, $700. As Wiley told reporters after the trial, the maker "had made two million on the product … and was [still] $1,999,300 ahead." The maker didn't even bother to file an appeal; he simply changed the label and carried on selling the product for years.
There was a steady beat of disputes over the law and its meaning, with the nadir coming in the U.S. Supreme Court's ruling that said anyone with a cancer remedy that they personally believed in was entitled to sell it, regardless of medical opinion or scientific evidence.
The fundamental problems of hazardous food and drugs had not been addressed directly.
The most famous example came in fall 1937. The Massengill Co. of Bristol, Tennessee., was selling a drug called sulfanilamide. It was a new drug in the first great family of anti-infectives, effective against several bugs. As these drugs came on the market, doctors and patients snapped them up instantly. Salesmen for Massengill reported back to headquarters that patients also would like to trade the bad-tasting pills for a more palatable liquid version.
The chief chemist at Massengill, Harold C. Watkins, tried one solvent after another before settling on diethylene glycol, a sweetish but largely tasteless fluid. The concoction was checked for flavor and fragrance, and then manufactured in batches totaling hundreds of gallons. The liquid, Elixir Sulfanilamide, was put into bottles of 4 ounces each and shipping began on Sept. 4, 1937.
Tulsa, Oklahoma., was the first city in which reactions were reported. By early October 1937, 10 patients in the practice of James Stephenson, M.D., had died immediately after taking the bright red liquid.
When FDA inspectors reached Massengill's Tennessee plant, they found that Tulsa would not be the only site of the problems. Two hundred forty gallons of "elixir" had been shipped across the country, from California to Virginia.
A short time later, Walter Campbell, the FDA's chief chemist, held a press conference in Washington, D.C., during which he said that 14 people had died after taking the Massengill product. He said that the FDA could not legally investigate or prosecute the matter unless it could be shown that there was something wrong with the label on the bottles. He had, however, begun a national investigation, as his agency was the only one with any possible jurisdiction. Campbell would go on to become Commissioner of Food and Drugs in 1940.
The 1906 law had no prohibition per se of dangerous drugs. Campbell was fortunate in that the medicine was labeled an "elixir," which technically is a liquid containing alcohol, so he went ahead with his investigation in hopes that the technicality would be sufficient cause to investigate.
The full field force of the FDA in the United States, 239 inspectors, began to search out the druggists and doctors who had received the shipments. Massengill proved to be trouble. At the beginning of the crisis, on Oct. 15, 1937, the FDA had asked the company to recall from doctors, druggists, and distributors whatever was left of the shipments. The company sent out a notice that all should send back the preparation, but said nothing about the reason for the return or the emergency nature of the recall. The recall was largely ignored. It wasn't until four days later that the company was told it had to send out a second notice, indicating that the drug was life-threatening.
By the end of November 1937, 107 deaths had been reported, many of them children. Not counted in the statistics was Watkins, the chemist who had caused it all, who died while cleaning his gun. It is unclear how many more victims there were beyond those reported, but the FDA investigators kept the number of deaths down by recovering, within four weeks, more than 90 percent of the original shipment. About 6 gallons, apparently, accounted for all the deaths.
Less than a month later, when it was clear the episode was over, the question of prosecution arose. Samuel Massengill himself wrote to the AMA, staking out the company's position. The deaths were regrettable, he said, "but I have violated no law."
The AMA, pushing for a new law, issued a statement noting that the death toll from ill people taking useless quack medications was far higher than 107. A proposed law was already in Congress, and pressure to pass a stricter food and drug law had been on for four years. The mail poured into Congress and the bill, left for dead, was revived and readied for passage. The new Food, Drug, and Cosmetic Act was passed on June 15, 1938. It has provided the framework for drug research and marketing that stands to the present.
Meaning of the New Law
The crisis of 1937 had been a modern one. It was not about crank products and false claims, but rather technical knowledge and potent modern medicines, about checking potent substances before selling them.
The law came at a time when the drug industry was just beginning to realize that the nature of its business for the future would not be to stamp out millions of bottles of chemicals, but rather to fashion drugs that could attack the underlying bases of disease.
The 1938 law made it clear that companies could not survive without increasing their number of scientists and laboratories, without knowing something substantial about the drugs they were selling, and about human illness itself. They would have to, at least, produce scientific tests of safety for the FDA.
The 1938 law was a landmark in civil governance, not just for the United States, as it turned out, but for democratic governments around the world. In the years to come, each nation of the developed world would adopt its central principles. It was the first law to require the checking of drugs before they went to market. It put into law the notion that the scientific approach--not the commercial, not the anecdotal--would be the standard for modern society. It was, in fact, one key factor that created the modern pharmaceutical industry.
By the 1940s and the advent of penicillin from a British university laboratory, most pharmaceutical companies had not yet made that crucial transition, from being chemical factories to basing their business on medical research and development. When scientists offered commercial companies the right to penicillin, for free, one company after another turned the chance down. Taking penicillin from the lab to commercial quantities was carried out prominently in a government laboratory in Peoria, Ill. But by the 1950s, the companies had made a big transition. They dropped thousands of useless patent medicines and similar products from manufacture--as in the case of Smith, Kline, which dropped 14,940 of its 15,000 products and began concentrating on only 60 products for its success. Until 1940, none of the important medicines had been created in industry, and the Sharp and Dohme Catalog contained not a single exclusive prescription medicine.
By the 1950s, the situation was changed completely. The death rates for a variety of the worst human illnesses, such as tuberculosis, had been cut by 99 percent in the developed countries, and the pharmaceutical industry was now a leader in the campaign to conquer disease. With an array of new antibiotics, steroids, vitamins, and other truly useful products, the prospect of humanity actually conquering among the most terrifying features of nature seemed possible, and the effort was under way.
Problems With Drug Potency
But soon, it became clear that the new potency of drugs would bring unexpected problems. The central fact of the new drugs was that they were biologically active, inducing real changes inside the body: against the agents of illness, certainly, but also, unintentionally, against some healthy systems as well.
The trouble came with useful drugs like chloramphenicol, a powerful antibiotic, which in a small number of cases also caused a deadly blood disorder. That case highlighted the fact that overuse of a good drug could have some bad consequences. And, as in the previous two great crises, a public debate began with magazine and newspaper articles, soon followed by proposals to change food and drug laws. And once again, the reform law was essentially defeated and headed for the dustbin when a medical crisis intervened.
A company called Richardson-Merrell in Cincinnati, a subsidiary of the Vick Company, had bought from a German company the right to make and sell a drug it labeled "Mer-32." Its trade name was Kevadon; its laboratory name was thalidomide. It was, basically, a relaxant used as a sleeping pill.
When the application to market the drug came to the FDA, it was reviewed by FDA Medical Officer Frances O. Kelsey, M.D., Ph.D. The application contained many testimonials but scant scientific information, and Kelsey asked the company to produce better safety data. The company then began a fierce lobbying campaign to get the drug approved, including at least 50 recorded meetings with FDA officials and sending top company officials and hired physician-lobbyists to urge that thalidomide be put on the American market. But as the pressure was on, the first reports of nerve damage from the drug surfaced. Kelsey knew that nerve damage also meant the possibility of damage to fetuses, and she and a colleague, John Archer, began to raise the question whether it might cause birth defects. They recommended holding back on approval; after all, the drug was not life-saving and not much more useful than drugs already on the market.
Within months, the German company Grunenthal, which created the drug, began quietly dealing with the rising tide of complaints and lawsuits arising from the side effects of thalidomide. The drug did not reach the U.S. market, but Richardson-Merrell had sent it to 1,267 doctors to give to their patients as an "experiment." About 20,000 patients received the drug--it would have been the largest drug trial ever conducted in the United States, but data were not taken. It was largely an exercise in promotion to the doctors rather than testing on patients.
Hundreds of cases of nerve damage from the drug were reported before the first "thalidomide baby" was officially reported. The first case probably occurred in 1957--four years before--but the news didn't break publicly in Germany until November 1961. Doctors began seeing a syndrome called phocomelia, or "seal limbs," in which babies were born with missing arms or legs. It took until July 1962 for the news to spread in the United States.
By conservative estimates, about 8,000 grossly deformed infants were born as a result of thalidomide use, almost all of them in Germany or in other European countries. Another several thousand, perhaps 5,000 to 7,000, died of their deformities before birth. In the United States, no accurate count is possible because of the lack of company records, but 17 cases of birth deformities are known and another nine are likely. If the drug had made it past FDA scrutiny to the American market, it is estimated that an additional 10,000 babies might have been born with the deformities.
The crisis came, once again, just as a new drug bill apparently had failed in Congress after years of hearings and debate. It was soon revived and brought forward. It was in October 1962, that the Kefauver-Harris Amendments to the food and drug law were passed and signed by President John F. Kennedy. The Secretary of Health, Education and Welfare wrote, in a letter to The New York Times, that "It is unfortunately true, as the thalidomide incident so well illustrates that the drug industry does not now always adhere to high standards, either in planning or in investigation, selecting the investigators, or providing the investigators with full information about the hazards … the proposed regulations would change this. And they would have made it impossible for Richardson-Merrell to distribute millions of tablets to physicians who made no pretense of being investigators."
New System for Drug Testing
The law passed in 1962 laid out a rational system for drug testing. The previous law had said that companies could sell drugs at will unless the FDA objected, on the basis of data submitted by the sponsor, within 60 days of being notified that a drug was to be marketed. The power rested with the companies, and the burden with the FDA. The tests of safety for any drug were vague, and the standards generally quite low.
The new law reversed that. The new law put into writing that companies wanting to sell new drugs should show that they can be used safely and that they worked for the stated purposes. "Experiments" like the thalidomide distribution in America were prohibited; doctors and companies would now be required to keep records of what they gave to patients in experiments; and the patients must give their consent. The key result was that the FDA would be more closely involved in oversight.
The few words in Section 505 set the basic standards for evidence. The law said that safety and effectiveness should be shown by "adequate" investigations, meaning large and numerous enough studies. The studies had to be "well controlled," meaning testimonials and unverified results would no longer suffice. Controlled meant the studies should include comparisons of patients who took the treatment with those who did not.
It took some time for the full effect of this line in the law to be felt. Opinion was out; science was in; and companies and the FDA together would have to literally create a set of standards for what makes a good clinical experiment. It took more than a decade for the industry and the FDA to establish in discussion and in practice just what practical rules would fulfill the language: it was the birth of the modern clinical trial.
FDA Review and Approval Times
The next great change at the FDA came during the 1990s with the long dispute over the agency's efficiency and funding to get its work done. It was the natural result of the struggle over standards--how much study should be devoted to each drug's safety and effectiveness? How long should review take?
Scholars have explored the different parts of the problem and found that there are several key issues. Are the drugs companies put forward worthwhile? Are the applications well done and complete? Many people assume that all drugs and all drug applications put forward are worthwhile. Studies, however, show that is not always the case. The debate over the years has tended to ignore these fundamental questions and focused not on whether the companies were doing their job effectively, but on whether the FDA was. The debate came to a head after the 1994 elections, when a movement for "FDA reform" was pressed in government and before the public. It centered on the question of how long the FDA staff members take to review applications. What is a reasonable time?
Advertising appearing in major media put the matter in the most radical terms, calling for a retreat to the days before food and drug regulation. "If a murderer kills you, it's homicide. If a drunk driver kills you, it's manslaughter. If the FDA kills you, it's just being cautious." The argument, put forward by lawmakers as well as radical think tanks, suggested that the time taken to review the safety of a drug was lost time and that delays cost lives.
Congressional leaders and the FDA took the efficiency of the reviews seriously, despite the provocative rhetoric. Over several years, different approaches were discussed to produce the speediest feasible reviews. The Prescription Drug User Fee Act was passed in 1992, and quickly had an impact.
Before the act, review times were similar to those of other developed countries, but after it, review times were dramatically better. The General Accounting Office--now the Government Accountability Office--said that by 1994, "FDA review and approval times were faster than those in the United Kingdom--a country whose regulatory system many critics like to cite as a way of doing things faster and better."
The History of Food Standards
The advances, and battles, have continued on food as well. The first food standards to be issued under the 1938 act were for canned tomato products. By the 1960s, about half of the U.S. food supply was subject to a standard. As food technology changed and the number of possible ingredients grew, the agency developed recipe standards for foods, lists of ingredients that could lawfully be included in a product. A food that varied from the recipe would be required to be labeled as an imitation.
The FDA pursued many cases of food misbranding during the 1950s and 1960s, most of which stemmed from false nutritional claims and unscientific enrichment. In 1973, after hearings convened by the FDA to address the vitamin fortification of foods and the claims made for dietary supplements, the FDA issued regulations for special dietary foods, vitamins, and supplements. The public response to these regulations helped lead Congress in 1976 to prohibit the FDA from controlling the potency of dietary supplements. The agency maintains authority to regulate enriched foods.
During the 1990s, the agency took a step beyond just the wholesomeness of food, and created the landmark food labeling system. Using it, consumers could read the ingredients and nutritional values in processed food, and reasonably easily determine for themselves what was most wholesome, and how to avoid too much of the more deleterious fare.
Working in the Public Interest
Over the years, the FDA, because it is in the center of modern commercial and intellectual activity, has had to evolve to do its job and, in fact, to survive in the sometimes contentious political atmosphere in Washington. It began as a shop in which decisions were largely made behind the scenes; its decisions are now open to inspection and challenge; and its reviews routinely include many outside experts and some consumer voices. It established with intellectual work, the basic standards for scientific testing, and with the cooperation of industry, enforced them.
Its work has produced credibility for industry and a standard on which industry and government have built extraordinary progress. Much remains to be done. Because the agency remains near the front lines of discovery, new information challenges old practices. Today, the two areas that loom ahead clearly are the best way to deliver reliable drug information to citizens, and the best way to study and report on safety after a drug is marketed.
But the role is essentially the same as that imagined by Harvey Wiley--to bring science to bear in food and medicine, to advance commerce safely and effectively. It may be that one of the most important discoveries during the evolution of the FDA is not just that science can be used effectively to make food and medicine safe, but that it can be used to set high standards that launch business success as well.
Philip J. Hilts has covered health and science for The New York Times and The Washington Post. He published "Protecting America's Health: The FDA, Business and One Hundred Years of Regulation." in 2003.