The Hundred Years’ War over Toxic Chemicals
The Hundred Years’ War over Toxic Chemicals
Ross: War over Toxic Chemicals
In America, chemicals are innocent until proven guilty. It’s a rule that’s been in place for one hundred years and still applies to compounds used every day in industry and in your home.
This may be changing at last. In April Congressman Henry Waxman, chair of the House Committee on Energy and Commerce, made regulation of toxic chemicals a priority by proposing the Safe Chemicals Act of 2010. A companion bill was introduced in the Senate by Frank Lautenberg.
Under Waxman’s legislation, the Environmental Protection Agency would at last gain some real powers to control the chemical soup we live in. Manufacturers would be required to test their chemical products. Procedural obstacles that have hobbled regulators would be swept away. Vital safety information could no longer be kept secret. Testing of especially dangerous products would be required within eighteen months.
The initial lack of public attention to the Waxman bill should not obscure its importance in a world whose chemical complexity we have only started to comprehend. Chemical manufacturers publicly support the bill, but the industry is already working to weaken some provisions. The experience of health care reform proves that even a watered-down version of the legislation could be killed in the partisan atmosphere of the current Congress.
THIS LEGISLATION moves us forward along a path that began in the first months of the twentieth century, when advances in science and industry brought the problem of chemical safety to the world. Legislators began to worry about toxic chemicals when a strange plague struck England 110 years ago. Seventy died, and many more became ill, poisoned by a commercial chemical that had inadvertently entered the food supply.
The English malady, diagnosed rapidly as arsenic poisoning, was tied to certain brands of beer. A Royal Commission of five scientists was empanelled to find the causes and prescribe remedies. At its head was the famous physicist Lord Kelvin, by then approaching eighty years of age, framer a half-century earlier of the Second Law of Thermodynamics, inventor of the absolute temperature scale, and designer of the first transatlantic telegraph cable.
Kelvin and his colleagues traced the impurity to its origin. Until a few years earlier, a beer purity law like those still extant in Germany today had allowed only natural ingredients. But the law was repealed in a parliamentary deal to pass a controversial tax increase. Brewers began using acid to quickly digest starchy grains. The source of England’s problem was contaminated sulfuric acid, made from batches of imported pyrite that were unusually rich in arsenic.
This finding brought the Royal Commission to what was then a novel question: how much arsenic should be allowed in food? Science alone offered no answer. Chemists of the day succeeded in measuring the poison in the beer that made its drinkers ill. But the investigators did not know whether smaller quantities, consumed over a lifetime in each glass of beer, would cause illness too. They had no choice but to balance the public’s health against uncertain dangers.
Here arose what has become known as the Precautionary Principle. “In the absence of fuller knowledge,” the commissioners wrote in 1901, “as to the possible effect of consumption of mere traces of arsenic, we are not prepared to allow that it would be right to declare any quantity of arsenic, however small, as admissible in beer or in any food…”
As an achievable approximation of this goal, Kelvin’s commission proposed an enforceable standard that could be met by manufacturers and measured by inspectors. Arsenic content was limited to one hundred parts per billion in beer, water, and other liquids, and one part per million in solid foods. Europe joined Britain in adopting this rule.
The United States soon needed to make its own decisions about chemicals whose effects were incompletely understood. Here, too, arsenic posed the first problem. The new continent’s enormous copper smelters spewed the element forth in prodigious quantities, denuding trees, killing livestock, and provoking a flood of lawsuits. But America’s pro-business courts insisted that damage must be incontrovertibly tied to a specific chemical perpetrator before they would act.
The legal rule requiring proof of actual damage was soon imported into science. When leaded gasoline was introduced, scientists aware of lead’s poisonous properties raised alarms. After a rash of illnesses and deaths in plants that made the lead additive, production was suspended in 1925 to allow time for studies. Researchers detected damage to the blood cells of chauffeurs and gas station attendants who used the leaded fuel, but found no acute illness. The new fuel returned to the market, confirming that chemicals in America, unlike in Europe, were innocent until proven guilty.
The two paradigms for dealing with the unknown—precaution and skepticism—came quickly into conflict. In 1925, an English family fell sick from eating American apples. When British authorities threatened to shut off fruit imports, the newly founded Food and Drug Administration began to seize apples laced with too much arsenic—though Americans were allowed to eat the fruit deemed too contaminated for export. This double standard was untenable even in the years of Herbert Hoover, and by 1932 the FDA was enforcing the one-part-per-million arsenic standard on all produce.
Apple growers fought back against the controls. The Pure Food and Drug Act, passed in 1906, had authorized the government to seize food when added chemicals were injurious to health, but it provided no way to decide what injured health. This allowed the merits of Kelvin’s limit to be argued anew each time a seizure of contaminated fruit was challenged in court. Juries in apple-growing regions often sided with the farmers, and enforcement bogged down.
The Waxman of an earlier day—Rex Tugwell, a key member of Franklin Roosevelt’s “Brains Trust”—offered a legislative solution. Tugwell’s new Federal Food, Drug, and Cosmetic Act aimed to make the FDA’s limits enforceable in court. The law enacted in 1938, after protracted struggles among the panoply of consumer and industry interests, was a compromise. The FDA was authorized to issue legally enforceable restrictions on chemicals in foods, but to exercise this power, it had to meet a high standard of proof and hold lengthy hearings. The apparent strengthening of the agency’s powers had little real effect; no arsenic standard was issued until 1950, and by then new products had all but driven arsenic off the market. By conferring seemingly broad powers to regulate chemicals, but erecting nearly impassable barriers in the way of their use, the 1938 statute set a pattern that would be repeated in years to come.
SYNTHETIC PESTICIDES were the next group of chemicals to attract regulators’ attention. The bug-killer DDT, introduced during the Second World War, was quickly understood to threaten the balance of nature by disrupting the food chain. The Army and War Production Board, in conjunction with the Agriculture Department, issued a blanket ban on civilian spraying in 1945, but it was enforceable only because of the wartime system of rationing scarce materials. Once the war ended, there was no way to keep it in force.
In theory, but not in practice, a law passed in 1947 shifted the burden of proof for chemicals used as pesticides. Before new products went on the market, manufacturers had to prove their chemical innocence by submitting evidence to the Agriculture Department that new pesticides would not make food unsafe. But there was a giant loophole. If the application was turned down, the product could still be sold—“registered under protest,” with a warning letter from the government.
New pesticides, some of them known to cause cancer, made their way into food, and complaints were quick to come from scientists and legislators. Attempts to tighten the 1938 act were rebuffed until 1958, when Kelvin’s concept that poisons should be kept out of food enjoyed a modest revival in the Delaney Clause, an amendment to the Federal Food, Drug, and Cosmetic Act. This provision, slipped in quietly by a powerful member of the House Rules Committee, required that processed food be free of any detectable quantities of cancer-causing substances. Because of the Delaney Clause, a year later the entire cranberry crop was seized just before Thanksgiving.
The evidence against environmental toxins was by then mounting rapidly, and in 1962 Rachel Carson’s best-seller Silent Spring brought them new attention. Environmentalists sought to give the precautionary approach wider application. Chemicals of all kinds, they argued, should be tested for safety before going into wide use. But chemical manufacturers and their allies were now well-entrenched in Congress. Not until 1972, in the wave of environmental activism that followed the first Earth Day, was the Precautionary Principle written into pesticide law. And even this was a compromise. Registration under protest was ended, but the fine print of the law still placed significant limits on the new EPA’s powers.
By then industrial chemicals had joined pesticides in the spotlight of public concern. PCBs, DDT-like compounds whose main use was in electrical equipment, were found in the late 1960s to have spread throughout the world. Environmentalists sought to give the precautionary approach wider application. Chemicals of all kinds, they argued, should be tested for safety before going into wide use.
Intense negotiations between industry and environmentalists eventually brought forth the Toxic Substances Control Act of 1976, which remains in effect today. The words of the law grant the EPA broad authority over chemicals used in industry and in consumer products. But, in an echo of the 1938 Federal Food, Drug, and Cosmetic Act, these powers are so constrained as to render them nearly useless. Before taking regulatory action, the government must prove that a chemical poses an “unreasonable risk.” As interpreted by skeptical judges, this becomes an all-but-insurmountable hurdle.
Waxman and Lautenberg hope to update this law. More than three decades have passed since the 1976 act, and we have learned much more about the escape of chemicals into the environment. Twenty years ago, when I found a barbiturate and a tranquilizer in a Florida homeowner’s well, it was a discovery. By now, drugs are found so often in drinking water that there are international conferences on the subject.
Oddly named substances, whose origins are sometimes obscure even to chemists, have spread throughout the globe. Scientists detect many of them among a myriad of artificial compounds in the blood of every living person. We are all guinea pigs in a vast experiment of chemical alteration of the environment, with consequences that are no clearer to us than the effects of life-long exposure to arsenic were to Kelvin.
In the face of this challenge, the transatlantic divide over chemical safety that opened so many years ago still yawns wide. The European Union has reaffirmed its decision to follow the Precautionary Principle. A law passed in 2007 requires widely used chemicals to be registered and tested for safety; its first deadline looms this November. At about the same time, the fate of Waxman’s precautionary bill will be known in the United States. Between then and now, Americans have a few more months to mull over the century-long debate between precaution and skepticism in a sea of toxic chemicals.
Benjamin Ross is author of The Polluters: The Making of Our Chemically Altered Environment, to be published in September by Oxford University Press.
Homepage Photo: Chemical Inspection Site in the 1920s (FDA History Office)