U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Kirchhelle C. Pyrrhic Progress: The History of Antibiotics in Anglo-American Food Production [Internet]. New Brunswick (NJ): Rutgers University Press; 2020.

Cover of Pyrrhic Progress

Pyrrhic Progress: The History of Antibiotics in Anglo-American Food Production [Internet].

Show details

Chapter 3Chemical Cornucopia

Antibiotics on the Farm

This chapter reconstructs antibiotics’ adoption and ensuing conflicts in the US agricultural sphere. After 1945, antibiotics were rapidly integrated into all areas of US food production. This rapid introduction was facilitated by growing pressure to expand and intensify agricultural production and by a sophisticated sales network connecting pharmaceutical producers with farms. By the early 1960s, antibiotics had acquired infrastructural importance in many production sectors. Although US farmers shared public antibiotic optimism, they soon expressed concern about the negative effects of chemical-driven intensification on smaller producers. However, all-out rejection of agricultural chemicals remained limited to organic farmers. While conventional farmers attempted to curb antibiotic residues in sensitive products like milk, growing public concerns about chemicals instead provoked angry outbursts against “irrational” faddists. Similar to the US public sphere, AMR hazards were rarely discussed. Despite warnings by veterinary bacteriologists, most agricultural commentators did not discuss potential public health implications of AMR selection on farms.

The Origins of Agricultural Antibiotic Use

US agriculture’s rapid adoption of antibiotics was no coincidence. Beginning in the interwar period, the Taylorian logic of Henry Ford’s factories gradually spread throughout the countryside. A new generation of agricultural experts, officials, and producers wanted to apply the principles of quantification and mechanization to US farms. Already farming larger acreages and producing more animals than Europeans, US farmers further expanded and also began to rationalize production. Interwar farms employed accounting techniques alongside new technologies like tractors, hybrid seeds, and pesticides.1 Farmers also invested in more intensive ways of producing livestock. Conceptualizing animals as machine-like feed converters, farmers began to purchase premixed fortified rations from specialist producers like the Commercial Solvents Corporation or Pfizer. While farmers’ motivation was to produce more meat with less feed, manufacturers saw farms as lucrative outlets for industrial surplus and by-products. Connecting farmers with pharmaceutical manufacturers was an increasingly sophisticated network of veterinarians, local feed-mixers, and government extension officials.2

Rising output soon exceeded demand. Attempting to maintain incomes despite sinking commodity prices, US farmers increased their production by 13 percent between 1917 and 1929. Unsurprisingly, prices continued to sink. By the end of the 1920s, it cost more to produce many commodities than farmers earned from selling them. Unable to service their debts, many farmers suffered bankruptcy, and the US farm population declined from 32.5 million in prewar years to 30 million in 1930. Only very efficient or large farming operations remained profitable due to lower production costs. When commodity prices declined by another 37 percent during the Great Depression, even the most efficient producers struggled to survive.3

Reacting to farmers’ plight, the Roosevelt administration launched a comprehensive program of agricultural aid. Passed in May 1933, the Agricultural Adjustment Act (AAA) was designed to reduce surpluses, stabilize prices, and enhance farmers’ purchasing power. The AAA allowed the United States Department of Agriculture (USDA) to administer adjustment payments to farmers, who in turn agreed to reduce production of surplus commodities. Together with compensated slaughter programs and other initiatives, the AAA was supposed to restore the relative purchasing power—parity—of agricultural goods to prewar levels. However, in attempting to alleviate the Great Depression’s impact, New Deal measures increased agricultural intensification pressure and subsidy dependence: by 1941, one-third of US gross farm income was derived from direct or indirect federal payments.4 Paying producers to slaughter animals and take land out of production also incentivized them to produce more with remaining assets—thereby putting larger farmers at an advantage. As a consequence, the farms that survived the Great Depression were culturally and economically geared to strive for factory-like efficiency, scale, and technological sophistication.5

When commodity prices recovered, production boomed. Reacting to America’s entry into the Second World War, Congress passed the 1942 Emergency Price Control Act and the Steagall Amendment. Legislators guaranteed commodity prices at around full parity for the duration of hostilities and for two years afterward to incentivize farmers to maximize production and invest in productivity increases. Ensuing transformations were particularly dramatic in the livestock sector: whereas New Dealers had ordered the compensated slaughter of about 6 million hogs in 1934, the new price guarantees encouraged farm investment and a significant rise in production.6

Parallel war-induced grain, protein, and labor shortages, however, soon threatened rising outputs. US researchers were hastily commissioned to find solutions. Described by historian Mark Finlay, one of these researchers was Damon Catron. At Purdue University’s Work Simplification Laboratory, Catron launched a systematic attempt to overcome shortages in the pig sector with efficiency increases. Farrowed in spring, animals were fattened on pastures during summer and autumn, and mass slaughtered ahead of winter. The resulting pork glut often overwhelmed processing facilities and depressed prices. By contrast, Catron’s vision for production resembled an integrated car assembly plant that divided a pig’s life into distinct stages: breeding, farrowing, weaning, rebreeding, and finishing. Removed from pastures, animals were to be “assembled” all-year-round in optimized indoor environments and fed tailored rations prior to their final disassembly in an abattoir.7

This Fordist vision of animal production faced significant challenges. While intensification was already underway in poultry farming, the overwhelming majority of US pigs were still held in outdoor or mixed indoor-outdoor systems. Most US cattle were kept on pastures.8 Convincing these producers to abandon their low-cost systems and invest in confined intensive production proved challenging. Disease posed another obstacle. On both sides of the Atlantic, previous attempts to increase animal densities in more confined systems had been stunted by a corresponding growth of disease pressure: infections had wiped out herds or diminished productivity.9 It was here that the advent of antibiotics had a significant impact.

During the late 1930s, producers had already used organoarsenics, inorganic sulfur compounds, sulfonamides, and biological antibiotics like gramicidin to treat individual animals (e.g., mastitis in cows) or larger groups of animals. However, drug prices were high, and incorrect dosages could poison animals. This situation began to change during the 1940s. The American poultry industry was a leader of this change. In 1939, Cornell veterinarian P. Philip Levine reported that the only recently marketed sulfanilamide had shown efficacy against coccidiosis—a protozoal infection of the intestine. Researchers soon reported the successful use of other sulfonamides like sulfamethazine, sulfadiazine, and sulfaguanidine to both treat and prevent coccidiosis in poultry and foulbrood in bees. Developed with the support of pharmaceutical companies, safe sulfonamide ratios in water and feed soon enabled the mass medication of entire flocks and were also used against other diseases like fowl typhoid (Salmonella gallinarum) and Pullorum disease (Salmonella pullorum).10 What had once been devastating herd and flock diseases seemed increasingly controllable.

With the end of the war easing military demand, veterinary antibiotic treatments became cheaper and more widely available. Described by Susan Jones, ads for Lederle’s sulfathiazole began to appear in poultry magazines by 1946. In the same year, Merck patented sulfaquinoxaline. Originally developed as an antimalarial but licensed against coccidiosis in 1948, sulfaquinoxaline became the first antibiotic product officially approved for inclusion in animal feeds and proved extremely lucrative. Poultry farmers could now medicate entire flocks with only minimal (and often without) veterinary supervision.11

While sulfonamides paved the way for routine therapeutic antibiotic use, it was the nonspecific application of antibiotics that would transform agro-pharmaceutical relations. There was already a lucrative market for animal feed supplements in the United States (chapter 2).12 However, researchers continued to puzzle over why certain feeds supported animal growth and others did not. The animal protein factor (APF—also known as anti–pernicious anemia factor) was at the center of this puzzle. Nutritionists had long known that feeds enriched with fishmeal, cow and chicken manure, or fermentation wastes were more effective at promoting growth than feeds containing cheap vegetable protein alone. Meanwhile, expensive liver extracts and cod oil had been found to be both effective against pernicious anemia in humans and to promote animal growth.13 With animal protein imports from Japan and Norway interrupted during the Second World War, researchers redoubled efforts to isolate and find alternative APF sources.14

The APF hunt led to unexpected results: in 1946, a University of Wisconsin team including Peter Moore and the colorful Thomas Donnell (later Sir Samurai) Luckey reported that a combination of sulfasuxidine and streptomycin increased the growth of chicks when fed alongside folic acid.15 The Wisconsin researchers had been studying the role of B-vitamins for animal growth by using antibiotics to “knock out” parts of the digestive system. They had expected streptomycin to sterilize the gut and create a growth-retarding vitamin deficiency. To their surprise, the opposite had happened. However, the Wisconsin team and other academic groups researching the feeding of waste products like mycellium from penicillin production failed to realize the practical and commercial implications of their observations.16

This changed three years later. In 1948, research by the University of Maryland and Merck as well as by Glaxo in the United Kingdom led to the isolation of vitamin B12 and its identification as the mysterious APF. In a lucrative spinoff, Merck not only discovered that Streptomyces griseus (the organism producing streptomycin) produced B12 but that industrial fermentation wastes accruing after streptomycin extraction still contained large amounts of the vitamin. Far cheaper than other animal protein sources, fermentation wastes of biological antibiotics could be fed directly to animals as APF growth promoters. There was a particularly large market for B12/APF supplements in the Midwest where animal husbandry and the production of cheap vegetable protein in the form of hybrid corn and soybeans were expanding rapidly.17

APF feeds also excited other companies. Working at Lederle Laboratories’ Pearl River station, Thomas Jukes and Robert Stokstad investigated whether vitamin B12 produced by Lederle’s Streptomyces aureofaciens (the organism producing chlortetracycline) could be commercialized too. Both Jukes and Stokstad were experienced B-vitamin researchers: trained as a biochemist, the group’s leader, Jukes, had previously studied relationships between B-complex vitamins and their effects on deficiency diseases and animal growth. During the 1930s, he had identified pantothenic acid and choline as growth factors in chickens and turkeys. Stokstad was a trained animal nutritionist whose research would lead to the isolation of vitamin K and who had previously isolated and purified folic acid.18 Over Christmas 1948, Jukes and Stokstad fed chicks a deficiency diet consisting of 75 percent soybean meal and supplemented it with sterilized S. aureofaciens. To their surprise, they found that chicks eating S. aureofaciens feeds grew quicker than those eating feeds only supplemented with purified vitamin B12. Antibiotics seemed to be “promoting” animals’ growth.

In contrast to the Wisconsin team, their previous work and commercial setting made Jukes and Stokstad keenly aware of their findings’ implications. To their dismay, they were, however, denied access to further S. aureofaciens, which was needed for Lederle’s aureomycin production. For their feed experiments, they turned to alternative aureomycin sources like acetone cake—dried acetone solvent left over from purifying aureomycin fermentation liquid—and at one point dug out fermentation wastes from the Lederle dump. Befriended researchers who received unmarked feed samples soon began to confirm “growth promotion” in other species and also reported combating bloody diarrhea (scours) in pigs.19 The reliability of these early studies is debatable. Although this was not uncommon during the period, AGP trials were conducted on a small number of animals for short periods of time and did not produce sufficient data to determine statistical significance. Results were also not reproducible in other countries and later studies (chapter 6).20 Most contemporaries, however, trusted Lederle’s claims.

Triggering a commercial storm, Jukes and Stokstad publicly announced the “antibiotic growth effect” in 1950. Despite the earlier Wisconsin report, it was the serendipitous combination of their observation, commercial focus, and access to fermentation wastes that gave birth to a new era of antibiotic mass consumption. The effects of this new era reverberated around the globe. While it is important to note that US animal production would have continued to intensify without them, antibiotics’ alleged ability to boost metabolisms, control disease, and reduce labor previously devoted to caring for individual animals made many agricultural commentators soon consider them indispensable.21

Antibiotic Infrastructure

Economically, antibiotics’ mass introduction to US agriculture could not have come at a better time. Between 1940 and 1945, American farmers’ average per capita net income had increased from $706 to $2,063.22 Feeding the United States and large parts of postwar Europe and encouraged by the Korean War’s promise of stable prices, farmers paid off debts and invested in new technologies.23 Expensive animal protein supplements made them particularly well disposed toward new APF sources that would help turn cheap vegetable protein into lucrative animal products.24

Manufacturers were happy to oblige and launched advertising campaigns for “APF growth factors,” “AGP miracle additives,” and therapeutic applications in farming magazines.25 Using what Rima Apple has called “reason why” and “negative appeal”26 strategies to lure or scare farmers, early AGP commercials were nearly identical to previous APF and sulfonamide commercials. However, it was often unclear what farmers were actually buying. Discussed in more detail in chapter 4, Lederle’s decision to avoid FDA licensing by marketing AGPs as APF with an additional vaguely specified growth factor meant that drug concentrations varied. Without conducting assays for either B12 or aureomycin, Lederle sold “tankcars of brine containing residues from [aureomycin] fermentation”27 to feed merchants, who then repackaged the wastes. This procedure caused significant confusion. The history of the first AGP advertisement in the popular Iowan magazine Wallaces Farmer is telling. Featuring a proud farmer holding a feisty piglet, a June 1950 Gooch Feeds advertisement reported “amazing results” achieved with the new “Aureomycin APF” “wonder-worker.”28 However, it soon emerged that Gooch Feeds’ “Genuine Lederle Aureomycin APF”29 was not always effective. Two weeks after printing the ad, Wallaces Farmer warned readers: “crystalline aureomycin is not available at the present time to either the feed industry or the farmer.”30 According to a competing merchant: “no statement should be made … concerning the presence of the antibiotic since it is naturally inherent in the ingredient.”31

The story of Gooch’s advertisement is indicative of the gold rush atmosphere surrounding AGPs. In 1985, co-discoverer Thomas Jukes remembered: “The demand was such that the available supply was prorated among customers. On one occasion, we had to deal with a complaint from Senator Wherry of Nebraska that the supplies of APF were all going to Iowa rather than Nebraska …. In Austin, Minnesota, a local pharmacist purchased Lederle APF in bulk, repackaged it, and sold it at an inflated price. Allegedly, he made so much money that he retired and went to live in Florida.”32 Other companies were “right on [Lederle’s] heels,”33 and aureomycin was soon joined by a bewildering array of competing AGPs. For a while, it seemed as though farmers would buy any feed as long as it contained preferably large doses of antibiotics. While companies like Ful-O-Pep or Kraft advertised their own antibiotic supplements,34 Allied Mills promised that its AGP would turn a “scrawny runt” into a “husky hog” in “just 81 days.”35 For farmers unwilling to trust only one antibiotic, a company called Occident advertised multimycin, an unspecified “combination of miracle antibiotics” offering “up to 18% greater gains than with single antibiotic feeds.”36 Trying to ward off competition, Lederle soon claimed that aureomycin was “the only antibiotic that has been proved highly effective for swine, poultry, calves and several kinds of small animals”.37

FIGURE 3.1. Gooch’s APF/AGP Feed, Wallaces Farmer, 1950.

FIGURE 3.1

Gooch’s APF/AGP Feed, Wallaces Farmer, 1950.

US farmers trusted these claims. Although later surveys indicated that producers’ reasons for purchasing them were varied (chapter 9), antibiotic supplements had become standard ingredients of broiler and turkey mashes by mid-1951.38 Calves and pigs also received large amounts of low-dosed antibiotics. One year after the antibiotic growth effect was announced, 110,000 kilograms of antibiotics—about 16 percent of total US antibiotic sales—were already being used for unspecified nontherapeutic purposes.39 However, chaos over what constituted effective growth promotion and what might be therapeutically relevant persisted even after the FDA introduced AGP dosage requirements in 1951. Initially, officials recommended antibiotic dosages of up to 50 grams per ton of finished feed. However, industry-sponsored trials soon made some farmers disregard guidelines in favor of higher feed dosages, which could also be used to prevent and treat disease. Manufacturers also experimented with higher-dosed penicillin, bacitracin, and chlortetracycline implants. On farms, the boundaries between growth promotion and treatment were fast blurring. By the early 1960s, experts resignedly noted that “legally precise [dosage] boundaries are easier to establish but not always easier to maintain than biologically precise boundaries.”40

Reacting to exaggerated marketing claims and widespread confusion, agricultural advisors rushed to provide farmers with expertise. In popular farming magazines, articles promoted “rational” and cost-effective antibiotic use. Most experts agreed that antibiotics would reveal their true potential only on hygienic and modern farms. Soon familiar messages included: farmers should “follow-through”41 with treatments; antibiotics were no substitute for good management;42 using drugs to maintain outmoded husbandry systems would not pay off—“drugs can’t whip old lots.”43

A similar promotion of “rational” antibiotic use took place in contemporary agricultural manuals for both therapeutic and nontherapeutic purposes. As late as 1944, US pig and cattle manuals had stressed preventive health care but recommended little in the way of effective DIY therapeutics against microbial infection.44 However, by 1947, newer publications like the Eastern States Farmers Handbook featured therapeutic sulfonamide use.45 Following the announcement of the antibiotic growth effect, manuals soon reclassified antibiotics as a routine part of animal nutrition. In 1951, Interstate’s Livestock Feeding Manual contained guidance and exercises for pig farmers to calculate the right amount of “vitamin B12 or some antibiotic”46 for feeds. In the same year, the Midwest Farm Handbook contained an entire section devoted to B12 and AGPs. Penicillin, streptomycin, bacitracin, aureomycin, and terramycin were praised for boosting growth and controlling diseases like scours. Reflecting the drugs’ popularity, readers were explicitly cautioned that “a deficiency symptom does not develop by omitting antibiotics.”47 The next few years saw a further proliferation of handbooks giving advice on when, how much, and how long to administer antibiotics.48 In 1952, Hog Profits for Farmers listed “antibiotic and B12 supplements”49 among its seven essentials of a complete pig ration. The drugs would also act as an “insurance”50 against runts and many other problems. Government advisors joined the chorus with USDA and Farm Credit Administration publications praising AGPs and feeding antibiotics alongside cheap cottonseed rations.51

Although the boundaries between antibiotic growth promotion and treatment were always fluid, falling prices also triggered a boom of explicitly therapeutic applications. In pig husbandry, the postwar period saw antibiotics being used against atrophic rhinitis, suspected paratyphoid, edema, and respiratory diseases.52 In feedlots, beef cattle were given penicillin against bacterial footrot. Antibiotics were also used to aid artificial insemination and against infections of cows’ udders (bovine mastitis).53 Caused by several bacteria species and spread via hands and inadequately sterilized milking equipment, bovine mastitis is painful, occasionally fatal, reduces productivity, and can taint the flavor of milk. Drinking tainted milk can also cause septic sore throat and food poisoning in humans.54 In 1955, Farmers Weekly Review from Illinois estimated that preventable US mastitis losses amounted to $225 million.55 Given its bacterial causes, antibiotic mastitis treatments proved popular. In the 1940s, articles advised US farmers to “lick mastitis”56 with sulfonamide, penicillin-sulfonamide, or streptomycin infusions. In 1949, American Cyanamid began marketing collapsible broadspectrum antibiotic tubes for intermammary mastitis control. “Ready-to-use-one-treatment tube[s]” could be purchased without a veterinary prescription and were soon regularly advertised in US farming magazines.57 By 1953, a USDA survey reported that American farmers had access to a wide variety of antibiotic treatments containing sulfonamides, nitrofurazone, tyrothricin, penicillin, streptomycin, aureomycin, terramycin, neomycin, bacitracin, polymyxin, and chloromycetin: “combinations of the various antibiotics and sulfonamides are being widely used.”58

FIGURE 3.2. Boundaries between AGPs and therapeutics soon blurred. Pfizer advertisement, Farm Journal and Country Gentleman, 1956.

FIGURE 3.2

Boundaries between AGPs and therapeutics soon blurred. Pfizer advertisement, Farm Journal and Country Gentleman, 1956.

FIGURE 3.3. Cyanamid advertisement, Wallaces Farmer, 1949.

FIGURE 3.3

Cyanamid advertisement, Wallaces Farmer, 1949.

Antibiotic mastitis treatments proved so popular that dairies and creameries were complaining about drug residues in raw milk as early as 1948.59 According to Wallaces Farmer, “antibiotics not only kill mastitis germs, but also kill bacteria which ferment milk.”60 In April 1951, the “cheese state” Wisconsin ruled that mastitis ointments carry labels on drug withdrawal times. In addition to public concerns about allergies and invisible poisoning (chapter 2), agricultural commentators warned that excessive antibiotic use would select for resistant pathogens in udders and that residues might select for AMR when ingested by humans.61 However, such warnings could not dampen antibiotic enthusiasm. By 1956, US farmers annually used 75 tons of antibiotics against mastitis.62 Faced with widespread residue problems, Ohio State University researchers began to experiment with antibiotic-resistant lactic acid starter cultures to produce cheese from residue-laden milk.63 Meanwhile, Canadian studies revealed that Staphylococci and Streptococci isolated from cheese were becoming increasingly resistant against penicillin and dihydro-streptomycin—both popular mastitis treatments.64

Interestingly, US veterinarians had little control over the rapid proliferation of agricultural antibiotic use. Veterinary researchers’ success in developing effective antibiotic treatments “unwittingly created difficulties for their brethren practicing in the field.”65 Unlike human medicine, where antibiotics were prescription only drugs from 1951 onward, labeled drugs like penicillin could be sold to farmers without a prescription (chapter 4). As a consequence, pharmaceutical companies increasingly bypassed veterinarians by selling and advertising easy-to-use antibiotic products directly to farmers. This strategy was not uncontroversial. Thomas Jukes later remembered that the decision to sell pure aureomycin to farmers was “strongly opposed by the veterinarians at Lederle”66 but was supported by Lederle president Wilbur G. Malcolm. Subsequent attempts by the American Veterinary Medical Association (AVMA) to convince officials to restrict antibiotic access via prescription requirements similarly failed.67 As a consequence, postwar veterinarians soon found themselves competing for farmers’ custom against pharmaceutical salesmen, feed merchants, and a new group of experts specializing in mass animal health management. Many veterinarians reacted by leaving the livestock sector. Others expanded their traditional purview to include preventive services against subclinical diseases and for animal productivity.68

While US veterinarians had reason to feel ambivalent, agricultural scientists, farming magazine commentators, and American officials all endorsed broadening antibiotic access and use. In 1951, Damon Catron conceded: “we don’t know why antibiotics do what the experiments indicate. But we do know that they prevent scours, increase rate of gains and reduce feed requirements.”69 There was also pride in antibiotics as part of a new American-led push for agricultural efficiency and nutritional plenty. According to Wallaces Farmer, AGPs and new production systems enabled poultry farmers to devote only “ten seconds per bird per day” and raise “flock profits by 110 per cent.“70 In Illinois, Farmers Weekly Review reported on awestruck British visitors’ reactions to Pfizer’s “Miracle Drug Pigs,”71 new breeds of “antibiotic-age chicks,”72 and trials of antibiotic-doused earth to boost crop production.73 In Pennsylvania, Lancaster Farming informed readers on antibiotic tree sprays and using antibiotic-doused bees to combat fire blight.74 Major companies like Merck, Cyanamid, and Pfizer did their best to promote this enthusiasm by regularly publishing glossy brochures, free manuals, and annotated bibliographies on antibiotics’ many benefits.75

Antibiotics’ popularity is best expressed in numbers: around 1955, the USDA estimated that 50 percent of American formula feeds for poultry, hogs, and cattle contained antibiotics and vitamin B12. Ninety-three percent of poultry feed manufacturers, 60 percent of hog feed manufacturers, 22 percent of dairy feed (mostly calf feed) manufacturers, and 4 percent of beef cattle feed manufacturers added antibiotics to their products. Drug concentrations ranged from an average of 16.5 grams and 27 grams (mostly broadspectrum antibiotics) per ton in pig and calf feeds to 2.7 to 3.5 grams (mostly penicillin) per ton in poultry feeds.76 Having focused on large businesses, surveyors believed that antibiotic use was even more common at the local small business level. Between 1951 and 1960, US sales of nontherapeutic antibiotic applications grew seven-fold from about 110,000 to 770,000 tons. Although total US antibiotic production grew significantly, the proportion of total production sold for nontherapeutic purposes expanded from 16 percent to 36 percent.77 Resulting profits encouraged manufacturers to professionalize with many larger feed producers investing in new research facilities.78 By 1962, it was estimated that an astounding 99 percent of US poultry, 90 percent of pigs, and 30 percent of beef cattle were receiving antibiotic-supplemented feeds.79

Entering nearly all areas of animal production and also entering food preservation and plant protection, antibiotics had rapidly achieved infrastructural relevance in US food production. The new antibiotic age was nearly universally welcomed. In the eyes of farmers and large parts of the US agricultural establishment, antibiotics had seemingly overcome the bacterial limitations previously imposed on the size of production facilities and the productivity of their inhabitants. Hardly anybody worried that the new antibiotic era might come at a price.

The Costs of Plenty

In 1951, US meat packer Swift sponsored large ads calling on farmers to throw aside fears of overproduction and produce as much meat as possible. The new ABCs of animal nutrition—A standing for antibiotics—would guarantee rising production and profits.80 According to Swift: “The problem’s never surplus meat—you can’t raise more than we can eat.”81 This trust in American appetites proved misguided.

Following the Korean War, agricultural commodity prices began to sink and the Eisenhower administration became concerned about overproduction and expensive subsidies. Between 1953 and 1954 alone, the US government’s Commodity Credit Cooperation (CCC) was forced to purchase $1.5 billion of agricultural surpluses. However, CCC purchases were not enough to shield farmers from a return of the interwar cost-price squeeze. Forced to maintain federal subsidies, the Eisenhower Administration attempted to dispose of surpluses abroad with the 1954 Food for Peace program—an opportune side effect of Cold War diplomacy.82 The 1956 Agricultural Act recycled the New Deal idea of paying farmers to reduce production. However, agricultural production continued to grow by an annual average of 2.1 percent throughout the 1950s.83 Concerned about annual CCC expenditure of $4 billion and daily storage costs of about $1 million,84 the new Kennedy administration established the US food stamp program and expanded the Food for Peace and lunch and milk programs in schools. Kennedy also reduced US acreage and the quantity of marketed produce.85

The re-emerging cost-price squeeze hit the rural community hard. As in the Great Depression, small American farmers were worst affected. With polls showing that farmers themselves were upsizing definitions of a “family farm,”86 the number of US farms decreased from 3,710,503 in 1959 to 2,730,250 in 1969 while the average farm size increased from 302.8 to 389.5 acres.87 In the livestock sector, developments in the poultry production were indicative of things to come: since the early 1960s, large national vertically integrated firms controlled as much as 90 percent of now mostly confined US broiler production.88 US pig production also began to change. Although confinement remained far from ubiquitous, a growing number of hog producers experimented with new housing systems to save labor and reduce animal movement.89 In the beef sector, the 1960s saw an increasing number of cattle concentrated and fattened on large feedlots prior to slaughtering. This development had already gathered steam during the 1950s. As a result of cheap cereals and the fencing off of range land, a growing number of cattle were fed grain diets and new additives like diethylstilbestrol (DES) to fatten them quicker. By February 1955, 6 million US cattle were “on feed.” During the 1960s, capital injections and conflicts over ranching on public lands facilitated the spread of commercial feedlots from California to Colorado, the Texas panhandle, Kansas, and Nebraska. By 1970, nearly two-thirds of the US calf crop was placed in feedlots prior to slaughter.90

Interestingly, most agricultural commentators did not blame new production systems for the ensuing cost-price squeeze. Despite bemoaning family farms’ decline,91 the “factory farm” remained a utopia rather than a dystopia. In an age of superpower rivalry and overpopulation, technology-driven intensification was presented as essential for farmers’ and the nation’s long-term survival.92 This trajectory included further increases of antibiotic use. In 1956, Farm Journal and Country Gentleman wrote that farmer Hugh Fussell was getting everything right: “Detroit’s automobile factories have nothing on Hugh Fussell. This Georgia farmer raises hogs on a truly assembly-line basis. Every two weeks Fussell is on the market with 50 to 60 head of No. 1 hogs.”93 Significantly, Fussell was also a “fanatic on disease control”: every day, each of his finishing barn pens was disinfected; Fussell’s pigs were vaccinated, their “feeds [were] well laced with vitamins and antibiotics.”94 With further “mighty new germ killer[s]”95 on their way, who could blame farmers if—for their peace of mind—they invested in continuous antibiotic use to reduce feed costs and stay ahead of their competitors and infection?96

Only rarely was antibiotic use described not as a solution but as a problem. While Lancaster Farming erroneously thought that new antibiotic food preservatives would slow the advance of refrigeration,97 Farm Journal viewed them as a further step down the road to universal low-cost competition: “Acronize is doing it. The cheaper broiler areas can now sell anywhere …. It’s now one big national market with broiler prices, like water, seeking one level.”98 The article’s ambivalence is telling. Caught in a cost-price squeeze, the majority of US farmers, however, felt that they could no longer afford to stop and reconsider the antibiotic infrastructure growing around them. By the end of the 1950s, a path to dependency was emerging: falling prices led to greater herd densities associated with higher productivity and greater antibiotic use, which in turn led to a further fall of commodity prices. The price of this chemical cornucopia was agricultural insecurity. When public concerns about chemical exposure became more pronounced during the late 1950s, many farmers found themselves torn between shared health concerns and the perceived necessitude of further production increases and antibiotic consumption.

The 1956 scandal surrounding penicillin residues in milk was the first to shake public trust in agricultural antibiotic use (chapter 2). Initially, it did the same in agricultural circles. Attempting to restore trust, experts exhorted farmers to adhere to withdrawal times and identify bacterial strains prior to using antibiotics. Not only would cows recover sooner, farmers would also stop paying for ineffective antibiotics and prevent stricter regulations.99 Farm Journal warned that the FDA was merely asking “farmers to cooperate”: “If that doesn’t work, … they may either order that drug companies put dyes in mastitis treatments … or put a ban on penicillin.”100 Despite mentioning allergenic hazards, most commentators, however, described antibiotic residues as an isolated problem. Events seemingly proved them right. Bolstered by sinking residue findings and blaming black sheep, dairy farmers averted statutory antibiotic restrictions.101

Public trust proved more difficult to win back. Despite agricultural campaigns to curb residue levels, US chemical criticism reached fever pitch in the wake of the 1959 cranberry scare and bestsellers like The Poisons in Your Food. Whereas agricultural commentators had formerly presented episodes like the 1956 penicillin scandal as a credible hazard to public health, the continuous expansion of public chemical fears was increasingly interpreted as a threat to modern agriculture. Commentators were especially apprehensive about public fears leading to substance bans. In 1959, Wallaces Farmer warned that a bigger “clamp down on all farm chemicals” was only a question of time: “a small army of FDA inspectors … have orders from Washington to go from farm to farm, if necessary, to find [antibiotic] violators.”102 Lancaster Farmer cautioned producers to exercise chemical self-control: “In light of the tremendous publicity accorded to the recent cranberry situation, dairy industry leaders are very much concerned about the great damage which could be done to milk if FDA officials are forced to file lawsuits against dairies or producers in order to enforce rules.”103

Over time, ongoing public clashes about the safety of agricultural chemicals led to a radicalization of agricultural rhetoric. In the US farming media, experts and editors increasingly resorted to painting black and white pictures of efficient, hard-pressed family farmers falling victim to an “irrational” anti-chemistry campaign. In 1959, Wallaces Farmer accused consumers and officials of stirring a “Big Ruckus” about the Cranberry Scare and publicizing the “incident entirely out of proportion to the dangers involved.”104 According to Progressive Farmer from Alabama, FDA officials were guilty of spreading “fear and disfavor for the entire production of an industry.”105 The magazine warned: “the nation is being harassed by a number of food cranks who insist that a food is good only if no chemicals were used in growing it” and “No nation in the world has a more abundant food supply, one that is cleaner, safer, or more nutritious than ours …. unless farmers look out, the ‘food cranks’ and other misinformed people may pressure Congress into passing unreasonable restrictions—restrictions that may do serious damage to our food supply and to national welfare.”106 Commenting on The Poisons in Your Food, Lancaster Farming compared agricultural chemical use to the plowing of “virgin prairie sod”: “The use of hormonized and fortified feeds with antibiotics” was just as necessary if the “producer is to stay in meat production.”107 While other articles complained about official tests detecting “what is almost ‘less than zero’ amounts of residue,”108 Wallaces Farmer proclaimed a “battle for farmers”109 in 1962. A “worrisome new movement” no longer just included cranks but also ordinary people, “well-meaning for the most part, who have become overly alarmed at our growing use of chemicals in food production.”110

Attacks on “cranks” and overzealous inspectors did not mean that the American agricultural community was naïve about potential side-effects of the chemical technologies it was employing. In farming magazines, there was usually a sharp contrast between the fierce rhetoric directed against “external” critics and the concerned tone adopted for “internal” safety advice. Reacting to scientific warnings, commentators promoted cost-benefit strategies to reduce personal health risks without foregoing chemicals’ economic benefits. “Rational” farmers were expected to follow labeling instructions and limit direct exposure. If personal exposure remained below critical thresholds, agricultural chemical use was deemed safe.111 However, even agricultural expert advice could be contradictory. In June 1960, an issue of Progressive Farmer contained two very different articles: whereas one commentator advocated using various chemicals to fight yard pests on page 76,112 page 78 contained an article warning about “harmful residues”113 of similar chemicals on homegrown fruits and vegetables. The effects of such mixed messaging on farmers’ personal attitudes are difficult to judge. In 1964, Wallaces Farmer conducted a poll to see whether chemical warnings had changed farmers’ habits. According to the poll, half of farmers regularly using pesticides and insecticides reported having taken more precautions because of hazards to crop, livestock, and personal health. One interviewee confessed, “These chemicals are beginning to scare me to death and I wouldn’t be surprised if only experts will be allowed to apply them in the near future.’”114 However, with overall pesticide, herbicide, and antibiotic use continuing to increase,115 most producers seem to have viewed personal safety measures as sufficient to contain potential hazards and justify ongoing use.

The only agricultural community to wholeheartedly endorse growing public chemical criticism were organic producers. US organic producers were a small and heterogeneous community throughout the 1950s and 1960s. Within the community, publications by Jerome Irving (J. I.) Rodale (Organic Farmer, Organic Farming and Gardening, Prevention) enjoyed high visibility. Rodale, a former accountant and electrical equipment manufacturer, had purchased a farm in Emmaus, Pennsylvania, in 1940 and turned it into an organic experiment station. He also began publishing books and magazines on organic farming, personal health, and self-improvement in what would become a veritable publishing and direct-advertising empire. Inspired by English activists like Alfred Howard and Eve Balfour and relying on a mix of experimental evidence, anecdotal accounts, and pseudo-science (chapter 6),116 Rodale was convinced that healthy food could only grow on “living” organic soil.117 He thus promoted “natural” farming methods like composting, mulching, and crop rotation in opposition to “artificial” production methods involving chemical fertilizers, pesticides, or hormones, which he blamed for problems ranging from polio to cavities.118 Although the wider ideals driving organic agriculture were more complex, Organic Farmer’s subtitle “farming without chemicals” was one of the budding movement’s most important maxims.

Because of their “natural” roots, biological antibiotics presented a classificatory problem for post-war organic producers. Rodale and his readers initially interpreted antibiotics not as chemicals but as proof of the health-giving properties of “living” soil: “One variety of actinomycete is known to produce vitamin B-12…. But of far greater importance is their production of antibiotics … there is a very strong antibiotic action in a soil loaded with organic matter, and thus many plant diseases are licked before they can even get near the plant.”119 “Natural” soil’s antibiotic benefits could allegedly also be passed on to humans and animals. In 1952, an Organic Farmer article noted that organic crops enabled a Michigan family to ingest “the benefits of … health-giving antibiotics from stable manures put into the soil.”120 In 1954, retired chemist Leonard Wickenden reiterated some of these claims in his popular Gardening with Nature.121 However, with antibiotic use growing in conventional agriculture, organic producers were soon forced to differentiate between beneficial and bad antibiotic exposure. For a while, this led to confusion. In 1952, Organic Farmer warned that “chicks fed antibiotics can poison consumers” by developing “resistant bacteria that will give food poisoning, enteritis and typhoidal infections to people eating their meat.”122 However, one year later, the magazine printed a feature on antibiotic soil’s benefits for poultry: “Nature has placed, and is constantly manufacturing antibiotics—germ killers—in the soil to promote the health of poultry. All one has to do to reap the benefits of such germ killers is to make sure that the range is not contaminated.”123 According to the article, “natural” antibiotics had to be distinguished from “artificially” manufactured antibiotics: “It is wrong to jump off the deep end in believing only in the merits of synthetically produced antibiotics, for Nature has given us antibiotics for lo, these many years.”124

Within the loosely organized organic community, a firmer front against agricultural antibiotics emerged only after the 1956 residue scandal. In addition to increasing attacks on “poisonous” and allergenic antibiotic residues,125 organic criticism also began to regularly encompass concerns about agricultural AMR selection. According to a 1959 article, AGPs caused senility and sterility, upset breeding patterns, and destroyed natural disease resistance in animals. For humans, risks resulting from agricultural antibiotic use included the destruction of sensitive organisms by “highly resistant strains of dangerous bacteria, notably a very lethal staphylococcus.”126 While organic farmers were producing a growing supply of “safe” chemical free meat, the conventional farmer “will find himself facing economic ruin—if his own mal-nourished [sic] body hasn’t given up the fight first.”127 Citing British research, Organic Farmer also began to warn about the on-farm selection of tetracycline-resistant E. coli from pigs and resistant mastitis in cows.128 This early joint discussion of residue and AMR hazards was exceptional both within the US agricultural and public spheres.

In conventional agriculture, only a small group of mostly veterinary bacteriologists warned about AMR as a public health hazard. In November 1951, University of California researchers Mortimer P. Starr and Donald M. Reynolds published a remarkable study on streptomycin AGPs’ effect on the microflora of turkeys. Feeding an oil meal ration containing 50 milligrams of streptomycin per kilogram of feed, the authors conducted sensitivity tests on E. coli isolated from bird’s feces. Although resistant organisms were also isolated from control birds, E. coli from streptomycin-fed birds quickly proved “generally highly resistant to the drug.”129 The authors noted that AMR was probably of little relevance to poultry producers since it would be cheaper to cull than to treat flocks in the case of disease. However, they worried that antibiotics could foster the spread of resistant pathogens like salmonella. Thinking about public health more widely, the authors wondered “just how much indiscrimination should be permitted in the use of new chemotherapeutic agents.”130 AMR against penicillin and sulfonamides had proven “inevitable” even with the “best planned scheme of drug administration”:

It would be unfortunate if a large reservoir of drug-fast enteric pathogens potentially harmful to man accumulated unchecked in the poultry population. We hope that those charged with the protection of the public health will objectively evaluate the situation …. We grant that the poultry industry cannot readily forego [AGPs’] great economic advantages …. But … a few years of research are likely to elucidate the fundamental mechanism underlying this growth promoting effect and … will permit agriculturalists to secure more rapid animal growth without inflicting potential hazards on the public health.131

Corroborated by a 1953 Minnesota study on AGP-fed rats and Canadian research on AMR in oxytetracycline-fed pigs,132 Starr and Reynolds’ warnings, however, failed to inspire regulatory or agricultural action.

Instead, most commentators in conventional agriculture described AMR not as a general threat to public health but as an economic problem of inefficient overuse to be overcome not by bans but via “rational therapeutics”—a movement that was also gaining ground in human medicine.133 In US veterinary circles, the 1950s saw a growing number of textbooks and magazines complain about “irrational” antibiotic use on farms caused by ignorance, lacking sanitation, and hasty disease diagnosis.134 The 1957 textbook Veterinary Pharmacology and Therapeutics also complained about manufacturer’s “dangerous tendency to add excessive levels of antibiotics”135 to animal feeds. However, most veterinary observers did not believe that resulting agricultural AMR selection was a serious health threat. Despite containing a separate section on AMR in 1961, the sixth edition of Veterinary Bacteriology and Virology maintained that AMR was “natural” and not an existential threat.136

Most non-veterinary agricultural observers also remained complacent about AMR.137 Of thirty analyzed US farming manuals and reviews published between 1955 and 1966, all stressed antibiotics’ benefits, many cautioned about inefficient drug use, but only two warned about potential health hazards resulting from AMR selection.138 And even these two warnings stressed that ill effects would be contained by rationalizing antibiotic use. According to veterinary pharmacologist L. Meyer Jones, “no patient should be deprived of the benefit of antibiotic therapy solely because of fear of inducing resistance in the disease germ.”139 Warning about inadequately dosed mastitis treatments and storing unsterilized needles in antibiotic bottles,140 a 1962 manual on Milking Machines and Mastitis similarly stressed that “rational” veterinary supervision would solve problems: “The only people “benefiting from [inefficient antibiotic use] are the sellers and advertisers of antibiotics.”141

The focus on AMR as a problem not of health but of inefficiency was mirrored in contemporary debates about fluctuating AGP performance. Despite widespread endorsement in farming magazines and manuals, varying feed trials were leading some observers to speculate whether AMR was diminishing AGPs’ efficacy. Industry reacted fiercely. In 1956, a Cyanamid booklet claimed that “reports that antibiotics are ‘losing’ their effect cannot be taken seriously.”142 According to Cyanamid, reductions in growth promotion were due to improved sanitation on farms—antibiotics simply had less bacteria to control. Three years later, the company repeated that “there is no evidence of a diminishing trend in production with time. On the contrary, the conversion of feedstuffs to pork has improved considerably throughout a decade of swine feeding.”143 Taking stock of the situation in 1962, Farmers Weekly Review acknowledged that “certain antibiotics used in a swine herd for several months may become less effective.”144 Ignoring the potential health implications of AMR selection, the article, however, noted that new antibiotics or antibiotic combinations still produced “a significant boost in rate of gain when fed to growing-finishing pigs.”145 In 1965, Iowa State University animal nutritionist Virgil Hays emphasized: “antibiotics are definitely of value in 98 percent of our farm situations.”146 The increasing use of higher-dosed AGPs was simply due to sinking antibiotic prices.

Sales figures show that routine antibiotic use remained popular on US farms. Between 1960 and 1970, sales of antibiotics added to animal feed and other applications grew by 330 percent from about 770,000 to 3,310,000 kilograms (about 43 percent of total US antibiotic sales).147 With the exception of organic farmers, an overwhelming majority of agricultural experts, commentators, and producers remained committed to the ideal of industrialized plenty—and to the chemical helpers enabling it. The antibiotic-facilitated drive for efficiency and growth remained stable despite the re-emerging cost-price squeeze and increasingly visible side effects like the decline of family farms and safety concerns about new agricultural chemicals. In the minds of many agricultural commentators, unfettered access to antibiotics had acquired almost infrastructural importance for productivity. The cultural and physical importance of expanding antibiotic infrastructures also meant that producers and commentators reacted first with alarm and then with indignation to “irrational” external antibiotic criticism. Mirroring public critics’ focus on residue hazards, US agricultural commentators rarely touched on AMR throughout the 1950s and early 1960s. Although the farming media reported on resistant pathogens in human medicine,148 AMR selection on farms was mostly portrayed as an efficiency problem to be overcome with “rational” drug use. As late as the mid-1960s, commentators reassured farmers that antibiotic use and medicated feeds would produce “little or no resistance, even when used over long periods of time.”149 This sanguine assessment seemed justified following the publication of the FDA’s 1966 report on veterinary and non-veterinary antibiotics (chapter 4). Industry journal Feedstuffs was happy to report: “scientific data now available do not show reason for alarm.”150 Overall, the US agricultural community remained confident in its ability to manage antibiotic risk.

Copyright © 2020 by Claas Kirchhelle.

All rights reserved

No part of this book may be reproduced or utilized in any form or by any means, electronic or mechanical, or by any information storage and retrieval system, without written permission from the publisher. Please contact Rutgers University Press, 106 Somerset Street, New Brunswick, NJ 08901. The only exception to this prohibition is “fair use” as defined by U.S. copyright law.

Except where otherwise noted, this work is open access under a Creative Commons Attribution-NonCommercial-No Derivatives 4.0 license (CC-BY-NC-ND 4.0).

Monographs, or book chapters, which are outputs of Wellcome Trust funding have been made freely available as part of the Wellcome Trust's open access policy

Bookshelf ID: NBK554196

Views

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...