U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

National Research Council (US) Subcommittee on Selenium. Selenium in Nutrition: Revised Edition. Washington (DC): National Academies Press (US); 1983.

Cover of Selenium in Nutrition

Selenium in Nutrition: Revised Edition.

Show details

7Effects of Excess Selenium

The poisonous nature of many selenium compounds remained more or less a laboratory curiosity until the 1930s, when it was discovered that selenium was the active principle in forages and grains that caused alkali disease in livestock raised in certain areas of the American great plains. The practical nature of this problem stimulated a great deal of research on both chronic and acute selenosis (reviewed by Moxon and Rhian [1943] and Rosenfeld and Beath [1964]). This chapter deals with general aspects of the toxicity of selenium compounds to animals and humans.

SELENIUM TOXICITY IN LABORATORY ANIMALS

ACUTE TOXICITY

The minimum lethal dose of selenium as sodium selenite or selenate in rabbits, rats, and cats was 1.5 to 3.0 mg/kg body weight regardless of whether the salts were given orally, subcutaneously, intraperitoneally, or intravenously (Smith et al., 1937). Animals receiving such acute doses of selenium compounds develop a garlicky breath odor because of the exhalation of volatile methylated selenium metabolites. Dimethyl selenide, the primary volatile metabolite, and trimethylselenonium ion, a urinary metabolite, have relatively low orders of toxicity, their LD50 in rats being 1,600 mg and 49.4 mg selenium/kg, respectively (McConnell and Portman, 1952b; Obermeyer et al., 1971). However, these compounds should not be regarded as innocuous, since they can have strong synergistic toxicities with mercuric chloride (Parizek et al., 1971) and sodium arsenite (Obermeyer et al., 1971). Also, male rats have been reported to be much more sensitive to dimethyl selenide than female rats (Parizek et al., 1971), and male rats fed diets low in selenium apparently are extremely sensitive to the toxicity of dimethyl selenide (Parizek et al., 1980). This susceptibility to dimethyl selenide toxicity can be largely eliminated by pretreating the male rats with small injected doses of selenite or by increasing the prior oral intake of dietary selenite. Elemental selenium is quite nontoxic, since its oral LD50 in rats is 6,700 mg/kg (Cummins and Kimura, 1971).

Aside from garlicky breath odor, animals acutely poisoned with selenium exhibit vomiting, dyspnea, tetanic spasms, and death from respiratory failure (Franke and Moxon, 1936). Pathological changes include congestion of the liver, with areas of focal necrosis; congestion of the kidney; endocarditis; myocarditis; petechial hemorrhages of the epicardium; atony of the smooth muscles of the gastrointestinal tract, gallbladder, and urinary bladder; and erosion of the long bones, especially the tibia.

CHRONIC TOXICITY

Dietary selenium levels of 4 to 5 ppm are sufficient to cause growth inhibition in animals fed a normal diet (NRC, 1976b). However, the resistance or susceptibility of animals to chronic selenium poisoning can be markedly altered by a number of factors. For example, Harr et al. (1967) found that rats fed a commercial “laboratory chow” diet were two to three times more resistant to chronic selenium toxicity than rats fed a semipurified diet. On the other hand, diets low in protein quality or quantity potentiated chronic selenosis (Gortner, 1940; Lewis et al., 1940). The growth rate of vitamin E-deficient rats was depressed by only 1 mg selenium as sodium selenite/kg of diet (Witting and Horwitt, 1964), and swine deficient in vitamin E and selenium were shown to be more susceptible to acute selenium toxicity than pigs fed diets supplemented with vitamin E and selenium (Van Vleet et al., 1974). Weanling rats are more susceptible to selenium toxicity than older rats (Halverson et al., 1966), and rabbits are more sensitive to selenium poisoning than rats (Pletnikova, 1970). Jaffe and Mondragon (1969) obtained evidence suggesting that rats can adapt somewhat to a chronic selenium intake, since chronically poisoned rats from mothers previously exposed to selenium stored less of the element in their livers than rats from mothers fed a nonseleniferous stock diet.

Another factor that can influence the interpretation of chronic selenium toxicity experiments is the criterion used to assess the degree of response to a given dose of selenium. As indicated above, the most commonly used criteria in the past were growth inhibition and mortality. Other criteria used include liver damage, splenomegaly, pancreatic enlargement, anemia, and elevated serum bilirubin levels (e.g., see Halverson et al., 1966). Jaffe et al. (1972b) reported that excess selenium intake in rats decreased fibrinogen levels and prothrombin activities and elevated serum alkaline phosphatase and glutamic-pyruvic or glutamic-oxaloacetic transaminase activities. However, all these effects were observed only at selenium intakes that also depressed growth rate.

Pletnikova (1970) measured various biochemical indices in rabbits and rats that were given low doses of sodium selenite in aqueous solution for long periods of time. In this experiment, 32 rabbits and 16 rats were divided into 4 groups and given daily peroral doses of 0, 5.0, 0.5, and 0.05 µg selenium/kg body weight for 7½ months and 6 months, respectively. There was a significant increase in the concentration of oxidized glutathione in the blood of the rabbits given 5 µg/kg for 2 months, and hepatic sulfobromophthalein excretion and succinic dehydrogenase activity decreased after 7 months. Fewer and less-pronounced changes were caused by 0.5 µg/kg, while 0.05 µg/kg weakened the capacity for forming new conditioned reflexes. Although these responses were considered harmful effects of selenium, the level of selenium given perorally to rats at a dose of 5 µg/kg is roughly equivalent to the intake provided by a diet containing 0.063 ppm. Since this level of dietary selenium intake is clearly within the nutritional range, the interpretation of this experiment is open to question. Most biologists would regard responses to selenium in the doses used by Pletnikova (1970) as physiological rather than pharmacological or toxicological effects.

SELENIUM TOXICITY IN FARM ANIMALS

Rosenfeld and Beath (1964) classified three distinct forms of selenium poisoning in livestock: (a) acute, (b) chronic of the blind-staggers type, and (c) chronic of the alkali-disease type.

In the field, acute selenium poisoning is caused by the ingestion of a large quantity of highly seleniferous accumulator plants in a short period of time. The experimental or accidental administration of selenium compounds has also produced acute poisoning in farm animals (NRC, 1976b). Signs of severe distress include labored breathing, abnormal movement and posture, and prostration and diarrhea, and are followed by death in a few hours. Acute selenosis is generally not a practical problem because livestock usually avoid the accumulator plants except when other pasture is not available.

Rosenfeld and Beath (1964) stated that blind staggers occurs in animals that consume a limited amount of selenium accumulator plants over a period of weeks or months, but this disease has not been produced in animals by the administration of pure selenium compounds. However, blind staggers can be mimicked experimentally by giving aqueous extracts of accumulator plants, so Maag and Glenn (1967) and Van Kampen and James (1978) suggested that alkaloids or other toxic substances in the accumulators may play a role in this syndrome. The affected animals have impaired vision, and they wander, stumble, and finally succumb to respiratory failure.

Animals that consume grains containing 5 to 40 mg selenium/kg over a period of several weeks or months suffer from chronic selenosis, known as alkali disease. Signs include liver cirrhosis, lameness, hoof malformations, loss of hair, and emaciation. Although Maag and Glenn (1967) were not able to demonstrate alkali disease experimentally in cattle given inorganic selenium, Olson (1978) cited several studies indicating that the disease is associated with the consumption of seleniferous grains or grasses and could be produced experimentally by feeding inorganic selenium salts.

There is no effective way to counteract selenium toxicity in livestock except to remove the animals from the areas with high selenium soils and close these areas to livestock production. Grains or grasses grown in these areas, however, can still be used if they are blended with crops from areas with lower selenium soils.

SELENIUM OVEREXPOSURE IN HUMANS

The public health aspects of excess selenium exposure first became of concern after the discovery that selenium caused alkali disease in livestock, since it was quickly realized that selenium from grains or vegetables grown on seleniferous soils could also enter the human food chain. Smith et al. (1936) surveyed rural farming and ranching families living in the Great Plains area of the United States known to have a history of alkali disease in livestock. No symptoms pathognomonic of human selenium poisoning were found, and no serious illness definitely attributable to selenium toxicity was observed. Vague symptoms of anorexia, indigestion, general pallor, and malnutrition were reported, and more pronounced disease states such as bad teeth, yellowish discoloration of the skin, skin eruptions, chronic arthritis, diseased nails, and subcutaneous edema were seen.

Since the results of this preliminary study were not clear, a second, more complete survey was carried out to delineate more exactly the symptomatology of human selenosis and its possible relationship to urinary selenium excretion (Smith and Westfall, 1937). To increase the probability of detecting effects of selenium overexposure, 100 subjects were selected from the earlier survey who had high levels of selenium in their urine. Once again it was concluded that none of the symptoms observed could be considered specific for selenium poisoning. However, numerous complaints of gastrointestinal disturbance were considered significant, and a high incidence of icteroid skin discoloration was thought perhaps to be related to liver dysfunction possibly caused by selenium ingestion. Bad teeth were also seen in 27 percent of the individuals surveyed. Other symptoms were reported so rarely that they did not appear to be associated with selenium.

Jaffe (1976) conducted a survey in Venezuela and compared children from a high-selenium region (Villa Bruzual) with those from Caracas. Average hemoglobin and hematocrit values were depressed in Villa Bruzual, but no correlation between blood and urine selenium levels and hemoglobin or hematocrit values was observed in specific individuals. Any differences in hemoglobin were thought to be more likely due to differences in nutritional or parasitological status rather than to differences in selenium intake (Jaffe et al., 1972a). Prothrombin and serum alkaline phosphatase and transaminase activities were normal in all children, and no correlation with blood selenium levels was found. Dermatitis, loose hair, and pathological nails were more common in children from the high-selenium region, and the clinical signs of nausea and pathological nails seemed to correlate with serum and urine selenium levels. But it was doubted that selenium was responsible for the increased incidence of those clinical signs since no differences attributable to selenium were seen in the various biochemical tests carried out (Jaffe et al., 1972a).

Nine cases of acute selenium intoxication were described by Kerdel-Vegas (1966) in persons who had consumed nuts of the “Coco de Mono tree” (Lecythys ollaria) from a seleniferous area in Venezuela. In most cases, nausea, vomiting, and diarrhea occurred a few hours after eating the nuts, followed by hair loss and nail changes a few weeks after the initial episode. Most patients appeared to make a satisfactory recovery, with eventual regrowth of hair and nails; but a 2-year-old boy died due to severe dehydration. Samples of Brazil nuts marketed in Great Britain contained an average of 22 ppm of selenium (Thorn et al., 1978), and Chavez (1966) reported signs of selenium toxicity in rats fed diets that included defatted Brazil nut flour containing 51 ppm of selenium. Brazil nuts marketed in the United States also are high in selenium, with 6 percent of one sample containing 100 ppm or more (Palmer et al., 1982).

Although data from industrial exposure to selenium are limited, Glover (1976) has stated that “there have been no deaths or cases of irreversible pathological conditions due to selenium or its compounds being absorbed from industrial processes.”

A detailed description of an episode of endemic human selenosis was recently reported from the People's Republic of China (Yang et al., 1983). The morbidity due to selenium intoxication was almost half of 248 inhabitants from the 5 most heavily affected villages during the years of peak prevalence (1961 to 1964). Loss of hair and nails was the most common sign of the poisoning, but lesions of the skin, nervous system, and possibly teeth may have been involved in the areas of high incidence. The mean urinary selenium level found in this area of China with selenosis (2.68µg/ml) was greater than even the maximum value (1.98 µg/ml) reported by Smith and Westfall (1937). The average blood selenium level in this high-selenium region of China (3.2 µg/ml) substantially exceeded the level that Jaffe et al. (1972a) concluded was hazardous to children (0.813 µg/ml). The selenium content of vegetable, cereal, hair, blood, and urine samples from the selenosis area was up to three orders of magnitude higher than that of corresponding samples from Keshan disease (selenium-deficiency) areas. The selenium entered the food chain from soils that had been contaminated by weathered coal of a very high selenium content (average greater than 300 µg/g).

Thus, with the exception of the Chinese experience, it has not been possible to identify any specific, definitive long-term human health problem due to selenium overexposure. This seems rather remarkable in light of the great inherent toxicity of selenium. However, it should be pointed out that others felt that human selenium poisoning is common, widespread, and in certain localities of importance to public health (Lemley, 1943). Kilness (1973) decried the fact that no subsequent systematic survey with appropriate controls has been made in South Dakota since the first surveys done over 30 years earlier. Moreover, Smith and Westfall (1937) were surprised by the absence of definite evidence of serious injury, especially in those subjects whose urinary selenium concentrations were markedly elevated.

Because of the lack of any well-documented selenium intake data during excess selenium exposure, a precise figure for an intake that would be harmful to humans cannot be given. Most of the subjects of Smith and Westfall (1937) were thought to be absorbing between 10 and 100 µg/kg body weight per day. This would be equivalent to a dietary selenium intake of 700 to 7,000 µg/day by a 70 kg man, if it can be assumed that all of the ingested selenium was absorbed. Tsongas and Ferguson (1977) could find no difference in the health status of two populations that drank water containing 50 to 125 µg selenium/liter or 1 to 16 µg selenium/liter, respectively. It was not possible to estimate the daily dietary intake of selenium in the endemic-selenosis area of China during the period of peak prevalence, but the dietary intake some time after the peak prevalence had subsided averaged 4.99 mg/day with a range of 3.20 to 6.69 mg/day (Yang et al., 1983). A tentative maximum acceptable daily selenium intake for the protection of human health of 500 µg was proposed by Sakurai and Tsuchiya (1975). This intake was arrived at by first estimating that the usual average selenium intake by humans ranged between 50 and 150 µg/day. Intakes of 10 to 200 times normal were thought acceptable as an estimated range for the safety margin within which most persons could tolerate selenium. Multiplying the lower of both estimates gave the lowest level of potentially dangerous selenium intake, i.e., 50 × 10 or 500 µg/day.

Obviously, progress in selenium toxicology would be greatly enhanced if a more specific and sensitive test of selenium overexposure could be developed. Perhaps with the discovery of the role of selenium in GSH-Px (see Chapter 4) and the newly found inhibition of protein synthesis by selenodiglutathione (Vernie et al., 1979) such tests will be forthcoming in the future.

Copyright © National Academy of Sciences.
Bookshelf ID: NBK216723

Views

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...