U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Institute of Medicine (US) Committee on Micronutrient Deficiencies; Howson CP, Kennedy ET, Horwitz A, editors. Prevention of Micronutrient Deficiencies: Tools for Policymakers and Public Health Workers. Washington (DC): National Academies Press (US); 1998.

Cover of Prevention of Micronutrient Deficiencies

Prevention of Micronutrient Deficiencies: Tools for Policymakers and Public Health Workers.

Show details

2Key Elements in the Design and Implementation of Micronutrient Interventions

The 1991 Montreal Conference1 focused worldwide attention on ''Hidden Hunger." Not only were millions of individuals affected by deficiencies of vitamin A, iron, and iodine, but solutions to these micronutrient deficiencies were technologically possible. Since the 1991 conference, a number of donors have committed substantial financial resources to solving the problem of "hidden hunger." Other efforts, including the 1992 report of the International Committee on Nutrition,2 have helped stimulate national efforts to analyze nutritional problems and the resources available for their solution. Less attention, however, has been devoted to understanding the key elements needed to implement and sustain a micronutrient intervention on a fully operational scale (e.g., national, regional), as opposed to a pilot project scale, at either the national or community level.

Experience to date has shown that "how" an intervention is implemented may be as important, or in some cases more important, than "what" is implemented. Some research has already been conducted on elements of successful nutrition interventions generally; in 1989, the USAID International Nutrition Planners Forum held a workshop in Seoul, South Korea, on "Elements of Successful Community Nutrition Programs."3 Similarly, a World Bank-funded project entitled "Successful Nutrition Programs in Africa: What Makes Them Work?"4 evaluated elements of effective nutrition interventions in the region.

Correspondingly little work, however, has been done on the elements of effective micronutrient interventions. While the design and implementation elements of effective micronutrient interventions may be identical to macronutrient nutrition interventions in general, it is worth taking a systematic look at the factors that account for—or constrain—their effectiveness.

This chapter draws on the discussions of the two working groups in the 5–7 December 1996 workshop (see the Appendix for agenda). Working Group I was charged with evaluating past experience with approaches to the prevention or correction of micronutrient malnutrition. The approaches examined included food-based strategies such as dietary change and fortification, supplementation, and other public health measures, including parasite control and delayed umbilical cord ligation. Working Group II looked more broadly at the major elements of success and constraint in past programs.

This chapter first describes the importance of iron, vitamin A, and iodine to health. It then considers options for successful interventions based on the level of development of the target country. The costs of interventions are also briefly reviewed.

In drafting this summary, the committee has followed two general rules. First, while elements of past experiences may differ among the specific micronutrients, the committee paid special attention to successful examples of strategies incorporating more than one micronutrient or including improvement in public health measures. Second, the committee and workshop participants agreed to base all findings and recommendations in this report on the data provided in three background papers, because these documents provided the substantive basis for discussion at the workshop. "Conventional wisdom" was not considered a sound basis for judgment in the absence of acceptable evidence. To streamline discussion, no references are provided in this chapter; the interested reader is encouraged to read the supporting papers.

The Importance Of Iron, Vitamin A, And Iodine To Health

The health and vitality of human beings depend on a diet that includes adequate amounts of certain vitamins and minerals that promote effective functioning of physiologic processes, including reproduction, immune response, brain and other neural functions, and energy metabolism. The body needs relatively minute quantities of these elements—i.e., measured in micrograms or milligrams—thus supporting their description as micronutrients. These elements are essential; they cannot be manufactured by the human body and must be obtained through dietary means. Deficiencies of most micronutrients are known to have devastating effects on health. They increase risk of overall mortality and are associated with a variety of adverse health effects, including poor intellectual development and cognition, decreased immunity, and impaired work capacity. The adverse effects of micronutrient malnutrition are most severe for children, pregnant women, and the fetus.

This report focuses on lessons learned from past interventions to address iron, vitamin A, and iodine malnutrition. The committee decided to limit its evaluation to these three micronutrients because it felt there was adequate experience for each and, because iron, vitamin A, and iodine deficiencies alone are responsible for significant global mortality and morbidity. While discussion relates to these three micronutrients, workshop participants agreed that the lessons learned for improving future intervention strategies would also be applicable to prevention and control of malnutrition created by deficiencies of other micronutrients.

Iron

Iron is present in both heme and nonheme forms in the diet. Heme iron, the most bioavailable form, is found in the greatest quantities in animal sources such as red meat. Normal individuals absorb between 20 and 30 percent of dietary heme iron, while iron-deficient subjects absorb between 40 and 50 percent. Nonheme iron—which is absorbed less efficiently than heme iron—is most abundant in other sources of iron, including eggs and all vegetable roots, seeds, leaves, and fruits. Nonheme iron is also present in heme iron-containing and other animal tissues. Nonheme iron generally constitutes over 90 percent of dietary iron, particularly in the developing world. In contrast with heme iron, nonheme iron absorption is enhanced or inhibited by many dietary constituents. Heme-iron-containing proteins, ascorbic, malic, tartaric, and succinic acids, and many fermentation products are enhancers. Meat and alcohol, by promoting gastric acid production, also enhance nonheme iron absorption. Inhibitors include fiber, phytic acid and other polyphosphates, calcium, manganese, polyphenols such as tannins and other compounds present in food (seeds, other plant components, and many condiments), and beverages (e.g., tea, herbal infusions, coffee, and chocolate).

ID has serious adverse consequences on health; the most evident is anemia. About one billion people worldwide suffer clinical anemia. Severe anemia causes as many as one in five maternal deaths and is a major cause of childhood mortality in many developing countries. Other consequences of iron deficiency are impaired physical growth; potentially permanent adverse effects on neurological functions involving cognition, emotional behavior, reaction to and reception of stimuli, attention span, learning capacity, and neuromotor development and function; decreased capacity for physical work; lowered immunity, resulting in increased susceptibility to infections; and alterations in the reproductive process.

Vitamin A

Vitamin A activity is found in fruits and vegetables that contain green and yellow provitamin A carotenoid pigments and as the preformed vitamin in liver and breast milk. Humans need less than 1 milligram of vitamin A a day to maintain health, yet in 1995, it was estimated that 3 million children annually exhibit xerophthalmia—that is, they are clinically vitamin A-deficient and at risk of blindness. An additional 250 million children under 5 years of age were estimated to be subclinically vitamin A-deficient (based on the prevalence of serum retinol distributions below 0.70 µmol/L) and at risk of severe morbidities and premature death. These estimates do not include pregnant and lactating women living in areas of childhood VAD endemicity who are also likely to be in poor status, but for whom epidemiological data are quite limited. A high prevalence of maternal night blindness and low breast milk levels of vitamin A are reported in such areas. A lack of sensitive, survey-applicable, nonclinical indicators specific to VAD, however, has hampered population-based evaluation of status among reproductive-age women and other age and sex groups.

Iodine

Iodine must be obtained from the environment, but it has been depleted from the soil and water in many areas of the world. WHO has estimated that over 1.5 billion persons in the world reside in regions of environmental iodine deficiency and are at risk of IDD. The only recognized role of iodine in mammalian biology is as a component of the thyroid hormones, although there are data suggesting that iodine deficiency may be involved in fibrocystic disease of the breast. IDD is associated with goiter, cretinism, mental and neuromotor retardation, and reproductive impairment. Fetal and pre- and postnatal survival are also reduced by iodine deficiency.

The Continuum Of Population Risk

The nutrition status of all populations is in flux. Groups are in continuous movement along a continuum of nutritional risk, extending from a situation of severe micronutrient malnutrition, through a wide spectrum of presumed nutrient adequacy, to one of nutrient overload and toxicity. The latter state is not emphasized in this report. The committee has identified four levels of population risk.

  • Level IV (severe deficiency) is characterized by populations with severe micronutrient malnutrition. These populations can be said to be in public health crisis with respect to vitamin A and iodine deficiency when clinical manifestations (xerophthalmia and goiter) are prevalent at the levels indicated in Table 2-1. Immediate therapeutic and prophylactic programs are needed. There is lack of consensus on the prevalence cutoff to determine the urgency for therapeutic public health programs for anemia defined by hemoglobin level alone.5
  • Level III (moderate deficiency) is characterized by populations with moderate to severe micronutrient malnutrition where preventive or therapeutic public health programs geared to the level of severity are appropriate. This scenario is most frequently encountered in developing countries, but it can be found in regions of industrial countries as well.
  • Level II (mild and widespread deficiency) is characterized by populations with mild micronutrient malnutrition. This scenario is encountered in both developing and industrialized countries.
  • Level I (mild and clustered deficiency) is characterized by only selected, usually deprived, populations affected by micronutrient malnutrition. This scenario is most frequently encountered in regions of industrial countries.
TABLE 2-1. Population Prevalence of Clinical and Subclinical Signs of Iron, Vitamin A, or Iodine Deficiency by Level of Population Risk.

TABLE 2-1

Population Prevalence of Clinical and Subclinical Signs of Iron, Vitamin A, or Iodine Deficiency by Level of Population Risk.

Table 2-1 provides the criteria for classification of populations to Levels IV to I for iron, vitamin A, and iodine status. In using this framework, program funders—and implementers—should note that populations within a single country may be at different levels of risk for these three micronutrients. For example, a given country may have populations at Level II with respect to iodine status, while the same or other populations are at Level III with respect to iron status. The criteria in Table 2-1 are based on prevalence rates of specific clinical and subclinical signs of iron, vitamin A, and iodine deficiency in specified subpopulations. In studies in the field, standard clinical signs for deficiency of iron, vitamin A, and iodine, respectively, are: anemia in any high-risk group; xerophthalmia in preschool-age children, including night blindness; and presence of goiter in school-age children as determined by palpation or ultrasound. The subclinical indicators for deficiency of iron, vitamin A, and iodine, respectively, are: iron deficiency indicator (usually serum ferritin); serum retinol level; and median urinary iodine concentration. The highest level of population risk assigned by a prevalence value takes precedence.

The rationale for classifying populations across the three micronutrients examined in this report is an important one. Although the interventions for the three will differ, it was the consensus of the workshop participants that micronutrient deficiencies tend to cluster in populations. Thus, intervention programs should always consider strategies that meet the population needs with respect to multiple deficiencies where they may exist. While interventions on a single micronutrient may, in certain instances, be appropriate (for example, USI), the committee believes that strategies that focus only on a single micronutrient, without consideration of other micronutrient needs, should no longer be supported without careful consideration and justification.

In addition, the focus on both clinical and subclinical signs as criteria for determining severity of micronutrient status is a deliberate one. Workshop participants concluded that the prevailing focus of past intervention efforts on frank clinical signs of micronutrient malnutrition as a basis for determining need, while important for their severe health consequences and for serving as indices of risk in the community, has often resulted in neglect of the vast hidden problem of subclinical micronutrient malnutrition. The subclinical problems contribute to slowed human capital development and stagnant national economic development through reduced mental and work capacities and premature deaths. Future efforts should therefore place particular emphasis on the measurement, control, and ultimate prevention of subclinical deficiencies.

The purpose of any intervention program is to move at-risk populations along the continuum of risk, from higher to lower levels of risk (toward Level I). As this chapter will demonstrate, experience has shown that the timeframe for movement will depend both on the context of overall national development of the target population's country and the mix of interventions selected within that context.

Options For Successful Interventions

Past experience with strategies directed toward correction of iron, vitamin A, and iodine deficiencies demonstrates that there is a "toolchest" of potentially effective, complementary interventions available to both government and the private sector. Workshop participants agreed, however, that the key to past programmatic successes has not been just the availability of these tools—many programs have faltered even with their use. Experience suggests that it is the selection and adoption of the right mix of tools for a particular country or regional setting that can ensure success. Equally important, evaluation of limited past experience suggests that it is the persons and organizations using the tool that provide the critical means for building upon and extending the initial basis for success. Well-cited instances (see, for example, the description of the Thailand ivy gourd project in Chapter 4) support the contention that it is important that the people using the tools—ideally, providers at the local, regional, and country level—have a say in their choice, are educated as to their strengths and limitations in different circumstances, and are assured the means and capabilities to maintain the tools long after their donors have moved elsewhere. This may be achieved by building local "ownership," as would be done for many food-based or supplementation interventions, or by utilizing a viable industrial base and market, as might be the case for fortification. Although logical and popular, workshop participants agreed that this contention requires additional confirmation, given its limited testing and the considerable additional costs associated with securing and maintaining broad-based community involvement.

Availability of the toolchest alone, however, has not been sufficient to ensure program success. Workshop participants agreed that careful consideration and application of design and management strategies suited to local conditions and needs are critical to success. Unfortunately, these strategies have not often been given adequate consideration in the design of past interventions.

The following section describes some of the major tools available in the fight against micronutrient malnutrition. It should be emphasized that each has its strengths and limitations and domain of applicability, providing a powerful mix of options for improving micronutrient status in a population over varying periods of time.

Supplementation

Supplementation refers to the addition of pharmaceutical preparations of nutrients—capsules, tablets, or syrups—to the diet. Research has shown supplementation of adequate dosage and duration to be efficacious in treating, correcting, and preventing deficiencies of iron, vitamin A, and iodine for groups in which there are serious health problems. A major challenge is to scale up supplementation to a program level that achieves adequacy in target group coverage, dosage, and frequency of dosing that assures effectiveness, representing the combined impact of efficacy and the process of implementation. Supplementation has traditionally been considered "short-term," although it may usefully continue until effective alternatives are in place. Iron supplementation and iron therapy are currently part of the national health programs in the majority of developing countries and in many industrial countries. Periodic distribution of high-dose vitamin A supplements, either universal to all children of a specified age range or targeted to high-risk groups, has been the most widely applied intervention for treatment, prevention, and control of VAD. Supplementation of iodine using iodized oils by injection; drops of Lugol's solution; and tablets of salts of iodine, sometimes disguised with chocolate, have been less frequently used than supplementation for iron and vitamin A, but it can be an effective stopgap measure in populations with severe iodine deficiency until salt iodization can become effective.

Strengths of supplementation include its immediate impact on micronutrient status, health, and survival ability. It can achieve rapid coverage in at-risk populations and be linked to the health care delivery system, and the cost of worker training is relatively low, compared, for example, with that for dietary modification. A key limitation of supplementation—whether it is used to correct for deficiencies of iron, vitamin A, or iodine—is that because of inadequate targeting or coverage, deficient individuals may not be identified or reached routinely, and many at-risk persons, particularly in rural settings, can be missed. In addition, periodic high coverage has often not been sustained over time for a variety of reasons, including lack of sustained financial or political support or other overriding priorities in a limited health infrastructure. Finally, poor compliance by the target individual in taking a supplement has been a consistent reason for the low impact of many supplemental schemes. This is a particular problem in iron programs, which have traditionally relied on the use of daily iron supplementation, although weekly dosing now appears to offer a cost-effective alternative. Thus, supplementation should be considered an essential and complementary bridge to more sustained measures such as food fortification, food-based approaches, and other supportive public health interventions. The example of Indonesia, which successfully shifted from an almost exclusive reliance on vitamin A supplementation to more varied strategies, such as fortification (see Chapter 4), supports this conclusion.

Fortification

Fortification refers to the addition of needed micronutrients to foods. Adequate consumption of fortified food has been shown to improve micronutrient status. The choice of a food vehicle or vehicles depends on a series of factors, including the target group, food consumption patterns of the target group, and availability and characteristics of the possible vehicle. With respect to the target group, food vehicles may differ when directed to the population as a whole (general fortification) or to specific target groups (e.g., infants, schoolchildren, and refugees), or defined socioeconomic or geographical areas (for example, urban, rural, and ethnic group). The fortified food should also be adjusted to match the food consumption practices of the target population to avoid under- or over-supplementation. Foods should be selected for fortification on the basis of the food consumption practices, stability, production and marketing characteristics, and cost.

Experience has shown food fortification to be a useful bridge to sustainable, long-term dietary change in populations at moderate and low levels of iron and vitamin A deficiencies (Level III to I). If the program is made universal through a commonly consumed product, fortification requires little government involvement in the creation of consumer demand or in the training of service delivery workers. In addition, it generally presents fewer logistical problems in supply than supplementation and its costs end up being borne almost exclusively by the private rather than the public sector. The experience with iron fortification is mixed. When cereal flour is used without long storage time, ferrous sulphate is cheap and effective, and ethylene diamine tetraacetic acid (EDTA) iron could potentially be a satisfactory fortificant once supply and cost issues are solved. Nevertheless, iron fortification of foods continues to be plagued by the absence of ideal compounds that would be favorably absorbed, stable and nonreactive, with little color and taste of their own, easily measurable for monitoring purposes, and inexpensive. The experience with vitamin A fortification of foods has also been mixed. Successful vehicles for vitamin A fortification have included sugar in Guatemala and margarine in the Philippines. Experience with the use of vitamin A-fortified monosodium glutamate (MSG) in the Philippines and Indonesia—where the product was highly effective, but showed color changes (yellowing) that the manufacturers feared would jeopardize sales—suggests that it is important to solve technical problems with the vehicle early on, while assuring that the population with the micronutrient problem consumes the vehicle in stable quantities on a regular basis. Selection of the form of the fortificant is important. For iodine and vitamin A, the options are more straightforward than for iron. With iron, no one form is superior in all vehicles and environmental conditions. Thus, careful evaluation of options is required before the most appropriate iron fortificant is selected. The advantages outlined for salt in the following paragraph are also true for iron fortification of flour in some countries, and in vitamin A fortification of sugar. The same quality assurance precautions are required.

Fortification of salt with iodine has been a major public health success. It has the unique advantage among the micronutrients because it requires no change in dietary habits in most instances: everyone uses salt. USI has therefore become the cornerstone for prevention of IDD. Programs, however, must take into account possible losses between point of manufacture or importation and the consumer's table. Losses may vary among the forms of iodine used (iodide vs. iodate), heat, purity, humidity, packaging, shelf time, and losses in cooking. Programs should also be designed around salt consumption patterns in order to ensure, as nearly as possible, an intake of iodine within the desired range. Similarly, a major need in the salt iodization process is keeping the level of added iodine within safe and effective limits. This means that, at the very least, the concentration of the commercial product must be measured at frequent intervals. Fortunately, the measurement technique is quite simple and reasonably accurate for practical purposes. Difficulties arise in implementing programs when the salt industry is widely dispersed among a large number of small producers. Ensuring distribution of iodine to all local producers is difficult, and compliance is a problem. Other foods that have been used successfully as vehicles for iodine fortification have included bread and water. Salt, however, remains the preferred vehicle for fortification.

A less traditional form of intervention is the broad class of plant breeding strategies that are emerging to deal with micronutrient deficiencies. A number of pilot trials are now under way worldwide to examine nutrient-modified crops—high-iron rice is an example—to address serious micronutrient deficiencies. In addition, reduced phytate staple crops are being developed to increase the bioavailability of a range of micronutrients. If successful, these strategies are attractive because they do not require a modification in typical dietary patterns.

Food-Based Approaches

Food-based approaches attempt to correct the underlying causes of micronutrient deficiencies. These strategies are usually considered the ideal long-term goal toward which society strives—provision or assurance of access to a nutritionally adequate diet achieved through diversity of food availability, wise consumer selection, proper preparation, and adequate feeding. Nevertheless, the conventional assumption that food-based approaches represent the best strategy for correcting micronutrient deficiency in all circumstances needs to be reviewed carefully. If increases in homegrown foods can be effected, and this leads to increased intakes, then the dietary change minimizes the effect of lack of consumer access to markets or fluctuations in market prices or food availability. Increased reliance on food-based approaches decreases reliance on the health care system as a means of nutrient supply (as with supplementation) and offers a source of nutrients through foodstuffs that may be available through foraging as well as homegrown or procured foods. In the case of iodine deficiency resulting from the low iodine content of water supplies and locally produced foods, however, dietary change based on local foods is not an option. Changes in food-based strategies also have only limited, short-term application for the prevention of iron deficiency when there are economic or religious constraints on increasing animal protein intake. In addition, efforts to change national consumption patterns of foods that interfere with iron uptake (e.g., tannin containing teas, wheat-containing foods) have had limited success. Addition of iron enhancers to the diet (vitamin C) is also being tried, but experience is limited. Studies have also shown that improving iron status as part of the complementary feeding of infants and very young preschoolers is impractical without fortification. For improving vitamin A status, food-based approaches could be most effective where there is widespread availability, variability, adequacy, and acceptability of vitamin A-containing foods among targeted populations.

The cultivation of homegrown foods is not cost-free. In most cases the responsibility for cultivation rests with the females in the household. The issue of time constraints on women must be a key consideration in assessing the feasibility of home gardens to alleviate micronutrient deficiencies.

To at least some extent, past experience has shown that dietary modification can be brought about by nutrition education. To even begin to achieve such changes, however, may require five to ten years or more, assuming a stable economic and political environment. Dietary modification and changes depend not only on changes or modifications in agricultural activities such as crops raised and food production, but also on food marketing, preservation, and preparation. A critical institution is the infrastructure of the communications industry, including those who produce mass media and point-of-purchase advertising that may influence dietary change. Advertising and other advocacy measures should, however, be based on principles of sound nutrition.

Other Public Health Control Measures

Disease control methods—including immunization, parasite control, provision of sufficient water and public sanitation, control of diarrheal diseases and acute respiratory infections (ARI), and the teaching of personal hygiene and sanitation practices—are an important addition to, but should not be considered a replacement for, interventions that increase the micronutrient intake of deficient populations. For example, high measles immunization coverage can contribute importantly to VAD control, as documented in Tanzanian children by the threefold reduction in hospital admission for corneal ulceration associated with improved coverage (see Chapter 4). Similarly, treatment of hookworm infection and prevention of reinfection have been shown to decrease iron loss, and thus complement iron replenishment strategies (see Chapter 3). Since correction of micronutrient deficiency, in turn, improves response to immunization and other public health measures, simultaneous attention to improving nutrition status and ensuring effective public health measures can offer the most cost-effective interventions in deprived populations.

Costs Of Interventions

One of the most severe limitations in evaluating past strategies for prevention or correction of micronutrient malnutrition is the widespread lack of information on cost-effectiveness. The data that do exist suggest that interventions against micronutrient malnutrition, considered either separately or in any combination, offer a high return for a relatively low investment.

The following two tables derive from the 1994 World Bank report, Enriching Lives.6 Table 2-2 indicates that the direct costs of delivering nutrients as supplements or as fortified foods are low. In India and Guatemala, it cost US$0.12/year per person (in 1994 U.S. dollars) to fortify salt and sugar, respectively, with iron. In Guatemala, the costs of fortifying sugar with vitamin A were US$0.17/year per person. In India, it cost US$0.05/year per person to fortify salt with iodine.

TABLE 2-2. Costs of Micronutrient Control Programs.

TABLE 2-2

Costs of Micronutrient Control Programs.

The costs of dietary change are less well documented than those of fortification and supplementation; most data derive from studies of vitamin A interventions (see Chapter 4). A project conducted in Nepal attempted a cost-effective analysis of three vitamin A interventions: semiannual capsule distribution, capsule distribution piggy-backed to primary health care (PHC), and nutrition education activities piggy-backed to PHC. Distribution of vitamin A supplements was least costly, followed by PHC and nutrition education. Similarly, when costs and effectiveness of three vitamin A interventions—sugar fortification, capsule distribution, and gardening plus nutrition education—were examined in a study in Guatemala, the analysis reported cost per high-risk person achieving adequate vitamin A status to be US$0.98/year for fortification, US$1.86/year for capsule distribution, and US$2.71/year to US$4.16/year for gardens. For the pilot HI Bangladesh gardening project, annual cost/target family averaged US$39.0/year. When disaggregated to an individual garden level that included operating costs for seeds/seedlings, crop protection (fencing), and irrigation, US$11.7/year was spent; minus fencing and irrigation, the cost was US$3.0/year. The scaled-up national project, working through Bangladeshi nongovernmental organizations (NGOs), has reduced costs to an estimated US$8.33/garden, or US$1.5/individual.

Costs in life years gained from reductions in mortality and/or lived free of illness and disability (disability-adjusted life years, or DALY7) provide a similar picture (see Table 2-3). Fortification with iron costs US$4 per DALY saved; with iodine, US$8 per DALY saved; and with vitamin A, US$29 per DALY saved. The costs of supplementation are also relatively low. Providing pregnant women with supplemental iron costs US$13 per DALY earned, while supplementing all people under age 60 with iodine costs US$37 per DALY earned. Vitamin A supplementation of children under age 5 costs US$9 per DALY earned.

TABLE 2-3. Returns on Nutrition Investments.

TABLE 2-3

Returns on Nutrition Investments.

In summary, all the cost-effective evaluations reviewed agree that fortified foods or capsule distribution, depending on whether a fortifiable food, widely consumed by the high-risk group, is available, are potentially the least expensive interventions. Although fortified foods are likely to be the more sustainable investment, they require a food production or delivery infrastructure that may be lacking in many countries. Capsule distribution is a proven, time-limited (except in the case of iron supplementation in pregnancy), and cost-effective intervention if coupled with programs that have effective service delivery to target groups and there is a consistent, adequate supply. Promotion of increased consumption and/or production of food is a viable option in most contexts where water supply is not critically short, but it requires application of a social marketing methodology to overcome socioeconomic-cultural barriers to behavior changes where benefits are not always obvious. There are many difficulties in quantifying nonmonetary benefits in order to realistically estimate the cost–benefit ratios associated with each intervention. Nonetheless, over the long term, interventions that provide balanced, multinutrient improvements, such as nutrient-rich, natural and/or fortified foods, are most likely to provide permanent benefit to recipients dwelling in deprived contexts.

Feasibility Of Involving Key Societal Sectors In The Planning And Implementation Of Micronutrient Interventions: A Guide To Decisionmaking

Experience has shown that different strategies for prevention and control of micronutrient malnutrition are required for countries at different levels of development. Key elements for successful implementation of food fortification programs and for achieving sustainable longer-term dietary change include (1) the presence of a viable food industry; (2) available channels for food marketing and distribution; (3) a health care system that can help identify and monitor micronutrient malnutrition in the population and provide education and treatment for deficiencies; (4) an effective core community of persons that can provide necessary input into the planning, implementation, and evaluation of intervention programs and enhance the kinds of educational, marketing, and community outreach activities that will help ensure sustainability of the intervention over the longer term; and (5) for long-term dietary change, a level of population literacy that can allow for greater community involvement in decisionmaking and conduct of intervention programs (see Chapter 4 on vitamin A).

Table 2-4 presents the degree of availability and accessibility of these five elements across four levels of country development and for urban and rural populations as separate groups. The four levels of country development are: very poor countries, poor countries, middle-income countries, and industrial countries. This table can be used by local experts, in consultation with international donors, to help identify optimal approaches to intervention.

TABLE 2-4. Feasibility of Involving Key Societal Sectors in the Planning and Implementation of Micronutrient Interventions: A Guide to Decisionmaking.

TABLE 2-4

Feasibility of Involving Key Societal Sectors in the Planning and Implementation of Micronutrient Interventions: A Guide to Decisionmaking.

Table 2-4 indicates that, for very poor countries, availability and access to established food industries, food marketing or distribution channels, health care systems, and developed community organizations are restricted. The overall level of literacy is also relatively low. Expertise that may be brought to bear on the planning and implementation of interventions tends to be more concentrated among relatively few individuals.

The experience for poor countries is slightly better overall; access and availability are increased for all elements. Access to the health care system is increased in urban settings and the presence of effective community organizations is greater, as is the level of literacy. While there are some notable exceptions, where access to health care increased in low-income countries (e.g., Cuba and Sri Lanka), these examples are rare. Generally, as national income increases, availability and access to basic health services increase.8

It is in the middle-income countries that the presence of established food industries (in urban settings) and food marketing and distribution channels becomes more pronounced. Health care is more widely accessible in both urban and rural settings and there tends to be organized community activity. Rates of overall literacy are increased.

The five elements are well developed and established in industrial countries.

Elements Of Successful Interventions Across The Continuum Of Population Risk

This section and its accompanying tables combine information from the last three sections and offer a guide to the preferred initial approaches to prevention and control of iron, vitamin A, and iodine deficiencies in populations at the defined levels of risk. Workshop participants agreed on two guiding principles in developing these approaches. First, planning for sustained intervention should include consideration of all four strategies—supplementation, fortification, food-based approaches, and public health control measures—where appropriate and feasible. Only the relative emphases among the four approaches should differ, both by level of population risk and by phase of intervention. Second, the long-term goal of intervention should be to shift emphasis away from supplementation and toward a combination of food fortification (USI or iron-fortified flour, for example) and food-based approaches where appropriate and feasible to sustain change. In other words, as populations move along continuums of risk and resource availability, the relative mix of interventions should model those presented in the following four tables for the level indicated. Inherent in this categorization, however, is the implicit caveat that the most appropriate intervention is influenced by the targeted age group. An obvious example is that of a 6-month-old, severely anemic child who will likely be unaffected by a wheat iron-fortification program.

Interventions for Level IV Populations

For country planners and donors developing interventions for Level IV populations, Table 2-5a suggests that supplementation be given primacy as a first step. Supplementation should be universal in the case of iron and vitamin A. The potential for toxicity of iodine supplementation in individuals with normal iodine levels argues for an approach that is more closely targeted to vulnerable groups.

TABLE 2-5a. Preferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Populations with Severe Micronutrient Malnutrition—LEVEL IV.

TABLE 2-5a

Preferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Populations with Severe Micronutrient Malnutrition—LEVEL IV.

The mix of preferred approaches in Level IV populations should also stress USI where feasible, as well as opportunities for household-processed, food-to-food fortification of complementary and weaning foods that take advantage of traditional home and community preservation practices.

Intervention programs should also include elements directed toward changing the diets of the target populations. More attention should be paid at this stage to enhancing family capacity to grow nutrient-rich foods.

Public health measures that aim to control infectious diseases—for example, treatment of hookworm infection and prevention of reinfection and increasing measles immunization coverage—should be considered an essential complement to interventions for iron and vitamin A, respectively.

Interventions for Level III Populations

Preferred approaches to combating micronutrient malnutrition in Level III populations include movement away from complete reliance on universal supplementation to a greater emphasis on supplementation of vulnerable groups.

TABLE 2-5bPreferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Populations with Moderate to Severe Micronutrient Malnutrition—LEVEL III

ApproachDeficiency
IronVitamin AIodine
Supplementation
Targeted to vulnerable groups++++++++
Universal+++
Fortification
Targeted
Universal+++++++
Food-based approaches
Food, nutrition education++++++
Food production+++n.a.
Food-to-food++++++
Public health control measures
Immunization++++++++
Parasite control+++++
HW/S+++++
DD/ARI+++++
Personal sanitation/hygiene++++++++

NOTE: ++++, very strong emphasis; +++, strong emphasis; ++, moderate emphasis; +, light emphasis; —, no emphasis; food-to-food fortification, mixing of staple foodstuffs—e.g., mango with gruel—at the household level to enrich nutrient content; n.a., not applicable; HW/S, healthy water and public sanitation; DD/ARI, control of diarrheal diseases and acute respiratory infections.

Programs should shift to a greater emphasis on food fortification relative to supplementation, particularly with regard to USI and, to a lesser degree, vitamin A and iron. National capacity to fortify and distribute foods should be exploited whenever possible, with continued attention directed to opportunities for household-processed, food-to-food fortification.

Food and nutrition education should become a more important part of the mix, while programs to enhance family capacity to grow nutrient-rich foods should continue to be stressed.

Public health control measures should continue to be considered an essential complement to interventions for iron and vitamin A.

Interventions for Level II Populations

In Level II populations, the relative mix of approaches begins to emphasize interventions that promote dietary change and programs that make use of expanding national capabilities in food production and distribution. Supplementation programs directed toward vulnerable groups should continue for iron and vitamin A. USI remains the program of choice for iodine deficiency; fortification with vitamin A and iron can also be helpful, assuming some level of food processing—refining sugar, milling flour, and the like—in the target country.

Complementary public health measures to control parasitic and diarrheal diseases and ARI are generally not needed because of the usual absence of these diseases in Level II populations. Nevertheless, efforts directed toward maintaining immunization rates and ensuring sanitation and hygiene in home practices should continue to be stressed.

TABLE 2-5cPreferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Populations with Mild and Widespread Micronutrient Malnutrition—LEVEL II

ApproachDeficiency
IronVitamin AIodine
Supplementation
Targeted to vulnerable groups++++++
Universal
Fortification
Targeted
Universal+++++++++
Food-based approaches
Food, nutrition education++++++++
Food production+n.a.
Food-to-food
Public health control measures
Immunization++++++++
Parasite Control
HW/S
DD/ARI
Personal sanitation/hygiene++++++++

NOTE: ++++, very strong emphasis; +++, strong emphasis; ++, moderate emphasis; +, light emphasis; —, no emphasis; food-to-food fortification, mixing of staple foodstuffs—e.g., mango with gruel—at the household level to enrich nutrient content; n.a., not applicable; HW/S, healthy water and public sanitation; DD/ARI, control of diarrheal diseases and acute respiratory infections.

TABLE 2-5dPreferred Initial Approaches to Prevention and Control of Iron, Vitamin A, and Iodine Deficiencies in Mild and Clustered Populations—LEVEL I

ApproachDeficiency
IronVitamin AIodine
Supplementation
Targeted to vulnerable groups++++
Universal
Fortification
Targeted
Universal+++++++++
Food-based approaches
Food, nutrition education++++++++++
Food productionn.a.
Food-to-food
Public health control measures
Immunization++++++++
Parasite controln.a.n.a.
HW/Sn.a.n.a.
DD/ARIn.a.n.a.
Personal sanitation/hygiene++++++++

NOTE: ++++, very strong emphasis; +++, strong emphasis; ++, moderate emphasis; +, light emphasis; —, no emphasis; food-to-food fortification, mixing of staple foodstuffs—e.g., mango with gruel—at the household level to enrich nutrient content; n.a., not applicable; HW/S, healthy water and public sanitation; DD/ARI, control of diarrheal diseases and acute respiratory infections.

Interventions for Level I Populations

Food-based approaches and food fortification are the approaches of choice to address micronutrient malnutrition in selected, usually deprived, populations of Level I countries. Programs directed toward iron and, to a lesser degree, vitamin A supplementation of at-risk groups should be continued as needed, as should universal public health control measures such as immunization and education on personal hygiene and sanitation.

Balancing Approaches to Country-Specific Circumstances

Countries with micronutrient deficiencies at a public health level are usually confronted with multiple problems of underdevelopment and limited resources to deal with them. Setting priorities is essential, not a choice. A series of notable political events, beginning in 1990 with The World Summit for Children and the follow-up 1991 conference on Ending Hidden Hunger, focused world attention on micronutrient malnutrition. The preparatory process for the International Conference on Nutrition in 1992 and country-level follow-up actions have fostered national-level planning for micronutrient deficiency control that was virtually nonexistent in many countries before these high-profile political events. National planning often is done collaboratively with international and bilateral agencies because of reliance on their financial assistance for program follow-up. The caution is to ensure that internationally set, time-bound goals are driven by nationally determined, not donor-driven, considerations.

Coordinating Interventions Across Micronutrients

Malnutrition from a single micronutrient seldom occurs in isolation, but within the context of deprivation, including multiple vitamin/mineral deficits. Thus, it is attractive to conceive of dealing with all of these deficits concurrently. A careful analysis needs to be undertaken, however, to determine where program compatibility exists in areas of awareness, assessment, analysis of causes, and resources available for solutions. Coordinated strategies are technically feasible, but infrequently implemented.

Except for iodine, food-based approaches are the most logical for integrating micronutrient control programs. Interactions are avoided between potential concentrated-dose incompatibilities among supplements, such as solubility differences, susceptibility to oxidation, and competition for absorption. The situation with IDD control is different because the deficit is not correctable simply by growing more or a different variety of food in the same iodine-depleted area. Furthermore, there is a proven, cost-effective IDD control intervention—universal iodization of salt—that should receive continued support, using oral iodine supplements to control the problem in limited, unyielding situations. Nonetheless, there are areas of opportunity for cost-saving, complementary activities in assessment, program selection and design, and in delivery mechanisms to vulnerable groups where micronutrient deficiencies coexist.

Common Elements Of Successful Micronutrient Interventions

This section briefly details elements that the workshop participants identified as being common to all successful micronutrient interventions (see Table 2-6).

TABLE 2-6. Common Elements of Successful Micronutrient Interventions.

TABLE 2-6

Common Elements of Successful Micronutrient Interventions.

Political Will/Stability

Experience demonstrates that political will and stability are important factors in the control of micronutrient deficiencies. Political instability breeds failure, as demonstrated by the collapse of the initially successful Guatemalan salt fortification (iodine) and sugar fortification (vitamin A) programs following a period of political unrest. Working to ensure consistent signals from a broad spectrum of leadership affirming the importance of eliminating or reducing micronutrient disorders, however, can help catalyze both government and voluntary agency efforts. Key actors in this process are political and administrative leaders, those from the health sector, the business community, NGOs, and, when involved in such programs, international agencies. Respected and visionary local champions of the intervention should be sought and involved from the earliest stages of program development. These champions can be individuals, as was the case in the Ecuador salt fortification program, or industries, as was seen in the Nigerian salt fortification program. Political will can be further enhanced and maintained through the development of creative partnerships, such as that between the government of the Philippines and the private sector in the program to fortify margarine with vitamin A.

Strategic and Program Planning

A common element in the design and implementation of successful micronutrient programs is development of effective strategies and planning processes. Strategic planning results in a clear set of impact objectives to be reached over a set timeframe and the choice of interventions and the necessary scale of operations to achieve them within available resources. Program planning involves formulation of process objectives and work plans. Decisions include choices of scale, targeting to particular beneficiaries, phasing and sequencing of activities, and selection of technologies. The planning process also addresses development of effective systems for training, supervision, management and logistics, the framing of work routines, allocation of tasks and functions, and phasing and sequencing of activities. The successful experience in increasing use of the underutilized vitamin A-rich ivy gourd in Thailand was a good example. A majority of these elements were incorporated (see page 117). The flexibility to adapt program content to changing circumstances, including lessons of implementation experience, is also a characteristic of successful intervention programs.

Community Involvement, Participation, and Consumer Demand

Involvement of the community at the point where interventions and beneficiaries intersect is a feature of some successful micronutrient programs. An excellent example was the program promoting horticultural interventions in gardens in Bangladesh. Committees at the state, district, block, and village levels provided guidance, coordination, and implementation (see page 123). It also characterizes most of the programs that have had positive results against other forms of malnutrition. Opinion is divided as to when and how best to involve individual communities: before the basic program framework is prepared or after. Both strategies appear to have helped to generate appropriate levels of consumer demand for interventions. Involving each community in the original design of its own interventions through such techniques as Participatory Rural Appraisal may fully invest program ownership in the community. The price, however, might be a diversity, which the program management system cannot easily absorb. Presenting a community with a design that has proved workable in similar circumstances may limit its involvement to adaptation, but has not proved a major impediment to beneficiary participation in the program.

Physical and Administrative Infrastructure

Experience shows that when interventions have been ''scaled up"—that is, increased in size and/or duration—results may be disappointing, in part because of the failure to anticipate the management and institutional capacity needed for ongoing operation. To ensure larger-scale or sustained accomplishment, the physical and administrative infrastructure must be appropriate. Among the indications that these conditions exist are the following, which apply regardless of the intervention being considered:

  • Physical infrastructure. This includes adequate communications capability (e.g. postal mail, telephones, faxes, e-mail; presence of roads, or other ways to reach the populations at risk) and special storage conditions where required.
  • Strategy and program design capability. These include the ability to identify optional strategies and program designs, to test them out, to choose best alternatives, and to evaluate and adjust programs on the basis of appropriate operations and management research. Selection of the most appropriate strategies and program designs also requires the capacity to adapt them to specific resource environments and constraints, along with the ability to measure program costs, efficiency, and effectiveness, as well as costs foregone through intervention outcomes. Part of the strategy and program design process requires clear specification of roles for concerned organizations and institutions, as well as administrative accountability at all levels of managerial and implementation responsibility.
  • Scaling-up skills. The initial success of many interventions is based primarily on the results of small-scale clinical trials. The ability to move to the national level from such small-scale endeavors needs to be validated through large-scale field demonstrations that include measures of effects.
  • Managerial capability. Training to strengthen or develop a management ethic and skills, and to promote management institutional development, including systems for administrative control, is an important but sometimes overlooked factor that influences program success. The establishment of appropriate process goals and a system and procedures for periodically assessing progress toward them is an important measure of managerial capability.
  • Budgetary resources. Resources consistent with achieving established impact objectives need to be made available. These include budgets adequate to develop, test, and choose among strategy options; to formulate and refine the program design; and to test and implement interventions at agreed operational levels for the time specified to reach program objectives.
  • Human resources. The capacity to define tasks and workloads realistically, and to train, deploy, supervise, and retain both employees and, where appropriate, volunteers must also be considered. Task-oriented training needs to take place initially and on an in-service basis, particularly for workers and supervisors in service delivery programs that involve supplementation and communications. Food-based approaches involving dietary modification require appropriate training of formal and informal educators in the use of both interpersonal and mass media resources. Supervisory tasks and ratios need to be geared to service delivery tasks and work routines.

Communications Strategies

Communications can play an important role in successful micronutrient programs by inducing target groups to improve their micronutrient-related behaviors. Depending on the specific operational context, successful communications strategies seek to (1) generate consumer demand for improved micronutrient status and/or (2) remove barriers to adoption of specific micronutrient-enhancing practices. Such strategies are critical to long-term program sustainability and effectiveness.

Communications is an important supportive measure in supplementation and fortification. It can be both a supportive and a leading intervention in the area of dietary modification. In micronutrient supplementation regimes, motivating consumers to demand improved micronutrient status as a personal benefit can lead to higher coverage rates, better compliance, and more efficient implementation. Regarding fortification, public demand for better micronutrient status plays a part in both consumption of the fortified product and in encouraging administrative bodies to adopt and enforce quality-control and other regulatory mechanisms. In the area of dietary practices, appropriate communications interventions can persuade consumers to prepare existing menus in micronutrient-favorable ways and/or to diversify their diets to include new sources of micronutrients.

Successful communications strategies include: (1) market segmentation, that is, identification of groups whose attitudes and behavior are to be affected; (2) definition of the specific changes sought for each group; (3) understanding of the barriers to such changes; (4) selection of suitable communications channels; and (5) the development and testing of appropriate messages.

In most cases, a comprehensive communications strategy will need to address specific segments of the general public, with attention to both target groups for a given intervention and those who influence the micronutrient behavior of such groups. Health workers and managers, from the community to the tertiary care levels, would often need to be included in the strategy, and in most cases would require reorientation, training, and materials support to do so. The potential role of policymakers, particularly in health, agriculture, education, industry, and finance also would need to be analyzed, particularly when such officials could affect resource flows, public perceptions, or other key aspects of the communications process. Two good examples of successful use of communications strategies in building support for and implementing effective interventions were the joint iodine fortification/supplementation program in Ecuador (see page 180) and the experience in applying social marketing methods to increase use of locally available vitamin A-rich foods in Thailand (see page 117).

Use of Appropriate Vehicle

The choice of an appropriate vehicle for the micronutrient and/or intervention strategy selected should take into account bioavailability, safety, side effects, and public acceptance. The vehicle should be consistent with best practice as determined by comparison with similar programs or well-documented research in pilot or clinical programs.

An example of inappropriate choice of vehicle was the Indonesian experience of fortifying MSG with vitamin A (see page 133). Although the vehicle was universally applicable, the resulting fortified "yellow rather than white" product was unacceptable. The widely accepted use of iodinated oils in Ecuador (see page 180) and vitamin-A fortified margarine in the Philippines (see page 134) indicates, however, that selection of an appropriate food vehicle is an important determinant of program success. Genetically modified crops appear to offer opportunities to increase yield, increasing micronutrient content or bioavailability. Their acceptance by the public, however, needs to be addressed.

Sustainability

Sustainability, as used here, refers to both the continuity of a successful intervention and a continuation of a significant, positive impact on the intended beneficiary.

The first kind of sustainability thus relates to process, the other to outcomes.

Three factors are essential for sustainability: efficacy, appropriateness, and demonstrated feasibility. Clearly one would only want to sustain an intervention that has "worked." The assumption is that a policy or program has been implemented that addresses the micronutrient need of a particular population. In order to continue to effectively operate the intervention, an institutional structure is needed that will allow for ongoing capacity for management. One common finding in public health interventions, in general, is that successful approaches are the ones designed and managed as part of research and/or pilot projects.

Cost is clearly a factor that influences sustainability. Programs based on a permanent reliance on external funding are usually not viable in the long term. At the same time, precipitous withdrawal of external funding may also doom projects. A consistently agreed upon gradualist approach may be optimal. There are now examples of the effective transition from total donor funding to total support by financing at the national level. The Indonesia vitamin A program is an excellent example of an intervention that evolved over a 20-year period from 100 percent donor support to the current program, which is entirely funded by government monies. The time period is also critical. For most countries, it is unrealistic to expect this transition to occur in a 3- to 5-year period. Micronutrient interventions such as the Indonesia vitamin A program, in which the donors and the host country plan for this transition from the initial stages, are the ones that have been most successful.

Micronutrient interventions that continue to achieve a significant impact on the target individuals are projects that are flexible enough to respond to the changing needs of the client. Typically this involves a combination of approaches to address a particular micronutrient. For vitamin A, as an example, a combination of strategies is most effective in reducing vitamin A deficiency in a given area. Each country must determine the most cost-effective mix of interventions.

Information Systems, Monitoring, and Evaluation

Monitoring and evaluation are essential program elements. They are vital for ensuring and improving efficiency of program operations—reaching the target group in a cost-effective fashion. Monitoring may provide early warning signs that either program operations are faltering or that prevalence of micronutrient malnutrition is rising in one or more groups. Protocols for monitoring and evaluation must be developed as part of the overall program design and implemented as part of the program. Programs that have not done so have inevitably failed. Projects that have incorporated strong monitoring and evaluation components, such as the two programs promoting home gardens in Bangladesh described in Chapter 4 (see page 123), have been successful and have been sustained.

Indicators appropriate to monitor intervention impact will vary in accord with the intervention objective. For example, program objectives may be to improve coverage of iron-supplement recipients; to insure that a vitamin A-fortified food meets quality assurance standards or is selected for consumption by target groups; to cause a change in food consumption behaviors, such as the frequency of consumption of dark, green, leafy vegetables (DGLV); or to increase the year-round availability of vitamin A-rich food in household or community gardens. The appropriate intervention-specific impact indicator(s) for each of these objectives will differ; in some cases process indicators will be appropriate, and in other cases biological indicators will be the most useful. If the desired outcome of the intervention is to document a change in the vitamin A status of the recipient population, biological indicators are ideal. Use of impact indicators, however, can be limited in instances where they are difficult to measure and it is likely that a scaled-up intervention will not have the precision necessary to demonstrate impact. In such cases, process indicators should be substituted. For example, demonstrating that the target population received the supplement of vitamin A and ingested it is sufficient. Strong evidence is already available that vitamin A supplements reduce vitamin A deficiency and childhood mortality and morbidity; thus, it is not necessary to repeat these impact measurements.

Resource availability can limit the feasibility of direct biological evaluations because these indicators are usually more costly to obtain and evaluate than indirect indicator data. In such situations, outcomes derived from metabolic and/or controlled community studies lend credence to causative inferences from similar outcomes of interventions implemented in less rigorously controlled community studies. Inability to perform biological evaluations should not be the sole criterion that prevents initiation of, or stops, VAD control programs when and where such programs are needed.

Biological Indicators

Population monitoring of iron deficiency is difficult. Responsiveness of the left tail of an Hb distribution curve is probably the best and least expensive indicator of iron-deficiency anemia, but is inadequate to measure iron reserves. In spite of limitations noted in the background paper on iron, serum ferritin is likely to be the best indicator of measuring iron status. In developing countries with initial high prevalence rates of anemia (and hence prevalence rates of nearly 100 percent subclinical iron deficiency), however, assessing hemoglobin levels is enough.

VAD, like iron deficiency, is difficult to monitor. In the view of the workshop participants, process indicators can monitor most programs just as accurately as any single biological indicator and with less expense. Which to use depends on the mix of program strategies. Clinical indicators require very large sample sizes because they are rare events. Night blindness is only useful in some populations and does not detect all subclinical VAD. The dynamic nature of the left tail of serum distribution curves among populations of young children is likely the best reflector for biological monitoring.

Monitoring of progress against iodine deficiency will usually involve both process and biological indicators. In a highly endemic area, goiter prevalence might be an appropriate initial indicator, but as a control program progresses, overall goiter rate is not an adequate indicator, because adult goiters are often fibrotic, and thus persist even when iodine deficiency is corrected. Goiter incidence in school-age children, however, could be appropriate until it becomes quite low. In contrast, median urinary iodine is reflective of current intakes of a population. Coverage can usually be monitored adequately—with least expense—by nonbiological process indicators such as the number of households in which iodized salt or other fortified food vehicle is consumed. To monitor quality control, however, quantitative laboratory methods for iodine levels in batches of salt, or median urinary iodine levels between cutoffs reflective of the desired level of iodine intake, are appropriate. Where the level of development favors institution-based deliveries, neonatal thyrotropin (TSH)—if a screening program for neonatal hyperthyroidism is already in place, as it is in much of Europe and other developed regions of the world—would be possible, but more expensive than urinary iodine. Median urinary iodine in representative school-age populations is likely the best indicator for long-term monitoring of iodine status and quality assurance of adequate salt iodization levels.

Footnotes

1

"Ending Hidden Hunger" UNICEF/WHO Conference, Montreal, Canada, October, 1991.

2

International Conference on Nutrition, FAO/WHO, Rome, Italy, December 1992.

3

International Nutrition Planners Forum Workshop, "Elements of Successful Community Nutrition Programs," Seoul, Republic of Korea, August, 1989.

4

Kennedy, E. 1991. Nutrition Interventions in Africa: What Makes Them Work? World Bank Working Series in Population, Health, and Nutrition. Washington, D.C.

5

Hemoglobin is a biochemical measurement that in some populations may not closely correlate with adverse health consequences at the cutoff used for defining anemia (e.g., when the hemoglobin distribution curve is narrow reflecting most values at the cutoff or only slightly below). When the distribution curve is skewed to the left, the risk of adverse health effects and need for public health interventions become increasingly urgent. Level IV, therefore, refers to a situation with a hemoglobin distribution substantially skewed toward the left.

6

World Bank. 1994. Development in Practice: Enriching Lives: Overcoming Vitamin and Mineral Malnutrition in Developing Countries. Washington, D.C.: The World Bank.

7

The DALY is an indicator of the time lived with a disability and the time lost through premature mortality. Years lost due to premature mortality are estimated in the context of the standard expectation of life at each age. Years lived with disability are translated into an equivalent time loss through multiplication by a set of weights that reflect reduction in functional capacity. In both cases, the losses are weighted according to a particular set of ''value choices"—the value of time lived at different ages (age weights) and time periods (discounting). (For a fuller justification of the conceptual framework underlying the DALY, see Murray, 1994, and Murray and Lopez, 1996.)

Murray, C. J. L. 1994. Quantifying the Burden of Disease: The Technical Basis for Disability-Adjusted Life Years. Bulletin of the World Health Organization 72(3):429–445.

Murray, C. J. L., and Lopez, A.D. 1996. Global Burden of Disease and Injury, Vol. 1. Boston: Harvard University Press.

8

World Bank. 1993. World Development Report 1993: Investing in Health. New York: Oxford University Press for the World Bank.

Copyright 1998 by the National Academy of Sciences. All rights reserved.
Bookshelf ID: NBK230111

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1.8M)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...