U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Institute of Medicine (US) Vaccine Safety Forum; Evans G, Bostrom A, Johnston RB, et al., editors. Risk Communication and Vaccination: Summary of a Workshop. Washington (DC): National Academies Press (US); 1997.

Cover of Risk Communication and Vaccination

Risk Communication and Vaccination: Summary of a Workshop.

Show details

Risk Perception and Decisionmaking2

There are many influences on how people perceive and respond to risks. Several participants noted that individuals' values, beliefs, and attitudes as well as the wider social or cultural values or dispositions strongly influence how risks are perceived or accepted. A better understanding of risks, consequently, will not lead to a uniform response to them. As an expert in risk communication noted, information alone does not resolve controversy. Good risk communication depends on understanding more than quantitative risks and benefits; background experiences and values also influence the process. For example, people who have a general mistrust of government or big business may be less likely to accept the vaccine risk estimates published by government health agencies or vaccine manufacturers.

Decisions about health risks were described by one speaker as being made not only on a rational basis but also on emotional, psychological, religious, spiritual, philosophical, and intuitive bases. This "cultural rationality" recognizes a richer range of influences on decisionmaking than does the narrower concept of rationality commonly used by experts in the field, according to a speaker who studies risk communication.

Studies show that voluntary, natural, and controllable risks are generally more accepted than risks that are imposed, not within an individual's control, or due to human-made causes. Risks that are familiar are also usually more accepted than those that are unfamiliar or hypothetical (Slovic et al., 1979; Lichtenstein et al., 1978; Fischhoff et al., 1978). Morgan (1993) uses observability and controllability as the two dimensions that characterize a hazard's "dreadfulness" and the degree to which it is understood (see Figure 1).

Figure 1. Illustration of observability and controllability for some common health hazards.

Figure 1

Illustration of observability and controllability for some common health hazards. Hazards can be characterized according to their degree of "dreadfulness" or controllability (horizontal axis) and the degree to which they are understood or are observable (more...)

Heuristics and Biases

Cognitive shortcuts or rules of thumb known as heuristics affect peoples' quantitative estimates of risk. Risk scientists have shown that there are regular and predictable patterns in the ways that these operate. Use of these heuristics can result in biases in quantitative estimates of risk.

Anchoring refers to a lack of feel for absolute frequency and a tendency for people to estimate frequencies for a new event on the basis of the frequencies presented for other events. For example, if a person is told that 1,000 people a year die from electrocution and then is asked to estimate how many people die from influenza, his or her number is likely to be lower than if the person is first told that 45,000 people a year die in automobile accidents (Kahneman and Tversky, 1972). The tendency is to "anchor" on the first number and not adjust far enough from it. Consequently, how and what probability estimates of risk are presented and in what order they are presented may affect how risks are perceived because of anchoring effects.

Compression is the overestimation of small frequency risks and the underestimation of large frequency risks (Fischhoff et al., 1993). If this applied to vaccine risks, people would behave as if the risk of rare adverse effects from vaccines were higher than reported.

Availability means that events that are easily remembered or imagined are more accessible or "available" to people, so that their frequencies are overestimated (Tversky and Kahneman, 1973). If, for example, a particular risk has recently or often been reported in the popular press, people may well overestimate its frequency. A science writer commented that people pay more attention to dramatic, new, or unknown risks or risks conveyed within the context of a personal story. Most people will give proportionally more weight to a dramatic risk of dying from an airplane crash, for example, than to the risk of dying from lung cancer due to smoking, even though the latter is more likely. Drama, symbolism and identifiable victims, particularly children or celebrities, the science writer said, also make a risk more memorable.

When risks are given as verbal probabilities (e.g., likely, unlikely, rare, and common), interpretation depends on the context (Budescu and Wallsten, 1985; Wallsten et al., 1986). The phrase "likely to catch a cold" will be interpreted differently from "likely to become infected with HIV," for example.

Exposure refers to the fact that people tend to underestimate the cumulative effect of multiple exposures to a risk (Linville et al., 1983). In many instances of risk, the concern is about exposure over time, not necessarily from a single exposure alone. Communication of cumulative risk can be helpful in these instances. Cigarette smoking is an example of an exposure in which cumulative risk is important.

Comparisons. Risk is multidimensional, but when a communicator makes a risk comparison on the basis of one or two dimensions, people may assume that many dimensions are being compared and draw conclusions based on the broader comparison rather than that which was intended. For instance, experts may say that the risk of an environmental exposure is inconsequential because on average it is low, but ordinary people might call for action because they fear that the risk falls disproportionally, and thus unfairly, on vulnerable groups.

Omission bias is the tendency to believe that an error of omission is less serious than an error of commission. That is, people tend to be more averse to a risk incurred by taking an action than one incurred by taking no action. For example, a University of Pennsylvania study found that nonvaccinators (parents who chose not to vaccinate their children) were more likely to accept deaths caused by a disease (that is, omitting vaccination) than deaths caused by vaccination (an act of commission) (Meszaros et al., 1996).

Framing, the way in which information is presented or the context into which it is placed, affects how risk communication messages are received. Studies show that a different framing of the same options can induce people to change their preferences among options (Tversky and Kahneman, 1973; Lichtenstein and Slovic, 1971). This is known as a preference reversal. For example, the data on lung cancer treatment suggest that surgical treatment has a higher initial mortality rate but radiation has a higher 5 year mortality rate. In one illustration, 10 percent of surgery patients die during treatment, 32 percent will have died one year after surgery, and 66 will have died by five years. For radiation, 23 percent die by one year and 78 die by five years. When people are given these mortality statistics, they tend to be evenly split between preferring radiation and preferring surgery. When the same statistics are given as life expectancies (6.1 years for surgery and 4.7 years for radiation) there is an overwhelming preference for surgery (McNeil et al., 1982).

How information is framed can also affect whether people allow an omission bias to be a prime motivator of a decision not to vaccinate. One study of university students found that when the issue of responsibility was removed, subjects were more likely to opt for vaccination. Responsibility was removed by reframing the question as "if you were the child, what decision would you like to see made" (Baron, 1992).

Other research shows that people tend to have a preference for eliminating risk and for maintaining the status quo (Thaler, 1980; Samuelson and Zeckhauser, 1988). Consequently, people often have an aversion to increasing the probability of one type of risk to reduce that of another, even by the same amount. They may even prefer a riskier situation over a less risky situation if the former maintains the status quo (Fischhoff et al., 1981).

Influences on and Biases of Experts

Experts in a particular area may (or may not) be less likely to exhibit, in their own field of expertise, the specific heuristic rules and biases discussed above. Experts also have their own biases. Their values, beliefs, and attitudes influence the form and content of the risk and benefit information that they present. In addition, organizational biases (such as whether experts are affiliated with a government agency promoting vaccination, a vaccine manufacturer, or a consumer organization concerned with vaccine safety) can also influence how experts view an issue.

Because of their particular professional training, their mental models and approaches to problem solving can differ fundamentally from those of nonexperts (Chi et al., 1981). For example, in their search to draw conclusions or solve problems, they may sometimes rely inappropriately on limited data, impose order on random events, fit ambiguous evidence into their own predispositions, omit components of risk such as human errors, and be overconfident in the reliability of analyses (Fischhoff et al., 1982; Fischhoff and Merz, 1995; Freudenberg and Pastor, 1992).

Footnotes

2

This section is based on information presented by Ann Bostrom, Jacqueline Meszaros, Douglas MacLean, and Cristine Russell, as well as discussion among other participants.

Copyright 1997 by the National Academy of Sciences. All rights reserved.
Bookshelf ID: NBK233844

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (796K)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...