U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Newberry SJ, Shekelle PG, Vaiana M, et al. Reporting the Findings of Updated Systematic Reviews of Comparative Effectiveness: How Do Users Want To View New Information? [Internet] Rockville (MD): Agency for Healthcare Research and Quality (US); 2013 May.

Cover of Reporting the Findings of Updated Systematic Reviews of Comparative Effectiveness

Reporting the Findings of Updated Systematic Reviews of Comparative Effectiveness: How Do Users Want To View New Information? [Internet]

Show details

Results

The results for both focus groups and the individual assessments by community physicians are combined and presented according to the unique characteristics of each format.

The Importance of Seeing What Changed

The questionnaire began by asking respondents to rate the importance of seeing what had changed in an update review. Among the four focus group 1 participants who responded to this question in advance, all said that it was either very important (5 on a scale of 1 to 5) or somewhat important (4 on a scale of 1 to 5) to see what had changed. During the meeting of focus group 1, one participant, an academic research physician with experience in developing evidence-based indications for care, speculated that this desire to see “what's changed” may be particular to academic physicians and policymakers. He said: The average clinician would actually prefer version 1 of the summary, a version that presents only what is most up to date. This comment was one factor that prompted us to add the third focus group, that of practicing clinicians, to this project.

Among the focus group 2 participants who responded in advance, one said it was somewhat important (4 on a scale of 1 to 5) and one said it was neither important nor unimportant (3 on the scale of 1 to 5) to see what had changed. During the discussion, participants all agreed that they would want to see what had changed.

The two community physicians who responded both rated the importance of seeing what had changed as somewhat important (4 out of 5).

Using Gray “Highlighting” To Show Changes

Asked about version 2, in which new findings were shaded in gray, four questionnaire respondents in focus group 1 found this highlighting not to be helpful, and three found it to be helpful; however, when asked whether the shading would be more helpful if all the questions remained the same from the earlier to the later version and only the quantitative findings changed, three agreed that the shading would be somewhat helpful. During the discussion, one focus group 1 participant stated:

There are two kinds of readers: people who just want to know what to do, and people who want to know why. For people who just need to know what to do, you could put the changes in bold. For those who want to know why, show the evidence behind the changes, secondarily.

Focus group 2 participants found the gray shading not to be helpful, saying that they would want to see not only the changes but also the context for the changes. The community physicians also found this version not to be helpful.

Using Track Changes To Show Changes

The original Version 3, which showed all the changes in track change mode, was rated poorly by all respondents in focus group 1; therefore, we dropped this version from subsequent assessments.

Summarizing Findings in a Quantitative or Narrative Summary Table

When asked to rate the versions of the summary that included one or both of the summary tables, focus group 1 participants rated both versions as somewhat to very helpful. During the discussion, they agreed that for a report with only narrative result, the narrative summary table would be more useful than the quantitative table (and vice versa). They also reiterated the point that many clinicians would want to see only the new conclusions but that anyone involved with setting guidelines would want to see both the old and the new information.

Focus group 2 and the community physicians were asked to comment on a version of the summary that included both the quantitative and the narrative tables (Version 6: See Appendix C). During the discussion, focus group 2 participants found these tables to be helpful and suggested some further steps to increase their utility. These suggestions included showing how the new findings contributed to changing the conclusions; using color coding or different fonts to increase the salience of what changed; and including some wording that would put the changes in context. This group also agreed among themselves that they wanted to see why conclusions changed, and that merely providing the tables with version 1 or 2 of the summary would not suffice.

The community physicians rated version 6 as 2 (somewhat unhelpful) and 3 (neither helpful nor unhelpful) on a scale of 1 to 5.

A Skeletal Summary/Report: Tables and Figures With Little Free Text

Focus group 2 was also asked to provide feedback on a version of the summary that included very little text; this version was intended to emulate the mini-evidence review described by the HMO physician manager who participated in focus group 1 (version 7: see Appendix C. This version presented the scope of the report and the context in the form of a table and conceptual framework, respectively; used quantitative tables to present the results for Key Questions for which the findings were reported quantitatively; used bulleted lists to present the results of questions that were answered narratively; and used a modification of the qualitative table to present the conclusions. Limitations to the review and future research recommendations were also presented as bullet points. focus group 2 participants generally liked this version.

We then asked focus group 2, during the discussion, how they would react to this version being substituted for the full evidence review. They liked the idea, but with reservations: The consensus was that more information was needed than the format allowed. They reiterated that it would be important to emphasize what changed and to provide adequate information for the reader to understand the reasons for the changes. One participant thought this version would be preferable for a guidelines creator (in contrast to version 3, which would be preferable for a general reader), whereas another participant emphasized that guidelines creators would need the equivalent of a full evidence report. However, a third participant thought version 4 might serve as a useful pointer to the full report, that is, as a standalone executive summary, rather than as a substitute for the full report.

Both community physicians who assessed the four different summaries overwhelmingly preferred version 4. One attributed her choice to her preference for graphic presentations of the findings over narrative text. The other physician said:

From a nonacademic community doc perspective, just to have a short summary on the conclusion for the Key Question would be most helpful. Hence, I thought the bullet points (version 4) was the easiest to understand… I think community docs usually take for face value that the conclusion is correct, and there is no need to prove that the conclusion is correct with a ton of citations.

Additional Issues Raised

A question raised and discussed at length in focus group 1 was, “Who are the users of EPC evidence reviews?” Although everyone agreed that the needs of users should dictate the format(s) in which the information is presented, no one is quite sure who are the primary users of evidence reviews or updates. Input from several of the participants painted a picture of typical users of full evidence reviews as individuals and groups charged with setting health care policy and/or guidelines, including Centers for Medicare & Medicaid Services, FDA, National Institutes of Health, State and some local health agencies, insurers/health plans, professional practice societies (e.g., American Academy of Pediatrics), and organizations such as the American Heart Association. The participants emphasized that these kinds of users would need a full-length evidence review, with the executive summary serving as little more than a detailed table of contents.

A different group of users of evidence review—namely researchers and funders—was at the center of a discussion by focus group 2. The issue they raised was how evidence reviews identify and present knowledge gaps and future research needs. One participant wanted to see a more complete discussion of the gaps in the research (which version 4 presented as bullet points), stating that her agency uses this information to make decisions about future research needs. Another participant said that such discussion is sometimes helpful but that oftentimes, it would be more helpful if greater attention was paid to the formulation of these recommendations. He added that the evidence reviews are one piece of information used to determine “next steps.” It was mentioned that some groups have begun incorporating modeling and decision analysis in an effort to improve the usefulness of “future research needs” discussions.

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (855K)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...