U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Newberry SJ, Ahmadzai N, Motala A, et al. Surveillance and Identification of Signals for Updating Systematic Reviews: Implementation and Early Experience [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2013 Jun.

Cover of Surveillance and Identification of Signals for Updating Systematic Reviews: Implementation and Early Experience

Surveillance and Identification of Signals for Updating Systematic Reviews: Implementation and Early Experience [Internet].

Show details

Methods

This report covers the period from June 2011 to June 2012 and is an interim analysis that summarizes the assessment of 14 CERs. The objective of the surveillance assessment system is to identify signals of the potential need for updating and not to conduct the actual update.

Identifying New Evidence From Published Studies

Search Strategy

The surveillance assessment system was designed to be implemented at 6 month intervals. The process starts with the assessment of a CER 6 months after its publication on the AHRQ Web site. The CERs determined to be up to date in the first cycle are reassessed 6 months after completion of the previous assessment. The CERs determined to be clearly out of date after the first assessment are not reassessed.

Starting with the search strategy employed in the original report, we conducted a limited literature search that included at least Medline/Pubmed and/or Cochrane, and, on a topic specific basis, additional databases. The search included five general medical interest journals (Annals of Internal Medicine, British Medical Journal, Journal of the American Medical Association, Lancet, and the New England Journal of Medicine) and the specialty journals most relevant to that topic. The specialty journals were those most highly represented among the references from the original report. In general, we followed the search strategy from the original CER. However, we did make some modifications. For example, if we were aware of new drugs for the condition, their names were added to the search terms. Search inception dates were 6 to 12 months prior to the end date of the original CER search, in order to ensure overlap between the searches.

Study Selection and Abstraction

In general, we also used the same inclusion and exclusion criteria as the original CER. A single reviewer, experienced in systematic reviews, conducted a screening of the titles and abstracts and requested any articles deemed relevant to the topic. From those articles, a single reviewer extracted relevant data from articles that met the inclusion criteria and constructed an evidence table. These data included any study level details extracted in the original CER, e.g., sample size, study design, and outcomes measured, as well as the outcomes themselves.

Identifying New Evidence From the U.S. Food and Drug Administration, Health Canada, and Medicines and Healthcare Products Regulatory Agency (MHRA) (UK)

At monthly intervals, the ECRI EPC, under contract with AHRQ, monitored the U.S. Food and Drug Administration (FDA MedWatch), Health Canada, and MHRA Web sites for any new regulatory information or safety alerts about drugs relevant to the CERs under review. This information was forwarded to the SCEPC or UOEPC as appropriate and included in the final summary tables if deemed relevant. Appendix B outlines the methods the ECRI EPC used.

Identifying New Evidence From Experts and Expert Opinion

For each topic, a questionnaire matrix that listed the Key Questions and conclusions from the original executive summary was created. The matrix was sent to experts in the field, including the original project leader, technical expert panel members, and peer reviewers. These experts were asked to complete the matrix, indicating whether each listed conclusion was, to their knowledge, still valid, and if not, to provide information about new evidence (see Table 1 below).

Table 1. Sample questionnaire matrix.

Table 1

Sample questionnaire matrix.

Check for Qualitative and Quantitative Signals

Once abstraction of the study conditions and findings for each new included study was completed and expert opinions were received, we assessed whether the new findings provided a signal for the need to update, according to the Ottawa Method and/or the RAND Method, on a conclusion-by-conclusion basis. If new studies was deemed sufficiently similar to studies included in a pooled analysis in the original CER and were sufficiently large with respect to sample size, a new meta-analysis was conducted using the original pooled effect size as one data point in a random effects model. Table 2 lists the criteria used for reaching conclusions.7,9

Table 2. Ottawa and RAND Method.

Table 2

Ottawa and RAND Method.

The Ottawa method involved detection of qualitative and/or quantitative signals indicating the need for updating through the assessment of new evidence using specific categories for qualitative (A1–A7) and quantitative (B1–B2) signals, as reported in Table 2. For example, a finding from a newly published pivotal trial that was opposite to a corresponding conclusion in the original CER with respect to an efficacy outcome (e.g., effective vs. ineffective or vice- ersa) or harm (e.g., risk of harm outweighs the previously observed benefits), a superior new treatment (e.g., new treatment significantly more effective than one assessed in the CER), or a new population subgroup (the treatment assessed in the CER has been expanded to a new subgroup of participants) are each considered qualitative signals. An example of a quantitative signal is the incorporation of a new trial (or trials) into a meta-analysis conducted for the original CER that leads to a transformation of a previously statistically non-significant pooled estimate into a statistically significant one or vice-versa.

The specific steps involved in the Ottawa Method are shown in Figure 2.

Figure 2 presents a diagram of how the Ottawa Method is carried out. The figure is further described in detail in the main body of the report, “Method I involved detection of qualitative and/or quantitative signals indicating the need for updating through the assessment of new evidence using specific categories for qualitative (A1-A7) and quantitative (B1-B2) signals as reported in Table 1. For example, a qualitative signal was considered a finding from a newly published pivotal trial which was opposite to that of the CER with respect to an efficacy outcome (e.g., effective vs. ineffective or vice-versa) or harm (e.g., risk of harm outweighs the previously observed benefits), a superior new treatment (e.g., new treatment significantly more effective than one assessed in the CER), or a new subgroup of population (the treatment assessed in the CER has been expanded to a new subgroup of participants). An example of a quantitative signal is the incorporation of a new trial (or trials) into a meta-analysis of the CER leading to an overturn of a statistically non-significant pooled estimate into a statistically significant one or vice-versa.”

Figure 2

Ottawa method.

For each CER, we constructed a summary table that included the following for each Key Question: original conclusion(s), findings of the new literature search, summary of expert assessment, findings from the ECRI search of regulatory bodies, and our final assessment of the currency of the conclusion(s).

Determining Priority for Updating a CER

For each report, we provided an assessment as to whether each conclusion was up to date. We then needed to assign an overall judgment of the priority for updating. We used two criteria in making our final conclusion for a CER:

  • How much of the CER is possibly, probably, or certainly out of date?
  • How out of date is that portion of the CER? For example, would the potential changes to the conclusions involve refinement of original estimates or do the potential changes include the finding that some therapies are no longer favored or may no longer be in use? Is the portion of the CER that is probably or certainly out of date an issue of safety (a drug withdrawn from the market, a black box warning) or the availability of a new drug within class (the latter being less of a signal to update than the former)?

This final conclusion was a global judgment made by all the individuals working on each particular CER. We classified CERs as being low, medium, or high priority for updating, with a notation explaining the rationale for high priority updates. If a therapy was no longer favored, no longer in use, or in question because of a safety concern, we would have recommended that, pending a full update, the original CER be withdrawn; however, no CERs presented this issue during our surveillance.

Summary Dissemination of Reports to AHRQ

We developed a format for a short summary report that presents the findings from the surveillance process to AHRQ. This format includes a title page that lists the final classification (low, medium, or high) of the priority for updating the CER; the details of the literature search and its yield (with evidence tables); the findings from FDA, Health Canada, and MHRA; the results of any expert opinion that was provided; and a summary table that contains each conclusion from the original CER and our assessment of the degree to which it may be out of date. Examples of such reports (one each judged as being at low, medium, and high priority for updating) are included in Appendix A.

Peer Review and Public Commentary

Experts were invited to provide external peer review of this report; AHRQ and an associate editor also provided comments. The draft report was posted on the AHRQ Web site for 4 weeks to elicit public comment. We received comments back from three reviewers and one public commentor. We have addressed all peer and public comments, revising the text as appropriate, and have documented all responses in a “disposition of comments report” that will be made available 3 months after the Agency posts the final report on the AHRQ Web site.

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1018K)

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...