U.S. flag

An official website of the United States government

NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Saldanha IJ, Skelly AC, Ley KV, et al. Inclusion of Nonrandomized Studies of Interventions in Systematic Reviews of Intervention Effectiveness: An Update [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2022 Sep.

Cover of Inclusion of Nonrandomized Studies of Interventions in Systematic Reviews of Intervention Effectiveness: An Update

Inclusion of Nonrandomized Studies of Interventions in Systematic Reviews of Intervention Effectiveness: An Update [Internet].

Show details

7Threats to Internal Validity of NRSIs

Potential threats to the internal validity of NRSIs are important for deciding whether NRSIs should be included in an SR. Risk of bias refers to the likelihood that the estimate of an intervention’s effect obtained from a study has a systematic error (i.e., bias) that leads to the estimate being different from the true effect.42 Concerns around bias in estimates of effect are distinct from concerns around applicability, imprecision, and quality of reporting.43 Applicability and imprecision are addressed in other SR processes in interpreting results, specifically, in strength of evidence assessments. Assessments of risk of bias in NRSIs should therefore focus on study design, conduct, and analysis,43 with the caveat that poor study reporting can hinder risk of bias judgments.43

Sources of bias that are unique to NRSIs occur before or at the start of the intervention; sources of bias that occur after the intervention starts may be akin to those in RCTs.13 Depending on the types and extent of biases and confounding in an NRSI, the magnitude or direction of the effects observed may be impacted, leading to spurious conclusions. Thus, it may not be helpful, and may even be problematic, to include an NRSI if its results are likely to be highly biased.4 Results from biased studies can lead to misleading conclusions that could be used inappropriately by decision makers, especially if inadequate attention is paid to the potential biases.44, 45 Therefore, it is reasonable to exclude NRSIs that do not adequately account for various potential biases.16 Any such exclusions of NRSIs should be based on the most important design features for minimizing risk of bias. NRSIs should not be excluded based solely on study design labels (e.g., cohort study) because such labels are notoriously inconsistent (see Section 4).46, 47

The following subsections explore specific types of biases that are particularly relevant for comparative NRSIs. Assessing the quality of single-group NRSIs is beyond the scope of the current guidance.

7.1. Selection Bias

NRSIs may be subject to a high risk of selection bias if at study baseline some potentially eligible participants, or their followup time, were excluded from the treatment or comparator groups, and such exclusion may have led to a biased estimate of the treatment effect.14 For example, consider a study using electronic health records, in which researchers defined the treatment group as those receiving a certain treatment for a certain disease and the comparator group as those receiving no treatment for the same disease. Selection bias could occur if the researchers excluded (for any reason) a larger proportion of patients with a greater risk of death due to comorbidities from the treatment group than the control group.

7.2. Confounding

NRSIs may be subject to a high risk of confounding if the treatment and comparator groups were imbalanced in terms of factors that were common causes of both the choice of treatment and the outcome. A confounder is a third variable that is associated with the treatment and a cause of the outcome but is not in the causal pathway between the treatment and the outcome (i.e., is not a mediator).48 For example, in NRSIs of mental health treatments in pregnancy, women with greater symptom severity may be more likely to be treated with psychotropic medications and may also be more likely to experience adverse pregnancy outcomes (e.g., low infant birthweight or premature delivery) from the underlying condition (e.g., depression).49 In this example, the effect of the intervention on pregnancy outcomes is confounded by the underlying severity of the condition.

7.3. Misclassification of Interventions

A source of bias that is unique to NRSIs, particularly retrospective NRSIs, relates to the misclassification of interventions. Intervention status may be misclassified (e.g., arising from an error in measurement) nondifferentially or it may be misclassified differentially, in terms of the outcome status. Differential misclassification is a particular problem because it can relate to the outcome.48 If data on the intervention status are collected when the outcome or the risk of the outcome is known, differential misclassification of the intervention status may occur. Depending on knowledge or expectations of the outcome, participants or study personnel may overstate or understate participant exposure to the intervention. For example, participants in retrospective NRSIs of folic acid supplementation may be aware of the benefits of folic acid during early pregnancy.50 Their experience of the pregnancy outcome may influence their recall of the extent of exposure to folic acid supplementation in early pregnancy (an example of “recall bias”).

7.4. Other Sources of Bias

Other sources of bias in NRSIs are in common with RCTs. Some examples include bias due to missing data, bias in measurement of outcomes, and bias in selection of reported results.13 However, assessment of some of these biases, such as bias due to missing data and bias in selection of reported results, may be challenging to evaluate for NRSIs because study protocols may not be available and reporting may be suboptimal.

Views

  • PubReader
  • Print View
  • Cite this Page
  • PDF version of this title (1.0M)

Other titles in this collection

Recent Activity

Your browsing activity is empty.

Activity recording is turned off.

Turn recording back on

See more...