Realities of replication: implementation of evidence-based interventions for HIV prevention in real-world settings

Implement Sci. 2014 Jan 6:9:5. doi: 10.1186/1748-5908-9-5.

Abstract

Background: To have public health impact, evidence-based interventions (EBIs) must be implemented appropriately at meaningful scale. The Center for Disease Control and Prevention's Replicating Effective Programs and Diffusion of Effective Behavioral Interventions programs disseminate select EBIs by providing program materials and training health providers on their appropriate use and implementation. Sociometrics' HIV/AIDS Prevention Program Archive (HAPPA) and Program Archive for Sexuality, Health, and Adolescents (PASHA) are likewise the largest EBI collections targeting sexual risk behaviors in the private sector. This study examined the extent to which organizations that obtain EBIs from HAPPA and PASHA implement, adapt and evaluate them and factors associated with program implementation.

Methods: Survey data were collected from 123 organizations that acquired, and had been in possession for a minimum of six months, at least one EBI from HAPPA or PASHA between January 2009 and June 2011. Data regarding program characteristics and date of acquisition were obtained from Sociometrics' sales and marketing databases. Logistic regression was used to assess barriers to program implementation.

Results: Among organizations that obtained an EBI from Sociometrics intending to implement it, 53% had implemented the program at least once or were in the process of implementing the program for the first time; another 22% were preparing for implementation. Over the three-year time period assessed, over 11,381 individuals participated in these interventions. Almost two-thirds (65%) of implementers made changes to the original program. Common adaptations included: editing content to be more current and of local relevance (81%); adding, deleting or modifying incentives for participation (50%); changing the location in which the program takes place (44%); and/or changing the number, length and/or frequency of program sessions (42%). In total, 80% of implementers monitored program delivery. Participant outcomes were tracked by 78%; 28% of which used evaluation designs that included a control or comparison group. Lack of adequate resources was significantly associated with decreased likelihood of program implementation (odds ratio = 0.180, p <0.05).

Conclusions: Findings provide greater understanding of implementation processes, barriers and facilitators that may be used to develop strategies to increase the appropriate use of EBIs.

Publication types

  • Research Support, N.I.H., Extramural

MeSH terms

  • Evidence-Based Medicine / organization & administration*
  • HIV Infections / prevention & control*
  • Humans
  • Information Dissemination
  • Program Development
  • Program Evaluation
  • Risk-Taking
  • Sexual Behavior