NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.

Paynter R, Bañez LL, Berliner E, et al. EPC Methods: An Exploration of the Use of Text-Mining Software in Systematic Reviews [Internet]. Rockville (MD): Agency for Healthcare Research and Quality (US); 2016 Apr.

Appendix BInterview Guide

Introduction

The overall mission of the Agency for Healthcare Research and Quality's (AHRQ) Effective Health Care (EHC) Program is to provide evidence-based information to health care stakeholders that is relevant to their needs, timely, objective, scientifically rigorous in construct, and developed and presented with transparency. In the production of systematic reviews, we aim to answer questions about effectiveness of interventions and average population effects. We are aware that for certain conditions and behavioral interventions, these questions may miss important issues.

AHRQ engages stakeholders in all facets of their research enterprise, including producing systematic reviews, to ensure that research findings reflect the needs of diverse users, are relevant to their unique challenges, and are applicable in real-world situations.

Purpose of the Stakeholder Interview

The goal of our project is to understand which text-mining tools you have used, how you have used them, and the challenges you have encountered.

We are very interested in learning from your experience.

There are not right or wrong answers, so please feel free to share your thoughts openly.

We welcome any materials that you would like to share with us either before or after the interview session. Please send any materials to vog.av@retnyaP.niboR.

Ground Rules for the Stakeholder Interview

The interview will be tape recorded, transcribed, and analyzed for overarching themes.

Although the report may list individuals who were interviewed, answers will not be identifiable to individuals or specific organizations.

You may refrain from answering any questions and are welcome to end the interview at any time.

Interview Guide – Senior Investigator/Systematic Review Organization Key Informant Questions

  1. Why did your organization decide to start using text-mining software?
  2. Which text-mining software tool(s) has your organization used in past systematic review projects?
  3. In which step or steps of the process has your organization used it?
  4. What criteria were used to determine which software package to use?
  5. Does utilizing text-mining software decrease or increase the length of time to complete the review process?
  6. How well did it fit within your existing organizational workflows?
  7. What were the expenses associated with its use in terms of staff training, software costs, etc.?
  8. Were there or are there any issues for staff to adapt to its usage?
  9. How long did it take for your organization to implement it?
  10. Were there or are there technical facilitators or barriers to implementation?
  11. Were there or are there organizational facilitators or barriers to implementation?
  12. How do you evaluate the value of using text-mining software to the review?
  13. How much confidence do you have in the results generated by the text-mining tools versus your previous process?
  14. Do you report anything differently because you used text-mining in your review?
  15. Would you recommend its use to other systematic review organizations?

Last question: Anything else you would like to add?

Interview Guide – Librarian Key Informant Questions

  1. Why did you begin using text-mining software to develop search strategies?
  2. How long have you been using text-mining software, or in roughly how many search strategies have you used it?
  3. Please describe how you utilize text-mining in your search process?
  4. When using text-mining software as compared to not using it, are there gains or losses in efficiency (i.e., amount of time needed to develop) and/or completeness of the strategy?
  5. Because search strategy topics vary widely, have you noted any types of questions for which text-mining software works particularly well or poorly?
  6. How do you evaluate the value of using text-mining?
  7. How much confidence do you have in the results generated by the text-mining tools versus your previous process?
  8. How do you report using text-mining in your review?

If using keyword/synonym text-mining tools OR concept identification text-mining tools:

  1. How did you evaluate this software to determine whether to use it?
  2. Were there any loading or technical issues in setting up or using the software?
  3. How long did it take to learn the software?
  4. How easy is it to import/input data into the tool?
  5. How do you use the output of the tool?
  6. Would you recommend its use?

If using filter text-mining tools:

  1. How did you evaluate the filter software to determine whether to use it?
  2. Were there any loading or technical issues in setting up or using the software?
  3. How long did it take to learn the software?
  4. How do you assess whether the test set of articles is valid? Reliable?
  5. Do you train on the citation/abstract or the full text or both?
  6. How long does it take to train your software on average?
  7. How easy is it to import/input data into the tool?
  8. How do you use the output of the tool?
  9. Would you recommend its use?

Last question: Anything else you would like to add?