This book is distributed under the terms of the Creative Commons Attribution-NonCommercial-NoDerivs License which permits noncommercial use and distribution provided the original author(s) and source are credited. (See https://creativecommons.org/licenses/by-nc-nd/4.0/
NCBI Bookshelf. A service of the National Library of Medicine, National Institutes of Health.
Structured Abstract
Background:
Although children's health insurance coverage has expanded, some eligible children are still uninsured and others experience frequent coverage gaps. Community health centers (CHCs) care for low-income patients, many of whom are eligible for public coverage; thus, CHCs are an ideal setting for testing interventions to increase children's health insurance coverage rates.
Objectives:
We designed an intervention that used electronic health record (EHR)–based tools to help CHCs provide children's insurance enrollment assistance. We hypothesized that the group of children for whom the tool was used would have increased odds of gaining and maintaining coverage, increased likelihood of having return visit(s) after an uninsured visit, lower odds of being uninsured at subsequent visits, and increased rates of receiving recommended pediatric care.
Methods:
We implemented EHR-based tools in 4 intervention CHCs and selected 4 matched control CHCs that did not implement the tools. Using mixed methods, we assessed tool adoption and impact of tool use on insurance coverage, care utilization, and receipt of recommended care over 18 months after tool implementation, comparing 3 pediatric groups: (1) patients in the intervention clinics for whom the tools were used (n = 2240), (2) patients in the intervention clinics for whom the tools were not used (n = 12 784), and (3) patients from matched control clinics (n = 12 227). We quantitatively measured whether tool use affected insurance coverage, care utilization, and pediatric care quality.
Results:
Overall use of the insurance assistance tool was low. The most commonly used tool, the Tracking and Documentation Form, was applied to 15% of targeted patients; it was also used, unexpectedly, among children with no clinic visit and adult patients (often when their child was being assisted). A key factor underlying these results was the effect of concurrent policy initiatives. Patients for whom the tool was used had higher rates of continuous insurance coverage during follow-up (62% vs 59%), and were 76% more likely to gain insurance and 29% less likely to lose insurance coverage compared with those for whom the tools were not used. Among patients with an uninsured visit, those for whom the tools were used were 83% more likely to have a return visit and 54% less likely to be uninsured at all return visits. Intervention patients had significantly higher rates of several recommended care quality indicators, including well-child visits, human papillomavirus vaccination, and up-to-date childhood and adolescent immunization status.
Limitations:
The intervention was not randomly allocated. Despite statistical regression adjustment, unmeasured differences among participating clinics may have affected results. The follow-up period may have been too short to adequately assess tool uptake and impact. Concurrent policy initiatives also may have affected tool use.
Conclusions:
Tool use was low but had significant impact. This pragmatic trial, the first to evaluate EHR-based health insurance enrollment support tools, suggests such tools can increase insurance enrollment, prevent coverage loss among CHC patients, and promote delivery of recommended pediatric care services. Future efforts should engage diverse stakeholders in collaboratively designing tools that can be adapted to changing initiatives and clinic environments and can support enrolling family units.
Background
Patients benefit when they obtain and maintain continuous health insurance coverage, through increased access to health care and reduced unmet need.1-4 Children with continuous insurance have an increased likelihood of receiving recommended preventive care and a decreased likelihood of preventable emergency department and hospital visits.5-10 In the United States, children's health insurance coverage rates have steadily increased since expansions in the Children's Health Insurance Program (CHIP),11 but important gaps remain. Many low-income children who are eligible for Medicaid do not obtain coverage, or have interrupted coverage;12-15 1 study found 27.7% of children insured by Medicaid or CHIP lost coverage within 12 months of obtaining it.15 As 1 example of how this impacts children, 2 years after passage of the CHIP Reauthorization Act, 21% of pediatric patients in a large community health center (CHC) network were uninsured when they presented for a visit, and 30% of these children remained uninsured at subsequent visits over a 2-year period.16
Research shows that multilevel outreach efforts can increase health insurance enrollment.17-20 Care coordinators, case managers, and enrollment specialists can help improve coverage continuity,20,21 and the US Health Resources and Services Administration (HRSA) provided funding for CHCs to increase their ability to provide insurance enrollment assistance using such resources.22 However, most prior efforts to enroll and retain eligible children in public coverage were implemented outside health care settings or did not utilize electronic health records (EHRs) or other health information technology tools.23,24 Thus, although new technologies have enhanced health care providers' ability to provide insurance enrollment support to patients,25-28 no past efforts took advantage of this potential. Little is known about how CHCs can systematically help keep children publicly insured using health information technology.
Given these changes, and the fact that many low-income children receive care at CHCs,29 we conducted the Innovative Methods for Parents And Clinics to Create Tools for Kids' Care (IMPACCT Kids' Care) Study.30 In this pragmatic trial, the study team created EHR-based tools to support clinics in providing children's health insurance enrollment assistance. The team designed the tools to help CHC nonclinical staff track patients' coverage status and eligibility, and track insurance support efforts provided by the clinics. We implemented the tools in 4 intervention CHCs and measured their impact on several important outcomes. Study design and tool development processes were previously described.30-32
An adaptation of Andersen and Aday's behavioral model of health care utilization guided this study.33-35 That model, an integrated conceptual framework that uses a systems perspective to address the complexity of access to medical care, is relevant to understanding and intervening in factors that contribute to children's access to and use of insurance coverage (see Figure 1). According to the model, use of health care services is influenced by 3 elements: predisposing factors, enabling/hindering factors, and need. Predisposing factors are such characteristics as age, gender, race, and health beliefs. Enabling/hindering factors are the presence or absence of resources available for health care services, such as family income, health insurance coverage, and community attributes. Need refers to both perceived and actual need for health care services. We used this model to inform how we understand the interplay between predisposing factors, enabling/hindering factors, and need for care, and their impact on children's insurance status, insurance stability, and receipt of recommended health care services.
In brief, this pragmatic trial addressed the following 3 main questions:
- Question 1 (Q1): What was the rate of adoption of health insurance assistance tools in intervention CHCs and what explained the observed adoption patterns?
- Question 2 (Q2): Compared with pediatric patients for whom the tools were not used, did subsequent insurance coverage and health care use change for patients with tool use?
- Question 3 (Q3): Compared with pediatric patients for whom the tools were not used, did patients with tool use have higher rates of recommended pediatric care?
Patient/Stakeholder Engagement
Based on the established framework for identifying key groups of stakeholders for clinical effectiveness research,36 we identified a diverse group of relevant stakeholders that included policymakers, payers, providers, clinic leadership, administrators, and staff, as well as patient advocates and patient families. We engaged the stakeholders in all phases of the project: developing the proposal, adapting study methods, understanding the context of obtaining and maintaining public health insurance for children, and designing and refining the tool. The stakeholder engagement process for this project is described in detail elsewhere32 and summarized here.
The idea of developing EHR tools to help clinics identify patients whose insurance had expired or was nearing expiration originated from a CHC staff member. We focused this stakeholder idea on children's health insurance coverage and developed the study concept, design, and approach in collaboration with CHC leaders, clinicians, patient representatives, and policymakers from our Patient Engagement Panel and Practice-Based Research Network. Involvement in study development led to formally appointing a patient coinvestigator who was actively involved throughout the study and participated in dissemination activities, including conference presentations and manuscript publications.
To develop our intervention, we employed a user-centered design approach. “User-centered design” broadly describes processes employed to ensure that end-users influence computer software design.37 Sequential strategies involved in user-centered design include initial interviews to collect background data about needs and expectations; subsequent interviews to collect data about sequence of work; focus groups to discuss issues and requirements; on-site observation to collect information about the environment; walk-throughs and simulations to evaluate prototypes; usability testing; and follow-up interviews about user satisfaction.38 Using these strategies, we actively engaged stakeholders and solicited their feedback and insights about the development and use of study tools. We applied user-centered design to ensure that study tools were appropriate, relevant, usable, and patient and family centered.
During the preimplementation and tool development phases of the study,32,39 we conducted qualitative interviews with clinic staff and families of pediatric patients (n = 44) to understand their perspectives on obtaining, keeping, and managing health insurance for their children. Our interviews revealed that families think health insurance is very important, but they find the process of applying for insurance to be onerous and confusing. They trust their CHCs and clinic staff to help them manage the application process and to track information about their applications. Clinic staff were enthusiastic about supporting families with the insurance enrollment and retention process. We also held a focused advisory group meeting with policymakers from the State of Oregon, patient representatives, community leaders, and members of the research team (n = 13) to exchange ideas and information about the project, state health insurance exchange, and Affordable Care Act. Based on learnings from these activities, we made a stakeholder-driven decision to develop our tools for clinic staff as the primary users.
Our tool development team then created a set of tool prototypes that were shared with stakeholders from the 3 participating clinic systems (CHC leaders, staff, and families; n = 14) at a half-day retreat. The research team summarized findings from the staff and family interviews, presented possible tool options, and solicited feedback on the prototypes. Next, during 2 on-site clinic visits, the research team engaged key clinic staff (n = 6) in Think-Aloud exercises to further refine the prototype tools.40 In this user-centered design process, key users “thought out loud” while performing specified tasks and reviewing the tool prototypes. We completed an iterative series of 5 cycles of review and refinement with CHC staff and patient stakeholder involvement. We also engaged state policy stakeholders at key points during this process. A health literacy consultant reviewed final prototypes, and we incorporated suggestions for improving readability and usability into the final tool design.
When the prototypes were finalized, we built and activated the tools. We conducted beta-testing across the 4 intervention sites with insurance eligibility specialists, schedulers, and front desk staff (n = 16). After final modifications, we made the tools available to all staff at intervention sites. Six weeks after the tools were implemented, we conducted usability trials with end-user clinic staff (n = 24). Based on these trials, we redesigned some of the tools for ease of use and modified some functions to enhance clarity. A year after tool implementation, we conducted follow-up visits at 3 of the intervention clinics (n = 12) to observe insurance assistance processes and use of the tools and to assess perceptions of the tools.
As the project neared completion, we held a final stakeholder event with patients, clinicians, staff, and administrators from the study clinics as well as policymakers, patient representatives, and staff from other interested clinics (n = 28). We provided a project status update, discussed preliminary findings, and sought stakeholder input on ways to increase use of the insurance support tools as well as future directions for research.
Methods
Overview
The study period was the first date of tool implementation (June 1, 2014) through 18 months postimplementation (November 30, 2015). We addressed Q1—rate of and factors associated with tool adoption—using a convergent, mixed methods design, wherein we collected qualitative and quantitative data before and after tool implementation, and converged them to explain tool use patterns.41 We addressed Q2 and Q3—associations between tool use and changes in insurance coverage, use, and preventive care receipt—using a retrospective cohort design that involved quantitative data only.
Study Setting
We conducted this study at 8 (4 intervention and 4 control) CHCs serving low-income, ethnically diverse populations, including many publicly insured and uninsured pediatric patients, who would potentially benefit from the tools. CHCs were members of OCHIN, Inc, a 501(c)(3) that provides health information technology to >400-member CHCs in 19 states.42,43 OCHIN member clinics share a single instance of the Epic EHR, which enabled us to build and implement the tools described here on a single EHR platform across all intervention sites and to obtain EHR data from all study sites.
Sample
Four OCHIN member clinics volunteered to be intervention sites. We selected 4 comparison sites from a pool of 38 qualifying nonintervention clinics, using a propensity score technique44 that selected comparison sites that were the closest possible matches to the intervention sites based on patient population demographics and date of EHR implementation. Because of the small clinic sample size, we were ultimately able to match on only 3 factors: ratio of children to adults, percentage Hispanic ethnicity, and length of EHR experience.
Intervention
We developed a suite of EHR tools via a user-centered design process, described in more detail above as well as in our publications.31,32,45 Briefly, through direct workflow observation at CHCs and interviews with CHC staff, we identified the following deficiencies in the process of providing health insurance assistance to patients: (1) lack of information on patients' insurance end dates, (2) lack of automated systems for identifying patients in need of assistance, and (3) lack of a standardized system for tracking assistance provided to patients. We then employed low-fidelity wireframe prototyping, beta-testing, and usability testing to design health insurance tools to address these gaps.
We created the tools for use specifically by those CHC staff who assisted patients with health insurance enrollment (eg, outreach and enrollment assistants/eligibility specialists; front desk [registration/scheduling] staff).45 Staff in these roles at both intervention and control sites have instituted insurance review and support workflows to ensure that as many of their patients as possible are insured. They regularly check insurance status and identify patients who may need help with health insurance applications. In addition to insurance verification at time of scheduling, patients may come to the attention of enrollment assistants/eligibility specialists in a number of other ways: being referred from the front desk at check-in, being contacted through active outreach by clinic staff, by word of mouth from the community, or by seeking assistance on their own. For those patients who require assistance, enrollment staff may complete, submit, and track both new and renewal applications on their behalf.45 Enrollment staff maintained expertise in insurance eligibility and enrollment rules and procedures, the range of qualified health plan options and insurance affordability programs, and the needs of underserved and vulnerable populations. The majority of enrollment staff at all sites were bilingual in English and Spanish.
The suite of tools was embedded into the OCHIN Epic EHR Registration module and consisted of the following:
- Tracking and Documentation Form for Insurance Applications (Tracking Form; Figure 2) provided electronic functionality for documenting assistance provided to patients who applied for public health insurance. Specifically, we designed the Tracking Form to record relevant insurance application data, including fields to capture the status of the application (submitted, pending, approved, and denied), the date application was submitted, the insurance type, the enrollment assistant who opened the form, and a free text field for entering notes about missing information or action steps for the next appointment. The form also included a field for entering a follow-up date to support the workflows for checking the status of an application.
- Reporting/Roster Tools (Roster Tool; Figures 3 through 6) enabled clinic staff to create a daily report listing patients who have a Tracking Form and (1) for whom insurance assistance has already been initiated and are in need of follow-up (IMPACCT Follow-up Report), or (2) who have an upcoming clinical appointment and might need insurance enrollment support (IMPACCT Upcoming Appointments Report). Access to the Tracking Form for each patient included on the report was readily available from these reports, facilitating updates to the form.The IMPACCT Roster Tools can be accessed by searching the library of reports available within the electronic health record.An IMPACCT Follow-up Report can be selected from the library of available reports and is ready to run.Running the IMPACCT Follow-up Report results in a roster of patients for whom insurance assistance has already been initiated and follow-up is needed; used primarily by insurance assistants/eligibility specialists.An IMPACCT Upcoming Appointments Report can be selected from the library of available reports and is ready to run.
- Insurance Coverage Information (pop-up warnings, coverage verification option, appointment information insurance data, Department Appointments Report): To meet clinics' need to identify patients in need of health insurance application assistance, we made health insurance information visible in a few different locations in the EHR Registration module:
- Pop-up Alerts (Figure 7): Notifications that popped up when clinic staff closed the registration page of, scheduled an appointment for, or checked in a patient whose health insurance was soon to expire or who appeared eligible but not enrolled in public health insurance
- Coverage Verification Option (Figure 8): A new dropdown menu option under Coverage Verification that allows staff to indicate that public health insurance coverage may soon be expiring for a patient
- Appointment Information Insurance Data (Figure 9): New fields in the Appointment Information section that display health insurance data (Medicaid Redetermination Date, Medicaid Application Status, Insurance Assistant Appointment)
- IMPACCT DAR (Figures 10 and 11): A special Department Appointments Report (DAR) created to highlight insurance expiration data (Medicaid Redetermination Date, Medicaid Application Status, Insurance Specialist Appointment Date). Users could add any of these data elements to an existing, customized DAR.
The IMPACCT DAR, designed for Front Office staff use, creates a roster of patients for scheduling/check-in, along with relevant health insurance information.
Front office staff can add new IMPACCT data elements (eg, insurance expiration date, next appointment with insurance assistant) to appointment information by selecting and running the IMPACCT DAR from the Available Settings menu under Scheduling/Department Appointments.
We designed all study tools to be accessed and used by CHC staff; individual patients for whom the tools were used were unaware of their use and thus were passive participants in the study. The tools were available for use with any patient (ie, not limited to children or Medicaid enrollees); staff initiated tool use based on clinic-specific workflows.
Implementation
We built study tools on OCHIN's single EHR platform and distributed them centrally from OCHIN into the study sites' EHR. Figure 12 illustrates the timeline of the iterative process undertaken for tool development and implementation. We initially rolled the tools out on June 1, 2014. Over the course of the next year, CHC staff were periodically engaged in activities related to tool implementation and evaluation, including on-site training, usability testing, and postimplementation process evaluation. We made updates to the tools based on feedback from these activities. Initially, in response to usability testing, we redesigned some of the tools for ease of use and modified language on several functions to enhance clarity, such as expanding drop-down list options and adding data fields. Next, when we discovered that the accuracy of some of the insurance coverage data provided by the State of Oregon was unclear, we obtained additional data elements from the state to more precisely inform the algorithm that calculated the insurance end dates embedded in the IMPACCT tools. We worked with analysts from the State of Oregon throughout the study to iteratively improve our process of obtaining regular data feeds. Following implementation of the improved insurance end date algorithm, we learned that another barrier to wider adoption of the tools was the lack of Tracking Form reporting capability. We met with CHC stakeholders to elaborate reporting specifications and developed new reporting functionality accordingly. We implemented the final iteration of the tools on April 2, 2015.
Training
Working with the training team at OCHIN, which specializes in CHC training, we developed training materials and conducted on-site trainings with relevant CHC staff. Insurance assistants/eligibility specialists, and front office workers, as well as their supervisors, managers, and CHC leadership participated in trainings for the initial tool implementation and the 2 subsequent updates. The trainings provided a brief overview of the study and an in-depth, hands-on tutorial on using the tools. We distributed both electronic and paper copies of a comprehensive tool guide. Trainings were interactive, with the study team demonstrating tool use on a big screen in front of the group and participants following along on their own laptops. The last portion of the trainings was reserved for CHC leadership-led discussions on the workflows associated with the use of new tools. At the end of each training, we distributed contact information for the members of the research team in case participants had follow-up questions.
Analyses—Q1 Tool Uptake
Quantitative
Measurement was limited to the Tracking Form, the only tool for which usage data could be extracted from the EHR and readily quantified. To evaluate how frequently it was used at the study CHCs, we used a retrospective cohort design to measure its use in the 18 months after the tools were implemented at intervention clinics. We evaluated use of the other tools (reports, pop-up alerts, and coverage verification options) using qualitative data.
We extracted EHR data to assess Tracking Form use (when the tool was opened and edited within a patient chart) and to collect patient demographics and visit data. We assessed patients' insurance coverage using OCHIN EHR data (coverage status at a visit) and Oregon Medicaid enrollment data, which specified coverage start and end dates. Patients covered by Oregon Medicaid retain the same Medicaid identification number over time; our 2 quantitative data sources were linked using this identification number via established methods.46-48 Medicaid data also include a case number that identifies households, which enabled us to measure how long each patient's household was “established” at a clinic. Because Oregon requires all enrollees to recertify their Medicaid enrollment annually, we designed our study was with an 18-month intervention period as a means of ensuring that all Medicaid patients would have the opportunity to use enrollment assistance at least once during the study.
We retrospectively identified 3 parallel pediatric (aged 0-19 at time of visit) cohorts for analyses. Study cohorts included all pediatric patients with at least 1 clinical encounter in the study period, regardless of insurance status. “Intervention patients” (n = 2240) were those with ≥1 clinical visit at an intervention clinic, and for whom the Tracking Form was used. Because tool use was determined by CHC staff, patients did not self-select into the intervention group; all patients at intervention clinics had the opportunity for the tools to be used on them. “Within-clinic comparison patients” (n = 12 784) were those for whom the tool was not used, despite having ≥1 clinical visit at an intervention clinic during the study period. We identified the within-clinic comparison group to control for differences in health outcomes owing to changes in other clinic procedures, staffing, and workflows that may have occurred during the study period. “Control clinic patients” (n = 12 227) were patients with ≥1 clinical visit at matched control sites that did not have access to the tools. Because of the insurance review and support workflows in place at both intervention and control sites, control site patients would have had equal opportunity for the tools to be used on them had the tools been available in their clinics. Clinic staff at the intervention sites also used the tools for 2 unanticipated groups outside the study cohorts: children with no clinical visit (n = 969) and adult patients (n = 3207; see Figure 13).
We calculated the number of patients for whom the Tracking Form was used (numerator; intervention patients) over the number eligible for tool use (denominator; all eligible patients in the intervention CHCs aged 0-19 with ≥1 clinical visit). We selected the total patient population as our denominator, given that any patient had the opportunity to be a “tool use patient.” Using chi-square tests, we compared demographic, care use, and coverage characteristics of intervention patients vs the 2 comparison groups.
Qualitative
The rationale for employing qualitative methods in this study was 2-fold: (1) Qualitative data were collected to inform tool design based on the end-users' needs and perspectives, to ensure the tools' relevance and usefulness to CHCs; and (2) qualitative data provided an in-depth explanation and enhanced understanding of the quantitative findings.
During the preimplementation period (July-September 2013), we conducted observations and interviews at 2 intervention and 2 comparison CHCs, purposively selected from different organizations to assess baseline insurance eligibility support processes and to inform tool development.31 A research team spent 2 to 3 days at each clinic, observing insurance application assistance processes, conducting semi-structured interviews with 5 to 9 individuals per site, and creating detailed fieldnotes. During visits, we asked clinic staff members (clinic leaders, clinicians, and workers; n = 26) about their site's approach to enrollment assistance, how their role fit into the process, and about their experiences assisting patients and families with the insurance applications. Staff members who participated in these interviews were compensated for their time with $5 gift cards. In addition, we interviewed adult family members of at least 1 pediatric patient (n = 18) about their experiences getting assistance with public health insurance enrollment. We recruited family members for these interviews via referrals from clinic staff or by approaching families in the clinic waiting rooms before or after their child's visit. Family members were compensated with $35 gift cards.
Approximately 1 year after tool implementation (May 2015), we returned to the same 2 intervention clinics visited during the preimplementation period and visited an additional intervention clinic that was selected because the enrollment/eligibility specialist at that location was a key participant in the tool development process. Field researchers spent 1 day at each of the clinics and observed insurance assistance processes as well as tool use (eg, which tools were used, how they were used and incorporated into workflow, who used them, which features of each tool were used). Field researchers conducted semi-structured interviews with intended tool users only (eg, front desk staff, schedulers, eligibility specialists/insurance assistants, clinic administrators; n = 11), as these interviews assessed users' experience with and perception of the tools. Interview questions focused on understanding exactly how the tools were used and identifying facilitators and barriers to tool use. Participants were compensated for their time with $5 gift cards. Because tools were targeted exclusively for clinic staff use, we did not interview family members in the postintervention assessment. Although patients enrolling for insurance through the clinic were no doubt aware that the staff assisting them were gathering data from and entering data into a computer, they were not specifically told about the tools nor asked about their perception of them.
Audio-recorded interviews were professionally transcribed. Transcripts and fieldnotes were deidentified and entered into Atlas.ti (Version 7.0, Atlas.ti Scientific Software Development GmbH, Berlin, Germany) for analysis. We used a grounded theory approach49 to analyze the data. Grounded theory is a systematic methodology for both collecting and analyzing data. Instead of developing theory a priori—before collecting and analyzing data—theory is grounded in and emerges directly from the data. Iteratively collecting and analyzing data are critical features to this method. In the first phase, we met as a group to review all data for 1 clinic. Our group discussion informed how we named and tagged the text, and how we defined emerging themes to create a code book. We repeated this process for each clinic, continuing to meet as a group to debate, discuss, and make sense of the data. Then we compared emerging themes across the clinics, at which point we identified several factors that influenced tool use, such as clinics' ability to access consistent coverage end dates and concurrent initiative documentation requirements.
Mixed Methods
We used a convergent, mixed methods design,41,50,51 a type of research design in which qualitative and quantitative data are collected in parallel, analyzed separately, and then merged. We selected this method because it is especially useful for corroborating findings and expanding and explaining quantitative data through the collection of open-ended qualitative data. We collected quantitative and qualitative data concurrently before and after tool implementation. We initially analyzed quantitative and qualitative data separately. Then, during a mixed methods interpretation phase, we converged the data. For example, the quantitative data showed low tool use rates; these data also showed that the tools were used beyond the intended study population. The qualitative data showed contributing factors to tool usage and helped explain tool use and uptake patterns.
Analyses—Q2 Tool Impact on Coverage and Health Care Visits
We structured analyses to address our second study question to test the following 3 hypotheses: (1) Tool use would be associated with higher odds of gaining/maintaining Medicaid insurance coverage; (2) among children with an uninsured visit, tool use would be associated with higher odds of returning for a clinical visit, because coverage would remove an access barrier; and (3) among children with an uninsured visit who returned for a second visit, tool use would be associated with lower odds of being uninsured at subsequent visits. These analyses are premised on the assumption that uninsured children would benefit most from the tools. To test these hypotheses, we conducted the following analyses within relevant subsets of our 3 cohorts (intervention patients, within-clinic comparison patients, and control clinic patients):
- Among patients who could be linked to Medicaid data (those with a Medicaid identification number in the EHR), we assessed (1) the odds of gaining coverage and (2) the odds of losing coverage during the study period. The assessment period was June 1, 2014, to September 30, 2015. Medicaid enrollment data used to evaluate this outcome were available only through September 30, 2015, so these analyses ended at that date.
- Among those who had an uninsured visit within 6 months of the tools' implementation date, we calculated the odds of returning for a second visit. The assessment period was December 1, 2013, to November 30, 2015. We selected this outcome based on the recommendation that most children should have at least 1 wellness visit per year.52
- Among patients in (2) who did return for a visit, we calculated (1) the odds of being uninsured and (2) the odds of being insured by Medicaid at return visit(s) within the assessment period of December 1, 2013, to November 30, 2015.
In these analyses, our assessment period extended to 6 months before tool implementation, to include patients who would have been identified as uninsured or about to lose coverage, if the tool were used for panel-based outreach. We calculated odds ratios using adjusted generalized estimating equation (GEE) models to account for within-clinic clustering and adjusted for demographic and utilization covariates.
Analyses—Q3 Tool Impact on Receipt of Recommended Care
We structured analyses to address our third study question to test the hypothesis that tool use would be associated with higher rates of recommended care receipt in the 18 months following tool implementation, compared with within-clinic comparison patients and control clinic patients.
We assessed receipt of recommended quality using the Child Core Set of health care quality measures set out by the Children's Health Insurance Program Reauthorization Act (CHIPRA). We extracted data from the EHR and used CHIPRA technical specifications published in March 2015,53 the most current version available during our study's assessment period. The following indicators were deemed most appropriate to our study population and yielded sufficient sample size for analysis: well-child visits in the first 15 months of life; well-child visits in the third, fourth, fifth, and sixth years of life; adolescent well-care visit; body mass index (BMI) assessment for children and adolescents; human papillomavirus (HPV) vaccine for female adolescents; chlamydia screening in women; childhood immunization status; and immunizations for adolescents. We assessed all measures among eligible subpopulations over the 18-month period after study tools were implemented (June 1, 2014, to November 30, 2015). Using chi-square statistics, we tested between-group differences in the percentage of eligible children meeting each measure. We also estimated adjusted odds ratios for meeting each measure, comparing the intervention group with within-clinic comparison patients for whom the tools were not used, and with control clinic patients.
For all quantitative analyses, we estimated odds ratios using adjusted GEE models to account for within-clinic clustering; we set statistical significance was at α = .05. We assumed a compound symmetry correlation structure and applied a robust sandwich variance estimator to account for possible model misspecification.54 Because our propensity score matching technique was insufficient to balance on many differences in the cohort populations, we adjusted all models for a range of patient covariates: gender, age, race/ethnicity, language, Federal Poverty Level (FPL), number of encounters, new/established patient status, household members on case, and length of time household established with clinic. We also analyzed the 3 cohorts as described above using an intent-to-treat (ITT) approach, in which all children with ≥1 clinical visit at an intervention clinic, regardless of tool usage, were compared with the group of control clinic patients (Appendix). We were cognizant of the fact that ITT analyses might not be valid in the presence of low uptake rates, which provided little information about the efficacy of the tools and made it more susceptible to type II error.55 We conducted all analyses using SAS software, version 9.4 (SAS Institute, Inc). The Oregon Health & Science University IRB approved the study.
Results
Patient populations differed somewhat between intervention and control clinics: Intervention clinic patients were more commonly Hispanic and Spanish-speaking, had lower household incomes, were more likely to be a new patient at their first encounter, and were less likely to have a Medicaid identification number (Table 1 and Appendix, Table A.1). The subset of patients for whom the tools were used within intervention sites further differed from the overall population of intervention clinic patients (described below).
Q1: Tool Uptake
Tool Use Rates
Overall, use of the suite of tools we developed to support insurance activities was low. Over the course of the study period, the Tracking Form was used for approximately 15% of all pediatric patients with a clinical visit at an intervention site (Figure 13 and Table 1). Although we could not track the other tools quantitatively, qualitative assessments suggest that the Pop-up Alerts and Roster Tool were also used on a small subset of patients. Additionally, we observed that Tracking Form use changed over time (Figure 14): An immediate spike in use rates occurred when the tool went live in June 2014, then utilization dipped for several months. After we conducted additional site visits and trainings that fall, starting in November 2014, tool use increased gradually but steadily throughout the remainder of the follow-up period but remained low overall.
Qualitative data offer several explanations for this tool use pattern. A key factor was the effect of concurrent initiatives. Most notably, clinic staff were initially very enthusiastic about the tools' potential to alert them when a patient's insurance was due to expire. However, because of Affordable Care Act insurance expansion implementation in Oregon, the state received an overwhelming influx of Medicaid applications just before the study period began, which extended many coverage expiration dates to prevent coverage loss owing to a longer processing times for renewal paperwork.
Because of everything with OHA [the Oregon Health Authority] being so crazy, renewal dates are all over the place. A lot of people … are not renewing and they're just being extended. [The state is] extending expiration dates so that … people don't lose coverage, even though they were supposed to lose coverage, like, back December 31st. But they're extending them because there's so many people applying, and they're so short staffed and, you know, so they just keep extending, extending, extending so that they have time to … renew medical benefits. [Clinic 5, Eligibility Specialist]
Despite a greater amount of insurance application activity, when staff realized that the coverage end dates shown in the tools did not coincide with the state's extended coverage end dates, they stopped using the tools:
We were excited because [the tools were] going to tell us when it's time to renew. We're going to be able to research all these patients and call them…. But then it didn't really work. And then the front [desk staff] would schedule appointments because [the tool] would say it's time to renew. But then when they would come [in] I would call and [the state] would say, no, it's not time. So that was kind of a bummer. We thought it would work, and … help not only the patient but, you know, us too. [Clinic 5, Eligibility Specialist]
We think that the insurance is supposed to be expired by January. Then we realize that it's been extended 3 more months. When we check [in] 3 more months, the next day there's another 3 more months. So my staff are like, “How can I do any follow-up?” … There is not a real and exactly accurate redetermination date for us to support our patients. [Clinic 4, Eligibility Specialist Supervisor]
In addition, during the study period, new health insurance outreach initiatives required CHCs to use specific processes to track insurance application assistance. In 2013, HRSA announced a new funding opportunity for CHCs to extend insurance assistance to all community members, whether patients or not.22 The CHCs participating in the study received this HRSA community outreach grant funding, which changed their workflows and reporting requirements. Although some overlap occurred between the data fields in the study tools and the HRSA data tracking requirements, HRSA required the collection of additional information that was not included in the Tracking Form, including but not limited to number of family members being assisted or included on each health insurance application. The intervention CHCs also participated in an alternative payment program that required staff to document every type of patient interaction, including insurance assistance, using specified reporting methods.
The study tools had been designed based on prior workflows and were not easily adapted to the workflows necessary to meet the requirements for these new initiatives, so clinics developed alternative tracking mechanisms to document the newly required data. As a result, CHC staff were tracking HRSA data and insurance-related interactions in clinic-developed spreadsheets and not using the study tools, which would have required time-consuming and duplicative data entry.
In addition, staff did not receive clear direction from management about using the various recording systems available to them. One staff member said she did not use the tools because she had not been instructed by a manager to do so; another said she had been instructed to use only the clinic's spreadsheet, not the tools under study:
I'm just doing what we were told to do … to keep track on the [spreadsheet] insurance tracking list. And that's how I go and put all the information. [Clinic 4, Eligibility Specialist]
These concurrent initiatives not only created confusion about which tools to use for which purposes, but also increased the eligibility specialists' workloads. As a result, the eligibility specialists had little time to conduct proactive outreach activities using the Roster Tool to identify patients whose insurance was nearing expiration.
Respondent: It's been really hard for me to get into that list. I know I should be working on it. But there is … no way unless we blocked certain time from our schedule to get into the list and see which ones need to re-apply.
Interviewer: What are the barriers?
Respondent: The workload on working with appointments every hour. And sometimes we have families of 7 or 8 people. And, even though I do the application online, it's fairly time consuming … because sometimes people don't have all the information with them. So unless we block the schedule … there is not enough time to do the follow-up list. [Clinic 1, Eligibility Specialist]
Tool Use by Patient Characteristics
Although tools were intended for use with all pediatric patients, actual tool use appeared to be driven, in part, by patient characteristics. Children for whom the Tracking Form was used differed from those for whom it was not used, most notably by ethnicity: 95% were Hispanic, compared with 65% of within-clinic comparison patients; and 88% used Spanish as their primary language, compared with 51% (P < .001 for all differences; Table 1). Qualitative results concur that patients' ethnicity and language drove the use of the insurance assistance tools. Eligibility specialists reported that Spanish-speaking patients often requested application assistance and often referred other families that needed insurance assistance.
Some [Spanish-speaking patients] don't want to apply just because they don't think they qualify. [Clinic 4, Front Desk Worker]
Since we've been here for a long time, the word of mouth with the Hispanic families has been the word of mouth, basically. They know where we are and they just come. [Clinic 3, Eligibility Specialist]
I can be like a liaison between the client [and the state]. And especially if they're Spanish speaking, they tend to be more intimidated to call their case worker to go into the office. [Clinic 2, Eligibility Specialist]
In addition, intervention patients were older, had lower household incomes, were more likely to be established clinic patients at their first study visit, and had more clinical encounters in the study period compared with within-clinic comparison patients (P < .001 for all comparisons; Table 1). Among patients with a Medicaid ID, intervention patients were also more likely to have another household member on their Medicaid case (P < .001; Table 2).
Tool Use Beyond the Intended Study Population
Although the tools were designed for use with all pediatric patients seen at the study clinics, we observed tool use in 2 groups not included in our main study population: (1) children who were not established patients and (2) adults. Children who were not established clinic patients (had no clinical visits during the study period; n = 969) were older, less commonly Hispanic or Spanish speaking, and less likely to have a Medicaid ID than intervention patients (P < .001 for all; Table 3). Among adults for whom the tool was used, most were female (65%), under age 50 (77%), with household incomes ≤138% of FPL (88%), Hispanic (86%), and Spanish speaking (80%). Of these adults, 23% were assisted with insurance but had no medical visit during the study period. Almost a third of adults with a Medicaid ID did not share a case number with other household members, indicating that the Tracking Form was also used for “single” adults, not just family members of children already being assisted.
Our qualitative work helps explain these findings. The eligibility specialists often aided entire families with their health insurance; when helping a child apply, the specialist screened the rest of the household and assisted all eligible family members at the same appointment.
The eligibility specialist pulls a paper [application] and begins to fill it out with the family…. She writes in the names and info about all the family members. [Clinic 1, Fieldnotes]
Interviewer: How long does it take you to complete an application with the family?
Respondent: It depends. If it's just 1 person, it takes about 45 minutes. But if it's a family, a big family, 5 or 6, it takes me a little bit more than an hour. [Clinic 4, Eligibility Specialist]
Because EHRs are designed to be used at the patient level (ie, users work in only 1 patient chart at a time, and the Tracking Form was linked to an individual patient's chart), the study tools did not allow eligibility workers to document assistance provided to a family unit without conducting multiple data entries.
Q2: Tool Impact on Coverage and Health Care Visits
Tool Use Was Associated With Higher Odds of Continuous Medicaid Coverage
Fewer intervention patients had Medicaid coverage for <50% of the postimplementation period (7%) than either the within-clinic comparison patients (13%) or the control clinic patients (11%). Pediatric patients for whom the Tracking Form was used had higher odds of gaining Medicaid after being uninsured, relative to comparable patients at the same clinics for whom the tool was not used (adjusted odds ratio [aOR], 1.76; 95% CI, 1.60-1.93) and control clinic patients (aOR, 2.28; 95% CI, 1.91-2.72). Patients for whom the tool was used also had lower odds of losing coverage than within-clinic comparison patients (aOR, 0.71; 95% CI, 0.53-0.94) or control clinic patients (aOR, 0.55; 95% CI, 0.45-0.67; Table 2). These relationships were consistent in ITT analyses (Appendix, Table A.2).
Tool Use Was Associated With Higher Odds of an Uninsured Child Returning for a Subsequent Clinic Visit
Among the subset of patients with ≥1 uninsured visit within 6 months of tool implementation, the adjusted odds of returning for ≥1 follow-up visit was significantly higher in the intervention group than in the within-clinic comparison group (aOR, 1.83; 95% CI, 1.39-2.40) and also higher than in the control clinic group, although the difference was not statistically significant (aOR, 1.64; 95% CI, 0.92-2.91; Table 4).
Tool Use Was Associated With Lower Odds of Being Uninsured at Return Visits
Among uninsured patients who did return for at least 1 clinic visit during the follow-up period, patients with tool use had significantly lower odds of being uninsured at all return visits (aOR, 0.46; 95% CI, 0.24-0.88), compared with patients within the same clinics without tool use. Additionally, patients with tool use had double the odds of being insured by Medicaid at a follow-up visit compared with within-clinic comparison patients without tool use (aOR, 2.11; 95% CI, 0.91-4.89); this difference was not statistically significant, possibly because the comparison may have been underpowered to detect a statistically significant difference (Table 4). When compared with the control clinic group, these relationships were reversed and did not reach statistical significance. We were unable to produce adjusted odds ratios for this comparison, however, because of lack of variability within subgroups; we report unadjusted odds ratios instead (see Table 4 footnotes). These associations yielded mixed results in ITT analyses (Appendix, Table A.3).
Q3: Tool Impact on Receipt of Recommended Care
Tool Use Was Associated With Higher Rates of Receipt of Quality Recommended Care
In the 18 months after tool implementation, intervention patients for whom the Tracking Form was used had higher rates of most assessed CHIPRA indicators compared with both within-clinic comparison patients and control clinic patients (Table 5). For example, 63% of eligible intervention patients had the recommended number of well-child visits in the first 15 months of life, compared with 39% of within-clinic comparison patients for whom the tools were not used (P < .001) and 53% of control clinic patients (P = .02). Female adolescents for whom the tool was used were more likely to complete HPV vaccination by their 13th birthday (54%) vs 36% for within-clinic comparison patients and 38% for control clinic patients (P < .001 for both). Intervention patients were more likely to be up-to-date on most immunization indicators, compared with both comparison groups. The only measures for which intervention patients did not perform significantly better than either comparison group was BMI assessment and chlamydia screening among sexually active women. Many of these observed differences remained significant when adjusted for covariates and clinic clustering (see Table 5). Results of ITT analyses were mixed (Appendix, Table A.4).
Discussion
This study achieved its goal of partnering with stakeholders in a user-centered design process to build EHR-based insurance assistance tools for CHCs.30-32 During the first 18 months after tool implementation, tool use rates were low. However, results suggest that CHC staff saw a benefit to using the tools for certain subpopulations (eg, Hispanic, Spanish speaking) and for individuals beyond the study population (eg, adults). Further, pediatric patients for whom the Tracking Form tool was used had significantly higher odds of gaining Medicaid after being uninsured, and significantly lower odds of losing Medicaid coverage, than those for whom it was not used. Among patients with an uninsured visit in the study period, tool use was associated with higher odds of returning for a follow-up visit and lower odds of being uninsured at follow-up visits. Using the CHIPRA quality of care measures as indicators, children for whom the Tracking Form tool was used had significantly higher rates of several quality measures than comparison groups in the 18 months after tool implementation. These findings are consistent with a statewide analysis showing that gaining Medicaid is associated with increased use of pediatric primary and preventive care services.56
Real-world factors are common in pragmatic trials but can be difficult to anticipate. We conducted our study during a historic time in health care reform, and we could not anticipate the new programs and conditions that appeared in a rapidly evolving landscape. For example, Medicaid expansion resulted in a record number of new applicants, and Oregon was a national leader in this expansion.57 While this could have been the perfect environment in which to implement a suite of insurance application tools, several factors greatly limited adoption. First, this surge in Medicaid applications increased processing time, and Oregon responded by extending coverage end dates for reapplicants. As a result, the end dates built into our tools were no longer accurate, which affected tool usage rates and users' perceptions of the tools' usefulness. Second, Medicaid expansion increased workloads for CHC staff, who were then too busy helping new applicants to have time to use the tools for proactive outreach. Third, an HRSA grant program that incentivized the CHCs to use a different documentation system was implemented,22 requiring data tracking documentation incompatible with our tools.
A strength of this study is its mixed methods design; qualitative data showed how external factors affected study outcomes. Future efforts might engage a broader set of stakeholders in tool development to better predict changing landscapes that might affect adoption, and tools should be designed for easy and quick adaptation to unforeseen circumstances.
Another notable finding from this study was that, although designed for use among pediatric patients, the tools were used among adults as well. About two-thirds of the time, these adults had a child who was also assisted, and qualitative data concurred that eligibility specialists often assisted entire families with enrollment. Existing EHR data structures do not allow for reliable identification of family units;46 rather, they treat each patient as a single unit, in part because linking multiple patient records together in an EHR-based tool could lead to unintended disclosure of Protected Health Information across the records, thus violating the Health Insurance Portability and Accountability Act (HIPAA) Privacy Regulation. As a result, documenting families as a group in the Tracking Form tool required users to enter information separately for each family member—a time-consuming process of multiple data entry. Future tools should support tracking and documenting on family units, and EHR designs should accommodate linking family members, or at the very least linking children with their parents/caregivers, through mechanisms compliant with the HIPAA Privacy Regulation.
Our experience with stakeholder engagement yielded important lessons for future efforts to develop and implement EHR-based, automated tools to help CHCs provide insurance enrollment support. Our iterative tool development process successfully engaged a variety of stakeholders in the design of the tools and the conduct of the study.32 Our stakeholders were instrumental in helping us identify and understand the gaps in the insurance support workflows in the CHCs and how health information technology could be used to bridge those gaps. In addition, they were key informants in defining what functionality was specifically needed to support clinics' health insurance efforts. Furthermore, our stakeholders helped keep us abreast with the changes in the policy and health care landscape relevant to CHCs and the tools we developed, as well as provided input to help shape future directions for these tools. A growing body of evidence from a variety of disciplines, such as user-centered design and community-engaged research, suggests that such engagement enhances the acceptance and uptake of new technologies and tools.58-60
Implications
This study contributed new knowledge about the effectiveness of innovative health insurance enrollment systems and EHR-based data tools in supporting CHCs as they assist families with the process of obtaining and retaining children's health insurance coverage and deliver high-quality, evidence-based pediatric care. Based on the many promising findings from this first study, we believe it is important to continue pursuing this line of inquiry, which has significant future implications for improving patients' health.
We have already embarked on 2 exciting next steps. First, we are working with the leadership of the study intervention clinics to use our tools and findings in crafting effective strategies for meeting their 2016-2017 organizational goal of “No Gaps In Coverage” for all of their Medicaid-eligible patients. Second, findings from this study directly informed the design and implementation of a hybrid effectiveness-implementation trial, the Community-based HIT Tools for Cancer Screening and Health Insurance Promotion (CATCH-UP) study, funded by the National Cancer Institute (NCI R01 CA181452, DeVoe PI). We have shared the final algorithms and functionalities of the IMPACCT health insurance enrollment assistance tools, training materials, pre- and postimplementation qualitative findings, and best practices with the CATCH-UP study. The 5-year CATCH-UP study is building on lessons learned, stakeholder engagement, and findings from IMPACCT by developing a next generation of health insurance enrollment tools and spreading the updated suite of tools to 23 additional OCHIN clinic sites. The CATCH-UP health insurance tools include updated and expanded tracking and reporting functionality, including new data fields to support HRSA outreach and enrollment reporting. In addition, the new tool structure captures the number of individuals in a family unit, without identifying who those family members are. CATCH-UP will formally study implementation support strategies to identify key facilitators for increasing the tools' use as well as their effectiveness with adult patients. Upon completion of the CATCH-UP study, and based on the learnings from both IMPACCT and CATCH-UP (if the updated version of the health insurance enrollment tools prove effective), our ultimate goal is widespread dissemination of the tools, not only throughout the OCHIN clinic network (>450 clinic sites) but nationwide.
The potential long-term impact of our work is to maximize health insurance enrollment, promote insurance coverage stability, improve health outcomes, enhance the efficiency of health care system performance, reduce health care disparities for vulnerable populations, and ultimately improve the health of at-risk populations.
Limitations
To accommodate CHC stakeholder requests early in the study, the intervention was not randomly allocated to the 4 CHCs initially recruited into the study—all 4 received the intervention. Instead, we used propensity score techniques to identify the 4 CHCs that were closest to the intervention clinics based on patient panel characteristics and date of EHR implementation. However, significant differences remained between intervention and control sites, probably because of the small number of nonintervention CHCs available for matching. We attempted to address this imbalance through additional statistical regression adjustment. Although this approach is not as closely controlled as a more traditional randomized trial, it was chosen to suit this pragmatic, real-world evaluation. This approach was also more ethically appropriate for research in this setting because it enabled us to avoid recruiting CHCs with the possibility of being randomized into a control arm.
Unmeasured differences between the intervention and control CHCs may have affected results. As with any study, unobserved differences may have existed between study groups that affected insurance enrollment, which in turn may have affected the study results. Through our qualitative work, we learned that eligibility specialists, the primary users of the Tracking Form, had similar responsibilities at both the intervention and control sites, but were unable to compare them quantitatively.
We focused on the impact of the tools on the children for whom the tools were used. The addition of a within-clinic comparison cohort—patients seen in CHCs with access to the tools but for whom the tools were not used—provided another comparison group for quantitative analyses and accounted for potential clinic-level effects. Additionally, we adjusted analyses for patient- and clinic-level confounders, but some model-based adjustment may have inadequately controlled for the differences between study groups. We also assessed clinic-level effects in a secondary ITT analysis, which expectedly showed few significant results. This, too, reflects the nature of pragmatic studies in which tool use is an optional rather than a required part of the intervention, as with a more controlled study. Furthermore, the tools were used on a small number of patients, so we did not assess clinic-level changes in uninsured rates.
Because EHRs are not structurally designed to link records for families, we could not directly assess whether tool use was affected by the number of individuals being assisted in a given family. We could use Medicaid enrollment data to assess family size, but only among household members who ever received Medicaid. The length of our follow-up period was 18 months, which may not have been sufficiently long to assess uptake and impact. In addition, it was technically feasible to quantify only Tracking Form use; while we evaluated use of the other tools in qualitative data collection, we were not able to assess whether that affected the outcomes of interest here. Based on the relatively low amount of tool use, as well as the other limitations noted above, inferences drawn from this study should be considered tentative.
Conclusions
The IMPACCT Kids Care study was the first to evaluate the feasibility and impact of developing EHR-based tools to help primary care clinics provide insurance enrollment support to pediatric patients. Our results suggest EHR-based tools could increase insurance enrollment, prevent coverage loss, and promote receipt of recommended pediatric care services. Our encouraging findings provide important lessons to improve such tools and increase their future adoption. First, when used to proactively prevent lapses in coverage for insured patients, the design and implementation of EHR-based insurance assistance tools require close collaboration with payers (eg, Medicaid/CHIP) as well as an infrastructure that can generate accurate data on insurance coverage end dates61 and can be easily adapted to a changing insurance landscape. Second, a family-based tool (in the EHR or elsewhere) is needed, as EHRs are designed to be used at the patient level, while clinic staff assist the whole family. A family-based tool would facilitate the implementation of health insurance enrollment support into most clinic workflows.
References
- 1.
- Hoffman C, Paradise J. Health insurance and access to health care in the United States. Ann N Y Acad Sci. 2008;1136:149-160. [PubMed: 17954671]
- 2.
- Newacheck PW, Stoddard JJ, Hughes DC, Pearl M. Health insurance and access to primary care for children. N Engl J Med. 1998;338:513-519. [PubMed: 9468469]
- 3.
- Courtemanche CJ, Zapata D. Does universal coverage improve health? The Massachusetts experience. J Policy Anal Manage. 2014;33(1):36-69. [PubMed: 24358528]
- 4.
- Schoen C, DesRoches C. Uninsured and unstably insured: the importance of continuous insurance coverage. Health Serv Res. 2000;35(1 Pt 2):187-206. [PMC free article: PMC1089095] [PubMed: 10778809]
- 5.
- Olson LM, Tang SF, Newacheck PW. Children in the United States with discontinuous health insurance coverage. New Engl J Med. 2005;353(4):382-391. [PubMed: 16049210]
- 6.
- Cassedy A, Fairbrother G, Newacheck PW. The impact of insurance instability on children's access, utilization, and satisfaction with health care. Ambul Pediatr. 2008;8(5):321-328. [PubMed: 18922506]
- 7.
- Cummings JR, Lavarreda SA, Rice T, Brown ER. The effects of varying periods of uninsurance on children's access to health care. Pediatrics. 2009;123(3):e411-e418. [PubMed: 19254977]
- 8.
- DeVoe JE, Ray M, Krois L, Carlson MJ. Uncertain health insurance coverage and unmet children's health care needs. Fam Med. 2010;42(2):121-132. [PMC free article: PMC4918751] [PubMed: 20135570]
- 9.
- Federico SG, Steiner JF, Beaty B, Crane L, Kempe A. Disruptions in insurance coverage: patterns and relationship to health care access, unmet need, and utilization before enrollment in the State Children's Health Insurance Program. Pediatrics. 2007;120(4):e1009-e1016. [PubMed: 17908722]
- 10.
- Sudano JJ, Baker DW. Intermittent lack of health insurance coverage and use of preventive services. Am J Public Health. 2003;93(1):130-137. [PMC free article: PMC1447707] [PubMed: 12511402]
- 11.
- The Henry J. Kaiser Family Foundation. Children's health coverage: Medicaid, CHIP and the ACA. Published 2014. Accessed May 4, 2016. http://kff
.org/health-reform /issue-brief /childrens-health-coverage-medicaid-chip-and-the-aca/ - 12.
- Fairbrother GL, Emerson HP, Partridge L. How stable is Medicaid coverage for children? Health Aff (Millwood). 2007;26(2):520-528. [PubMed: 17339682]
- 13.
- Short PF, Graefe DR, Swartz K, Uberoi N. New estimates of gaps and transitions in health insurance. Med Care Res Rev. 2012;69(6):721-736. [PMC free article: PMC4135711] [PubMed: 22833452]
- 14.
- Aiken KD, Freed GL, Davis MM. When insurance status is not static: insurance transitions of low-income children and implications for health and health care. Ambul Pediatr. 2004;4(3):237-243. [PubMed: 15153059]
- 15.
- Sommers BD. Why millions of children eligible for Medicaid and SCHIP are uninsured: poor retention versus poor take-up. Health Aff (Millwood). 2007;26(5):w560-w567. [PubMed: 17656394]
- 16.
- Hatch B, Angier H, Marino M, et al. Using electronic health records to conduct children's health insurance surveillance. Pediatrics. 2013;132(6):e1584-e1591. [PMC free article: PMC4918749] [PubMed: 24249814]
- 17.
- Cousineau MR, Stevens GD, Farias A. Measuring the impact of outreach and enrollment strategies for public health insurance in California. Health Serv Res. 2011;46(1 Pt 2):319-335. [PMC free article: PMC3037785] [PubMed: 21054378]
- 18.
- Meng Q, Yuan B, Jia L, Wang J, Garner P. Outreach strategies for expanding health insurance coverage in children [Review]. Cochrane Database Syst Rev. 2010;8:CD008194. [PubMed: 20687096]
- 19.
- Sandberg SF, Erikson C, Owen R, et al. Hennepin Health: a safety-net accountable care organization for the expanded Medicaid population. Health Aff (Millwood). 2014;33(11):1975-1984. [PubMed: 25367993]
- 20.
- The Henry J. Kaiser Family Foundation. Profiles of Medicaid outreach and enrollement strategies: helping families maintain coverage in Michingan. Published 2013. Accessed February 15, 2017. https:
//kaiserfamilyfoundation .files.wordpress .com/2013/05/8441-profiles-of-medicaid-outreach-and-enrollment-strategies1.pdf [requires login] - 21.
- Flores G, Abreu M, Chaisson CE, et al. A randomized, controlled trial of the effectiveness of community-based case management in insuring uninsured Latino children. Pediatrics. 2005;116(6):1433-1441. [PubMed: 16322168]
- 22.
- HRSA Health Center Program. Health center outreach and enrollment technical assistance. 2015.
- 23.
- Flores G, Walker C, Lin H, et al. A successful program for training parent mentors to provide assistance with obtaining health insurance for uninsured children. Acad Pediatr. 2015;15(3):275-281. [PMC free article: PMC4409443] [PubMed: 25447369]
- 24.
- InsureKidsNow.gov. Outreach tool library. Accessed February 15, 2017. https://www
.insurekidsnow .gov/library/index.html - 25.
- Bates DW, Bitton A. The future of health information technology in the patient-centered medical home. Health Aff (Millwood). 2010;29(4):614-621. [PubMed: 20368590]
- 26.
- Blumenthal D. Performance improvement in health care—seizing the moment. N Engl J Med. 2012;366(21):1953-1955. [PubMed: 22533534]
- 27.
- Blumenthal D. Implementation of the federal health information technology initiative. N Engl J Med. 2011;365(25):2426-2431. [PubMed: 22187990]
- 28.
- Buntin MB, Burke MF, Hoaglin MC, Blumenthal D. The benefits of health information technology: a review of the recent literature shows predominantly positive results. Health Aff (Millwood). 2011;30(3):464-471. [PubMed: 21383365]
- 29.
- The Henry J. Kaiser Family Foundation. A profile of community health center patients: implications for policy. Published 2013. Accessed March 30, 2016. http://kff
.org/medicaid /issue-brief/a-profile-of-community-health-center-patients-implications-for-policy/ - 30.
- Angier H, Marino M, Sumic A, et al. Innovative Methods for Parents And Clinics to Create Tools for Kids' Care (IMPACCT Kids' Care) study protocol. Contemp Clin Trials. 2015;44:159-163. [PMC free article: PMC4757508] [PubMed: 26291916]
- 31.
- DeVoe J, Angier H, Likumahuwa S, et al. Use of qualitative methods and user-centered design to develop customized health information technology tools within federally qualified health centers to keep children insured. J Ambul Care Manage. 2014;37(2):148-154. [PubMed: 24594562]
- 32.
- Likumahuwa-Ackman S, Angier H, Sumic A, et al. IMPACCT Kids' Care: a real-world example of stakeholder involvement in comparative effectiveness research. J Comp Eff Res. 2015;4(4):351-357. [PMC free article: PMC4538706] [PubMed: 26274796]
- 33.
- Aday LA, Andersen R. A framework for the study of access to medical care. Health Serv Res. 1974;9(3):208-220. [PMC free article: PMC1071804] [PubMed: 4436074]
- 34.
- Phillips KA, Morrison KR, Andersen R, Aday LA. Understanding the context of healthcare utilization: assessing environmental and provider-related variables in the behavioral model of utilization. Health Serv Res. 1998;33(3 Pt 1):571-596. [PMC free article: PMC1070277] [PubMed: 9685123]
- 35.
- Andersen RM. Revisiting the behavioral model and access to medical care: does it matter? J Health Soc Behav. 1995;36(1):1-10. [PubMed: 7738325]
- 36.
- Concannon TW, Meissner P, Grunbaum JA, et al. A new taxonomy for stakeholder engagement in patient-centered outcomes research. J Gen Intern Med. 2012;27(8):985-991. [PMC free article: PMC3403141] [PubMed: 22528615]
- 37.
- Abras C, Maloney-Krichmar D, Preece J. User-centered design. In: Bainbridge W, ed. Berkshire Encyclopedia of Human-Computer Interaction. Berkshire Publishing Group, LLC; 2004.
- 38.
- Preece J, Rogers Y, Sharp H. Interaction Design: Beyond Human-Computer Interaction. 4th ed. John Wiley & Sons, Inc; 2015.
- 39.
- Gold R, Burdick T, Angier H, et al. Improve synergy between health information exchange and electronic health records to increase rates of continuously insured patients. EGEMS (Wash DC). 2015;3(1):1158. [PMC free article: PMC4562735] [PubMed: 26355818]
- 40.
- Lewis C, Rieman J. Task-centered user interface design: a practical introduction. Published 1994. Accessed February, 16, 2017. http://hcibib
.org/tcuid/ - 41.
- Fetters MD, Curry LA, Creswell JW. Achieving integration in mixed methods designs—principles and practices. Health Serv Res. 2013;48(6 Pt 2):2134-2156. [PMC free article: PMC4097839] [PubMed: 24279835]
- 42.
- DeVoe JE, Gold R, Spofford M, et al. Developing a network of community health centers with a common electronic health record: description of the Safety Net West Practice-based Research Network (SNW-PBRN). J Am Board Fam Med. 2011;24(5):597-604. [PMC free article: PMC3525325] [PubMed: 21900444]
- 43.
- Devoe JE, Sears A. The OCHIN community information network: bringing together community health centers, information technology, and data to support a patient-centered medical village. J Am Board Fam Med. 2013;26(3):271-278. [PMC free article: PMC3883432] [PubMed: 23657695]
- 44.
- Rosenbaum PR, Rubin DB. Constructing a control group using multivariate matched sampling methods that incorporate the propensity score. Am Stat. 1985;39(1):33-38.
- 45.
- Hall J, Harding R, DeVoe JE, et al. Designing health information technology tools to prevent gaps in public health insurance. J Innov Health Inform. 2017;24(2):900. [PubMed: 28749314]
- 46.
- Angier H, Gold R, Crawford C, et al. Linkage methods for connecting children with parents in electronic health record and state public health insurance data. Matern Child Health J. 2014;18(9):2025-2033. [PMC free article: PMC4926760] [PubMed: 24562505]
- 47.
- Angier H, Gold R, Gallia C, et al. Variation in outcomes of quality measurement by data source. Pediatrics. 2014;133(6):e1676-e1682. [PMC free article: PMC4918742] [PubMed: 24864178]
- 48.
- Gold R, Angier H, Mangione-Smith R, et al. Feasibility of evaluating the CHIPRA care quality measures in electronic health record data. Pediatrics. 2012;130(1):139-149. [PMC free article: PMC3382922] [PubMed: 22711724]
- 49.
- Strauss A, Corbin J. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory. 2nd ed. Sage Publications, Inc; 1998.
- 50.
- Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. Sage Publications; 2011.
- 51.
- Creswell JW, Plano Clark VL, Gutmann ML, Hanson WE. Advanced mixed methods research designs. In: Tashakkori A, Teddlie C, eds. Handbook of Mixed Methods in Social and Behavioral Research. Sage Publications; 2003:209–240.
- 52.
- Committee on Practice and Ambulatory Medicine and Bright Futures Periodicity Schedule Workgroup. 2016 recommendations for preventive pediatric health care. Pediatrics. 2016;137(1). [PubMed: 26324870]
- 53.
- Centers for Medicare & Medicaid Services. Core Set of Children's Health Care Quality Measures for Medicaid and CHIP (Child Core Set): Technical Specifications and Resource Manual for Federal Fiscal Year 2015 Reporting. 2015.
- 54.
- Overall JE, Tonidandel S. Robustness of generalized estimating equation (GEE) tests of significance against misspecification of the error structure model. Biom J. 2014;46(2):203-213.
- 55.
- Gupta SK. Intention-to-treat concept: a review. Perspect Clin Res. 2011;2(3):109-112. [PMC free article: PMC3159210] [PubMed: 21897887]
- 56.
- Bailey SR, Marino M, Hoopes M, et al. Healthcare utilization after a Children's Health Insurance Program expansion in Oregon. Matern Child Health J. 2016;20(5):946-954. [PMC free article: PMC4826791] [PubMed: 26987861]
- 57.
- The Henry J. Kaiser Family Foundation. How is the ACA impacting Medicaid enrollment? Published 2014. Accessed May 4, 2016. http://kff
.org/medicaid /issue-brief/how-is-the-aca-impacting-medicaid-enrollment/ - 58.
- Shah SG, Robinson I. Benefits of and barriers to involving users in medical device technology development and evaluation. Int J Technol Assess Health Care. 2007;23(1):131-137. [PubMed: 17234027]
- 59.
- Ahmad R, Kyratsis Y, Holmes A. When the user is not the chooser: learning from stakeholder involvement in technology adoption decisions in infection control. J Hosp Infect. 2012;81(3):163-168. [PubMed: 22633278]
- 60.
- Angier H, Wiggins N, Gregg J, Gold R, DeVoe J. Increasing the relevance of research to underserved communities: lessons learned from a retreat to engage community health workers with researchers. J Health Care Poor Underserved. 2013;24(2):840-849. [PMC free article: PMC4926764] [PubMed: 23728049]
- 61.
- DeVoe JE, Tillotson CJ, Lesko SE, Wallace LS, Angier H. The case for synergy between a usual source of care and health insurance coverage. J Gen Intern Med. 2011;26(9):1059-1066. [PMC free article: PMC3157522] [PubMed: 21409476]
Acknowledgments
This study would not have been possible without the tireless efforts of our tool developer, Duane Ellington, and tool development project manager, Marla Dearing—we are deeply grateful for their contributions. We also thank all the clinics that participated in this research.
Research reported in this report was [partially] funded through a Patient-Centered Outcomes Research Institute® (PCORI®) Award (308). Further information available at: https://www.pcori.org/research-results/2012/using-electronic-health-records-community-health-centers-help-children-get-health-insurance-and-access-health-care
Appendix
Appendix: Intent-to-Treat (ITT) Analyses
Suggested citation:
DeVoe J, Gold R, Nelson C, et al. (2018). Innovative Methods for Parents and Clinics to Create Tools (IMPACCT) for Kids' Care. Patient-Centered Outcomes Research Institute (PCORI). https://doi.org/10.25302/6.2018.CER.308
Disclaimer
The [views, statements, opinions] presented in this report are solely the responsibility of the author(s) and do not necessarily represent the views of the Patient-Centered Outcomes Research Institute® (PCORI®), its Board of Governors or Methodology Committee.
- NLM CatalogRelated NLM Catalog Entries
- PMCPubMed Central citations
- PubMedLinks to PubMed
- Electronic health record tools to assist with children's insurance coverage: a mixed methods study.[BMC Health Serv Res. 2018]Electronic health record tools to assist with children's insurance coverage: a mixed methods study.DeVoe JE, Hoopes M, Nelson CA, Cohen DJ, Sumic A, Hall J, Angier H, Marino M, O'Malley JP, Gold R. BMC Health Serv Res. 2018 May 10; 18(1):354. Epub 2018 May 10.
- Effectiveness of an insurance enrollment support tool on insurance rates and cancer prevention in community health centers: a quasi-experimental study.[BMC Health Serv Res. 2021]Effectiveness of an insurance enrollment support tool on insurance rates and cancer prevention in community health centers: a quasi-experimental study.Huguet N, Valenzuela S, Marino M, Moreno L, Hatch B, Baron A, Cohen DJ, DeVoe JE. BMC Health Serv Res. 2021 Oct 30; 21(1):1186. Epub 2021 Oct 30.
- Innovative methods for parents and clinics to create tools for kids' care (IMPACCT Kids' Care) study protocol.[Contemp Clin Trials. 2015]Innovative methods for parents and clinics to create tools for kids' care (IMPACCT Kids' Care) study protocol.Angier H, Marino M, Sumic A, O'Malley J, Likumahuwa-Ackman S, Hoopes M, Nelson C, Gold R, Cohen D, Dickerson K, et al. Contemp Clin Trials. 2015 Sep; 44:159-163. Epub 2015 Aug 18.
- Review Evidence Brief: Comparative Effectiveness of Appointment Recall Reminder Procedures for Follow-up Appointments[ 2015]Review Evidence Brief: Comparative Effectiveness of Appointment Recall Reminder Procedures for Follow-up AppointmentsPeterson K, McCleery E, Anderson J, Waldrip K, Helfand M. 2015 Jul
- Review Being uninsured: impact on children's healthcare and health.[Curr Opin Pediatr. 2005]Review Being uninsured: impact on children's healthcare and health.Fry-Johnson YW, Daniels EC, Levine R, Rust G. Curr Opin Pediatr. 2005 Dec; 17(6):753-8.
- Innovative Methods for Parents and Clinics to Create Tools (IMPACCT) for Kids' C...Innovative Methods for Parents and Clinics to Create Tools (IMPACCT) for Kids' Care
Your browsing activity is empty.
Activity recording is turned off.
See more...