Overview

This section provides guidance on qualitative data collection using RE-AIM. We posit that RE-AIM can be used temporally (before, during, and after implementation), in different settings (clinic, community, corporate), with multiple levels of stakeholders, and for a variety of targeted audiences and outcomes.

What’s included?

Two key peer reviewed papers for qualitative RE-AIM application.

Key Considerations for Qualitative Data with RE-AIM

Qualitative interview guide templates, by stakeholder level and over time

Example qualitative data collection instruments from a number of specific studies

Key Considerations for Qualitative Data with RE-AIM

Table 1. Examples of applying RE-AIM
dimension(s) in different settings across different phases of projects
PROJECT STAGE Clinical Community Corporate Overall
BEFORE IMPLEMENATION
Consider the project impact on all RE-AIM
dimensions and prioritize the focus for planning and evaluation
Example: Stakeholders’ interest in intervention reach and representativeness within the setting.
Measure: Identify
potentially eligible patients through electronic medical record
Measure: Estimate and compare
eligible participants to demographics using Behavioral Risk Factor
Surveillance System (BRFSS) or Census data
Measure: Identify
potentially eligible participants from customers who signed up for
intervention via wellness card   Considerations: Gain ‘buy in’ from corporate leadership. Use existing
corporate infrastructure to identify participants.  
Attempt to keep the target population as large and diverse or representative as possible for a greater public health impact.   Consider ways to enhance recruitment of those most vulnerable and most at risk.   Use a team-based approach to consider which dimension is a priority for the work. Allocate resources accordingly.  
     
Considerations: May need to conduct
sensitivity analyses to determine sample size because of issues like
inconsistent coding. There may be coding inconsistencies that influence the
numerator or denominator, and all data may not be available for the desired
study. 
Considerations: Reach proportion may seem extremely small when using
county-level data to determine denominator. Reach and representativeness
within each delivery site, and comparisons across sites, may help understand
for whom the intervention is working (or not).   
Prioritization: Implementation factors should be prioritized and carefully
considered as they play a key role in the program’s success and ongoing
sustainability. Organizations with multiple sites/locations may require local
‘buy in’.
Prioritization: Although reach is
important dimension to consider, in this example, the team prioritizes effect
of the behavioral outcome 
Prioritization: Because the anticipated outcomes with evidence-based programs
are known, the delivery of programs at multiple sites places additional
emphasis on training and fidelity monitoring (to ensure outcomes are
achieved).
 
  Example: Decision made to intervene to improve adoption, describe effect, and assess implementation fidelity
Determine how each dimension will be
included in the project: describe, assess, and/or intervene
Intervene: Healthcare
organization is implementing new protocol for nursing rounds. Some clinics
receive additional intervention to improve adoption of the protocol.
  Describe or measure the effect of the new rounding protocol (i.e., did
it achieve outcome of interest).   Assess the degree to which the new nurse rounding protocol was
delivered consistently over time and across clinics.
Intervene to improve
adoption rates of YMCA centers of a diabetes prevention
intervention.    Describe rates of diabetes reduction or other proximal outcomes (weight
loss, physical activity improvements).   Assess the degree to which the diabetes prevention program was
delivered consistently across YMCA sites.
Intervene to improve
adoption rates of a wellness program at a local grocery store within a
national chain.    Describe outcomes including unintended negative consequences of the
wellness program.   Assess the degree to which the wellness program was delivered
consistently across grocery stores in that chain. 
Avoid the publication bias for solely reporting on
the effect of an intervention on the desired outcome/ behavior change without
describing or assessing other interventions.   Consider a hybrid design
when intervening or assessing both clinical/behavioral intervention as well
as implementation strategy. 
Develop data collection and reporting
procedures and timelines for selected RE-AIM dimensions
Consider the metrics of interest and how data will be
transferred.    Consider if HIPPA compliance or BAA/DUA* are
needed.    Determine the appropriate timeline for observing
outcomes (e.g., a full year of observation may be needed to see change in
clinical outcomes).    
Pragmatically consider what is feasible to collect
based on the intended purpose of the intervention.    Consider who,
in what community organization, has the time and skills necessary to deliver
a program.   Weigh the pros and cons associated with subjective versus
objective measures, primary versus secondary data, and self-reported data
from participants versus administrative measures.
Consider the messages important for key stakeholders
and the data that will drive such messages.    Determine the time
and resources needed to obtain such measures and the formats/modalities for
disseminating findings to leadership and consumers.     
Consider ‘balancing metrics’ and unintended outcomes;
as well as assessing and reducing potential health inequities.
  Example: Determine appropriate stakeholders and where, when, how, and why they will be engaged.
Engage all project staff and partners in
processes to ensure transparency, equity, compliance with regulations, and
support (ongoing throughout the project).
Consider structure of the clinical healthcare
organization and potential stakeholders including nurses, nurse assistants,
physicians, patients/family, and administrators.   Consider that perhaps
it is not appropriate to engage patients with an electronic medical record
update.
Bring together stakeholders from diverse sectors
(e.g., government, academia, faith-based, aging) to allow each to vocalize
their ‘pain points’ and definitions for success.   Form a comprehensive
set of variables based on stakeholder priorities and use those elements to
measure outcomes relevant to each stakeholder.   Consider time course of
putative effects
Engaging multiple employee types (leadership,
different divisions/roles) in conversations about new initiatives brings a
sense of ownership, which can bolster initial and ongoing support. By
including multiple employee perspectives in the planning phase, the logistics
about implementation and anticipated outcomes will be identified, which will
increase initial adoption and the potential for longer-term maintenance.
Diverse perspectives allow all parties to provide
feedback about processes and procedures so that a coordinated approach can be
devised and executed with fidelity.   Construct a logic model to
understand content, activities, short and longer-term impact.  
Plan for sustainability and
generalizability from the outset
Consider how intervention- and assessment- components
can be implemented in settings with different histories, resources, workflows
  Plan to communicate results with stakeholders providing guidance and
align reporting of information with data needed for decision-making for
sustainability.
Develop a coalition or advisory board to be engaged
throughout the process, including those not directly involved in the project,
to identify information and resources needed to increase the likelihood of
sustainability.
Include staff with clinical expertise to be engaged
throughout the process, including those not directly involved in the project.
Design for feasibility, success, and dissemination
that addresses each of RE-AIM dimensions.   Design the intervention to
be broadly applied within and across settings.
DURING IMPLEMENTATION / ITERATIVE
ASSESMENT AND ADJUSTMENT 
Monitor data periodically and at key
points for each dimension (emphasis on priority dimensions)
Have brief (perhaps ‘automated’), ongoing data collection. Use rapid, pragmatic assessments to identify reasons for initial results. Conduct training for program delivery staff about
data collection procedures including data completion and quality
checks.  Routinely export available data from administrative records and
secondary sources to track real-time changes.
Have brief ’automated’ ongoing data collection from
routine company records. When supplementary outcome measures are used,
conduct training for program delivery staff about data collection procedures
including data completion and quality checks.  Routinely export
available data from administrative records and secondary sources to track
real-time changes.
Pragmatic, timely, and low- resource data collection
for ongoing decision making and engagement in the PDSA cycle over time and
dimensions.
Track implementation and costs as well as
fidelity to core components if those are priority dimensions
Discuss and implement low burden cost assessments
(interviews, tracking, observations) at key time points.
Develop systems for fidelity monitoring (observation)
and adherence to delivery protocol.  Programs that breach fidelity are
subject to additional unplanned costs (e.g., cost per participant increases
if workshops are not filled to capacity).
Track implementation and variability across sites.
Routinely compare outcomes across a random sample of sites as a way of
identifying unanticipated fluctuations and potential protocol deviations.
Real-time issues can be addressed more rapidly.
Avoids type 3 error (concluding that intervention did not work when perhaps
delivery was not consistent with evidence-based components).
Perform ongoing assessments of project
evolution and adaptations
Probe adaptations to address each RE-AIM dimension.
  Track implementation and impact over time and across settings and
staff
Routinely export available data from administrative
records and secondary sources to track real-time progress.  Regularly
debrief with program deliverers and organizational partners to identify (and
adapt to address) unforeseen challenges.
Track implementation and impact over time and across
settings and staff.   Collect stories and ‘positive deviance’ examples
to inspire other settings.
Need to capture real-world adaptations to
systematically collect data on how, why, when, and by whom changes are being
implemented in the field. 
Reconsider the intervention impact on
(and priorities for) all RE-AIM dimensions
Use both quantitative and qualitative assessments. In
applied cases, use ‘good enough’ methods- ballpark estimates make them work
when ‘gold standard’ methods are not feasible.
Assess whether the number of participants reached
will enable meaningful outcomes to be observed and adjust
recruitment/delivery accordingly.  Discuss project progress with program
deliverers, partnering organizations, and other key stakeholders regularly to
ensure transparency and identify changes in priorities for the project.
Assess program impact on ‘bottom line’ and estimated
return on investment.   Discuss project progress with program
deliverers, different locations, and other key stakeholders regularly to
ensure transparency and identify changes in priorities for the project.
Continued discussion with stakeholders ensures that
the appropriate impact is being achieved.   Ongoing considerations of
which dimension to intervene, describe, or assess, particularly for long-term
intervention work.   
Decide if adaptations are needed to
address problems with outcomes on one or more RE-AIM dimensions
Pilot and then implement intervention or
implementation strategy adaptations needed to improve performance, and track
their impact
Assess the appropriateness of participants engaged in
the intervention to determine if appropriate and equitable outcomes are
observed. Depending on what is seen, there may be implications for refining
participant recruitment and retention procedures. 
Test different intervention or implementation
strategy adaptations needed to improve performance, and track their impact
  Track innovations
Prioritize adaptations and test their impact across
dimensions (see Figure 1)
AFTER IMPLEMENTATION / SUMMATIVE
Evaluate the impact on all relevant
RE-AIM dimensions
Consider subgroup as well as overall effects.
Consider overall impact on quality of life and patient-centered outcomes.
Include balancing measures.
Begin with priority dimensions and ‘low-hanging
fruit’.  Reach and implementation measures may be easily assessed,
whereas adoption and maintenance may require more in-depth processes to
identify.
Consider subgroup effects in addition to overall
outcomes. Based on findings, target intervention to streamline resources and
impact.
Return to RE-AIM plan and summarize accordingly.
  If retrospective RE-AIM evaluation, use existing tools to ensure
consideration of concepts and elements within each dimension. 
Calculate costs and cost-effectiveness
for each RE-AIM dimension
Report costs from perspective of multiple
stakeholders- adopting settings; clinical team; and patients. Estimate
replication costs in different settings or under different conditions
Consider the benefits of cost and cost-effectiveness
in terms of expanding the initiative geographically versus scaling-up in your
local area (or both). Costs may differ for new initiatives relative to those
that are ongoing.    
Summarize return-on-investment and expected rate of
return   Consider how cost-saving procedures can be employed in future
roll-outs
Communication and evaluation of costs contributes to
generalizability of the intervention.
Determine why and how observed RE-AIM
results occurred
Consider using mixed methods to blend objective data
(the ‘what’) and impressionistic data (the ‘why and how’) to gain a more
comprehensive understanding about the context of intervention successes and
challenges.
Share findings with stakeholders within and external
to organizations to contextualize and interpret findings.  Multiple
perspectives will drive decisions about impact, needed adaptations, and
grand-scale dissemination (if appropriate).
Collect stories and reports about keys to success and
share these at meetings, on company websites, etc.
Contribute to the understanding of the mechanisms
that achieved the effect for multiple populations, settings and staff.
Disseminate findings for accountability,
future projects, and policy change
Base statistical findings on clinically significant
findings valued by clinicians. Costs may be appropriate for leadership and
health plans.
In community settings, general findings about
improvements seen among participants and testimonials may be appropriate for
community residents and partnering organizations.
In corporate settings, metrics related to
productivity and staff absenteeism may be most appropriate for leadership to
assess cost-benefits of employee-level interventions. Staff outcomes and
program feedback may be indicative of overall employee engagement.
Determine the most appropriate format to distribute
findings and which messages are most meaningful for that audience.
Plan for replication in other settings
based on results    
Summarize lessons learned and provide guides for
implementation and adaptation for different types of settings
Consider reporting venues and organizations to share
results (e.g., community-based organizations, governmental agencies).
Consider issues of scalability and how to efficiently
implement successful programs company-wide (with appropriate adaptations)
Develop implementation and adaptation guides for
future applications and new settings.

Citation: Harden SM, Smith ML, Ory MG, Smith-Ray RL, Estabrooks PA, Glasgow RE. (2018). RE-AIM in clinical, community, and corporate settings: Perspectives, strategies, and recommendations to enhance public health impact. Frontiers in Public Health – Public Health Education and Promotion. https://doi.org/10.3389/fpubh.2018.00071

RE-AIM in Cooperative Extension

RE-AIM Dimensions Suggested Planning Questions Extension Examples
Reach     Who is the target audience for the program?     Health Equity: How will program access be supported and participation obstacles removed?   Define the priority audience or subgroups who would benefit most from exposure to the program. Target the program to those who need it rather than those who want it.   Develop strategies to specifically recruit those who are most underserved or at risk and enable their participation. Consider the time the program is offered and how participants will have transportation to attend.Plan how you advertise, promote, and locate the program to reach these participants. Engage community partners who serve the audience to help recruit and include the most underserved within your target audience.
Effectiveness     What key changes or outcomes do you expect to see?   How will you collect data to measure these outcomes?         Health Equity: How will the intervention be delivered to those most in need? · Determine the individual or environmental-level changes you are targeting.     Consider data collection that is realistic for those who will deliver the program. For individual or interpersonal level programs, food frequency questionnaires, behavior logs, or physical activity trackers could be used. For environmental changes, meeting minutes, grant activity, readiness assessments, and asset mapping may be used.   Consider using multiple delivery channels for accessing your program (e.g., direct, internet, and/or local media-delivered programs). Ensure that materials are culturally appropriate and designed for diverse literacy levels.
Adoption     Who will deliver the program?         How many of these delivery agents will use the program?       Health Equity: How will you enhance participation in low-resource settings? · Determine who is responsible for training, technical assistance, and support. For example, state-level specialists may train Extension educators/agents, or Extension educators/agents may be training/assisting volunteers or school staff members.   · Determine how you will capture and track adoption rates, representativeness of the staff and settings who deliver the program, and what resources are available in what settings to make the work feasible and sustainable.   Include delivery agents throughout the planning process to improve buy-in. Choose a feasible program that places low demands on staff and resources.  
Implementation     How will the initiative be delivered, including adjustments and adaptations?       What costs (including time and burden, not just money) need to be considered?         Health Equity: How will you document adaptations to the original program? Determine how you will measure fidelity. If implementation checklists are used, consider the degree to which they will feel supportive or punitive. Consider using checklists as a way to improve how Extension practitioners and volunteers can improve their performance rather than determine whether or not they are delivering with fidelity.   Consider whether implementation costs are feasible for the organization. Include costs of recruiting or tailoring materials, training delivery personnel, and start-up (e.g., equipment and incentives) vs. continuing costs (e.g., educator/agent time, training new staff). Decide if it is appropriate to run a cost-effectiveness analysis to determine cost of achieving the program outcomes.   Implementation checklists should also capture population- or systems-specific adaptations that may be improving the fit of the intervention rather than deviating from its initial protocol (Chambers & Norton, 2016).
Maintenance: individual level How likely is your initiative to produce lasting effects for individual participants?     Health Equity: How will you assess long-term results? Consider the duration and evidence-base of your program to determine whether long-term change is likely. Direct resources towards implementing programs with high population reach and evidence of long-term behavior change rather than single classes or informational seminars.   Engage participants in deciding how you will stay in touch to track outcomes after the program ends. For example, participants may want a follow-up event six months post-program or to keep in touch through newsletters, a website, or social media. Consider equitable and inclusive access to resources needed for participants to sustain program results, such as social media or a website.
Maintenance: organization level           Can the organization sustain the initiative over time and are there plans to leave resources or trained staff in place?   Health Equity: How will you prepare delivery settings and systems to sustain the program? Consider what your state system values and supports, including program capacity and resources provided support by managers (e.g., directors and district directors), multi-sector stakeholders and partners, community members, and volunteers.         Assess access and sustainability barriers and facilitators within and among delivery settings. Provide tools and resources to enable long-term program monitoring and adaptation.

Citation: Balis LE, John DH, Harden SM. (2019). Beyond Evaluation: Using the RE-AIM Framework for Program Planning in Extension. Journal of Extension. 57(2). 2TOT1.

Return to top

RE-AIM Workgroup Templates

The National Working Group on RE-AIM Planning and Evaluation Framework would like to share templates of focus group and one-on-one interview guides. These are by no means comprehensive of all styles and types of qualitative data collection, but rather a template for you to adapt based on your research and stakeholder priorities.  

Return to top

 

Example Qualitative Data Collection Tools

Some of our colleagues have graciously shared their qualitative data collection tools. We are in the process of compiling this information for upload. These are the intellectual property of the research team but available for your perusal. Contact information was up to date upon posting. Please email Samantha Harden if there are issues.

Topic Area Resources Available Scope of Project (timeframe, funds) Contact Information
Implementing alcohol screening, brief intervention, and referral

1. Key Informant Interview Guide for Nurse Managers,

2. CFIR Interview Guide (With RE-AIM domains indicated) for alcohol screening (behavioral intervention) and implementation strategy

3. Breathewell Study Planning Team interview guide: Items include proposal, planning, design, implementation, design and adoption, and maintenance phases

  Diane K. King, PhD Director Center for Behavioral Health Research and Services Institute of Social and Economic Research University of Alaska Anchorage
907-786-1638
The “I Decide” team developed an interview guide based on the Diffusion of Innovation Theory and has many RE-AIM constructs.    Patient and staff pre-intervention (“baseline”), during intervention (“follow-up”), and maintenance (“post-intervention”) interview guides for DECIDE-LVAD.   Daniel Matlock, PhD [email protected] and Jocelyn Thompson, MA I DECIDE:LVAD
Adult and Child Consortium for Health Outcomes Research and Delivery Science (ACCORDS) University of Colorado School of Medicine Mail Stop F443 13199 E. Montview Blvd, Suite 210  Aurora, CO 80045 (p) 530-906-1081 (e)[email protected] https://patientdecisionaid.org/lvad/  
Weight Management Intervention for Endometrial Cancer Survivors   Focus group pre-mortem script- Zoom Version with participants   Medical student project, $2000 max funding for participant compensation Shannon Armbruster, MD, MPH  Carilion Clinic Gynecologic Oncology (p) (540) 581-0275 (e) [email protected]   Samantha Harden, PhD Dept. Obstetrics/Gynecology Virginia Tech Carilion School of Medicine (p) (540) 231-9960 (e) [email protected]
Older Ghanaian Adults’ Perceptions of Physical Activity Focus Group Script   Pre-implementation focus group script with participants Balis L, Sowatey G, Ansong-Gyimah K, Ofori E, Harden SM. Older Ghanaian adults’ perceptions of physical activity: an exploratory, qualitative study. University of Wyoming Center for Global Studies Faculty International Research Grant, 2017-2018. Laura Balis, PhD Assistant Professor and Health Specialist University of Arkansas System Division of Agriculture Cooperative Extension Service 2301 South University Avenue Little Rock, AR 72204 (p) 501-671-2099 (e) [email protected]

Return to top