Unintended consequences: a qualitative study exploring the impact of collecting implementation process data with phone interviews on implementation activities

Inga Gruß, Arwen Bunce, James Davis, Rachel Gold, Inga Gruß, Arwen Bunce, James Davis, Rachel Gold

Abstract

Background: Qualitative data are crucial for capturing implementation processes, and thus necessary for understanding implementation trial outcomes. Typical methods for capturing such data include observations, focus groups, and interviews. Yet little consideration has been given to how such methods create interactions between researchers and study participants, which may affect participants' engagement, and thus implementation activities and study outcomes. In the context of a clinical trial, we assessed whether and how ongoing telephone check-ins to collect data about implementation activities impacted the quality of collected data, and participants' engagement in study activities.

Methods: Researchers conducted regular phone check-ins with clinic staff serving as implementers in an implementation study. Approximately 1 year into this trial, 19 of these study implementers were queried about the impact of these calls on study engagement and implementation activities. The two researchers who collected implementation process data through phone check-ins with the study implementers were also interviewed about their perceptions of the impact of the check-ins.

Results: Study implementers' assessment of the check-ins' impact fell into three categories: (1) the check-ins had no effect on implementation activities, (2) the check-ins served as a reminder about study participation (without relating a clear impact on implementation activities), and (3) the check-ins caused changes in implementation activities. The researchers similarly perceived that the phone check-ins served as reminders and encouraged some implementers' engagement in implementation activities; their ongoing nature also created personal connections with study implementers that may have impacted implementation activities. Among some study implementers, anticipation of the check-in calls also improved their ability to recount implementation activities and positively affected quality of the data collected.

Conclusion: These results illustrate the potential impact of qualitative data collection on implementation activities during implementation science trials. Mitigating such effects may prove challenging, but acknowledging these consequences-or even embracing them, perhaps by designing data collection methods as implementation strategies-could enhance scientific rigor. This work is presented to stimulate debate about the complexities involved in capturing data on implementation processes using common qualitative data collection methods.

Trial registration: ClinicalTrials.gov, NCT02325531 . Registered 15 December 2014.

Keywords: Capturing implementation processes; Data collection methods; Implementation studies; Measurement reactivity; Qualitative research; Researcher effects; Study design.

Conflict of interest statement

The authors declare that they have no competing interest.

References

    1. Adler PA, Adler P. Membership roles in field research. Thousand Oaks: Sage Publications, Inc; 1987. p. 95.
    1. Shibre T, Teferra S, Morgan C, Alem A. Exploring the apparent absence of psychosis amongst the Borana pastoralist community of Southern Ethiopia. A mixed method follow-up study. World Psychiatry. 2010;9(2):98–102. doi: 10.1002/j.2051-5545.2010.tb00286.x.
    1. Tedlock B. From participant observation to the observation of participation: the emergence of narrative ethnography. J Anthropol Res. 1991;47(1):69–94. doi: 10.1086/jar.47.1.3630581.
    1. Vidich AJ. Participant observation and the collection and interpretation of data. Am J Sociol. 1955;60(4):354–360. doi: 10.1086/221567.
    1. Mayo E. The human problems of an industrial civilization. New York: MacMillan; 1933.
    1. Roethlisberger FJ, Dickson W. Management and the worker. Cambridge: Harvard University Press; 1939.
    1. Chen LF, Vander Weg MW, Hofmann DA, Reisinger HS. The Hawthorne effect in infection prevention and epidemiology. Infect Control Hosp Epidemiol. 2015;36(12):1444–1450. doi: 10.1017/ice.2015.216.
    1. Fernald DH, Coombs L, DeAlleaume L, West D, Parnes B. An assessment of the Hawthorne effect in practice-based research. J Am Board Fam Med. 2012;25(1):83–86. doi: 10.3122/jabfm.2012.01.110019.
    1. Paradis E, Sutkin G. Beyond a good story: from Hawthorne effect to reactivity in health professions education research. Med Educ. 2017;51(1):31–39. doi: 10.1111/medu.13122.
    1. Berthelot JM, Le Goff B, Maugars Y. The Hawthorne effect: stronger than the placebo effect? Joint Bone Spine. 2011;78(4):335–336. doi: 10.1016/j.jbspin.2011.06.001.
    1. Holden JD. Hawthorne effects and research into professional practice. J Eval Clin Pract. 2001;7(1):65–70. doi: 10.1046/j.1365-2753.2001.00280.x.
    1. Kompier MA. The “Hawthorne effect” is a myth, but what keeps the story going? Scand J Work Environ Health. 2006;32(5):402–412. doi: 10.5271/sjweh.1036.
    1. McCambridge J, Witton J, Elbourne DR. Systematic review of the Hawthorne effect: new concepts are needed to study research participation effects. J Clin Epidemiol. 2014;67(3):267–277. doi: 10.1016/j.jclinepi.2013.08.015.
    1. Wickstrom G, Bendix T. The “Hawthorne effect”--what did the original Hawthorne studies actually show? Scand J Work Environ Health. 2000;26(4):363–367. doi: 10.5271/sjweh.555.
    1. Sprott DE, Spangenberg ER, Block LG, Fitzsimons GJ, Morwitz VG, Williams P. The question–behavior effect: what we know and where we go from here. Soc Influ. 2006;1(2):128–137. doi: 10.1080/15534510600685409.
    1. Enosh G, Ben-Ari A. Reflexivity: the creation of liminal spaces--researchers, participants, and research encounters. Qual Health Res. 2016;26(4):578–584. doi: 10.1177/1049732315587878.
    1. Scott C, Walker J, White P, Lewith G. Forging convictions: the effects of active participation in a clinical trial. Soc Sci Med. 2011;72(12):2041–2048. doi: 10.1016/j.socscimed.2011.04.021.
    1. Bourbonnais A, Ducharme F, Landreville P, Michaud C, Gauthier MA, Lavallee MH. An action research to optimize the well-being of older people in nursing homes: challenges and strategies for implementing a complex intervention. J Appl Gerontol. 2020;39(2):119–128. doi: 10.1177/0733464818762068.
    1. Chan KS, Hsu YJ, Lubomski LH, Marsteller JA. Validity and usefulness of members reports of implementation progress in a quality improvement initiative: findings from the Team Check-up Tool (TCT) Implement Sci. 2011;6:115. doi: 10.1186/1748-5908-6-115.
    1. Hartveit M, Hovlid E, Nordin MHA, et al. Measuring implementation: development of the implementation process assessment tool (IPAT) BMC Health Serv Res. 2019;19(1):721. doi: 10.1186/s12913-019-4496-0.
    1. Sperber NR, Bruening RA, Choate A, et al. Implementing a mandated program across a regional health care system: a rapid qualitative assessment to evaluate early implementation strategies. Qual Manag Health Care. 2019;28(3):147–154. doi: 10.1097/QMH.0000000000000221.
    1. Tomoaia-Cotisel A, Scammon DL, Waitzman NJ, et al. Context matters: the experience of 14 research teams in systematically reporting contextual factors important for practice change. Ann Fam Med. 2013;11(Suppl 1):S115–S123. doi: 10.1370/afm.1549.
    1. Curry LA, Nembhard IM, Bradley EH. Qualitative and mixed methods provide unique contributions to outcomes research. Circulation. 2009;119(10):1442–1452. doi: 10.1161/CIRCULATIONAHA.107.742775.
    1. Finley EP, Huynh AK, Farmer MM, et al. Periodic reflections: a method of guided discussions for documenting implementation phenomena. BMC Med Res Methodol. 2018;18(1):153. doi: 10.1186/s12874-018-0610-y.
    1. Morgan-Trimmer S, Wood F. Ethnographic methods for process evaluations of complex health behaviour interventions. Trials. 2016;17(1):232. doi: 10.1186/s13063-016-1340-2.
    1. Bunger AC, Powell BJ, Robertson HA, MacDowell H, Birken SA, Shea C. Tracking implementation strategies: a description of a practical approach and early findings. Health Res Policy Syst. 2017;15(1):15. doi: 10.1186/s12961-017-0175-y.
    1. Bruun H, Pedersen R, Stenager E, Mogensen CB, Huniche L. Implementing ethics reflection groups in hospitals: an action research study evaluating barriers and promotors. BMC Med Ethics. 2019;20(1):49. doi: 10.1186/s12910-019-0387-5.
    1. Conte KP, Shahid A, Gron S, et al. Capturing implementation knowledge: applying focused ethnography to study how implementers generate and manage knowledge in the scale-up of obesity prevention programs. Implement Sci. 2019;14(1):91. doi: 10.1186/s13012-019-0938-7.
    1. Gold R, Hollombe C, Bunce A, et al. Study protocol for “Study of Practices Enabling Implementation and Adaptation in the Safety Net (SPREAD-NET)”: a pragmatic trial comparing implementation strategies. Implement Sci. 2015;10:144. doi: 10.1186/s13012-015-0333-y.
    1. Gold R, Bunce A, Cowburn S, et al. Does increased implementation support improve community clinics’ guideline-concordant care? Results of a mixed methods, pragmatic comparative effectiveness trial. Implement Sci. 2019;14(1):100. doi: 10.1186/s13012-019-0948-5.
    1. Carrick R, Mitchell A, Powell RA, Lloyd K. The quest for well-being: a qualitative study of the experience of taking antipsychotic medication. Psychol Psychother. 2004;77(Pt 1):19–33. doi: 10.1348/147608304322874236.
    1. Patton MQ. Enhancing the quality and credibility of qualitative analysis. Health Serv Res. 1999;34(5 Pt 2):1189–1208.
    1. Vaismoradi M, Turunen H, Bondas T. Content analysis and thematic analysis: implications for conducting a qualitative descriptive study. Nurs Health Sci. 2013;15(3):398–405. doi: 10.1111/nhs.12048.
    1. Glenton C, Lewin S, Scheel IB. Still too little qualitative research to shed light on results from reviews of effectiveness trials: a case study of a Cochrane review on the use of lay health workers. Implement Sci. 2011;6(1):53. doi: 10.1186/1748-5908-6-53.
    1. Brinkerhoff DW. Accountability and health systems: toward conceptual clarity and policy relevance. Health Policy Plan. 2004;19(6):371–379. doi: 10.1093/heapol/czh052.
    1. Gray CS, Berta W, Deber R, Lum J. Organizational responses to accountability requirements: do we get what we expect? Health Care Manage Rev. 2017;42(1):65–75. doi: 10.1097/HMR.0000000000000089.
    1. Cleary SM, Molyneux S, Gilson L. Resources, attitudes and culture: an understanding of the factors that influence the functioning of accountability mechanisms in primary health care settings. BMC Health Serv Res. 2013;13(1):320. doi: 10.1186/1472-6963-13-320.
    1. Deber RB. Thinking about accountability. Healthc Policy. 2014;10(Spec issue):12–24.
    1. Hall AT, Frink DD, Buckley MR. An accountability account: a review and synthesis of the theoretical and empirical research on felt accountability. J Organ Behav. 2017;38(2):204–224. doi: 10.1002/job.2052.
    1. Ivers N, Jamtvedt G, Flottorp S, et al. Audit and feedback: effects on professional practice and healthcare outcomes. Cochrane Database Syst Rev. 2012;(6):CD000259. 10.1002/14651858.CD000259.pub3.
    1. Hysong SJ. Meta-analysis: audit and feedback features impact effectiveness on care quality. Med Care. 2009;47(3):356–363. doi: 10.1097/MLR.0b013e3181893f6b.
    1. Miles LM, Rodrigues AM, Sniehotta FF, French DP. Asking questions changes health-related behavior: an updated systematic review and meta-analysis. J Clin Epidemiol. 2020;123:59–68. doi: 10.1016/j.jclinepi.2020.03.014.
    1. Wilding S, Conner M, Sandberg T, et al. The question-behaviour effect: a theoretical and methodological review and meta-analysis. Eur Rev Soc Psychol. 2016;27(1):196–230. doi: 10.1080/10463283.2016.1245940.
    1. Solberg LI. Recruiting medical groups for research: relationships, reputation, requirements, rewards, reciprocity, resolution, and respect. Implement Sci. 2006;1:25. doi: 10.1186/1748-5908-1-25.
    1. Maiorana A, Steward WT, Koester KA, et al. Trust, confidentiality, and the acceptability of sharing HIV-related patient data: lessons learned from a mixed methods study about health information exchanges. Implement Sci. 2012;7:34. doi: 10.1186/1748-5908-7-34.
    1. Norman CD, Huerta T. Knowledge transfer & exchange through social networks: building foundations for a community of practice within tobacco control. Implement Sci. 2006;1:20. doi: 10.1186/1748-5908-1-20.
    1. Clifford J, Marcus GE. Writing culture: the poetics and politics of ethnography. Berkeley: University of California Press; 1986.
    1. Rosaldo R. Culture & truth: remaking of social analysis. Boston: Beacon Press; 1993.
    1. Fook J. Reflexivity as method. Annu Rev Health Soc Sci. 2014;9(1):11–20. doi: 10.5172/hesr.1999.9.1.11.
    1. Jootun D, McGhee G, Marland GR. Reflexivity: promoting rigour in qualitative research. Nurs Stand. 2009;23(23):42–46.
    1. Lambert C, Jomeen J, McSherry W. Reflexivity: a review of the literature in the context of midwifery research. Br J Midwifery. 2010;18(5):321–326. doi: 10.12968/bjom.2010.18.5.47872.

Source: PubMed

3
Abonneren