Fidelity monitoring in complex interventions: a case study of the WISE intervention

Taren Swindle, James P Selig, Julie M Rutledge, Leanne Whiteside-Mansell, Geoff Curran, Taren Swindle, James P Selig, Julie M Rutledge, Leanne Whiteside-Mansell, Geoff Curran

Abstract

Background: Researchers face many decisions in developing a measurement tool and protocol for monitoring fidelity to complex interventions. The current study uses data evaluating a nutrition education intervention, Together, We Inspire Smart Eating (WISE), in a preschool setting to explore issues of source, timing, and frequency of fidelity monitoring.

Methods: The overall study from which these data are drawn was a pre/post design with an implementation-focused process evaluation. Between 2013 and 2016, researchers monitored fidelity to evidence-based components of the WISE intervention in 49 classrooms in two Southern states. Data collectors obtained direct assessment of fidelity on a monthly basis in study classrooms. Research staff requested that educators provide indirect assessment on a weekly basis. We used mean comparisons (t-tests), correlations (Pearson's r), and scatterplots to compare the direct and indirect assessments.

Results: No mean comparisons were statistically different. Correlations of direct and indirect assessments of the same component for the same month ranged between - 0.51 (p = 0.01) and 0.54 (p = 0.001). Scatterplots illustrate that negative correlations can be driven by individuals who are over reporting (i.e., self-report bias) and that near zero correlations approximate the ideal situation (i.e., both raters identify high fidelity).

Conclusion: Our findings illustrate that, on average, observed and self-reports may seem consistent despite weak correlations and individual cases of extreme over reporting by those implementing the intervention. The nature of the component to which fidelity is being monitored as well as the timing within the context of the intervention are important factors to consider when selecting the type of assessment and frequency of fidelity monitoring.

Trial registration: NCT03075085 Registered 20 February 2017. Trial registration corresponds to the funding that supported the writing of this manuscript, not the data collection. The original study was not a trial and was collected without registration. However, the data reported here provided foundational preliminary data for the trial.

Keywords: Behavioral interventions; Fidelity; Implementation science; Nutrition; Obesity prevention.

Conflict of interest statement

This protocol was approved by the UAMS Institutional Review Board (IRB 134665). We conducted this study in accordance with all applicable government regulations and University of Arkansas for Medical Sciences research policies and procedures. Consent was collected from all participating educators.Not applicable.Dr. Leanne Whiteside-Mansell, Dr. Taren Swindle, and UAMS have a financial interest in the technology (WISE) discussed in this presentation/publication. These financial interests have been reviewed and approved in accordance with the UAMS conflict of interest policies.Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Figures

Fig. 1
Fig. 1
Means Across Units for Direct and Indirect Assessment of Hands On. Legend:
Fig. 2
Fig. 2
Means Across Units for Direct and Indirect Assessment of Use of Mascot. Legend:
Fig. 3
Fig. 3
Scatterplot of direct and indirect assessments for use of mascot unit 1
Fig. 4
Fig. 4
Scatterplot of direct and indirect assessments for use of mascot unit 8
Fig. 5
Fig. 5
Scatterplot of direct and indirect assessments for child participation unit 1
Fig. 6
Fig. 6
Scatterplot of direct and indirect assessments for child participation unit 8

References

    1. Proctor E, Silmere H, Raghavan R, Hovmand P, Aarons G, Bunger A, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38:65–76. doi: 10.1007/s10488-010-0319-7.
    1. Dane AV, Schneider BH. Program integrity in primary and early secondary prevention: are implementation effects out of control? Clin Psychol Rev. 1998;18:23–45. doi: 10.1016/S0272-7358(97)00043-3.
    1. van NF, Singh A. Implementation evaluation of school-based obesity prevention programmes in youth; how, what and why? Heal Nutr. 2015;18(9):1531. doi: 10.1017/S1368980014002778.
    1. Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41:327–350. doi: 10.1007/s10464-008-9165-0.
    1. Horner S, Rew L, Torres R. Enhancing intervention fidelity: a means of strengthening study impact. J Spec Pediatr. 2006;11(2):80–89. doi: 10.1111/j.1744-6155.2006.00050.x.
    1. Schoenwald SK, Garland AF, Chapman JE, Frazier SL, Sheidow AJ, Southam-Gerow MA. Toward the effective and efficient measurement of implementation fidelity. Adm Policy Ment Heal Ment Heal Serv Res. 2011;38:32–43. doi: 10.1007/s10488-010-0321-0.
    1. Glasgow RE, Riley WT. Pragmatic Measures. Am J Prev Med. 2013;45:237–243. doi: 10.1016/j.amepre.2013.03.010.
    1. Aarons GA, Green AE, Palinkas LA, Self-Brown S, Whitaker DJ, Lutzker JR, et al. Dynamic adaptation process to implement an evidence-based child maltreatment intervention. Implement Sci. 2012;7:32. doi: 10.1186/1748-5908-7-32.
    1. Breitenstein S, Robbins L, Cowell JM. Attention to Fidelity. J Sch Nurs. 2012;28:407–408. doi: 10.1177/1059840512465408.
    1. Walton H, Spector A, Tombor I, Michie S. Measures of fidelity of delivery of, and engagement with, complex, face-to-face health behaviour change interventions: a systematic review of measure quality. Br J Health Psychol. 2017;22(4):872–903. doi: 10.1111/bjhp.12260.
    1. Creed TA, Wolk CB, Feinberg B, Evans AC, Beck AT. Beyond the label: relationship between community therapists’ self-report of a cognitive behavioral therapy orientation and observed skills. Adm Policy Ment Heal Ment Heal Serv Res. 2016;43:36–43. doi: 10.1007/s10488-014-0618-5.
    1. Hurlburt MS, Garland AF, Nguyen K, Brookman-Frazee L. Child and family therapy process: concordance of therapist and observational perspectives. Adm Policy Ment Heal Ment Heal Serv Res. 2010;37:230–244. doi: 10.1007/s10488-009-0251-x.
    1. Martino S, Ball S, Nich C, Frankforter TL, Carroll KM. Correspondence of motivational enhancement treatment integrity ratings among therapists, supervisors, and observers. Psychother Res. 2009;19:181–193. doi: 10.1080/10503300802688460.
    1. Ward AM, Regan J, Chorpita BF, Starace N, Rodriguez A, Okamura K, et al. Tracking evidence based practice with youth: validity of the MATCH and standard manual consultation records. J Clin Child Adolesc Psychol. 2013;42:44–55. doi: 10.1080/15374416.2012.700505.
    1. Hogue A, Dauber S, Lichvar E, Bobek M, Henderson CE. Validity of therapist self-report ratings of fidelity to evidence-based practices for adolescent behavior problems: correspondence between therapists and observers. Admin Pol Ment Health. 2015;42:229–243. doi: 10.1007/s10488-014-0548-2.
    1. Griffin TL, Pallan MJ, Clarke JL, Lancashire ER, Lyon A, Parry JM. Process evaluation design in a cluster randomised controlled childhood obesity prevention trial: the WAVES study. Int J Behav Nutr Phys Act. 2014;11
    1. Reynolds K, Franklin F, Leviton L. Methods, results, and lessons learned from process evaluation of the high 5 school-based nutrition intervention. Health Educ. 2000;27(2):177–186.
    1. Sherill J. Pragmatic strategies for assessing psychotherapy. Washington: National Institutes of Mental Health.
    1. Gray H, Contento I, Koch P. Linking implementation process to intervention outcomes in a middle school obesity prevention curriculum,‘choice, control and change. Health Educ Res. 2015;30(2):248–261. doi: 10.1093/her/cyv005.
    1. Martens M, van Assema P, Paulussen T, Schaalma H, Brug J. Krachtvoer: process evaluation of a Dutch programme for lower vocational schools to promote healthful diet. Health Educ Res. 2006;21:695–704. doi: 10.1093/her/cyl082.
    1. Bessems K, van AP. Exploring determinants of completeness of implementation and continuation of a Dutch school-based healthy diet promotion programme. Int J Heal Prom Edu. 2014;52(6):315–327. doi: 10.1080/14635240.2014.912445.
    1. Brown CH, Curran G, Palinkas LA, Aarons GA, Wells KB, Jones L, et al. An overview of research and evaluation designs for dissemination and implementation. Annu Rev Public Health. 2017;38:1–22. doi: 10.1146/annurev-publhealth-031816-044215.
    1. Stetler CB, Legro MW, Wallace CM, Bowman C, Guihan M, Hagedorn H, et al. The role of formative evaluation in implementation research and the QUERI experience. J Gen Intern Med. 2006;21(Suppl 2):S1–S8. doi: 10.1007/s11606-006-0267-9.
    1. Proctor EK, Powell BJ, Mcmillen JC. Implementation strategies: recommendations for specifying and reporting. Implement Sci. 2013;8(1):139. doi: 10.1186/1748-5908-8-139.
    1. Powell B, Waltz T, Chinman M, Damschroder L, Smith J, Matthieu M, et al. A refined compilation of implementation strategies: results from the expert recommendations for implementing change (ERIC) project. Implement Sci. 2015;10:21. doi: 10.1186/s13012-015-0209-1.
    1. Whiteside-Mansell L, Swindle TM. Together we inspire smart eating: a preschool curriculum for obesity prevention in low-income families. J Nutr Educ Behav. 2017;49(9):789–792. doi: 10.1016/j.jneb.2017.05.345.
    1. Johnson S, Bellows L. Evaluation of a social marketing campaign targeting preschool children. Am J Health Behav. 2007;31(1):44–55. doi: 10.5993/AJHB.31.1.5.
    1. Office of Head Start. 1302.31 Teaching and the learning environment. | ECLKC. Legislation and regulations: Head Start Program Performance Standards (45 CFR part 1304.23 Child Nutrition). US Department of Health and Human Services, Administration for Children and Families. 2016. .
    1. Anzman-Frasca S, Savage JS, Marini ME, Fisher JO, Birch LL. Repeated exposure and associative conditioning promote preschool children’s liking of vegetables. Appetite. 2012;58:543–553. doi: 10.1016/j.appet.2011.11.012.
    1. Dazeley P, Houston-Price C. Exposure to foods’ non-taste sensory properties. A nursery intervention to increase children’s willingness to try fruit and vegetables. Appetite. 2015;84:1–6. doi: 10.1016/j.appet.2014.08.040.
    1. Knai C, Pomerleau J, Lock K, McKee M. Getting children to eat more fruit and vegetables: a systematic review. Prev Med. 2006;42:85–95. doi: 10.1016/j.ypmed.2005.11.012.
    1. Wardle J, Cooke LJ, Gibson EL, Sapochnik M, Sheiham A, Lawson M. Increasing children’s acceptance of vegetables; a randomized trial of parent-led exposure. Appetite. 2003;40:155–162. doi: 10.1016/S0195-6663(02)00135-6.
    1. Wardle J, Herrera M-L, Cooke L, Gibson EL. Modifying children’s food preferences: the effects of exposure and reward on acceptance of an unfamiliar vegetable. Eur J Clin Nutr. 2003;57:341–348. doi: 10.1038/sj.ejcn.1601541.
    1. Schindler JM, Corbett D, Forestell CA. Assessing the effect of food exposure on children’s identification and acceptance of fruit and vegetables. Eat Behav. 2013;14:53–56. doi: 10.1016/j.eatbeh.2012.10.013.
    1. Borzekowski D, Robinson T. The 30-second effect: an experiment revealing the impact of television commercials on food preferences of preschoolers. J Am Diet Assoc. 2001;101(1):42–46. doi: 10.1016/S0002-8223(01)00012-8.
    1. Boyland E, Harrold J, Kirkham T, Halford J. Persuasive techniques used in television advertisements to market foods to UK children. Appetite. 2012;58:658–664. doi: 10.1016/j.appet.2011.11.017.
    1. Kraak V, Story M. Influence of food companies’ brand mascots and entertainment companies’ cartoon media characters on children’s diet and health: a systematic review. Obes Rev. 2015;16(2):107–126. doi: 10.1111/obr.12237.
    1. Keller K, Kuilema L, Lee N, Yoon J, Mascaro B. The impact of food branding on children’s eating behavior and obesity. Physiol. 2012;106(3):379–386.
    1. Roberto C, Baik J, Harris J, Brownell K. Influence of licensed characters on children’s taste and snack preferences. Pediatrics. 2010;126(1):88–93. doi: 10.1542/peds.2009-3433.
    1. Weber K, Story M, Harnack L. Internet food marketing strategies aimed at children and adolescents: a content analysis of food and beverage brand web sites. J Am Diet Assoc. 2006;106:1463–1466. doi: 10.1016/j.jada.2006.06.014.
    1. Benjamin Neelon SE, Briley ME. Position of the American dietetic association: benchmarks for nutrition in child care. J Am Diet Assoc. 2011;111:607–615. doi: 10.1016/j.jada.2011.02.016.
    1. Gibson EL, Kreichauf S, Wildgruber A, Vögele C, Summerbell CD, Nixon C, et al. A narrative review of psychological and educational strategies applied to young children’s eating behaviours aimed at reducing obesity risk. Obes Rev. 2012;13(Suppl 1):85–95. doi: 10.1111/j.1467-789X.2011.00939.x.
    1. Greenhalgh J, Dowey A, Horne P, Lowe C. Positive-and negative peer modelling effects on young children’s consumption of novel blue foods. Appetite. 2009;52(3):646–653. doi: 10.1016/j.appet.2009.02.016.
    1. Hendy HM, Raudenbush B. Effectiveness of teacher modeling to encourage food acceptance in preschool children. Appetite. 2000;34:61–76. doi: 10.1006/appe.1999.0286.
    1. IBM Corp . IBM SPSS Statistics for windows, version 22.0. Armonk, NY: IBM Corp; 2013.
    1. Dobson D, Cook TJ. Avoiding type III error in program evaluation. Results from a field experiment Eval Program Plann. 1980;3:269–276. doi: 10.1016/0149-7189(80)90042-7.
    1. Saunders RP, Wilcox S, Baruth M, Dowda M. Process evaluation methods, implementation fidelity results and relationship to physical activity and healthy eating in the faith, activity, and nutrition (FAN) study. Eval Program Plann. 2014;43:93–102. doi: 10.1016/j.evalprogplan.2013.11.003.
    1. Kimber M, Barac R, Barwick M. Monitoring fidelity to an evidence-based treatment: practitioner perspectives. Clin Soc Work J :1–15.
    1. Ivers NM, Grimshaw JM, Jamtvedt G, Flottorp S, O’Brien MA, French SD, et al. Growing literature, stagnant science? Systematic review, meta-regression and cumulative analysis of audit and feedback interventions in health care. J Gen Intern Med. 2014;29:1534–1541. doi: 10.1007/s11606-014-2913-y.
    1. Stirman SW, Miller CJ, Toder K, Calloway A. Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implement Sci. 2013;8:65. doi: 10.1186/1748-5908-8-65.
    1. Castro FG, Barrera M, Jr, Martinez CR., Jr The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prev Sci. 2004;5:41–45. doi: 10.1023/B:.
    1. Chambers DA, Norton WE. The Adaptome: advancing the science of intervention adaptation. Am J Prev Med. 2016;51:S124–S131. doi: 10.1016/j.amepre.2016.05.011.
    1. Perepletchikova F, Treat TA, Kazdin AE, Dorsey S, Schoenwald SK, Mandell DS, et al. Treatment integrity in psychotherapy research: analysis of the studies and examination of the associated factors. J Consult Clin Psychol. 2007;75:829–841. doi: 10.1037/0022-006X.75.6.829.
    1. Beidas RS, Maclean JC, Fishman J, Dorsey S, Schoenwald SK, Mandell DS, et al. A randomized trial to identify accurate and cost-effective fidelity measurement methods for cognitive-behavioral therapy: project FACTS study protocol. BMC Psychiatry. 2016;16:323. doi: 10.1186/s12888-016-1034-z.
    1. Williams N. Multilevel mechanisms of implementation strategies in mental health: integrating theory, research, and practice. Adm Policy Ment Heal Ment. 2016;43:783–798. doi: 10.1007/s10488-015-0693-2.
    1. National Insitute for Mental Health. Priorities for Strategy 4.1.

Source: PubMed

3
Iratkozz fel