Measuring and Improving Evidence-Based Patient Care Using a Web-Based Gamified Approach in Primary Care (QualityIQ): Randomized Controlled Trial

Trever Burgon, Linda Casebeer, Holly Aasen, Czarlota Valdenor, Diana Tamondong-Lachica, Enrico de Belen, David Paculdo, John Peabody, Trever Burgon, Linda Casebeer, Holly Aasen, Czarlota Valdenor, Diana Tamondong-Lachica, Enrico de Belen, David Paculdo, John Peabody

Abstract

Background: Unwarranted variability in clinical practice is a challenging problem in practice today, leading to poor outcomes for patients and low-value care for providers, payers, and patients.

Objective: In this study, we introduced a novel tool, QualityIQ, and determined the extent to which it helps primary care physicians to align care decisions with the latest best practices included in the Merit-Based Incentive Payment System (MIPS).

Methods: We developed the fully automated QualityIQ patient simulation platform with real-time evidence-based feedback and gamified peer benchmarking. Each case included workup, diagnosis, and management questions with explicit evidence-based scoring criteria. We recruited practicing primary care physicians across the United States into the study via the web and conducted a cross-sectional study of clinical decisions among a national sample of primary care physicians, randomized to continuing medical education (CME) and non-CME study arms. Physicians "cared" for 8 weekly cases that covered typical primary care scenarios. We measured participation rates, changes in quality scores (including MIPS scores), self-reported practice change, and physician satisfaction with the tool. The primary outcomes for this study were evidence-based care scores within each case, adherence to MIPS measures, and variation in clinical decision-making among the primary care providers caring for the same patient.

Results: We found strong, scalable engagement with the tool, with 75% of participants (61 non-CME and 59 CME) completing at least 6 of 8 total cases. We saw significant improvement in evidence-based clinical decisions across multiple conditions, such as diabetes (+8.3%, P<.001) and osteoarthritis (+7.6%, P=.003) and with MIPS-related quality measures, such as diabetes eye examinations (+22%, P<.001), depression screening (+11%, P<.001), and asthma medications (+33%, P<.001). Although the CME availability did not increase enrollment in the study, participants who were offered CME credits were more likely to complete at least 6 of the 8 cases.

Conclusions: Although CME availability did not prove to be important, the short, clinically detailed case simulations with real-time feedback and gamified peer benchmarking did lead to significant improvements in evidence-based care decisions among all practicing physicians.

Trial registration: ClinicalTrials.gov NCT03800901; https://ichgcp.net/clinical-trials-registry/NCT03800901.

Keywords: MIPS; care standardization; case simulation; continuing education; decision-support; feedback; gamification; medical education; outcome; physician engagement; quality improvement; serious game; simulation; value-based care.

Conflict of interest statement

Conflicts of Interest: QURE, LLC owns the intellectual property used to prepare the cases and collect the data. JP is the owner of QURE, LLC. TB, CV, DTL, EDB, and DP are employees of QURE Healthcare.

©Trever Burgon, Linda Casebeer, Holly Aasen, Czarlota Valdenor, Diana Tamondong-Lachica, Enrico de Belen, David Paculdo, John Peabody. Originally published in the Journal of Medical Internet Research (https://www.jmir.org), 23.12.2021.

References

    1. Wennberg JE. Time to tackle unwarranted variations in practice. BMJ. 2011 Mar 17;342:d1513–d1513. doi: 10.1136/bmj.d1513.
    1. Atsma F, Elwyn G, Westert G. Understanding unwarranted variation in clinical practice: a focus on network effects, reflective medicine and learning health systems. Int J Qual Health Care. 2020 Jun 04;32(4):271–274. doi: 10.1093/intqhc/mzaa023. 5822111
    1. Morris ZS, Wooding S, Grant J. The answer is 17 years, what is the question: understanding time lags in translational research. J R Soc Med. 2011 Dec 16;104(12):510–20. doi: 10.1258/jrsm.2011.110180. 104/12/510
    1. Sutherland K, Levesque J. Unwarranted clinical variation in health care: definitions and proposal of an analytic framework. J Eval Clin Pract. 2020 Jun 28;26(3):687–696. doi: 10.1111/jep.13181.
    1. PubMed. National Library of Medicine. [2021-01-26]. .
    1. Bergmann S, Tran M, Robison K, Fanning C, Sedani S, Ready J, Conklin K, Tamondong-Lachica D, Paculdo D, Peabody J. Standardising hospitalist practice in sepsis and COPD care. BMJ Qual Saf. 2019 Oct 20;28(10):800–808. doi: 10.1136/bmjqs-2018-008829.bmjqs-2018-008829
    1. Weems L, Strong J, Plummer D, Martin J, Zweng TN, Lindsay J, Paculdo D, Tran M, Peabody J. A quality collaboration in heart failure and pneumonia inpatient care at Novant Health: standardizing hospitalist practices to improve patient care and system performance. Jt Comm J Qual Patient Saf. 2019 Mar;45(3):199–206. doi: 10.1016/j.jcjq.2018.09.005.S1553-7250(18)30334-9
    1. Oravetz P, White CJ, Carmouche D, Swan N, Donaldson J, Ruhl R, Valdenor C, Paculdo D, Tran M, Peabody J. Standardising practice in cardiology: reducing clinical variation and cost at Ochsner Health System. Open Heart. 2019 Mar 22;6(1):e000994. doi: 10.1136/openhrt-2018-000994. openhrt-2018-000994
    1. Yurso M, Box B, Burgon T, Hauck L, Tagg K, Clem K, Paculdo D, Acelajado MC, Tamondong-Lachica D, Peabody JW. Reducing unneeded clinical variation in sepsis and heart failure care to improve outcomes and reduce cost: a collaborative engagement with hospitalists in a multistate system. J Hosp Med. 2019 Jun 19;14(9):E1–E6. doi: 10.12788/jhm.3220. jhm.3220
    1. Peabody JW, Quimbo S, Florentino J, Shimkhada R, Javier X, Paculdo D, Jamison D, Solon O. Comparative effectiveness of two disparate policies on child health: experimental evidence from the Philippines. Health Policy Plan. 2017 May 01;32(4):563–571. doi: 10.1093/heapol/czw179. czw179
    1. McGlynn EA, Asch SM, Adams J, Keesey J, Hicks J, DeCristofaro A, Kerr EA. The quality of health care delivered to adults in the United States. N Engl J Med. 2003 Jun 26;348(26):2635–45. doi: 10.1056/NEJMsa022615.348/26/2635
    1. McCulloch P, Nagendran M, Campbell WB, Price A, Jani A, Birkmeyer JD, Gray M. Strategies to reduce variation in the use of surgery. Lancet. 2013 Sep;382(9898):1130–1139. doi: 10.1016/s0140-6736(13)61216-7.
    1. Davis P, Gribben B, Lay-Yee R, Scott A. How much variation in clinical activity is there between general practitioners? A multi-level analysis of decision-making in primary care. J Health Serv Res Policy. 2002 Oct 23;7(4):202–8. doi: 10.1258/135581902320432723.
    1. Wennberg JE. Unwarranted variations in healthcare delivery: implications for academic medical centres. BMJ. 2002 Oct 26;325(7370):961–4. doi: 10.1136/bmj.325.7370.961.
    1. Krumholz HM, Normand ST, Spertus JA, Shahian DM, Bradley EH. Measuring performance for treating heart attacks and heart failure: the case for outcomes measurement. Health Aff (Millwood) 2007 Jan;26(1):75–85. doi: 10.1377/hlthaff.26.1.75.26/1/75
    1. Tsugawa Y, Jha AK, Newhouse JP, Zaslavsky AM, Jena AB. Variation in physician spending and association with patient outcomes. JAMA Intern Med. 2017 May 01;177(5):675–682. doi: 10.1001/jamainternmed.2017.0059. 2608538
    1. Davis D. Does CME work? An analysis of the effect of educational activities on physician performance or health care outcomes. Int J Psychiatry Med. 1998 Apr 01;28(1):21–39. doi: 10.2190/ua3r-jx9w-mhr5-rc81.
    1. Bloom BS. Effects of continuing medical education on improving physician clinical care and patient health: a review of systematic reviews. Int J Technol Assess Health Care. 2005 Aug 04;21(3):380–5. doi: 10.1017/s026646230505049x.
    1. O'Neil KM, Addrizzo-Harris DJ, American College of Chest Physicians Health and Science Policy Committee Continuing medical education effect on physician knowledge application and psychomotor skills: effectiveness of continuing medical education: American College of Chest Physicians evidence-based educational guidelines. Chest. 2009 Mar;135(3 Suppl):37S–41S. doi: 10.1378/chest.08-2516.S0012-3692(09)60172-0
    1. Burgon TB, Cox-Chapman J, Czarnecki C, Kropp R, Guerriere R, Paculdo D, Peabody JW. Engaging primary care providers to reduce unwanted clinical variation and support ACO cost and quality goals: a unique provider-payer collaboration. Popul Health Manag. 2019 Aug;22(4):321–329. doi: 10.1089/pop.2018.0111.
    1. Richards JM, Burgon TB, Tamondong-Lachica D, Bitran JD, Liangco WL, Paculdo DR, Peabody JW. Reducing unwarranted oncology care variation across a clinically integrated network: a collaborative physician engagement strategy. JOP. 2019 Dec;15(12):e1076–e1084. doi: 10.1200/jop.18.00754.
    1. Harrison R, Hinchcliff RA, Manias E, Mears S, Heslop D, Walton V, Kwedza R. Can feedback approaches reduce unwarranted clinical variation? A systematic rapid evidence synthesis. BMC Health Serv Res. 2020 Jan 16;20(1):40. doi: 10.1186/s12913-019-4860-0. 10.1186/s12913-019-4860-0
    1. Peabody J, Oskombaeva K, Shimarova M, Adylbaeva V, Dzhorupbekova K, Sverdlova I, Shukurova V, Abdubalieva Z, Gagloeva N, Kudayarova A, Mukanbetovna AA, Dzhumagazievna NS, Vibornykh V, Zhorobekovna MS, de Belen E, Paculdo D, Tamondong-Lachica D, Novinson D, Valdenor C, Fritsche G. A nationwide program to improve clinical care quality in the Kyrgyz Republic. J Glob Health. 2020 Dec 15;10(2):020418. doi: 10.7189/jogh.10.020418. doi: 10.7189/jogh.10.020418.jogh-10-020418
    1. Lai AKH, Noor Azhar AMB, Bustam AB, Tiong XT, Chan HC, Ahmad RB, Chew KS. A comparison between the effectiveness of a gamified approach with the conventional approach in point-of-care ultrasonographic training. BMC Med Educ. 2020 Aug 12;20(1):263. doi: 10.1186/s12909-020-02173-7. 10.1186/s12909-020-02173-7
    1. Nevin CR, Westfall AO, Rodriguez JM, Dempsey DM, Cherrington A, Roy B, Patel M, Willig JH. Gamification as a tool for enhancing graduate medical education. Postgrad Med J. 2014 Dec 28;90(1070):685–93. doi: 10.1136/postgradmedj-2013-132486. postgradmedj-2013-132486
    1. Peabody JW, Luck J, Glassman P, Jain S, Hansen J, Spell M, Lee M. Measuring the quality of physician practice by using clinical vignettes: a prospective validation study. Ann Intern Med. 2004 Nov 16;141(10):771–80. doi: 10.7326/0003-4819-141-10-200411160-00008. 141/10/771
    1. Qualtrics. [2021-11-25].
    1. Mehta N, Geissel K, Rhodes E, Salinas G. Comparative effectiveness in CME: evaluation of personalized and self-directed learning models. J Contin Educ Health Prof. 2015;35 Suppl 1:S24–6. doi: 10.1002/chp.21284.
    1. Cervero RM, Gaines JK. The impact of CME on physician performance and patient health outcomes: an updated synthesis of systematic reviews. J Contin Educ Health Prof. 2015;35(2):131–8. doi: 10.1002/chp.21290.
    1. Kerfoot BP, Turchin A, Breydo E, Gagnon D, Conlin PR. An online spaced-education game among clinicians improves their patients' time to blood pressure control: a randomized controlled trial. Circ Cardiovasc Qual Outcomes. 2014 May;7(3):468–474. doi: 10.1161/circoutcomes.113.000814.
    1. Matanock A, Lee G, Gierke R, Kobayashi M, Leidner A, Pilishvili T. Use of 13-valent pneumococcal conjugate vaccine and 23-valent pneumococcal polysaccharide vaccine among adults aged ≥65 years: updated recommendations of the Advisory Committee on Immunization Practices. MMWR Morb Mortal Wkly Rep. 2019 Nov 22;68(46):1069–1075. doi: 10.15585/mmwr.mm6846a5. doi: 10.15585/mmwr.mm6846a5.
    1. Amaresan S. What is a good net promoter score? HubSpot. 2019. Oct 02, [2020-12-06]. .

Source: PubMed

3
订阅