Effects of an evidence service on health-system policy makers' use of research evidence: a protocol for a randomised controlled trial

John N Lavis, Michael G Wilson, Jeremy M Grimshaw, R Brian Haynes, Steven Hanna, Parminder Raina, Russell Gruen, Mathieu Ouimet, John N Lavis, Michael G Wilson, Jeremy M Grimshaw, R Brian Haynes, Steven Hanna, Parminder Raina, Russell Gruen, Mathieu Ouimet

Abstract

Background: Health-system policy makers need timely access to synthesised research evidence to inform the policy-making process. No efforts to address this need have been evaluated using an experimental quantitative design. We developed an evidence service that draws inputs from Health Systems Evidence, which is a database of policy-relevant systematic reviews. The reviews have been (a) categorised by topic and type of review; (b) coded by the last year searches for studies were conducted and by the countries in which included studies were conducted; (c) rated for quality; and (d) linked to available user-friendly summaries, scientific abstracts, and full-text reports. Our goal is to evaluate whether a "full-serve" evidence service increases the use of synthesized research evidence by policy analysts and advisors in the Ontario Ministry of Health and Long-Term Care (MOHLTC) as compared to a "self-serve" evidence service.

Methods/design: We will conduct a two-arm randomized controlled trial (RCT), along with a follow-up qualitative process study in order to explore the findings in greater depth. For the RCT, all policy analysts and policy advisors (n = 168) in a single division of the MOHLTC will be invited to participate. Using a stratified randomized design, participants will be randomized to receive either the "full-serve" evidence service (database access, monthly e-mail alerts, and full-text article availability) or the "self-serve" evidence service (database access only). The trial duration will be ten months (two-month baseline period, six-month intervention period, and two month cross-over period). The primary outcome will be the mean number of site visits/month/user between baseline and the end of the intervention period. The secondary outcome will be participants' intention to use research evidence. For the qualitative study, 15 participants from each trial arm (n = 30) will be purposively sampled. One-on-one semi-structured interviews will be conducted by telephone on their views about and their experiences with the evidence service they received, how helpful it was in their work, why it was helpful (or not helpful), what aspects were most and least helpful and why, and recommendations for next steps.

Discussion: To our knowledge, this will be the first RCT to evaluate the effects of an evidence service specifically designed to support health-system policy makers in finding and using research evidence.

Trial registration: ClinicalTrials.gov: NCT01307228.

Figures

Figure 1
Figure 1
Linkages among the intervention, contextual developments, and theory of planned behaviour constructs.

References

    1. Lavis JN. How can we support the use of systematic reviews in policymaking? PLoS Medicine. 2009;6
    1. Egger M, Smith GD, O'Rourke K. In: Systematic Reviews in Health Care: Meta-Analysis in Context. Second. Egger M, Smith GD, Altman DG, editor. London: BMJ Books; 2001. Rationale, potentials, and promise of systematic reviews; pp. 3–19.
    1. Lavis JN, Davies HTO, Oxman AD, Denis J-L, Golden-Biddle K, Ferlie E. Towards systematic reviews that inform health care management and policy-making. Journal of Health Services Research and Policy. 2005;10:S1:35–S1:48.
    1. Lavis JN, Davies HTO, Gruen RL. Working within and beyond the Cochrane Collaboration to make systematic reviews more useful to healthcare managers and policy makers. Healthcare Policy. 2006;1:21–33.
    1. Innvaer S, Vist GE, Trommald M, Oxman AD. Health policy-makers' perceptions of their use of evidence: A systematic review. Journal of Health Services Research and Policy. 2002;7:239–244. doi: 10.1258/135581902320432778.
    1. Haynes RB, Cotoi C, Holland J, Walters L, Wilczynski N, Jedraszewski D, McKinlay J, Parrish R, McKibbon KA. the McMaster Premium Literature Service (PLUS) Project. Second-Order Peer Review of the Medical Literature for Clinical Practitioners. JAMA. 2006;295:1801–1808. doi: 10.1001/jama.295.15.1801.
    1. Lavis JN, Permanand G, Oxman AD, Lewin SA, Fretheim A. SUPPORT Tools for evidence-informed health Policymaking (STP) 13: Preparing and using policy briefs to support evidence-informed policymaking. Health Research Policy and Systems. 2009;7
    1. Oxman A, Schunemann H, Fretheim A. Improving the use of research evidence in guideline development: 8. Synthesis and presentation of evidence. Health Research Policy and Systems. 2006;4:20. doi: 10.1186/1478-4505-4-20.
    1. Shea BJ, Grimshaw JM, Wells GA, Boers M, Andersson N, Hamel C, Porter AC, Tugwell P, Moher D, Bouter LM. Development of AMSTAR: A measurement tool to assess the methodological quality of systematic reviews. BMC Medical Research Methodology. 2007;7
    1. Lavis JN, Wilson MG, Hammill AC, Boyko JA, Grimshaw J, Oxman A, Flottorp S. Enhancing the retrieval of systematic reviews that can inform health system management and policymaking. Hamilton, Canada: Program in Policy Decision-Making; 2009.
    1. Grimshaw JM, Thomas RE, MacLennan G, Fraser C, Ramsay CR, Vale L, Whitty P, Eccles MP, Matowe L, Shirran L. et al.Effectiveness and efficiency of guideline dissemination and implementation strategies. Health Technology Assessment. 2004;8
    1. Haynes RB, Holland J, Cotoi C, McKinlay RJ, Wilczynski NL, Walters LA, Jedras D, Parrish R, McKibbon KA, Garg A. et al.McMaster PLUS: A cluster randomized clinical trial of an intervention to accelerate clinical use of evidence-based information from digital libraries. Journal of the American Medical Informatics Association. 2006;13:593–600. doi: 10.1197/jamia.M2158.
    1. Creswell JW, Plano Clark VL. Designing and Conducting Mixed Methods Research. Thousand Oaks, California: Sage; 2007.
    1. Lavis JN, Ross SE, McLeod CB, Gildiner A. Measuring the impact of health research. Journal of Health Services Research and Policy. 2003;8:165–170. doi: 10.1258/135581903322029520.
    1. Lavis JN. Ideas at the margin or marginalized ideas? Nonmedical determinants of health in Canada. Health Affairs. 2002;21:107–112. doi: 10.1377/hlthaff.21.2.107.
    1. Lavis JN, Ross SE, Hurley JE, Hohenadel JM, Stoddart GL, Woodward CA, Abelson J. Examining the role of health services research in public policymaking. Milbank Quarterly. 2002;80:125–154. doi: 10.1111/1468-0009.00005.
    1. Lavis JN. In: Using knowledge and evidence in health care: Multidisciplinary perspectives. Lemieux-Charles L, Champagne F, editor. Toronto, Canada: University of Toronto Press; 2004. A political science perspective on evidence-based decision-making; pp. 70–85.
    1. Lavis JN. Research, public policymaking, and knowledge-translation processes: Canadian efforts to build bridges. The Journal of Continuing Education in the Health Professions. 2006;26:37–45. doi: 10.1002/chp.49.
    1. Foy R, MacLennan G, Grimshaw JM, Penney G, Campbell M, Grol RP. Attributes of clinical recommendations that influence change in practice following audit and feedback. Journal of Clinical Epidemiology. 2002;55:17–22.
    1. Grilli R, Lomas J. Evaluating the message: The relationship between compliance rate and the subject of a practice guideline. Medical Care. 1994;32:202–213. doi: 10.1097/00005650-199403000-00002.
    1. Grol R, Dalhuijsen J, Thomas S, Veld C, Rutten G, Mokkink H. Attributes of clinical guidelines that influence use of guidelines in general practice: Observational study. British Medical Journal. 1998;317:858–861.
    1. Ajzen I. The theory of planned behaviour. Organizational Behavior and Human Decision Processes. 1991;50:179–211. doi: 10.1016/0749-5978(91)90020-T.
    1. Francis JJ, Eccles MP, Johnston M, Walker A, Grimshaw J, Foy R, Kaner EFS, Smith L, Bonetti D. Constructing Questionnaires Based on the Theory of Planned Behaviour: A Manual for Health Services Researchers. Newcastle upon Tyne, England: Centre for Health Services Research, University of Newcastle; 2004.
    1. Armitage CJ, Conner M. Efficacy of the theory of planned behaviour: A meta-analytic review. British Journal of Social Psychology. 2001;40:471–499. doi: 10.1348/014466601164939.
    1. Sheeran P. In: European Review of Social Psychology. Strobe W, Hewscone M, editor. Chichester, England: John Wiley & Sons, Ltd; 2002. Intention-behavior relations: A conceptual and empirical review; pp. 1–36.
    1. Bonetti D, Pitts NB, Eccles M, Grimshaw J, Johnston M, Steen N, Glidewell L, Thomas R, MacLennan G, Clarkson JE. et al.Applying psychological theory to evidence-based clinical practice: Identifying factors predictive of taking intra-oral radiographs. Soc Sci Med. 2006;63:1889–1899. doi: 10.1016/j.socscimed.2006.04.005.
    1. Walker A, Watson M, Grimshaw J, Bond C. Applying the theory of planned behaviour to pharmacists' beliefs and intentions about the treatment of vaginal candidiasis with non-prescription medicines. Family Practice. 2004;21:1–7. doi: 10.1093/fampra/cmh101.
    1. Walker AE, Grimshaw JM, Armstrong EM. Salient beliefs and intentions to prescribe antibiotics for patients with a sore throat. British Journal of Health Psychology. 2001;6:347–360. doi: 10.1348/135910701169250.
    1. Eccles MP, Hrisos S, Francis J, kaner EF, Dickinson HO, Beyer F, Johnston M. Do self-reported intentions predict clinicians' behaviour: A systematic review. Implementation Science. 2006;1:28. doi: 10.1186/1748-5908-1-28.
    1. Boyko JA, Lavis JN, Souza NM. Reliability of a Tool for Measuring Theory of Planned Behaviour Constructs for use in Evaluating Research Use in Policymaking. Hamilton, Canada: McMaster University; 2010.
    1. Streiner DL, Norman G. Health Measurement Scales: A Practical Guide to their Development and Use. New York, USA: Oxford University Press; 2008.
    1. Collins KMT, Onwuegbuzie AJ, Jiao QG. A mixed methods investigation of mixed methods sampling designs in social and health science research. Journal of Mixed Methods Research. 2007;1:267–294. doi: 10.1177/1558689807299526.
    1. Patton M. Qualitative Evaluation and Research Methods. Beverly Hills, USA: Sage; 1990.
    1. Sandelowski M. Combining qualitative and quantitative sampling, data collection, and analysis techniques in mixed-method studies. Research in Nursing & Health. 2000;23:246–255. doi: 10.1002/1098-240X(200006)23:3<246::AID-NUR9>;2-H.
    1. Boeije H. A purposeful approach to the constant comparative methods in the analysis of qualitative interviews. Quality & Quantity. 2002;36:391–409. doi: 10.1023/A:1020909529486.
    1. Creswell JW. Qualitative Inquiry and Research Design: Choosing Among Five Traditions. London, England: Sage Publications; 1998.
    1. Pope C, Ziebland S, Mays N. Qualitative research in health care: Analysing qualitative data. BMJ. 2000;320:114–116. doi: 10.1136/bmj.320.7227.114.
    1. Lavis JN, Lomas J, Hamid M, Sewankambo NK. Assessing country-level efforts to link research to action. Bulletin of the World Health Organization. 2006;84:620–628. doi: 10.2471/BLT.06.030312.
    1. Mitton C, Adair CE, McKenzie E, Patten SB, Wayne Perry B. Knowledge transfer and exchange: Review and synthesis of the literature. Milbank Quarterly. 2007;85:729–768. doi: 10.1111/j.1468-0009.2007.00506.x.
    1. Dobbins M, Robeson P, Ciliska D, Hanna S, Cameron R, O'Mara L, DeCorby K, Mercer S. A description of a knowledge broker role implemented as part of a randomized controlled trial evaluating three knowledge translation strategies. Implementation Science. 2009;4:23. doi: 10.1186/1748-5908-4-23.

Source: PubMed

3
Suscribir