The development of QUADAS: a tool for the quality assessment of studies of diagnostic accuracy included in systematic reviews

Penny Whiting, Anne W S Rutjes, Johannes B Reitsma, Patrick M M Bossuyt, Jos Kleijnen, Penny Whiting, Anne W S Rutjes, Johannes B Reitsma, Patrick M M Bossuyt, Jos Kleijnen

Abstract

Background: In the era of evidence based medicine, with systematic reviews as its cornerstone, adequate quality assessment tools should be available. There is currently a lack of a systematically developed and evaluated tool for the assessment of diagnostic accuracy studies. The aim of this project was to combine empirical evidence and expert opinion in a formal consensus method to develop a tool to be used in systematic reviews to assess the quality of primary studies of diagnostic accuracy.

Methods: We conducted a Delphi procedure to develop the quality assessment tool by refining an initial list of items. Members of the Delphi panel were experts in the area of diagnostic research. The results of three previously conducted reviews of the diagnostic literature were used to generate a list of potential items for inclusion in the tool and to provide an evidence base upon which to develop the tool.

Results: A total of nine experts in the field of diagnostics took part in the Delphi procedure. The Delphi procedure consisted of four rounds, after which agreement was reached on the items to be included in the tool which we have called QUADAS. The initial list of 28 items was reduced to fourteen items in the final tool. Items included covered patient spectrum, reference standard, disease progression bias, verification bias, review bias, clinical review bias, incorporation bias, test execution, study withdrawals, and indeterminate results. The QUADAS tool is presented together with guidelines for scoring each of the items included in the tool.

Conclusions: This project has produced an evidence based quality assessment tool to be used in systematic reviews of diagnostic accuracy studies. Further work to determine the usability and validity of the tool continues.

Figures

Figure 1
Figure 1
Flow chart of the tool development process.

References

    1. Glasziou P, Irwig L, Bain C, Colditz G. Systematic reviews in health care: A practical guide. Cambridge: Cambridge University Press. 2001.
    1. Deeks J. Systematic reviews of evaluations of diagnostic and screening tests. In: Egger M, Davey Smith G, Altman D, editor. Systematic Reviews in Health Care: Meta-analysis in context. London: BMJ Publishing Group; 2001. Second edition.
    1. Whiting P, Rutjes A, Dinnes J, Reitsma JB, Bossuyt P, Kleijnen J. A systematic review of existing quality assessment tools used to assess the quality of diagnostic research. submitted.
    1. Streiner DL, Norman GR. Health measurement scales: a practical guide to their development and use. Oxford: Oxford University Press. 1995.
    1. Jadad AR, Moore A, Carroll D, Jenkinson C, Reynolds DJ, Gavaghan DJ, McQuay HJ. Assessing the quality of reports of randomised clinical trials: is blinding necessary? Control Clin Trials. 1996;17:1–12. doi: 10.1016/0197-2456(95)00134-4.
    1. Juni P, Altman DG, Egger M. Assessing the quality of controlled trials. BMJ. 2001;323:42–46. doi: 10.1136/bmj.323.7303.42.
    1. Juni P, Witschi A, Bloch RM, Egger M. The hazards of scoring the quality of clinical trials for meta-analysis. JAMA. 1999;282:1054–1060. doi: 10.1001/jama.282.11.1054.
    1. Greenland S. Invited Commentary: A critical look at some popular meta-analytic methods. A J Epidemiol. 1994;140:290–296.
    1. Whiting P, Dinnes J, Rutjes AWS, Reitsma JB, M BP, Kleijnen J. A systematic review of how quality assessment has been handled in systematic reviews of diagnostic tests. submitted.
    1. Whiting P, Rutjes AWS, Dinnes J, Reitsma JB, Bossuyt PMM, Kleijnen J. The development and validation of methods for assessing the quality and reporting of diagnostic studies. Health Technol Assess.
    1. Whiting P, Rutjes AWS, Reitsma JB, Glas A, Bossuyt PM, Kleijnen J. A systematic review of sources of variation and bias in studies of diagnostic accuracy. Ann Intern Med, In press.
    1. Kerr M. The Delphi Process 2002. City: Remote and Rural Areas Research Initiative, NHS in Scotland; 2001. The Delphi Process. avaliable online at .
    1. The Delphi Technique in Pain Research. Vol. 2002. City: Scottish Network for Chronic Pain Research; 2001. The Delphi Technique in Pain Research. avaliable at .
    1. Bossuyt P, Reitsma J, Bruns D, Gatsonis C, Glasziou P, Irwig L, Moher D, Rennie D, de Vet H, Lijmer J. The STARD statement for reporting studies of diagnostic accuracy: explanation and elaboration. Clin Chem. 2003;49:7–18. doi: 10.1373/49.1.7.
    1. Schulz KF, Chalmers I, Hayes RJ, Altman DG. Empirical evidence of bias: dimensions of methodological quality associated with estimates of treatment effects in controlled trials. JAMA. 1995;273:408–412. doi: 10.1001/jama.273.5.408.

Source: PubMed

3
Prenumerera