Agreement is poor among current criteria used to define response to cardiac resynchronization therapy

Brandon K Fornwalt, William W Sprague, Patrick BeDell, Jonathan D Suever, Bart Gerritse, John D Merlino, Derek A Fyfe, Angel R León, John N Oshinski, Brandon K Fornwalt, William W Sprague, Patrick BeDell, Jonathan D Suever, Bart Gerritse, John D Merlino, Derek A Fyfe, Angel R León, John N Oshinski

Abstract

Background: Numerous criteria believed to define a positive response to cardiac resynchronization therapy have been used in the literature. No study has investigated agreement among these response criteria. We hypothesized that the agreement among the various response criteria would be poor.

Methods and results: A literature search was conducted with the keywords "cardiac resynchronization" and "response." The 50 publications with the most citations were reviewed. After the exclusion of editorials and reviews, 17 different primary response criteria were identified from 26 relevant articles. The agreement among 15 of these 17 response criteria was assessed in 426 patients from the Predictors of Response to Cardiac Resynchronization Therapy (PROSPECT) study with Cohen's kappa-coefficient (2 response criteria were not calculable from PROSPECT data). The overall response rate ranged from 32% to 91% for the 15 response criteria. Ninety-nine percent of patients showed a positive response according to at least 1 of the 15 criteria, whereas 94% were classified as a nonresponder by at least 1 criterion. kappa-Values were calculated for all 105 possible comparisons among the 15 response criteria and classified into standard ranges: Poor agreement (kappa< or =0.4), moderate agreement (0.4<kappa<0.75), and strong agreement (kappa> or =0.75). Seventy-five percent of the comparisons showed poor agreement, 21% showed moderate agreement, and only 4% showed strong agreement.

Conclusions: The 26 most-cited publications on predicting response to cardiac resynchronization therapy define response using 17 different criteria. Agreement between different methods to define response to cardiac resynchronization therapy is poor 75% of the time and strong only 4% of the time, which severely limits the ability to generalize results over multiple studies.

Conflict of interest statement

Conflict of Interest Disclosures

The authors have one conflict of interest to disclose: Bart Gerritse is an employee of Medtronic, Inc and owns company stock valued at >$10,000.

Figures

Figure 1
Figure 1
Flow chart showing the process by which response criteria were identified from the literature.
Figure 2
Figure 2
Agreement amongst the 15 response criteria was poor. The Kappa axis is color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4

Figure 3

Agreement amongst the 15 response…

Figure 3

Agreement amongst the 15 response criteria was classified as poor for 75% of…

Figure 3
Agreement amongst the 15 response criteria was classified as poor for 75% of the 105 possible comparisons. Kappa values are color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4

Figure 4

Agreement amongst the response criteria…

Figure 4

Agreement amongst the response criteria was poor 75% of the time and strong…

Figure 4
Agreement amongst the response criteria was poor 75% of the time and strong only 4% of the time. Kappa values are color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4
Comment in
Similar articles
Cited by
Publication types
Related information
Full text links [x]
[x]
Cite
Copy Download .nbib
Format: AMA APA MLA NLM

NCBI Literature Resources

MeSH PMC Bookshelf Disclaimer

The PubMed wordmark and PubMed logo are registered trademarks of the U.S. Department of Health and Human Services (HHS). Unauthorized use of these marks is strictly prohibited.

Follow NCBI
Figure 3
Figure 3
Agreement amongst the 15 response criteria was classified as poor for 75% of the 105 possible comparisons. Kappa values are color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4

Figure 4

Agreement amongst the response criteria…

Figure 4

Agreement amongst the response criteria was poor 75% of the time and strong…

Figure 4
Agreement amongst the response criteria was poor 75% of the time and strong only 4% of the time. Kappa values are color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4
Comment in
Similar articles
Cited by
Publication types
Related information
Full text links [x]
[x]
Cite
Copy Download .nbib
Format: AMA APA MLA NLM
Figure 4
Figure 4
Agreement amongst the response criteria was poor 75% of the time and strong only 4% of the time. Kappa values are color-coded according to the following ranges: green = strong agreement (kappa ≥0.75), yellow = moderate agreement (0.4

Source: PubMed

3
購読する