Affiliations 

  • 1 Department of Medicine, Duke University Medical Center, Box 2805, Durham, NC, 27710, USA
  • 2 Cardiology Department, CHU Nancy-Brabois, Nancy, France
  • 3 Charles Darwin University, Darwin, NT, Australia
  • 4 Division of Cardiology, Wayne State University School of Medicine, Detroit, MI, USA
  • 5 Department of Cardiology, Maria Vittoria Hospital, Turin, Italy
  • 6 Duke Clinical Research Institute, Durham, NC, USA
  • 7 Department of Cardiology, Hospital Clinic, University of Barcelona, Barcelona, Spain
  • 8 Department of Medicine, UT-Southwestern Medical Center, Dallas, USA
  • 9 Department of Cardiology, Baylor, Scott & White Healthcare, Round Rock, TX, USA
  • 10 Department of Medicine, Sahlgrenska University Hospital, Gothenburg, Sweden
  • 11 Internal Medicine, New York University Medical Center, New York, USA
  • 12 Department of Cardiology, University of Malaya Medical Centre, Kuala Lumpur, Malaysia
  • 13 Faculté de Médecine de Marseille, Marseille, France
  • 14 Department of Cardiology, Gentofte and Herlev University Hospital, Copenhagen, Denmark
  • 15 Department of Medicine, Duke University Medical Center, Box 2805, Durham, NC, 27710, USA. annalisa.crowley@duke.edu
Int J Cardiovasc Imaging, 2016 Jul;32(7):1041-51.
PMID: 27100526 DOI: 10.1007/s10554-016-0873-5

Abstract

Echocardiography is essential for the diagnosis and management of infective endocarditis (IE). However, the reproducibility for the echocardiographic assessment of variables relevant to IE is unknown. Objectives of this study were: (1) To define the reproducibility for IE echocardiographic variables and (2) to describe a methodology for assessing quality in an observational cohort containing site-interpreted data. IE reproducibility was assessed on a subset of echocardiograms from subjects enrolled in the International Collaboration on Endocarditis registry. Specific echocardiographic case report forms were used. Intra-observer agreement was assessed from six site readers on ten randomly selected echocardiograms. Inter-observer agreement between sites and an echocardiography core laboratory was assessed on a separate random sample of 110 echocardiograms. Agreement was determined using intraclass correlation (ICC), coverage probability (CP), and limits of agreement for continuous variables and kappa statistics (κweighted) and CP for categorical variables. Intra-observer agreement for LVEF was excellent [ICC = 0.93 ± 0.1 and all pairwise differences for LVEF (CP) were within 10 %]. For IE categorical echocardiographic variables, intra-observer agreement was best for aortic abscess (κweighted = 1.0, CP = 1.0 for all readers). Highest inter-observer agreement for IE categorical echocardiographic variables was obtained for vegetation location (κweighted = 0.95; 95 % CI 0.92-0.99) and lowest agreement was found for vegetation mobility (κweighted = 0.69; 95 % CI 0.62-0.86). Moderate to excellent intra- and inter-observer agreement is observed for echocardiographic variables in the diagnostic assessment of IE. A pragmatic approach for determining echocardiographic data reproducibility in a large, multicentre, site interpreted observational cohort is feasible.

* Title and MeSH Headings from MEDLINE®/PubMed®, a database of the U.S. National Library of Medicine.