Effect of long-detection interval vs standard-detection interval for implantable cardioverter-defibrillators on antitachycardia pacing and shock delivery: The advance III randomized clinical trial

Maurizio Gasparini, Alessandro Proclemer, Catherine Klersy, Axel Kloppe, Maurizio Lunati, José Bautista Martìnez Ferrer, Ahmad Hersi, Marcin Gulaj, Maurits C E F Wijfels, Elisabetta Santi, Laura Manotta, Angel Arenal

Research output: Contribution to journalArticlepeer-review

Abstract

Importance: Using more intervals to detect ventricular tachyarrhythmias has been associated with reducing unnecessary implantable cardioverter- defibrillator (ICD) therapies. Objective: To determine whether using 30 of 40 intervals to detect ventricular arrhythmias (VT) (long detection) during spontaneous fast VT episodes reduces antitachycardia pacing (ATP) and shock delivery more than 18 of 24 intervals (standard detection). Design, Setting, and Participants: Randomized, single-blind, parallel-group trial that enrolled 1902 primary and secondary prevention patients (mean [SD] age, 65 [11] years; 84% men; 75% primary prevention ICD) with ischemic and nonischemic etiology undergoing first ICD implant at 1 of 94 international centers (March 2008-December 2010). Interventions: Patients were randomized 1:1 to programming with long- (n=948) or standard-detection (n=954) intervals. Main Outcomes and Measures: Total number of ATPs and shocks delivered for all episodes (primary outcomes) and inappropriate shocks, mortality, and syncopal rate (secondary outcomes). Results: During a median follow-up of 12 months (interquartile range, 11-13), long-detection group had 346 delivered therapies (42 therapies per 100 person-years, 95% CI, 38-47) vs 557 in the standard-detection group (67 therapies per 100 person-years [95% CI, 62-73]; incident rate ratio [IRR], 0.63 [95% CI, 0.51-0.78]; P <.001). The long-vs the standard-detection group experienced 23 ATPs per 100 person-years (95% CI, 20-27) vs 37 ATPs per 100 person-years (95% CI, 33-41; IRR, 0.58 [95% CI, 0.47-0.72]; P <.001); 19 shocks per 100 person-years (95% CI, 16-22) vs 30 shocks per 100 person-years (95% CI, 26-34; IRR, 0.77 [95% CI, 0.59-1.01]; P =.06), with a significant difference in the probability of therapy occurrence (P <.001); and a reduction in first occurrence of inappropriate shock (5.1 per 100 patient-years [95% CI, 3.7-6.9] vs 11.6 [95% CI, 9.4-14.1]; IRR, 0.55 [95% CI, 0.36-0.85]; P =.008). Mortality (5.5 [95% CI, 4.0-7.2] vs 6.3 [95% CI, 4.8-8.2] per 100 patient-years; HR, 0.87; P =.50) and arrhythmic syncope rates (3.1 [95% CI, 2.6-4.6] vs 1.9 [95% CI, 1.1-3.1] per 100 patientyears; IRR, 1.60 [95% CI, 0.76-3.41]; P =.22) did not differ significantly between groups. Conclusions and Relevance: Among patients receiving an ICD, the use of a longvs standard-detection interval resulted in a lower rate of ATP and shocks, and inappropriate shocks. This programming strategy may be an appropriate alternative. Trial Registration: clinicaltrials.gov Identifier: NCT00617175

Original languageEnglish
Pages (from-to)1903-1911
Number of pages9
JournalJournal of the American Medical Association
Volume309
Issue number18
DOIs
Publication statusPublished - 2013

ASJC Scopus subject areas

  • Medicine(all)

Fingerprint Dive into the research topics of 'Effect of long-detection interval vs standard-detection interval for implantable cardioverter-defibrillators on antitachycardia pacing and shock delivery: The advance III randomized clinical trial'. Together they form a unique fingerprint.

Cite this