Trends in Hearing (Mar 2018)
Interaural Time-Difference Discrimination as a Measure of Place of Stimulation for Cochlear-Implant Users With Single-Sided Deafness
Abstract
Current clinical practice in programming a cochlear implant (CI) for individuals with single-sided deafness (SSD) is to maximize the transmission of speech information via the implant, with the implicit assumption that this will also result in improved spatial-hearing abilities. However, binaural sensitivity is reduced by interaural place-of-stimulation mismatch, a likely occurrence with a standard CI frequency-to-electrode allocation table (FAT). As a step toward reducing interaural mismatch, this study investigated whether a test of interaural-time-difference (ITD) discrimination could be used to estimate the acoustic frequency yielding the best place match for a given CI electrode. ITD-discrimination performance was measured by presenting 300-ms bursts of 100-pulses-per-second electrical pulse trains to a single CI electrode and band-limited pulse trains with variable carrier frequencies to the acoustic ear. Listeners discriminated between two reference intervals (four bursts each with constant ITD) and a moving target interval (four bursts with variable ITD). For 17 out of the 26 electrodes tested across eight listeners, the function describing the relationship between ITD-discrimination performance and carrier frequency had a discernable peak where listeners achieved 70% to 100% performance. On average, this peak occurred 1.15 octaves above the CI manufacturer’s default FAT. ITD discrimination shows promise as a method of estimating the cochlear place of stimulation for a given electrode, thereby providing information to optimize the FAT for SSD-CI listeners.