Frequency Importance Functions in Simulated Electric Acoustic Stimulation

模拟电声刺激中的频率重要性函数

阅读:2

Abstract

OBJECTIVES: In electric acoustic stimulation (EAS)-the combined use of a cochlear implant (CI) and a hearing aid (HA) within the same ear-effective speech perception depends on the integration of low-frequency acoustic input and high-frequency electric input. This spectral integration can be influenced by spatial overlap between the most apical CI electrode contact and functional acoustic hearing regions. Despite its prevalence, the impact of spatial overlap on spectral integration and speech perception remains unclear. This study derived frequency importance functions (FIFs) for simulated EAS hearing across three spatial-based frequency maps (spatial overlap, spatial meet, and spatial gap) in quiet and noise and compared them to FIFs from simulated bimodal hearing (a CI in one ear, HA in the opposite ear). These FIFs provide insight into how different frequency regions influence speech perception in EAS and inform strategies for optimizing frequency mapping. Comparisons with bimodal hearing also highlight distinct spectral processing patterns between the two hearing technologies. DESIGN: Acoustic hearing was simulated using low-pass filtering with a 500 Hz cutoff frequency. Electric hearing was simulated using a six-channel sinewave vocoder, with three sets of matched input and output frequency ranges, representing three different insertion depths-spatial overlap, spatial meet, and spatial gap-relative to the 500 Hz acoustic cutoff frequency. Spectral holes in speech spectra were introduced only in the electric portion by setting the amplitude of specific frequency channels to zero; the acoustic portion remained unaffected. RESULTS: The spatial gap map yielded the highest sentence perception scores, followed by the spatial meet and overlap maps. For spatial overlap and meet maps, the highest FIF weights were found in the upper frequency channels in both quiet and noise, with more weight in noise than in quiet. In contrast, the spatial gap map was distributed more evenly across the frequency channels in both quiet and noisy conditions. FIF shapes between EAS and bimodal hearing were similar in quiet but diverged significantly in noise. For the spatial overlap and spatial meet maps, EAS relied more on the higher-frequency channels than bimodal hearing for spatial overlap and meet maps, while the spatial gap map showed the opposite pattern. CONCLUSIONS: Simulated EAS hearing with a spatial gap map demonstrated a more balanced use of frequency information, suggesting a more effective combination and utilization of acoustic and electric cues. In contrast, spatial overlap and meet maps relied more heavily on high-frequency information, indicating less effective utilization of combined lower- and higher-frequency information. Comparisons with simulated bimodal hearing suggest that differences in spectral processing between EAS and bimodal hearing are more pronounced in noise than in quiet. As this study used acoustic simulations, these findings should be interpreted with caution when generalizing to actual EAS users.

特别声明

1、本页面内容包含部分的内容是基于公开信息的合理引用;引用内容仅为补充信息,不代表本站立场。

2、若认为本页面引用内容涉及侵权,请及时与本站联系,我们将第一时间处理。

3、其他媒体/个人如需使用本页面原创内容,需注明“来源:[生知库]”并获得授权;使用引用内容的,需自行联系原作者获得许可。

4、投稿及合作请联系:info@biocloudy.com。