PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Multimodal Perceptual Training for Improving Spatial Auditory Performance in Blind and Sighted Listeners

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
The use of individualised Head Related Transfer Functions (HRTF) is a fundamental prerequisite for obtaining an accurate rendering of 3D spatialised sounds in virtual auditory environments. The HRTFs are transfer functions that define the acoustical basis of auditory perception of a sound source in space and are frequently used in virtual auditory displays to simulate free-field listening conditions. However, they depend on the anatomical characteristics of the human body and significantly vary among individuals, so that the use of the same dataset of HRTFs for all the users of a designed system will not offer the same level of auditory performance. This paper presents an alternative approach to the use on non-individualised HRTFs that is based on a procedural learning, training, and adaptation to altered auditory cues.We tested the sound localisation performance of nine sighted and visually impaired people, before and after a series of perceptual (auditory, visual, and haptic) feedback based training sessions. The results demonstrated that our subjects significantly improved their spatial hearing under altered listening conditions (such as the presentation of 3D binaural sounds synthesised from non-individualized HRTFs), the improvement being reflected into a higher localisation accuracy and a lower rate of front-back confusion errors.
Rocznik
Strony
491--502
Opis fizyczny
Bibliogr. 31 poz., rys., tab., wykr., fot.
Twórcy
autor
  • Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Splaiul Independentei, 313, 060042, Bucharest, Romania
  • Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Splaiul Independentei, 313, 060042, Bucharest, Romania
  • Faculty of Automatic Control and Computers, University POLITEHNICA of Bucharest, Splaiul Independentei, 313, 060042, Bucharest, Romania
Bibliografia
  • 1. Ahissar M. (2001), Perceptual training: A tool for both modifying the brain and exploring it, PNAS, 98(21).
  • 2. Bǎlan O., Moldoveanu A., Moldoveanu F., Morar A. (2014), Experiments on training the human localization abilities, Proceedings of the 10th International Scientific Conference eLearning and Software for Education-Bucharest, April 24–25, 2014.
  • 3. Blum A., Katz B.F.G., Warusfel O. (2004), Eliciting adaptation to non-individual HRTF spectral cues with multi-modal training, Proceedings of Joint Meeting of the German and the French Acoustical Societies (CFA/DAGA ’04), Strasboug, France.
  • 4. Bogusz E., Koprowska H., Skrodzka E. (2012), Investigation of Performance in Selected Psychoacoustic Tasks by Visually Impaired Children and Teenagers, Acta Physica Polonica A, 121, 1A, A19–A23.
  • 5. Bogusz E., Koprowska H., Skrodzka E. (2012), Performance in pitch memory task by visually handicapped children and youths, Archives of Acoustics, 37, 4, 549–553.
  • 6. Bogusz-Witczak E., Skrodzka E., Furmann A., Hojan E., Przybek K. (2015), Results of Auditory Training for Blind and Visually Handicapped Children and Adolescents, Acta Physica Polonica A, 127, 117–119.
  • 7. Csapó A., Wersényi Gy. (2013), Overview of auditory representations in human-machine interfaces, Journal ACM Computing Surveys (CSUR), 46, 2.
  • 8. Dellepiane M., Pietroni N., Tsingos N., Asselot M., Scopigno R. (2008), Reconstructing head models from photographs for individualized 3D-audio processing, Comput. Graph. Forum, 27, 7, 1719–1727.
  • 9. Doucet M.E., Guillemot J.P., Lassonde M., Gagné J.P., Leclerc C., Lepore F. (2005), Blind subjects process auditory spectral cues more efficiently than sighted individuals, Exp Brain Res. 2005 Jan, 160, 2, 194–202.
  • 10. Fritz J.B., Elhilali M., David S.V., Shamma S.A. (2007), Does attention play a role in dynamic receptive field adaptation to changing acoustic salience in A1?, Hear Res., 229, 1–2, 186–203.
  • 11. Furmann A., Skrodzka E., Giżewski P., Nowotny Ł. (2013). Effect of sound reproduction method on performance in sound source localization tasks by visually impaired and normal sighted subjects, Acta Physica Polonica A, 123, 6, 988–994.
  • 12. Honda A., Shibata H., Gyoba J., Saitou K., Iwaya Y., Suzuki Y. (2007), Transfer effects on sound localization performances from playing a virtual three-dimensional auditory game, Appl. Acoustics, 68, 8, 885–896.
  • 13. Honda A., Shibata H., Hidaka S., Gyoba J., Iwaya Y., Suzuki Y. (2013), Effects of head movement and proprioceptive feedback in training of sound localization, Perception, 4, 4, 253–264.
  • 14. Katz F.G.B., Picinali L. (2011), Spatial Audio Applied to Research with the Blind, Advances in Sound Localization, Dr. Pawel Strumillo [Ed.], ISBN: 978-953-307-224-1, InTech, Available at: http://www.intechopen.com/books/advances-in-sound-localization/spatial-audio-applied-to-research-with-the blind.
  • 15. King A.J., Dahmen J.C., Keating P., Leach N.D., Nodal F.R., Bajo V.M. (2011), Neural circuits underlying adaptation and learning in the perception of auditory space, Neurosci Biobehav Rev. 2011 Nov, 35, 10, 2129–39.
  • 16. Lessard N., Paré M., Lepore F., Lassonde M. (1998), Early-blind human subjects localize sound sources better than sighted subjects, Nature, 395, 278–280.
  • 17. Majdak P., Goupell M.J., Laback B. (2010), 3-D Localization of Virtual Sound Sources: Effects of Visual Environment, Pointing Method, and Training, Atten Percept Psychophys, 72, 2, 454–469.
  • 18. Mendonça C. (2014), A review on auditory space adaptations to altered head-related cues, Front Neurosci., 2019, 8, doi: 10.3389/fnins.2014.00219.
  • 19. Meshram A., Mehra R., Yang H., Dunn E., Frahm J.M., Manocha D. (2014), P-HRTF: Efficient personalized HRTF computation for high-fidelity spatial sound, International Symposium on Mixed and Augmented Reality.
  • 20. Moldoveanu A., Bǎlan O. (2014), Training System for Improving Spatial Sound Localization, Proceedings of the 10th International Scientific Conference eLearning and software for Education-Bucharest, April 24–25, 2014.
  • 21. Ohuchi M., Iwaya Y., Suzuki Y., Munekata T. (2006), A comparative study of sound localization acuity of congenital blind and sighted people, Acoustical Science and Technology, 27, 290–293.
  • 22. Padersen J.A., Jorgensen T. (2005), Localization Performance of Real and Virtual Sound Sources, Proceedings North Atlantic Treaty Organization.
  • 23. Parseihian G., Katz B.F.G. (2012), Rapid Head-Related Transfer Function adaptation using a virtual auditory environment, J. Acoust. Soc. Am., 131, 4, 2948–2957, doi: 10.1121/1.3687448.
  • 24. Röder B., Teder-Sälejärvi W., Sterr A., Rösler F., Hillyard S.A., Neville H.J. (1999), Improved auditory spatial tuning in blind humans, Nature, Jul 8, 400, 6740, 162–166.
  • 25. Shinn-Cunningham B.G., Streeter T., Gyss J.-F. (2005), Perceptual plasticity in spatial auditory displays, Proceedings of the 2001 International Conference on Auditory Display, Espoo, Finland, July 29 – August 1, 2001.
  • 26. Strelnikov K., Rosito M., Barone P. (2011), Effect of Audiovisual Training on Monaural Spatial Hearing in Horizontal Plane, PLoS ONE, 6, 3.
  • 27. Wenzel E.M. (2001), Effect of increasing system latency on localization of virtual sounds with short and long duration, Proceedings of the International Conference on Auditory Display, Espoo, Finland, July 29 – August 1, 2001, 185–190.
  • 28. Wersényi Gy. (2012), Virtual Localization by Blind Persons, JAES, 60, 7/8, 568–579.
  • 29. Zahorik P., Bangayan P., Sundareswaran V., Wang K., Tam C. (2006), Perceptual recalibration in human sound localization: Learning to remediate front-back reversals, J. Acoust. Soc. Am., 120, 343–359, doi: 10.1121/1.2208429.
  • 30. Zwiers M.P., Van Opstal A.J., Cruysberg J.R. (2001), Two-dimensional sound-localization behavior of early-blind humans, Exp. Brain Res., 140, 2, 206–222.
  • 31. MIT HRTF Database. Available at: http://sound.media.mit.edu/resources/KEMAR.html
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-ef2d2d79-b8a1-4a4b-b42d-5488cbd982f1
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.