PL EN


Preferencje help
Widoczny [Schowaj] Abstrakt
Liczba wyników
Tytuł artykułu

Sonification: Review of Auditory Display Solutions in Electronic Travel Aids for the Blind

Treść / Zawartość
Identyfikatory
Warianty tytułu
Języki publikacji
EN
Abstrakty
EN
Sonification is defined as presentation of information by means of non-speech audio. In assistive technologies for the blind, sonification is most often used in electronic travel aids (ETAs) – devices which aid in independent mobility through obstacle detection or help in orientation and navigation. The presented review contains an authored classification of various sonification schemes implemented in the most widely known ETAs. The review covers both those commercially available and those in various stages of research, according to the input used, level of signal processing algorithm used and sonification methods. Additionally, a sonification approach developed in the Naviton project is presented. The prototype utilizes stereovision scene reconstruction, obstacle and surface segmentation and spatial HRTF filtered audio with discrete musical sounds and was successfully tested in a pilot study with blind volunteers in a controlled environment, allowing to localize and navigate around obstacles.
Rocznik
Strony
401--414
Opis fizyczny
Bibliogr. 51 poz., rys., tab., wykr., fot.
Twórcy
autor
  • Institute of Electronics, Lodz University of Technology, Wolczanska 211/215, 90-924 Łódz, Poland
  • Institute of Electronics, Lodz University of Technology, Wolczanska 211/215, 90-924 Łódz, Poland
Bibliografia
  • 1. Balakrishnan G., Sainarayanan G., Nagarajan R., Sazali Y. (2006), A stereo image processing system for visually impaired, International Journal of Signal Processing, 2, 3, 136–145.
  • 2. Bregman A. (1990), Auditory scene analysis: the perceptual organization of sound, The MIT Press.
  • 3. Bujacz M., Strumillo P. (2008), Synthetizing a 3D auditory scene for use in an electronic travel aid for the blind, Signal Processing Symposium, Proceedings of the SPIE, Vol. 6937, pp. 693737–693737–8, Jachranka, Poland.
  • 4. Bujacz M., Skulimowski P., Wroblewski G., Wojciechowski A., Strumillo P. (2009), A Proposed Method for Sonification of 3D Environments Using Scene Segmentation and Personalized Spatial Audio, Conference and Workshop on Assistive Technology for People with Vision and Hearing Impairments CVHI2009, pp. 1–6, Wroclaw, Poland.
  • 5. Bujacz M., Skulimowski P., Strumillo P. (2012), Naviton – a prototype mobility aid for auditory presentation of 3D scenes, Journal of Audio Engineering Society, 60, 9, 696–708.
  • 6. Capp M., Picton P. (2000), The optophone: an electronic blind aid, Engineering Science and Education Journal, 9, 3, 37–143.
  • 7. Csapo A., Wersenyi G. (2013) Overview of auditory representations in human-machine interfaces, ACM Computing Surveys, 46, 2, 19:1–19:23.
  • 8. Csapo A., Wersenyi G., Nagy H., Stockman T. (2015), A survey of assistive technologies and applications for blind users on mobile platforms – a review and foundation for research, Journal of Multimodal User Interfaces, 9, 3, 11 pages.
  • 9. Dakopoulos D., Bourbakis N.G. (2010), Wearable obstacle avoidance electronic travel aids for blind: a survey, IEEE Transactions on Systems Man and Cybernetics – Part C: Applications and Reviews, 40, 1, 25–35.
  • 10. Dewhurst D. (2010), Creating and Accessing Audio-Tactile Images with “HFVE” Vision Substitution Software, [in:] Proc. of the 3rd Interactive Sonification Workshop, pp. 101–104, KTH, Stockholm.
  • 11. Dobrucki A., Plaskota P., Pruchnicki P., Pec M., Bujacz M., Strumillo P. (2010), Measurement system for personalized head-related transfer functions and its verification by virtual source localization trials with visually impaired and sighted individuals, Journal of Audio Engineering Society, 58, 9, 724–738.
  • 12. Edwards A. D. N. (2011), Auditory Display in Assistive Technology, [in:] The Sonification Handbook, Hermann T., Hunt A., Neuhoff J. G. [Eds.], Logos Publishing House, Berlin, pp. 431–453.
  • 13. Elli G. V., Benetti S., Collington O. (2014), Is There a Future for Sensory Substitution Outside Academic Laboratories?, Multisensory Research, 27, 271–291.
  • 14. Fajarnes G. P., Dunai L., Praderas V. S., Dunai I. (2010), CASBLiP-a new cognitive object detection and orientation system for impaired people, 4th International Conference on Cognitive Systems, Zurich, Switzerland.
  • 15. Farcy R., Bellik Y. (2002), Locomotion assistance for the blind.[in:]Universal Access and Assistive Technology, Keates S., Langdom P., Clarkson P., Robinson P. [Eds.], pp. 277–284, Springer.
  • 16. Farmer L. (1978), Mobility Devices, [in:] Foundation of Orientation and Mobility, American Foundation for the Blind Inc. NY.
  • 17. Fernández Tomás M., Peris-Fajarnés G., Dunai L., Redondo J. (2007), Convolution application in environment sonification for blind people, VIII Jornadas de Matemática Aplicada, UPV.
  • 18. Fontana F., Fusiello A., Gobbi M., Murino V., Rocchesso D., Sartor L., Panuccio A. (2002) A Cross-Modal Electronic Travel Aid Device, Human Computer Interaction with Mobile Devices, Lecture Notes in Computer Science Volume 2411, pp 393–397.
  • 19. Gomez Valencia J.D. (2014), A computer-vision based sensory substitution device for the visually impaired (See ColOr), PhD thesis, University of Geneva.
  • 20. González-Mora J., Rodríguez-Hernández A., Rodríguez-Ramos L., Díaz-Saco L., Sosa N. (1999), Development of a new space perception system for blind people, based on the creation of a virtual acoustic space, Engineering Applications of Bio–Inspired Artificial Neural Networks, pp. 321–330, Springer Berlin/Heidelberg.
  • 21. Hermann T., Hunt A., Neuhoff J. G. [Eds.], (2011), The Sonification Handbook, Logos Publishing House, Berlin.
  • 22. Hersh M., Johnson M. [Eds.], (2008), Assistive technology for visually impaired and blind people, Springer, London.
  • 23. Heyes D. (1984), The Sonic Pathfinder: a new electronic travel aid, Journal of Visual Impairment and Blindness, 77, 200–202.
  • 24. I-Cane Mobilo, (2015), www.i-cane.org (accessed May 2015).
  • 25. Jie X., Xiaochi W., Zhigang F. (2010), Research and Implementation of Blind Sidewalk Detection in Portable ETA System, International Forum on Information Technology and Applications, pp. 431–434.
  • 26. Kay L. (1964), An ultrasonic sensing probe as a mobility aid for the blind, Ultrasonics, 2, 2, 53–56.
  • 27. Kay L. (1974), A sonar aid to enhance spatial perception of the blind: engineering design and evaluation, Radio and Electronic Engineering, 44, 605–627.
  • 28. Kramer G. [Ed.], (1994), Auditory Display: Sonification, Audification, and Auditory interfaces, Santa Fe Institute Studies in the Sciences of Complexity, vol. XVIII, Reading, MA: Addison-Wesley.
  • 29. Levy-Tzedek S., Hanassy S., Abboud S., Maidenbaum S., Amedi A. (2012), Fast, accurate reaching movements with a visual-to-auditory sensory substitution device, Restorative Neurology and Neuroscience, 30, 313–323.
  • 30. Lokki T., Savioja L., Vaananen R., Huopaniemi J., Takala T. (2003), Creating interactive virtual auditory environments, IEEE Computer Graphics and Applications, 22, 4, 49–57.
  • 31. Loomis J. M. (1992), Distal attribution and presence, Forum Spotlight on: The Concept of Telepresence, 1, 113–119.
  • 32. Maidenbaum S., Abboud S., Amedi A. (2014), Sensory substitution: Closing the gap between basic research and widespread practical visual rehabilitation, Neuroscience and Biobehavioral Reviews, 14, 3–15.
  • 33. Maidenbaum S., Hanassy S., Abboud S., Buchs G., Chebat D. R., Levy-Tzedek S., Amedi A. (2014), The “EyeCane”, a new electronic travel aid for the blind: technology, behavior & swift learning, Restorative Neurology and Neuroscience, 32, 813–824.
  • 34. Malvern B., Nazir A. (1973), An improved laser cane for the blind, [in:] Developments in Laser Technology II, SPIE Proceedings 0041.
  • 35. McGookin D. K., Brewster S. A. (2004), Understanding concurrent earcons: applying auditory scene analysis principles to concurrent earcon recognition, ACM Transactions on Applied Perception, 1, 2, 130–155.
  • 36. Merabet L., Battelli L., Obretenova S., Maguire S., Meijer P., Pascual-Leone A. (2009), Functional recruitment of visual cortex for sound encoded object identification in the blind, Neuroreport, 20, 2, 132–138.
  • 37. Meijer P. (1992), An experimental system for auditory image representations, IEEE Transactions on Biomedical Engineering, 39, 112–121.
  • 38. Milios E., Kapralos B., Kopinska A., Stergiopoulos S. (2003), Sonification of range information for 3-D space perception, IEEE Transactions on Neural Systems and Rehabilitation Engineering, 11, 4, 416–421.
  • 39. Miniguide mobility aid (2015), www.gdp-research.com.au (accessed May 2015).
  • 40. Moldoveanu A., Balan O., Moldoveanu F. (2014), Training system for improving sound localization, 10th International Conference “eLearning and Software for Education”, 24–25 April, pp. 32, Bucharest, Romania.
  • 41. Orlowski R. (1976), Ultrasonic echo reinforcement for the blind, Ph.D. thesis University of Nottingham.
  • 42. Ostrowski B., Strumillo P., Pelczynski P., Danych R. (2011), A wearable stereovision unit in an electronic travel–aid system for the visually impaired, Image Processing and Communications Challenges 2, Advances in Intelligent and Soft Computing, 102, 191–198.
  • 43. Pec M., Bujacz M., Strumillo P., Materka A. (2008), Individual HRTF measurements for accurate obstacle sonification in an electronic travel aid for the blind, Proceedings International Conference on Signals and Electronic Systems, pp. 235–238, Cracow, Poland.
  • 44. Sainarayanan G., Nagarajan R., Yaacob S. (2007), Fuzzy image processing scheme for autonomous navigation of human blind, Applied Soft Computing, 7, 1, 257–264.
  • 45. Shoval S., Borenstein J., Koren Y. (1998), Auditory guidance with the navbelt – a computerized travel aid for the blind, IEEE Transactions on Systems, Man, and Cybernetics, 28, 3, 459–467.
  • 46. Skulimowski P., Bujacz M., Strumillo P. (2009), Detection and parameter estimation of objects in a 3D scene image, Processing & Communications Challenges, 1, pp. 308–316, Academy Publ. House EXIT, Warsaw.
  • 47. Strumillo P., Szajerman D., Pelczynski P., Materka A. (2009),Implementation of stereo matching algorithms on graphics processing units, Image Processing and Communications Challenges, pp. 286–293, Bydgoszcz, Poland.
  • 48. Szczypinski P., Pelczynski P., Szajerman D., Strumillo P. (2010) Implementation of computer vision algorithms in DirectShow technology, Image Processing and Communications Challenges 2, Advances in Intelligent and Soft Computing, 84, 31–38.
  • 49. vOICe (2015), www.seeingwithsound.com (last accessed: May 2015).
  • 50. Visell Y. (2009), Tactile sensory substitution: models for enaction in HCI, Interacting with Computers, 21, 1–2, 38–53.
  • 51. Zhigang F., Ting L. (2010), Audification–based electronic travel aid system, IEEE International Conference on Computer Design and Applications (ICCDA 2010), pp. 137–141, Qinhuangdao, China.
Uwagi
Opracowanie ze środków MNiSW w ramach umowy 812/P-DUN/2016 na działalność upowszechniającą naukę.
Typ dokumentu
Bibliografia
Identyfikator YADDA
bwmeta1.element.baztech-686d9cff-0780-4e5a-bf6e-488d8610cadc
JavaScript jest wyłączony w Twojej przeglądarce internetowej. Włącz go, a następnie odśwież stronę, aby móc w pełni z niej korzystać.