Share This Article:

Enhancing Accessibility of Visual Information via Sound: Metaphoric Association versus Rule-Based Mapping

Abstract Full-Text HTML Download Download as PDF (Size:330KB) PP. 410-418
DOI: 10.4236/psych.2012.35058    5,586 Downloads   7,767 Views   Citations

ABSTRACT

The goal of this study was to develop and test methods for enhancing accessibility of visual information through conversion to sound. In three experiments, normally sighted and visually impaired participants learned to associate sounds to referent visual stimuli. The conversion included an experience-based method that made use of natural sounds of objects and a rule-based method, which produced an appropriate “auditory graph” via a precise function. Learning was easier with the first method but an appreciable transfer of learning was only observed with the second method. Rendering the visual input highly accessible, these methods are capable of improving activities of daily living.

Conflicts of Interest

The authors declare no conflicts of interest.

Cite this paper

Shenkar, O. & Algom, D. (2012). Enhancing Accessibility of Visual Information via Sound: Metaphoric Association versus Rule-Based Mapping. Psychology, 3, 410-418. doi: 10.4236/psych.2012.35058.

References

[1] Bly, S. A. (1982). Communicating with sound. Proceedings of the International Conference on Computer Human Interface, 371-375.
[2] Bonebright, T. L., Nees, M. A., Connerley, T. T., & McCain, G. R. (2001). Testing the effectiveness of sonified graphs for education: A programmatic research project. Proceedings of the International Conference on Auditory Display, Espoo, 29 July-1 August 2001, 222-226.
[3] Cabrera, D., & Ferguson, S. (2007). Sonification of sound: Tools for teaching acoustic and audio. Proceedings of the International Conference on Auditory Display, Espoo, 29 July-1 August 2007, 483- 491.
[4] Flowers, J. H., & Hauer, T. A. (1992). The ear’s versus the eye’s potential to assess characteristics of numeric data: are we too visuocentric? Behavior Research Methods, Instruments, & Computers, 24, 258-264. doi:10.3758/BF03203504
[5] Flowers, J. H., & Hauer, T. A. (1993). “Sound” alternatives to visual graphics for exploratory data analysis. Behavior Research Methods, Instruments, & Computers, 25, 242-249. doi:10.3758/BF03204505
[6] Flowers, J. H., & Hauer, T. A. (1995). Musical versus visual graphs: Cross-modal equivalence in perception of time series data. Human Factors, 37, 553-569. doi:10.1518/001872095779049264
[7] Flowers, J. H., Buhman, D. C., & Turnage, K. D. (1997). Cross-modal equivalence of visual and auditory scatterplots for exploring bivariate data samples. Human Factors, 39, 341-351. doi:10.1518/001872097778827151
[8] Jacobson, R. D. (1998). Navigating maps with little or no sight: A novel audio-tactile approach. Proceedings of Content Visualization and Intermedia Representation, 95-101.
[9] James, F. (1998). Lessons from developing audio HTML Interfaces. Proceedings of ASSETS, 27-34.
[10] Kamel, H. M., & Roth, P., Sinha, R. R. (2001). Graphics and User’s Exploration via Simple Sonics (GUESS): Providing interrelational representation of objects in a non-visual environment. Proceedings of the International Conference on Auditory Display, Espoo, 29 July-1 August 2001, 261-266.
[11] Keller, J. M., Prather, E. E., Boynton, W. V., Enos, H. L., Jones, L. V., Pompea, S. M., Slater, T. F., & Quinn. M. (2003). Educational testing of an auditory display regarding seasonal variation of Martian polar ice caps. Proceedings of the International Conference on Auditory Display, Boston, 6-9 July 2003, 212-215.
[12] Meijer, P. B. L. (1992). An experimental system for auditory image representations. IEEE Transaction on Biomedical Engineering, 39, 112-121. doi:10.1109/10.121642
[13] Nees, M. A., & Walker, B. N. (2007). Listener, task and audio graph: Toward a conceptual model on auditory graph comprehension. Proceedings of the International Conference on Auditory Display, Montréal, 26-29 June 2007, 266-273.
[14] Pirhonen, A. (2007). Semantics of sounds and images: Can they be paralleled? Proceedings of the International Conference on Auditory Display, Montréal, 26-29 June 2007, 219-325.
[15] Walker, B. N., & Cothran, J. T. (2003). Sonification sandbox: A graphical toolkit for auditory graphs. Proceedings of the International Conference on Auditory Display, Boston, 6-9 July 2003, 161-163.

  
comments powered by Disqus

Copyright © 2019 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.