Skip to main content
Social Sci LibreTexts

3.4: References

  • Page ID
    129503
  • \( \newcommand{\vecs}[1]{\overset { \scriptstyle \rightharpoonup} {\mathbf{#1}} } \) \( \newcommand{\vecd}[1]{\overset{-\!-\!\rightharpoonup}{\vphantom{a}\smash {#1}}} \)\(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\) \(\newcommand{\id}{\mathrm{id}}\) \( \newcommand{\Span}{\mathrm{span}}\) \( \newcommand{\kernel}{\mathrm{null}\,}\) \( \newcommand{\range}{\mathrm{range}\,}\) \( \newcommand{\RealPart}{\mathrm{Re}}\) \( \newcommand{\ImaginaryPart}{\mathrm{Im}}\) \( \newcommand{\Argument}{\mathrm{Arg}}\) \( \newcommand{\norm}[1]{\| #1 \|}\) \( \newcommand{\inner}[2]{\langle #1, #2 \rangle}\) \( \newcommand{\Span}{\mathrm{span}}\)\(\newcommand{\AA}{\unicode[.8,0]{x212B}}\)

    Ashmead, D. H., Leroy, D., & Odom, R. D. (1990). Perception of the relative distances of nearby sound sources. Perception & Psychophysics47(4), 326–331.

    Avbelj, V. (2012). Auditory display of biomedical signals through a sonic representation: ECG and EEG sonification. In MIPRO, 2012 Proceedings of the 35th International Convention (pp. 474–475). IEEE.

    Axon, L., Alahmadi, B., Nurse, J. R., Goldsmith, M., & Creese, S. (2018). Sonification in security operations centres: what do security practitioners think? Analyst7, 3.

    Baecker, R. M., & Buxton, W. A. (1987). Human-computer interaction: a multidisciplinary approach. Morgan Kaufmann Publishers Inc.

    Baier, G., Hermann, T., & Stephani, U. (2007). Event-based sonification of EEG rhythms in real time. Clinical Neurophysiology118(6), 1377–1386.

    Ballora, M. (2014). Sonification, Science and Popular Music: In search of the ‘wow.’ Organised Sound19(01), 30–40. https://doi.org/10.1017/S1355771813000381

    Ballora, M., Giacobe, N. A., & Hall, D. L. (2011). Songs of cyberspace: an update on sonifications of network traffic to support situational awareness. In Multisensor, Multisource Information Fusion: Architectures, Algorithms, and Applications 2011 (Vol. 8064, p. 80640P). International Society for Optics and Photonics.

    Barrass, S., & Kramer, G. (1999). Using sonification. Multimedia Systems7(1), 23–31.

    Batavia, M., Gianutsos, J. G., Vaccaro, A., & Gold, J. T. (2001). A do-it-yourself membrane-activated auditory feedback device for weight bearing and gait training: a case report. Archives of Physical Medicine and Rehabilitation82(4), 541–545.

    Baudry, L., Leroy, D., Thouvarecq, R., & Chollet, D. (2006). Auditory concurrent feedback benefits on the circle performed in gymnastics. Journal of Sports Sciences24(2), 149–156.

    Bazilinskyy, P., Petermeijer, S. M., Petrovych, V., Dodou, D., & De Winter, J. C. F. (2015). Take-over requests in highly automated driving: A crowdsourcing survey on auditory, vibrotactile, and visual displays. Unpublished.

    Bazilinskyy, Pavlo, van Haarlem, W., Quraishi, H., Berssenbrugge, C., Binda, J., & de Winter, J. (2016). Sonifying the location of an object: A comparison of three methods. IFAC-PapersOnLine49(19), 531–536.

    Boyd, J., & Godbout, A. (2010). Corrective Sonic Feedback for Speed Skating: A Case Study. Georgia Institute of Technology.

    Brock, M., & Kristensson, P. O. (2013). Supporting blind navigation using depth sensing and sonification. In Proceedings of the 2013 ACM conference on Pervasive and ubiquitous computing adjunct publication (pp. 255–258). ACM.

    Brown, L. M., & Brewster, S. A. (2003). Drawing by ear: Interpreting sonified line graphs. Georgia Institute of Technology.

    Brown, M. L., Newsome, S. L., & Glinert, E. P. (1989). An experiment into the use of auditory cues to reduce visual workload. In ACM SIGCHI Bulletin (Vol. 20, pp. 339–346). ACM.

    Cabrera, D., Ferguson, S., & Laing, G. (2005). Development of auditory alerts for air traffic control consoles. In Audio Engineering Society Convention 119. Audio Engineering Society.

    Chollet, D., Madani, M., & Micallef, J. P. (1992). Effects of two types of biomechanical bio-feedback on crawl performance. Biomechanics and Medicine in Swimming, Swimming Science VI48, 53.

    Chollet, D., Micallef, J. P., & Rabischong, P. (1988). Biomechanical signals for external biofeedback to improve swimming techniques. Swimming Science V. Champaign, IL: Human Kinetics Books, 389–396.

    Dayé, C., & de Campo, A. (2006). Sounds sequential: sonification in the social sciences. Interdisciplinary Science Reviews31(4), 349–364. https://doi.org/10.1179/030801806X143286

    De Campo, A., Hoeldrich, R., Eckel, G., & Wallisch, A. (2007). New sonification tools for EEG data screening and monitoring. Georgia Institute of Technology.

    Debashi, M., & Vickers, P. (2018). Sonification of network traffic flow for monitoring and situational awareness. PLOS ONE13(4), e0195948. https://doi.org/10.1371/journal.pone.0195948

    Diaz Merced, W. L. (2013). Sound for the exploration of space physics data (PhD). University of Glasgow. Retrieved from http://encore.lib.gla.ac.uk/iii/encore/record/C\_\_Rb3090263

    Diaz-Merced, W. L., Candey, R. M., Brickhouse, N., Schneps, M., Mannone, J. C., Brewster, S., & Kolenberg, K. (2011). Sonification of Astronomical Data. Proceedings of the International Astronomical Union7(S285), 133–136. https://doi.org/10.1017/S1743921312000440

    Diaz-Merced, W. L. L. (2017). We too may find new planets. In AASTCS5 Radio Exploration of Planetary Habitability, Proceedings of the conference 7-12 May, 2017 in Palm Springs, CA. Published in Bulletin of the American Astronomical Society, Vol. 49, No. 3, id. 202.01 (Vol. 49).

    Driver, J. (2001). A selective review of selective attention research from the past century. British Journal of Psychology92(1), 53–78.

    Driver, J., & Spence, C. (1998). Crossmodal attention. Current Opinion in Neurobiology8(2), 245–253. https://doi.org/10.1016/S0959-4388(98)80147-5

    Dubus, G., & Bresin, R. (2015). Exploration and evaluation of a system for interactive sonification of elite rowing. Sports Engineering18(1), 29–41. https://doi.org/10.1007/s12283-014-0164-0

    Dyer, J. F., Stapleton, P., & Rodger, M. (2017). Mapping Sonification for Perception and Action in Motor Skill Learning. Frontiers in Neuroscience11https://doi.org/10.3389/fnins.2017.00463

    Edworthy, J., Hellier, E., Aldrich, K., & Loxley, S. (2004). Designing trend-monitoring sounds for helicopters: methodological issues and an application. Journal of Experimental Psychology: Applied10(4), 203.

    Eriksson, M., & Bresin, R. (2010). Improving running mechanics by use of interactive sonification. Proceedings of ISon, 95–98.

    Escera, C., Alho, K., Winkler, I., & Näätänen, R. (1998). Neural mechanisms of involuntary attention to acoustic novelty and change. Journal of Cognitive Neuroscience10(5), 590–604.

    Fitch, W. T., & Kramer, G. (1994). Sonifying the body electric: Superiority of an auditory over a visual display in a complex, multivariate system. In SANTA FE INSTITUTE STUDIES IN THE SCIENCES OF COMPLEXITY-PROCEEDINGS VOLUME- (Vol. 18, pp. 307–307). Addison-Wesley Publishing Co.

    Flowers, J. H. (2005). Thirteen years of reflection on auditory graphing: Promises, pitfalls, and potential new directions. Georgia Institute of Technology.

    Foner, L. N. (1999). Artificial synesthesia via sonification: A wearable augmented sensory system. Mobile Networks and Applications4(1), 75–81.

    Frysinger, S. P. (2005). A brief history of auditory data representation to the 1980s. Georgia Institute of Technology.

    Garcia, A., Peres, S. C., Ritchey, P., Kortum, P., & Stallmann, K. (2011). Auditory Progress Bars: Estimations of Time Remaining. Proceedings of the Human Factors and Ergonomics Society Annual Meeting55(1), 1338–1341. https://doi.org/10.1177/1071181311551278

    George, S. S., Crawford, D., Reubold, T., & Giorgi, E. (2017). Making Climate Data Sing: Using Music-like Sonifications to Convey a Key Climate Record. Bulletin of the American Meteorological Society98(1), 23–27.

    Gfeller, K., Woodworth, G., Robin, D. A., Witt, S., & Knutson, J. F. (1997). Perception of rhythmic and sequential pitch patterns by normally hearing adults and adult cochlear implant users. Ear and Hearing18(3), 252–260.

    Gilfix, M., & Couch, A. L. (2000). Peep (The Network Auralizer): Monitoring Your Network with Sound. In LISA (pp. 109–117).

    Gionfrida, L., & Roginska, A. (2017). A Novel Sonification Approach to Support the Diagnosis of Alzheimer’s Dementia. Frontiers in Neurology8, 647. https://doi.org/10.3389/fneur.2017.00647

    Grond, F., & Hermann, T. (2014a). Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines. Organised Sound19(1), 41–51.

    Grond, F., & Hermann, T. (2014b). Interactive Sonification for Data Exploration: How listening modes and display purposes define design guidelines. Organised Sound19(01), 41–51. https://doi.org/10.1017/S1355771813000393

    Hermann, T., Hunt, A., & Neuhoff, J. G. (2011). The sonification handbook. Logos Verlag Berlin.

    Jamson, A. H., Lai, F. C., & Carsten, O. M. (2008). Potential benefits of an adaptive forward collision warning system. Transportation Research Part C: Emerging Technologies16(4), 471–484.

    Kather, J. N., Hermann, T., Bukschat, Y., Kramer, T., Schad, L. R., & Zöllner, F. G. (2017). Polyphonic sonification of electrocardiography signals for diagnosis of cardiac pathologies. Scientific Reports7, 44549. https://doi.org/10.1038/srep44549

    Kirby, R. (2009). Development of a real-time performance measurement and feedback system for alpine skiers. Sports Technology2(1–2), 43–52.

    Kish, D. (2009, April 11). Seeing with sound: What is it like to “see” the world using sonar? Daniel Kish, who lost his sight in infancy, reveals all. New Scientist.

    Knoll, G. F. (2010). Radiation detection and measurement. John Wiley & Sons.

    Kortum, P., Peres, S. C., Knott, B. A., & Bushey, R. (2005). The Effect of Auditory Progress Bars on Consumer’s Estimation of Telephone wait Time. Proceedings of the Human Factors and Ergonomics Society Annual Meeting49(4), 628–632. https://doi.org/10.1177/154193120504900406

    Kramer, G. (2000). Auditory display: sonification, audification and auditory interfaces. Addison-Wesley Longman Publishing Co., Inc.

    Larsen, P. E. (2016). More of an art than a science: Using microbial DNA sequences to compose music. Journal of Microbiology & Biology Education17(1), 129.

    Loeb, R. G., & Fitch, W. T. (2002). A laboratory evaluation of an auditory display designed to enhance intraoperative monitoring. Anesthesia & Analgesia94(2), 362–368.

    Loui, P., Koplin-Green, M., Frick, M., & Massone, M. (2014). Rapidly Learned Identification of Epileptic Seizures from Sonified EEG. Frontiers in Human Neuroscience8https://doi.org/10.3389/fnhum.2014.00820

    Lunn, P., & Hunt, A. (2011). Listening to the invisible: Sonification as a tool for astronomical discovery.

    Mezrich, J. J., Frysinger, S., & Slivjanovski, R. (1984). Dynamic representation of multivariate time series data. Journal of the American Statistical Association79(385), 34–40.

    Näätänen, R., Paavilainen, P., Rinne, T., & Alho, K. (2007). The mismatch negativity (MMN) in basic research of central auditory processing: A review. Clinical Neurophysiology118(12), 2544–2590. https://doi.org/10.1016/j.clinph.2007.04.026

    Nagarajan, R., Yaacob, S., & Sainarayanan, G. (2003). Role of object identification in sonification system for visually impaired. In TENCON 2003. Conference on Convergent Technologies for the Asia-Pacific Region (Vol. 2, pp. 735–739). IEEE.

    Nees, M. A., & Walker, B. N. (2009). Auditory Interfaces and Sonification.

    Oscari, F., Secoli, R., Avanzini, F., Rosati, G., & Reinkensmeyer, D. J. (2012). Substituting auditory for visual feedback to adapt to altered dynamic and kinematic environments during reaching. Experimental Brain Research221(1), 33–41.

    Parkinson, A., & Tanaka, A. (2013). Making Data Sing: Embodied Approaches to Sonification. In Sound, Music, and Motion (pp. 151–160). Springer, Cham. https://doi.org/10.1007/978-3-319-12976-1\_9

    Parvizi, J., Gururangan, K., Razavi, B., & Chafe, C. (2018). Detecting silent seizures by their sound. Epilepsia59(4), 877–884.

    Paterson, E., Sanderson, P. M., Paterson, N. a. B., & Loeb, R. G. (2017). Effectiveness of enhanced pulse oximetry sonifications for conveying oxygen saturation ranges: a laboratory comparison of five auditory displays. British Journal of Anaesthesia119(6), 1224–1230. https://doi.org/10.1093/bja/aex343

    Pereverzev, S. V., Loshak, A., Backhaus, S., Davis, J. C., & Packard, R. E. (1997). Quantum oscillations between two weakly coupled reservoirs of superfluid 3 He. Nature388(6641), 449.

    Petrofsky, J. (2001). The use of electromyogram biofeedback to reduce Trendelenburg gait. European Journal of Applied Physiology85(5), 491–495.

    Pollack, I., & Ficks, L. (1954). Information of elementary multidimensional auditory displays. The Journal of the Acoustical Society of America26(2), 155–158.

    Qi, L., Martin, M. V., Kapralos, B., Green, M., & García-Ruiz, M. (2007). Toward sound-assisted intrusion detection systems. In *OTM Confederated International Conferences" On the Move to Meaningful Internet Systems“* (pp. 1634–1645). Springer.

    Quinn, M. (2001). Research set to music: The climate symphony and other sonifications of ice core, radar, DNA, seismic and solar wind data. Georgia Institute of Technology.

    Quinn, M. (2012). “Walk on the Sun”: an interactive image sonification exhibit. AI & Society27(2), 303–305.

    Riskowski, J. L., Mikesky, A. E., Bahamonde, R. E., & Burr, D. B. (2009). Design and validation of a knee brace with feedback to reduce the rate of loading. Journal of Biomechanical Engineering131(8), 084503.

    Sanderson, P. (2006). The multimodal world of medical monitoring displays. Applied Ergonomics37(4), 501–512.

    Schaffert, N., Mattes, K., & Effenberg, A. O. (2009). A sound design for the purposes of movement optimisation in elite sport (using the example of rowing). Georgia Institute of Technology.

    Schmitz, G., & Bock, O. (2014). A Comparison of Sensorimotor Adaptation in the Visual and in the Auditory Modality. PloS One9(9), e107834.

    Seagull, F. J., Wickens, C. D., & Loeb, R. G. (2001). When is less more? Attention and workload in auditory, visual, and redundant patient-monitoring conditions. In Proceedings of the Human Factors and Ergonomics Society Annual Meeting (Vol. 45, pp. 1395–1399). SAGE Publications Sage CA: Los Angeles, CA.

    Sigrist, R., Rauter, G., Riener, R., & Wolf, P. (2013). Augmented visual, auditory, haptic, and multimodal feedback in motor learning: A review. Psychonomic Bulletin & Review20(1), 21–53. https://doi.org/10.3758/s13423-012-0333-8

    Sigrist, R., Schellenberg, J., Rauter, G., Broggi, S., Riener, R., & Wolf, P. (2011). Visual and auditory augmented concurrent feedback in a complex motor task. Presence: Teleoperators and Virtual Environments20(1), 15–32.

    Speeth, S. D. (1961). Seismometer sounds. The Journal of the Acoustical Society of America33(7), 909–916.

    Stanton, J. (2015). Sensing big data: Multimodal information interfaces for exploration of large data sets. In Big Data at Work (pp. 172–192). Routledge.

    Stockman, T., Nickerson, L. V., & Hind, G. (2005). Auditory graphs: A summary of current experience and towards a research agenda. Georgia Institute of Technology.

    Sturm, B. L. (2005). Pulse of an Ocean: Sonification of Ocean Buoy Data. Leonardo38(2), 143–149.

    Targett, S., & Fernstrom, M. (2003). Audio games: Fun for all? All for fun! Georgia Institute of Technology.

    Tervaniemi, M., & Brattico, E. (2004). From sounds to music towards understanding the neurocognition of musical sound perception. Journal of Consciousness Studies11(3–4), 9–27.

    Underwood, S. M. (2009). Effects of augmented real-time auditory feedback on top-level precision shooting performance.

    Väljamäe, A., Steffert, T., Holland, S., Marimon, X., Benitez, R., Mealla, S., … Jordà, S. (2013). A review of real-time EEG sonification research (pp. 85–93). Presented at the International Conference on Auditory Display 2013 (ICAD 2013), Lodz, Poland. Retrieved from http://icad2013.com/index.php

    Vickers, P., Laing, C., Debashi, M., & Fairfax, T. (2014). Sonification Aesthetics and Listening for Network Situational Awareness. ArXiv:1409.5282 [Cs]https://doi.org/10.13140/2.1.4225.6648

    Vickers, P., Laing, C., & Fairfax, T. (2017). Sonification of a network’s self-organized criticality for real-time situational awareness. Displays47, 12–24. https://doi.org/10.1016/j.displa.2016.05.002

    Walker, B. N., & Kramer, G. (2004). Ecological psychoacoustics and auditory displays: Hearing, grouping, and meaning making. Ecological Psychoacoustics, 150–175.

    Walker, B. N., & Mauney, L. M. (2010). Universal design of auditory graphs: A comparison of sonification mappings for visually impaired and sighted listeners. ACM Transactions on Accessible Computing (TACCESS)2(3), 12.

    Watson, M., & Sanderson, P. (2004). Sonification supports eyes-free respiratory monitoring and task time-sharing. Human Factors46(3), 497–517.

    Watson, T., & Lip, G. Y. H. (2006). Blood pressure measurement in atrial fibrillation: goodbye mercury? Journal of Human Hypertension20(9), 638.

    Wickens, C. D. (2002). Multiple resources and performance prediction. Theoretical Issues in Ergonomics Science3(2), 159–177.

    Wickens, C. D., & Liu, Y. (1988). Codes and modalities in multiple resources: A success and a qualification. Human Factors30(5), 599–616.

    Wickens, C. D., Parasuraman, R., & Davies, D. R. (1984). Varieties of attention.

    Winberg, F., & Hellstrom, S. O. (2001). Qualitative aspects of auditory direct manipulation. A case study of the towers of Hanoi. Georgia Institute of Technology.

    Yamamoto, G., Shiraki, K., Takahata, M., Sakane, Y., & Takebayashi, Y. (2004). Multimodal knowledge for designing new sound environments. In The International Conference on Human Computer Interaction with Mobile Devices and Services.

    Yeung, E. S. (1980). Pattern recognition by audio representation of multivariate analytical data. Analytical Chemistry52(7), 1120–1123.


    This page titled 3.4: References is shared under a not declared license and was authored, remixed, and/or curated by Matthew J. C. Crump via source content that was edited to the style and standards of the LibreTexts platform; a detailed edit history is available upon request.