sonification

An Overview of Auditory Displays and Sonification

Auditory Displays

Auditory Displays are systems where a human user makes sense of data using his/her listening skills, like for instance any data under analysis or data that represent states of the information processing system. Auditory displays include the information pre-processing system (A), the task, the techniques for data processing and computation (B), the sonification engine (C), sound signal amplification, the effectors (speakers, headphones) (D), the anticipated user, his/her listening abilities and the situational context (background sounds etc.) as depicted in the following figure.

figure showing components of an AD system

Sonification

Sonification is the use of sound – mainly non-speech audio signals – for representing or displaying data. Similar to scientific visualization, sonification aims at enabling human listeners to make use of their highly-developed perceptual skills (in this case listening skills) for making sense of the data. More specifically, sonification refers to the technique used to create a sound signal in a systematic, well defined way, that involves data under examination as an essential ingredient for sound computing. See also the definition of sonification.

Sonification Techniques

Sonification techniques include a very direct connection (Audification), a mapping-based connection (Parameter Mapping Sonification) and more indirect linkage types (e.g. Model-based Sonification). There are also sonification techniques to represent very specific information such as messages (alarm conditions, completion acknowledgements, orientation in a menu hierarchy etc.). Earcons and Auditory Icons are mainly used for such auditory display tasks. See also the sonification techniques page.

figure showing sonification techniques, hierarchically

Motivation

The motivation for using non-speech sound in human-computer interactions is manifold, because:

  • Sound represents frequency responses in an instant (as timbral characteristics)
  • Sound represents changes over time, naturally
  • Sound allows microstructure to be perceived
  • Sound rapidly portray large amounts of data
  • Sound alerts listeners to events outside their current visual focus
  • Sound holistically brings together many channels of information

The different perceptual characteristics make sound ideal to complement visually displayed information.

Applications

Particularly in contexts where the visual sense is already highly loaded, it makes sense to consider sound as a modality to give a better distribution of information. For some users (such as visually impaired people, or surgeons in an operating theatre) the visual sense is typically otherwise blocked, so that sonification offers the unique opportunity to display information. A strongly promising application field, however, is the exploratory analysis of high-dimensional and complex data as addressed in the field of data mining and multivariate statistics. The better we can bring the human user in contact with complex data, the more likely he/she will gain insight (insound?) into the data. Combining visual/auditory/tactile modalities leads to multimodal data exploration techniques, and this is sometimes referred to in general as data perceptualization. The sonification application page gives an overview.

Interactive Sonification

Interactive Sonification puts a particular focus on those systems where the user is tightly integrated into a closed-loop sonification system. The idea is that interaction is as important for understanding and using auditory displays efficiently as multiple views are necessary to understand for instance the 3D geometry of objects in the world. Interaction binds the user’s actions to acoustic reactions, and thus creates modes of fast and seamless navigation and exploration, and ultimately, the experience of flow while performing an activity. We are eager to understand how interaction in sonification can support the utility, effectiveness, acceptance and perceived aesthetics of auditory displays. ISon, the Interactive Sonification Workshop series, launched in 2004 by Thomas Hermann and Andy Hunt in Bielefeld, Germany summarizes the progress in this research area. More information is available at www.interactive-sonification.org.

Closed-Loop Auditory Systems

Auditory Displays that close the loop between the human listener and the system may have different forms, that mostly correspond to different tasks. The following figure shows how sonification can be integrated into a closed-loop system.

figure showing Closed Loop Systems

In all cases, the sonification module generates sound dependent on the data and interactions. However, there are different degrees to close the loop.

  • Monitoring: if there is no immediate action, the system might be called a monitoring system, that either monitors real-time data (stemming from the world) or data played back from a data base (e.g. in real-time).
  • Interactive Sonification: In the case that the user’s actions change parameters of the sonification technique, or interact with the sonification module, we obtain an interactive sonification in the most direct sense: the user is enabled to control directly how data are represented as sound. This might be to optimize a sonification in order to make a specific structure in the sound more salient, thus to optimize the sonification according to some objective. Excitory interfaces, as typically obtained when using Model-based sonification, allows the user to excite a sonification model by putting energy into the system (e.g. hit, shake the data-driven dynamic system). Such interactions cause auditory responses that depend upon the underlying data and thus represent information about the data. These and similar immediate interactions with a sonification system are meant by interactive sonification.
  • Navigation occurs, when the user’s interaction selects the data used for sonification as activity to close the loop. This is basically also a sort of interactive sonification, but more precisely one that navigates the data
  • Auditory Biofeedback: if the human activity (measured via any available sensor) is itself the data which is subject to the sonification, the sonification is an auditory represenation of the human’s activity, which is classically referred to as bio-feedback. There are different applications, including technique and tactics training in sports (as conducted by the author), AcouMotion, which allows for instance novel sports games for the visually impaired, and Brain-Computer Interfaces (BCI) and more.
  • Human Activity in the World: if the action affects a system in the world that in turn causes data to change which are the input of a sonification system, the user’s intention will likely be different from the sonification, but an anticipated change in the world. In this sense, the sonification may be a helpful by-product of the interaction to support human action. A real-world example would be, if someone puts a cup on a table. The impact sound is a by-product of that interaction and serves as confirmation for the completed action unit. An example for experimental control might be to adjust a laser beam via control screws on an adjustable mirror – the action refers to the world, however, in consequence the laser beam might move on a detector and a sonification of these data might change. (this example is elaborated further in the Master’s thesis of Arne Wulf on Auditory-Closed Loop systems, supervised by me). The sonification is not the original intention of the user’s action, but it helps to better perform activitiy in the world.