Thomas Hermann | Research Overview
|My research fields are Sonification, Data Mining, and Human-Computer Interfaces. Sonification is the use of sound for the presentation of data. Data Mining is the field where I apply sonification techniques in order to exploit our highly-developed perceptual capabilities for discovering structures and hidden regularities in high-dimensional data.Human-Computer Interfaces play a central role in enabling the interaction with the data under analysis, and my interests focus on multimodal and highly interactive data displays for real-time manipulation of – and navigation in – complex data.|
Particularly, I focus on the development of new techniques to render auditory presentations of high-dimensional data, which portray useful task-related information for data analysis by sound. The approach of Model-based sonification (MBS), developed in my Ph.D. thesis, provides a framework for generic connections between data spaces and acoustic processes and puts the focus on the interaction with a sonification system.
Human hands allow particularly complex and multi-dimensional control and their use for manipulating multimodal data renderings offers an enormous potential. My approach is to develop both computer-vision based interfaces to enable gestural control, and tangible computing interfaces to exploit the skills in physical interactions.
Understanding and improving the interface between information spaces (like for instance in data mining) and the human perceptual spaces demands a broad interdisciplinary view: here I rely on my strong background in music, physics, computer science, and my competences in interdisciplinary dialogue and research which I developed within the interdisciplinary graduate program Task-oriented Communication (between linguistics and computer science) and in international and interdisciplinary cooperations.
I am driven by the auspicious vision that scientific sonification, visualization, tactile displays and tangible or gestural interactions can be combined in novel synergistic ways to create multimodal experiences that provide more than the sum of isolated displays.
Please see the research section for more details.
Research Projects Selection
The following list provides a selection of projects I initiated and conducted during the last years. More details, including sound examples and the involved cooperating researchers are given on the corresponding project pages in [research].
For the development of Theory of auditory display, I contributed
- Model-Based Sonification – a framework for the generation of generic interfaces between high-dimensional data sets and sounding objects
- McMC Sonification for exploring high-dimensional distributions
- Sonification models – novel techniques for exploratory data analysis ranging from Principal Curve Sonification over Particle Trajectories Sonification toGrowing Neural Gas Sonification
- Gestural Navigation in high-dimensional spaces using Self-Organizing Maps (SOM) and the concept of the Aura.
- Psychophysical Experiments for assessing the use of sonification.
As Applications of sonification, I initiated resp. supervised/cooperated on the following selected projects:
- Sonification of Stock Market Data: real-time and compressing sonifications for guiding attention in market behaviours.
- EEG sonification: tools for exploratory analysis of data from psycholinguistic experiments. My current research focuses on techniques to discriminate pathologies via sonification of epileptic episodes.
- Sonification of Therapeutic Verbatim Protocols: Psychotherapy sessions have been sonified to allow high compression in browsing session protocolls for detecting key moments in the course of therapey sessions.
- Exploration of biomedical multi-channel fluorescence microscopy images using sonification
- Broadcasting Auditory Weather Forecasts: daily rendered sonifications have been integrated in the programme of a Bielefeld radio station for half a year.
- Using Sonification to support Tactics Analysis in Sports Games
- Sonification for augmenting experimental techniques in nano technology, on the example of Atomic Force Microscopy-based force spectroscopy.
- Blindminton – an Auditory Sports Games for Blind People
- AcouMotion – A system for acoustic motion control with applications in Sports Science, Physiotherapy and Cognitive Research.
Additionally I was closely involved in the development of technical infrastructure and Laboratory Facilities:
- Tangible Desk (tDesk): A tangible human computer interface to allow physical manipulations and mixed-reality applications, with application from edutainment to exploratory data analysis.
- A Malleable User Interface (MUI): an interface to enable deformable manipulations for interacting with high-dimensional data on top of the tDesk.
- An audio-haptic ball interface for sonification: an interface ball equipped with sensors and actuators, as novel instrument to excite sonification models.
- Ambient Light Objects: OSC-controlled luminiscating objects for ambient data display.
- Interaction Lab (iLab): Our iLab is an intelligent room equipped with the components listed above plus an 8-channel spatial audio equipment, several cameras, and additional sensors. The aim is to study multi-modal interactions like the coordinated use of real-time rendered auditory, tactile and visual display.