Author Archives:

Auditory Display for Improving Free-hand Gesture Interaction

David Black, Bastian Ganze, Julian Hettig, Christian Hansen

Experimental task: the participant has just selected a sphere (green)

Free-hand gesture recognition technologies allow touchless interaction with a range of applications. However, touchless interaction concepts usually only provide primary, visual feedback on a screen. The lack of secondary tactile feedback, such as that of pressing a key or clicking a mouse, in interaction with free-hand gestures is one reason that such techniques have not been adopted as a standard means of input. 

This work explores the use of auditory display to improve free-hand gestures. Gestures using a Leap motion controller were augmented with auditory icons and continuous, model-based sonification. Three concepts were generated and evaluated using a sphere-selection task and a video frame selection task. The user experience of the participants was evaluated using NASA TLX and QUESI questionnaires. Results show that the combination of auditory and visual display outperform both purely auditory and purely visual displays in terms of subjective workload and performance measures.


Towards Uncertainty-Aware Auditory Display for Surgical Navigation


Auditory method to find target (green): Movements in y axis are heard by comparing alternating tones, while movements in x are encoded as position in stereo.

In image-guided navigated interventions, information is often placed on screens to aid the operator in coordinating the placement of a tool with a pre-interventional plan. However, such methods require an operator to repeatedly shift the view from a patient. Auditory display for navigation (e.g., during ablation needle placement, path following, volume resection, etc.) offers suitable means for conveying numerical information using sound, permitting the operator to keep attention on the patient. The numerical navigation information, however, is uncertain due to various sources of potential errors, including from soft-tissue motion estimation processes and tool-tracking hardware. We enhance an existing auditory display approach for soft-tissue navigation to consider this uncertainty in the numerical information. The operator should not only hear navigation cues towards relevant structures, but also receive information on the quality and reliability of such navigation parameters. The consideration of uncertainty in the operating room is, in general, an exciting area of exploration in the usability of new navigation systems, and its transmission with audio significantly offloads reliance on traditional visual displays.