In this project, we make a case for leveraging the unique affordances of the human ear for eyes-free, mobile interaction. We present EarPut, a novel interface concept, which instruments the ear as an interactive surface for touch-based interactions and its prototypical hardware implementation. The central idea behind EarPut is to go beyond prior work by unobtrusively augmenting a variety of accessories that are worn behind the ear, such as headsets or glasses. Results from a controlled experiment with 27 participants provide empirical evidence that people are able to target salient regions on their ear effectively and precisely. Moreover, we contribute a first, systematically derived interaction design space for ear-based interaction and a set of exemplary applications.
- Roman Lissermann, Jochen Huber, Aristotelis Hadjakos, Suranga Nanayakkara and Max Mühlhäuser. "EarPut: Augmenting Ear-worn Devices for Ear-based Interaction". In OzCHI ’14: Proceedings of the 26th Australian Computer-Human Interaction Conference, ACM, 2014. [PDF]