Proximity-based Hand Input for Mobile Interaction

Technological advances in (depth) sensors and mobile projectors resulted in the emergence of a new class of interfaces that extend interaction to the surface of our body. These so-called on-body interfaces allow ubiquitous and mobile interaction with digital contents by sensing input and projecting graphical output on the skin. The hand and forearm receive particular attention because they are often unclothed and socially acceptable to touch. These advantages resulted in a large body of research for body-based projective, augmented or imaginary interfaces.

In most of these systems, the user's non-dominant hand acts as a two-dimensional interactive surface on which the opposing hand interacts with the content through (multi)-touch gestures. While useful and practical, the interaction space is bound to the two-dimensional surface of the hand. Moreover, this style of interaction requires both hands and therefore hardly supports situations, where users are encumbered. We believe that the large number of degrees of freedom offered by our hands and arms can support one-handed interaction styles based on proximity. We can rotate and move our hands away or towards our body or we can hold them at a specified position.

We extend the input space of prior on-body user interfaces by focusing on the degree of freedom offered by the elbow joint, i.e., flexion by moving the hand toward and extension by moving the hand away from the body. We propose to use this proximity dimension as an additional input modality for one-handed mobile interaction. The interaction space alongside the user's line of sight can be divided into multiple parallel-planes. Each plane corresponds to a layer with visual content. The user can move his hands to browse through successive layers. Beyond palm-projected interfaces, our approach can also be used as an additional input dimension for devices such as wearables or head-mounted displays with small input spaces for touch interaction. For such devices, our approach allows to expand the interaction space and provide direct manipulation.


  • Florian Müller
  • Sebastian Günther
  • Niloofar Dezfuli
  • Max Mühlhäuser