UC San Diego
Enabling eyes-free interaction with tactile messages based on human experience
- Author(s): Li, Kevin Ansia
- et al.
As computing moves towards mobile devices, new challenges emerge for Human-Computer Interaction. Although mobile phones have typically had visual interfaces, there are an increasing number of scenarios where users need to interact with their devices but cannot look at them due to situational factors. Furthermore, the visual and audio senses have already been overloaded by traditional user interface design. Haptic feedback is a promising alternative for information delivery. Research in this domain typically takes an information theoretic approach towards increasing the bandwidth of information transfer through the skin. This approach often results in complex tactile patterns that can be difficult to learn. With the proliferation of mobile devices, and the shortcomings of visual and auditory channels of communication, there is tremendous opportunity for a tactile communication medium. This dissertation breaks away from the traditional approach to haptic research, instead focusing on how human experience can be used to generate tactile messages that have pre-learned meaning. We have looked at how three different types of stimuli can be mapped to the tactile space: music, human touch, and speech. This set of projects acts as a proof of concept, demonstrating how the approach can be applied to a variety of different stimuli.