Weigel, M., Mehta, V. and Steimle, J., 2014, April. More than touch: understanding how people use skin as an input surface for mobile computing. In Proceedings of the SIGCHI Conference on Human Factors in Computing Systems (pp. 179-188). ACM.
This project contributes results from an empirical study of on-skin input, an emerging technique for controlling mobile devices. Skin is fundamentally different from off-body touch surfaces, opening up a new and largely unexplored interaction space. We investigate characteristics of the various skin-specific input modalities, analyze what kinds of gestures are performed on skin, and study what are preferred input locations. Our main findings show that (1) users intuitively leverage the properties of skin for a wide range of more expressive commands than on conventional touch surfaces; (2) established multi-touch gestures can be transferred to on-skin input; (3) physically uncomfortable modalities are deliberately used for irreversible commands and expressing negative emotions; and (4) the forearm and the hand are the most preferred locations on the upper limb for on-skin input. We detail on users’ mental models and contribute a first consolidated set of on-skin gestures. Our findings provide guidance for developers of future sensors as well as for designers of future applications of on-skin input.
User elicitation study
An empirical study of how people use skin as an input surface for mobile and wearable computing was conducted.
- Opportunistic sampling method
- 22 Participants (11 female); 8 different nationalities ; 4 different professions
- 3 tasks to be performed on upper limb
- Questionnaire + Likert scale
- Video recorded interview sessions (1 hour each) & Note-taking
- No sensing hardware used
- Think-aloud protocol
- Qualitative method : Grounded theory
- Underlining patterns and associated mental models discovered
- Agreement scores and Cohen-Kappa scores used to quantify the findings