Technical Paper: Designing gestures for (multi-touch) screens
For several months I worked on technical paper about designing gestures for screen-based environments. Finally, it is finished and you can read it. Here is the abstract:
This paper analyses gesture design for pointing devices in screen-based environments. By exploring design patterns the analysis investigated the gesture design of five different end-user products: Desktop operating systems, mobile operating systems, 3rd Party software, small software products, and common hardware products. The beginning of the paper defines what a gesture is, and the various kinds of gestures. Afterwards the analysis merges the gesture design results with the basic commands for pointing devices. This approach points out which gestures are often used, and in which context they are used. The results give interaction designers and software engineers a guide for implementing gestures in their own products. Furthermore, the paper proposes solutions for gesture documentation, and a conceptual framework for complicated gestures. The last section takes an industrial design perspective on pointing devices as an input channel. It discusses the evolution of interface design from a hardware driven to a software driven approach.
Unfortunately, I got sick on a long-term disease. Therefore it took me so long for writing this paper and that is also the reason why the data of the analysis is from January of 2010. However, in my opinion the results of my analysis are still valid. For more up-to-date data, please check the Touch Gesture Reference from LukeW.
I am very happy about the support from my teachers, friends, and fellow students. Big thanks to Mahir M. Yavuz and Mathias Stäbler for the content feedback. Vesela Mihaylova for a great Adobe Illustrator and graphic design support. Tim Devine for transforming over 30 pages of my bad english in a readable form, and marking some unclear points of my paper. Dudes, thank you so much!