JCJC SIMI 2 - JCJC - SIMI 2 - Science informatique et applications

Interacting with Multi-Dimensional Gestures – MDGest


Interacting with Multi-Dimensional Gestures

Increase the number of gestures on small touchscreens

1. Establish guidelines for the design of multidimensional gestures in a range of applica-tions.<br />2. Design and evaluate the gesture widgets, a novel form of user-interface objects that as-sist users in discovering, learning and executing multidimensional gestures.<br />3. Facilitate their easy development and deployment.<br />4. Illustrate their use in real applications and validate them in real contexts of use.<br />

1. Lab studies to observe cognitive and motor abilities of users when they discover, learn and execute mulit-dimensional gestures (with or without feedback)
2. Show how these gestures can be integrated in graphical user interfaces by developing illustrative prototypes and tools for their development

Five new projects:
- Dwell-and-Spring : gesture widget + development toolkit
- ThumbRock : event for discrete and continuous selection on small touchscreens
- SidePress : prototype for capturing pressure gestures
- TilTouch : algorithm + gesture widget to perform gestures combining finger and wrist movements
- Paper+Smartphone : technique for assisting recognition of gestures performed with the dominant hand by using the non-dominant hand
- Power-up button : prototype for capturing gestures performed on and around a button being attached to an input device

Providing more power of expressivity in the users' hand instead of relying on graphical widgets that consume screen space at the expense of the document of interest.

1. Caroline Appert, Olivier Chapuis and Emmanuel Pietriga (2012) Dwell-and-Spring: Undo for Direct Manipulation. In CHI '12: Proceedings of the 30th international conference on Human factors in computing systems. ACM, pages 1957-1966.
2. Theophanis Tsandilas. 2012. Interpreting strokes on paper with a mobile assistant. In Proceedings of the 25th annual ACM symposium on User interface software and technology (UIST '12). ACM, New York, NY, USA, 299-308.
3. David Bonnet, Caroline Appert and Michel Beaudouin-Lafon. Extending the Vocabulary of Touch Events with ThumbRock. 8 pages. To appear in Proceedings 39th Graphics Interface conférence (Gi ’13).

Interacting with gestures is becoming increasingly more relevant to practical product design in recent years, particularly since the introduction of touch-screen small devices. First, tracing gestures is much easier and more natural if using a direct pointing device, such as the finger or a stylus on a horizontal surface, than using a mouse on a vertical screen. Second, the standard point-and-click interaction paradigm quickly reaches its limits on small screens, as it can only display a small number of graphical targets. Besides, targets have to be large enough so that users can point on them with a finger without ambiguity.

The vocabulary of the gestures used on such devices is usually limited. For example, the Apple iPhone provides simple gestures such as sliding with a finger, but interaction still heavily relies on navigation through long lists and deep menu hierarchies. Several research projects in the area of Human-Computer Interaction have tried to extend the use of gestures in a range of applications. However, most efforts have focused on solutions for expert users, disregarding the cognitive complexity associated with the discovery, the memorization, and the activation of gestures. In all pieces of work about gesture-based interaction, gestures are identified solely based on their two-dimensional shape. Unfortunately, this approach does not scale up, leading to complex gesture vocabularies that are hard to learn and hard to execute.

In this project, we will explore vocabularies based on new gesture characteristics (dimensions), such as speed (local or global), drawing direction and orientation, or distinctive drawing patterns. We employ the term "multidimensional gestures" to characterize gestures whose semantics are defined in terms of more than one dimension. In order to explore this new concept and promote its use in modern interactive applications, we will carry out the following steps:
(1) Establish guidelines for the design of multidimensional gestures in a range of applications.
(2) Design and evaluate the gesture widgets, a novel form of user-interface objects that assist users in discovering, learning and executing multidimensional gestures.
(3) Facilitate their easy development and deployment.
(4) Illustrate their use in real applications and validate them in real contexts of use.

Throughout this project, we plan to produce publications in major HCI conferences and journals, create and distribute open-source software, and communicate our results to the large public through videos and public demonstrations.

Project coordination

Caroline APPERT (UNIVERSITE DE PARIS XI [PARIS- SUD]) – appert@lri.fr

The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.



Help of the ANR 87,618 euros
Beginning and duration of the scientific project: - 36 Months

Useful links

Explorez notre base de projets financés



ANR makes available its datasets on funded projects, click here to find more.

Sign up for the latest news:
Subscribe to our newsletter