Today, smart phones with touchscreens and sensors are the predominant, fastest growing class of consumer computing devices. However, because these devices are used in diverse situations, and have unique capabilities and form factors, they also raise new user interface challenges, and at the same time, offer great opportunities for impactful HCI research.
In this talk, I will focus on gesture-based interaction, an important interaction behavior enabled by touchscreens and built-in sensors, which sets mobile interaction apart from traditional graphical user interfaces.
I will first talk about gesture shortcuts in the context of Gesture Search [1], a tool that allows users to quickly access applications and data on the phone by simply drawing a few gestures (http://www.google.com/mobile/gesture-search). Gesture Search flattens mobile phones' UI hierarchy by alleviating the need for navigating the interface. Gesture Search has been released and is invoked hundreds of thousands of times per day by a large user population.
I will then cover several related projects that furthered our investigation into gesture shortcuts, including using gestures for target acquisition [3], crowd sourcing-based gesture recognition [5] and our early exploration on motion gestures [4, 6, 7].
Finally, I will turn to discuss multi-touch gestures for direct manipulation of an interface, the dominant class of gesture-based interaction on existing commercial devices. Multi-touch gestures are intuitive and efficient to use, but can be difficult to implement. I will discuss tools to support developers, allowing them to more easily create multi-touch interaction behaviors by demonstration [2].
These projects investigated various aspects of gesture-based interaction on mobile devices. They help open a new dimension for mobile interaction.
[1]
Yang Li.
Gesture search: a tool for fast mobile data access
,
2010,
UIST '10.
[2]
Yang Li,et al.
User-defined motion gestures for mobile interaction
,
2011,
CHI.
[3]
Yang Li,et al.
Gesture avatar: a technique for operating mobile user interfaces using gestures
,
2011,
CHI.
[4]
Yang Li,et al.
Tap, swipe, or move: attentional demands for distracted smartphone input
,
2012,
AVI.
[5]
Yang Li,et al.
Gesture coder: a tool for programming multi-touch gestures by demonstration
,
2012,
CHI.
[6]
Yang Li,et al.
Bootstrapping personal gesture shortcuts with the wisdom of the crowd and handwriting recognition
,
2012,
CHI.
[7]
Yang Li,et al.
DoubleFlip: a motion gesture delimiter for mobile interaction
,
2010,
UIST '10.