Think Wii, Kinect. Sort of in between (many gestures are wrist-level, so using it as a pointing device) and the effect is different. E.g. needs to have an affector (made that word up): either a cursor, or an element on the screen such as a character in a game, that is clearly under direct control of the input mechanism... etc. Easy to write an interaction pattern, I think.

Also "touchless gestures" which can be very similar to on-screen gestures, but don't make contact...

Review how the KB on the Wii works, for the virtual cursor and highlight functions.

Problem

Solution

Variations

Pointing (wii, and kinect also...)

Sending (two devices sending a contact, etc.)

Sensing (games mostly now, like "you are doing the yoga posture wrong"... but eventually can be used much like contextually sensing on kinesthetic gestures, and ubicomp devices can sense and respond in tangential ways...

Interaction Details

Presentation Details

Antipatterns

Be aware of the problem of pilot-induced oscillation and take steps to detect and alleviate it. PIO arises from a match between the frequency of the feedback loop in the interactive system, and the frequency of response of users. While similar behaviors can arise in other computing environments, they lead to single-point errors. In aircraft, the movement of the vehicle (especially in the vertical plane) can result in the oscillation building, possibly to the point of loss of control. The key facet of many remote gesturing systems -- being maneuvered in three dimensions, in free-space -- can lead to the same issues. These cannot generally be detected during design, but must be tested for. Alleviate this via reducing the control lag, or reducing the gain in the input system. Be aware that unsuitable responses vary by context. Games will generally accept much more "twitchy" controls than pointing for entry on a virtual keyboard.

Examples

PIO Article: http://dtrs.dfrc.nasa.gov/archive/00001004/01/210389v2.pdf