Differences between revisions 4 and 8 (spanning 4 versions)
Revision 4 as of 2011-04-26 00:32:35
Size: 1557
Editor: shoobe01
Comment:
Revision 8 as of 2011-04-28 02:01:41
Size: 4653
Editor: shoobe01
Comment:
Deletions are marked like this. Additions are marked like this.
Line 19: Line 19:
Proximity to Signals like audio detection on car radios? RFID, card readers....FUTURE, orientation of detectors on handset? Cultures?
Rotation rotating device on axis or planes to change the orientation of the information displayed.
Line 26: Line 24:
Mobile devices should react to user behaviors or... and something...
Line 29: Line 28:
Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation.
Line 30: Line 30:
The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame.
Line 31: Line 32:
Use of a secondary device for communicating gesture, such as a game controller, is covered under the '''[[Remote Gestures]]''' pattern. In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of '''Kinesthetic gestures''' including cameras, proximity sensors, magnetometers (compasses), and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other.

Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices?

Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately.

Specific subsets of device movement such as '''[[Orientation]]''' have specialize behaviors and are covered separately. '''[[Location]]''' is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge.

The use of a secondary device for communicating gesture, such as a game controller, is covered under the '''[[Remote Gestures]]''' pattern.
Line 35: Line 44:
Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity...

 * '''Device Orientation''' - Relative orientation to the user, or absolute orientation relative to the ground. For example, when rotated screen down, and no other movement is detected, the device may be be set to "meeting mode" and all ringers are silenced.
 * '''Device Gesturing''' - Shaking, etc. deliberate gestures...
 * '''User Movement''' - Incidental movement you grab for context, like walking vs running vs sitting detection...
 * '''User Proximity''' - Moving towards your head it locks, and so on...
 * '''Other Device Proximity''' - Approaching another handset or an NFC reader; note that you don't have to

mention as more future... Proximity to Signals -- like audio detection on car radios? RFID, card readers....FUTURE, orientation of detectors on handset? Cultures?
Line 38: Line 56:
do action???

combination gestures. almost all are. See device orientation in the variations...

'''Kinesthetic gestures''' are not as ubiquitous as '''[[On-Screen Gestures]]''' so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors.
Line 41: Line 64:
when reacting to a proximate other device, present the relationship on screen when possible... when known, display items on the screen adjacent to the location of the adjacent device...

Kinesthetic gestures "Whole body motions, whole-device motions (e.g. face down=sleep), and even sensing such as speed, gait... - NOT TOO FUTURISTIC! http://gestureworks.com/features/open-source-gestures/

Roll handset: Raising to face, Lowering from face, Tapping two devices together, Motioning handset towards RFID, Whipping: A quick down-up motion Consider things like Wii, which turn kinestetic gestures into on-screen gestures, and other methods of remoting, and tying multiple inputs in odd ways."

Shake shaking handset rapidly on multiple axis. Used to refresh and reload displayed information or to scatter data.

Roll Rolling device onto it's opposite side will change it's current state. i.e. face down=sleep.

Tapping Tapping two handsets together can be used for file and information sharing

Whipping Rapid back and forth action used to throw or cast information i,e. fishing reels used in games

Raising consider more...antipattern?

Lowering consider more...antipattern?

Problem

Mobile devices should react to user behaviors or... and something...

Solution

Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation.

The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame.

In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of Kinesthetic gestures including cameras, proximity sensors, magnetometers (compasses), and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other.

Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices?

Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately.

Specific subsets of device movement such as Orientation have specialize behaviors and are covered separately. Location is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge.

The use of a secondary device for communicating gesture, such as a game controller, is covered under the Remote Gestures pattern.

Variations

Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity...

  • Device Orientation - Relative orientation to the user, or absolute orientation relative to the ground. For example, when rotated screen down, and no other movement is detected, the device may be be set to "meeting mode" and all ringers are silenced.

  • Device Gesturing - Shaking, etc. deliberate gestures...

  • User Movement - Incidental movement you grab for context, like walking vs running vs sitting detection...

  • User Proximity - Moving towards your head it locks, and so on...

  • Other Device Proximity - Approaching another handset or an NFC reader; note that you don't have to

mention as more future... Proximity to Signals -- like audio detection on car radios? RFID, card readers....FUTURE, orientation of detectors on handset? Cultures?

Interaction Details

do action???

combination gestures. almost all are. See device orientation in the variations...

Kinesthetic gestures are not as ubiquitous as On-Screen Gestures so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors.

Presentation Details

when reacting to a proximate other device, present the relationship on screen when possible... when known, display items on the screen adjacent to the location of the adjacent device...

Antipatterns

Examples

Kinesthetic Gestures (last edited 2014-02-24 19:31:29 by shoobe01)