2335
Comment:
|
7171
|
Deletions are marked like this. | Additions are marked like this. |
Line 1: | Line 1: |
Kinesthetic gestures "Whole body motions, whole-device motions (e.g. face down=sleep), and even sensing such as speed, gait... - NOT TOO FUTURISTIC! http://gestureworks.com/features/open-source-gestures/ |
== Problem == Mobile devices should react to user behaviors or... and something... |
Line 4: | Line 4: |
Roll handset: Raising to face, Lowering from face, Tapping two devices together, Motioning handset towards RFID, Whipping: A quick down-up motion Consider things like Wii, which turn kinestetic gestures into on-screen gestures, and other methods of remoting, and tying multiple inputs in odd ways." |
|
Line 7: | Line 5: |
Shake shaking handset rapidly on multiple axis. Used to refresh and reload displayed information or to scatter data. | == Solution == Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation. |
Line 9: | Line 8: |
Roll Rolling device onto it's opposite side will change it's current state. i.e. face down=sleep. | The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame. |
Line 11: | Line 10: |
Tapping Tapping two handsets together can be used for file and information sharing | In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of '''Kinesthetic gestures''' including cameras, proximity sensors, magnetometers (compasses), audio, and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other. |
Line 13: | Line 12: |
Whipping Rapid back and forth action used to throw or cast information i,e. fishing reels used in games | Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices? |
Line 15: | Line 14: |
Raising consider more...antipattern? | Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately. |
Line 17: | Line 16: |
Lowering consider more...antipattern? | Specific subsets of device movement such as '''[[Orientation]]''' have specialize behaviors and are covered separately. '''[[Location]]''' is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge. The use of a secondary device for communicating gesture, such as a game controller, is covered under the '''[[Remote Gestures]]''' pattern. == Variations == Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity... * '''Device Orientation''' - Relative orientation to the user, or absolute orientation relative to the ground. Either incidental, or natural gestures. For example, when rotated screen down, and no other movement is detected, the device may be be set to "meeting mode" and all ringers are silenced; as the screen is the the "face" of the device, face-down means it is hidden or silenced. * '''Device Gesturing''' - Deliberate gestures, moving the device through space in a specific manner. Usually related to natural gestures, but may require some arbitrary mapping to behaviors. This may require learning. An example is the relatively common behavior of shaking to clear, reset or delete items. * '''User Movement''' - Incidental movement of the user is detected and processed to divine the user's current behavioral context. For example, gait can be detected either when the device is in hands or in a pocket; the difference between walking, running, and sometimes even restless vs. working (e.g. at a desk at work) can be detected. Vibration generally cannot be detected by accelerometers so riding in vehicles must be detected by location sensors. Other user movements including swinging the arm, such as moving a handset towards the user's head. * '''User Proximity''' - Proximity detectors, whether cameras, acoustical or infra-red, can be used to detect how close the device is to a passive object, such as the user's head. A common example is for touch-centric devices to lock the screen when approaching the user's ear, so talking on the handset does not perform accidental inputs. Moving away from the user is also detected, so the handset is unlocked without direct user input. * '''Other Device Proximity''' - Non-passive devices, such as other mobiles, or NFC card readers can be detected by the use of close-range radio transmitters. As the device approaches, it can open the appropriate application -- sharing or banking -- without user interaction. The user will generally have to complete the action, to confirm that this was not accidental, spurious or malicious. Proximity to signals or other environmental conditions further than a few inches away is not widely used as yet. When radio, audio or visual cues become commonly available they are likely to be a related contextually related manner. == Interaction Details == The core of designing interactions of '''Kinesthetic Gestures''' is in determining what gestures and sensors are to be used. Not just a single gesture, but which of them, in combination, can most accurately and reliably detect the expected condition. For example, it is a good idea to lock the keypad from input when it is placed to the ear, during a call. Using any one sensor alone would make this unreliable or unusable, but combining accelerometers (moving towards the head) with proximity sensors or cameras (at the appropriate time of movement, an object of head size approaches), this can be made extremely reliable. As a general rule, gestures should be reversible. Removing the phone from the ear must also be sensed, so reliably and quickly it immediately unlocks so there is no delay in the use of the device. Some indication of the gesture must be presented visually. Very often, this is not explicit (such as an icon or overlay) but switches modes at a higher level. When a device is locked due to being placed face-down, or near the ear, the screen is also blanked. Aside from saving power when there is no way to read the screen anyway, this serves to signal that input is impossible in the event that it does not unlock immediately. Non-gestural methods must be available to remove or reverse state-changing gestures. For example, if sensors do not unlock the device in the examples above, the conventional unlock method (e.g. power/lock button) must be enabled. Generally, this means that '''Kinesthetic Gestures''' should simply be shortcuts to existing features. |
Line 23: | Line 47: |
== Problem == | combination gestures. almost all are. See device orientation in the variations... Combinations include things not included, like GPS. When you walk to a car, accelerometer detects you walk, then GPS determines you are moving at vehicle speeds, etc.... combinations are not momentary but over time. The device should monitor full time... privacy concerns though... |
Line 25: | Line 49: |
== Solution == ...Kinesthetic is about not position, but context. How is it moving, in relation to what... etc. ...Not just accelerometers, but use of cameras, NFC/RFID devices, etc. all in coordination... Use of a secondary device for communicating gesture, such as a game controller, is covered under the '''[[Remote Gestures]]''' pattern. == Variations == * '''Device Orientation''' - Flipping it over, for example... face-down=sleep * '''Device Gesturing''' - Shaking, etc. deliberate gestures... * '''User Movement''' - Incidental movement you grab for context, like walking vs running vs sitting detection... * '''User Proximity''' - Moving towards your head it locks, and so on... * '''Other Device Proximity''' - Approaching another handset or an NFC reader; note that you don't have to mention as more future... Proximity to Signals -- like audio detection on car radios? RFID, card readers....FUTURE, orientation of detectors on handset? Cultures? == Interaction Details == do action??? |
'''Kinesthetic gestures''' are not as ubiquitous as '''[[On-Screen Gestures]]''' so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors. |
Line 52: | Line 54: |
...Learning... help docs, walkthroughs, explain in settings menus so users who wish to customize can discover them... |
Problem
Mobile devices should react to user behaviors or... and something...
Solution
Kinesthetics is the ability to detect movement of the body; while usually applied in a self-aware sense, here it refers to the mobile device using sensors to detect and react to proximity, action and orientation.
The most common sensor is the accelerometer, which as the name says, measures acceleration along a single axis. As utilized in modern mobile devices, accelerometers are very compact, integrated micro electro-mechanical systems, usually with detectors for all three axes mounted to a single frame.
In certain circles the term "accelerometer" is beginning to achieve the same sense of conflating hardware with behavior as the GPS and location. Other sensors can and should be used for detection of Kinesthetic gestures including cameras, proximity sensors, magnetometers (compasses), audio, and close-range radios such as NFC, RFID and Bluetooth. All these sensors should be used in coordination with each other.
Kinesthetic gesturing is largely a matter of detecting incidental or natural movements and reacting in appropriate or expected manners. Unlike on-screen gestures -- which form a language and can become abstracted very easily -- kinesthetic gesture is about context. Is the device moving? In what manner? In relation to what? How close to the user, or other devices?
Design of mobile devices should consider what various types of movement, proximity and orientation mean, and behave appropriately.
Specific subsets of device movement such as Orientation have specialize behaviors and are covered separately. Location is also covered separately. Subsidiary senses of position such as those used in augmented reality and at close range (e.g. within-building location) may have some overlaps, but are not yet established enough for patterns to emerge.
The use of a secondary device for communicating gesture, such as a game controller, is covered under the Remote Gestures pattern.
Variations
Only some of the variations of this pattern are explicitly kinesthetic, and sense user movement. Others are primarily detecting position, such as other device proximity...
Device Orientation - Relative orientation to the user, or absolute orientation relative to the ground. Either incidental, or natural gestures. For example, when rotated screen down, and no other movement is detected, the device may be be set to "meeting mode" and all ringers are silenced; as the screen is the the "face" of the device, face-down means it is hidden or silenced.
Device Gesturing - Deliberate gestures, moving the device through space in a specific manner. Usually related to
natural gestures, but may require some arbitrary mapping to behaviors. This may require learning. An example is the relatively common behavior of shaking to clear, reset or delete items.
User Movement - Incidental movement of the user is detected and processed to divine the user's current behavioral context. For example, gait can be detected either when the device is in hands or in a pocket; the difference between walking, running, and sometimes even restless vs. working (e.g. at a desk at work) can be detected. Vibration generally cannot be detected by accelerometers so riding in vehicles must be detected by location sensors. Other user movements including swinging the arm, such as moving a handset towards the user's head.
User Proximity - Proximity detectors, whether cameras, acoustical or infra-red, can be used to detect how close the device is to a passive object, such as the user's head. A common example is for touch-centric devices to lock the screen when approaching the user's ear, so talking on the handset does not perform accidental inputs. Moving away from the user is also detected, so the handset is unlocked without direct user input.
Other Device Proximity - Non-passive devices, such as other mobiles, or NFC card readers can be detected by the use of close-range radio transmitters. As the device approaches, it can open the appropriate application -- sharing or banking -- without user interaction. The user will generally have to complete the action, to confirm that this was not accidental, spurious or malicious.
Proximity to signals or other environmental conditions further than a few inches away is not widely used as yet. When radio, audio or visual cues become commonly available they are likely to be a related contextually related manner.
Interaction Details
The core of designing interactions of Kinesthetic Gestures is in determining what gestures and sensors are to be used. Not just a single gesture, but which of them, in combination, can most accurately and reliably detect the expected condition. For example, it is a good idea to lock the keypad from input when it is placed to the ear, during a call. Using any one sensor alone would make this unreliable or unusable, but combining accelerometers (moving towards the head) with proximity sensors or cameras (at the appropriate time of movement, an object of head size approaches), this can be made extremely reliable.
As a general rule, gestures should be reversible. Removing the phone from the ear must also be sensed, so reliably and quickly it immediately unlocks so there is no delay in the use of the device.
Some indication of the gesture must be presented visually. Very often, this is not explicit (such as an icon or overlay) but switches modes at a higher level. When a device is locked due to being placed face-down, or near the ear, the screen is also blanked. Aside from saving power when there is no way to read the screen anyway, this serves to signal that input is impossible in the event that it does not unlock immediately.
Non-gestural methods must be available to remove or reverse state-changing gestures. For example, if sensors do not unlock the device in the examples above, the conventional unlock method (e.g. power/lock button) must be enabled. Generally, this means that Kinesthetic Gestures should simply be shortcuts to existing features.
combination gestures. almost all are. See device orientation in the variations... Combinations include things not included, like GPS. When you walk to a car, accelerometer detects you walk, then GPS determines you are moving at vehicle speeds, etc.... combinations are not momentary but over time. The device should monitor full time... privacy concerns though...
Kinesthetic gestures are not as ubiquitous as On-Screen Gestures so may be unexpected to some users. Settings should be provided to disable (or enable) certain behaviors.
Presentation Details
when reacting to a proximate other device, present the relationship on screen when possible... when known, display items on the screen adjacent to the location of the adjacent device...
...Learning... help docs, walkthroughs, explain in settings menus so users who wish to customize can discover them...
Antipatterns