Differences between revisions 1 and 24 (spanning 23 versions)
Revision 1 as of 2011-02-21 10:47:39
Size: 716
Editor: shoobe01
Comment:
Revision 24 as of 2011-07-26 12:01:58
Size: 4809
Editor: eberkman
Comment:
Deletions are marked like this. Additions are marked like this.
Line 1: Line 1:
Keyboard, Keypad and Other input features. Gestures here or somewhere else? Sensors? - Sizes of touch targets, etc. Include diagrams from Mobile Design Elements. Also see this W3C stuff, or at least refer to the group: http://www.w3.org/2010/webevents/ The varying ways people prefer to interact with their devices is highly dependent upon their natural tendencies, comfort levels, and the context of use. As designers and developers, we need to understand these influences and offer user interfaces that appeal to these needs.
Line 3: Line 3:
Good for discussion: 1) People will input the way they are comfortable: anecdote of alison's kb breaking, taking photos of hand-written notes and MMSing instead of SMS 2) Therefore, lots of input. Don't get caught up in touch as the end-all, be-all. Try to cover handwriting (natural and synthetic), funny layouts and gestures (swype), etc. User preferences may range from inputting data using physical keys, natural handwriting, or other gestural behaviors. Some users may prefer to receive information with an eyes-off screen approach relying on haptics or audible notifications.
Line 5: Line 5:
Go light on each pattern, or combine a lot. These are design considerations, more than real patterns... I think. This section, Input & Output, will discuss in detail the different mobile methods and controls users can interact with to access and receive information.

The types of input and output that will be discussed here are subdivided into the following chapters:
 * Chapter 10, [[Text & Character Input]]
 * Chapter 11, [[General Interactive Controls]]
 * Chapter 12, [[Input & Selection]]
 * Chapter 13, [[Audio & Vibration]]
 * Chapter 14, [[Screens, Lights & Sensors]]

== Types of Input & Output ==
=== Text & Character Input ===
Whether the reason is sending an email, SMS, searching, or filling out forms, users require ways to input both text and characters. Such methods may be through keyboards and keypads, by hardware keys, touch screens and pen-based writing. Regardless of the method, they must each allow rapid input, while reducing input errors and providing methods of correction. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for text and character input.

This chapter will discuss the following patterns:
 * [[Keyboards & Keypads]]
 * [[Pen Input]]
 * [[Mode Switches]]
 * [[Input Method Indicator]]
 * [[Autocomplete & Prediction]]

=== General Interactive Controls ===
Functions on the device, and in the interface, are influenced by a series of controls. They may be keys arrayed around the periphery of the device, or be controlled by gestural behaviors. Users must be able to find, understand, and easily learn these control types. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for general interactive controls.

This chapter will discuss the following patterns:

 * [[Directional Entry]]
 * [[Press-and-hold]]
 * [[Focus & Cursors]]
 * [[Other Hardware Keys]]
 * [[Accesskeys]]
 * [[Dialer]]
 * [[On-screen Gestures]]
 * [[Kinesthetic Gestures]]
 * [[Remote Gestures]]

=== Input & Selection ===
Users require methods to enter and remove text and other character-based information without restriction. Many times users are filling out forms or selecting information from lists. At any time, they may also need to make quick, easy changes to remove contents from these fields or entire forms. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for input and selection.

This chapter will discuss the following patterns:

 * [[Input Areas]]
 * [[Form Selections]]
 * [[Spinners & Tapes]]
 * [[Clear Entry]]

=== Audio & Vibration ===
Our mobile devices are not always in plain sight. They may across the room, or placed deep in our pockets. When important notifications occur, users need to be alerted. Using audio and vibration as notifiers and forms of feedback can be very effective. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for audio and vibration.

This chapter will discuss the following patterns:

 * [[Tones]]
 * [[Voice Input]]
 * [[Voice Readback]]
 * [[Voice Notifications]]
 * [[Haptic Output]]

=== Screens, Lights & Sensors ===
Mobile devices today are equipped with a range of technologies meant to improve our interactive experiences. These devices may be equipped with advanced display technology to improve viewability while offering better battery life, and incorporate location base services integrated within other applications. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for screens, lights, and sensors.

This chapter will discuss the following patterns:

 * [[LED]]
 * [[Display Brightness Controls]]
 * [[Orientation]]
 * [[Location]]

== Helpful knowledge for this section ==
Before you dive right into each pattern chapter, we like to provide you some extra knowledge in the section introductions. This extra knowledge is in multi-disciplinary areas of human factors, engineering, psychology, art, or whatever else we feel relevant.

This section will provide background knowledge for you in the following areas:

 * General Interactive Touch Guidelines.
 * Understanding Brightness, Luminance, and Contrast.
 * How our hearing works.

The varying ways people prefer to interact with their devices is highly dependent upon their natural tendencies, comfort levels, and the context of use. As designers and developers, we need to understand these influences and offer user interfaces that appeal to these needs.

User preferences may range from inputting data using physical keys, natural handwriting, or other gestural behaviors. Some users may prefer to receive information with an eyes-off screen approach relying on haptics or audible notifications.

This section, Input & Output, will discuss in detail the different mobile methods and controls users can interact with to access and receive information.

The types of input and output that will be discussed here are subdivided into the following chapters:

Types of Input & Output

Text & Character Input

Whether the reason is sending an email, SMS, searching, or filling out forms, users require ways to input both text and characters. Such methods may be through keyboards and keypads, by hardware keys, touch screens and pen-based writing. Regardless of the method, they must each allow rapid input, while reducing input errors and providing methods of correction. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for text and character input.

This chapter will discuss the following patterns:

General Interactive Controls

Functions on the device, and in the interface, are influenced by a series of controls. They may be keys arrayed around the periphery of the device, or be controlled by gestural behaviors. Users must be able to find, understand, and easily learn these control types. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for general interactive controls.

This chapter will discuss the following patterns:

Input & Selection

Users require methods to enter and remove text and other character-based information without restriction. Many times users are filling out forms or selecting information from lists. At any time, they may also need to make quick, easy changes to remove contents from these fields or entire forms. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for input and selection.

This chapter will discuss the following patterns:

Audio & Vibration

Our mobile devices are not always in plain sight. They may across the room, or placed deep in our pockets. When important notifications occur, users need to be alerted. Using audio and vibration as notifiers and forms of feedback can be very effective. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for audio and vibration.

This chapter will discuss the following patterns:

Screens, Lights & Sensors

Mobile devices today are equipped with a range of technologies meant to improve our interactive experiences. These devices may be equipped with advanced display technology to improve viewability while offering better battery life, and incorporate location base services integrated within other applications. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for screens, lights, and sensors.

This chapter will discuss the following patterns:

Helpful knowledge for this section

Before you dive right into each pattern chapter, we like to provide you some extra knowledge in the section introductions. This extra knowledge is in multi-disciplinary areas of human factors, engineering, psychology, art, or whatever else we feel relevant.

This section will provide background knowledge for you in the following areas:

  • General Interactive Touch Guidelines.
  • Understanding Brightness, Luminance, and Contrast.
  • How our hearing works.

Input and Output (last edited 2011-12-13 17:03:09 by shoobe01)