Size: 8002
Comment:
|
Size: 3726
Comment:
|
Deletions are marked like this. | Additions are marked like this. |
Line 8: | Line 8: |
* Chapter 10, [[Text & Character Input]] * Chapter 11, [[General Interactive Controls]] * Chapter 12, [[Input & Selection]] * Chapter 13, [[Audio & Vibration]] * Chapter 14, [[Screens, Lights & Sensors]] |
* Chapter 9, [[Text and Character Input]] * Chapter 10, [[General Interactive Controls]] * Chapter 11, [[Input and Selection]] * Chapter 12, [[Audio and Vibration]] * Chapter 13, [[Screens, Lights and Sensors]] |
Line 18: | Line 18: |
This chapter will discuss the following patterns: * [[Keyboards & Keypads]] * [[Pen Input]] * [[Mode Switches]] * [[Input Method Indicator]] * [[Autocomplete & Prediction]] |
|
Line 28: | Line 22: |
This chapter will discuss the following patterns: * [[Directional Entry]] * [[Press-and-hold]] * [[Focus & Cursors]] * [[Other Hardware Keys]] * [[Accesskeys]] * [[Dialer]] * [[On-screen Gestures]] * [[Kinesthetic Gestures]] * [[Remote Gestures]] |
|
Line 43: | Line 26: |
This chapter will discuss the following patterns: * [[Input Areas]] * [[Form Selections]] * [[Spinners & Tapes]] * [[Clear Entry]] |
|
Line 53: | Line 29: |
This chapter will discuss the following patterns: * [[Tones]] * [[Voice Input]] * [[Voice Readback]] * [[Voice Notifications]] * [[Haptic Output]] |
|
Line 64: | Line 33: |
This chapter will discuss the following patterns: | |
Line 66: | Line 34: |
* [[LED]] * [[Display Brightness Controls]] * [[Orientation]] * [[Location]] == Helpful knowledge for this section == Before you dive right into each pattern chapter, we like to provide you some extra knowledge in the section introductions. This extra knowledge is in multi-disciplinary areas of human factors, engineering, psychology, art, or whatever else we feel relevant. This section will provide background knowledge for you in the following areas: * General Interactive Touch Guidelines. * Understanding Brightness, Luminance, and Contrast. * How our hearing works. == General Touch Interaction Guidelines == The minimum area for touch activation, to address the general population, is a square 3/8” on each side (10 mm). When possible, use larger target areas. Important targets should be larger than others. There is no distinct preference for vertical or horizontal finger touch areas. All touch can be assumed to be a circle, though the actual input item may be shaped as needed to fit the space, or express a preconceived notion (e.g. button). Due to reduced precision and poor control of pressure, but smaller fingers, children who can use devices un-assisted have the same touch target size. {{attachment:GICintro-Sizec.png|Minimum area for touch activation. Do not rely on pixel sizes. Pixel sizes vary based on device and are not a consistent unit of measure}} === Targets === The visual target is not always the same as the touch area. However the touch area may never be smaller than the visual target. When practical (i.e. there is no adjacent interctive item) the touch area should be notably larger than the visual target, filling the "gutter" or white-space between objects. Some dead space should often be provided so edge contact does not result in improper input. In the example, the orange dotted line is the touch area. It is notably larger than the visual target, so a missed touch (as shown) still functions as expected. {{attachment:GICintro-Targetd.png|Visual target compared to the touch area. The touch area should never be smaller than the visual target}} === Touch area and the centroid of contact === The point activated by a touch (on capacitive touch devices) is the centroid of the touched area; that area where the user’s finger is flat against the screen. The centroid is the center of area whose coordinates are the average (arithmetic mean) of the co-ordinates of all the points of the shape. This may be sensed directly (the highest change in local capacitance for projected-capacitive screens) or calculated (center of the obscured area for beam-sensors). A larger area will typically be perceived to be touched by the user, due to parallax (advanced users may become aware of the centroid phenomenon, and expect this). {{attachment:GICintro-Centroidb.png|The centroid area compared to the area touched. Due to screen parallax, we typically perceive a larger area exists to touch}} === Bezels, edges and size cheats === Buttons at the edges of screens with flat bezels may take advantage of this to use smaller target sizes. The user may place their finger so that part of the touch is on the bezel (off the sensing area of the screen). This will effectively reduce the size of their finger, and allow smaller input areas. This effective size reduction can only be about 60% of normal (so no smaller than 0.225 in or 6 mm) and only in the dimension with the edge condition. This is practically most useful to give high priority items a large target size without increasing the apparent or on-screen size of the target or touch area. {{attachment:GICintro-Bezela.png|By using the space provided on the screen bezel, the actual target size can be slightly reduced}} |
== Getting Started == You now have a general sense of the types of input and output that will be discussed in this part. The following chapters will provide you specific information on theory, tactics, and illustrate examples of appropriate design patterns you can apply to specific situations in the mobile space. |
The varying ways people prefer to interact with their devices is highly dependent upon their natural tendencies, comfort levels, and the context of use. As designers and developers, we need to understand these influences and offer user interfaces that appeal to these needs.
User preferences may range from inputting data using physical keys, natural handwriting, or other gestural behaviors. Some users may prefer to receive information with an eyes-off screen approach relying on haptics or audible notifications.
This section, Input & Output, will discuss in detail the different mobile methods and controls users can interact with to access and receive information.
The types of input and output that will be discussed here are subdivided into the following chapters:
Chapter 9, Text and Character Input
Chapter 10, General Interactive Controls
Chapter 11, Input and Selection
Chapter 12, Audio and Vibration
Chapter 13, Screens, Lights and Sensors
Types of Input & Output
Text & Character Input
Whether the reason is sending an email, SMS, searching, or filling out forms, users require ways to input both text and characters. Such methods may be through keyboards and keypads, by hardware keys, touch screens and pen-based writing. Regardless of the method, they must each allow rapid input, while reducing input errors and providing methods of correction. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for text and character input.
General Interactive Controls
Functions on the device, and in the interface, are influenced by a series of controls. They may be keys arrayed around the periphery of the device, or be controlled by gestural behaviors. Users must be able to find, understand, and easily learn these control types. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for general interactive controls.
Input & Selection
Users require methods to enter and remove text and other character-based information without restriction. Many times users are filling out forms or selecting information from lists. At any time, they may also need to make quick, easy changes to remove contents from these fields or entire forms. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for input and selection.
Audio & Vibration
Our mobile devices are not always in plain sight. They may across the room, or placed deep in our pockets. When important notifications occur, users need to be alerted. Using audio and vibration as notifiers and forms of feedback can be very effective. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for audio and vibration.
Screens, Lights & Sensors
Mobile devices today are equipped with a range of technologies meant to improve our interactive experiences. These devices may be equipped with advanced display technology to improve viewability while offering better battery life, and incorporate location base services integrated within other applications. This chapter will explain research based frameworks, tactical examples, and descriptive mobile patterns to use for screens, lights, and sensors.
Getting Started
You now have a general sense of the types of input and output that will be discussed in this part. The following chapters will provide you specific information on theory, tactics, and illustrate examples of appropriate design patterns you can apply to specific situations in the mobile space.