Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20070152983 A1
Publication typeApplication
Application numberUS 11/591,752
Publication dateJul 5, 2007
Filing dateNov 1, 2006
Priority dateDec 30, 2005
Also published asDE202006020451U1, EP1971909A1, US20140192001, US20150212608, WO2007078477A1
Publication number11591752, 591752, US 2007/0152983 A1, US 2007/152983 A1, US 20070152983 A1, US 20070152983A1, US 2007152983 A1, US 2007152983A1, US-A1-20070152983, US-A1-2007152983, US2007/0152983A1, US2007/152983A1, US20070152983 A1, US20070152983A1, US2007152983 A1, US2007152983A1
InventorsChris McKillop, Andrew Grignon, Bas Ording
Original AssigneeApple Computer, Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Touch pad with symbols based on mode
US 20070152983 A1
Abstract
A multifunctional handheld device capable of operating in different modes includes a single input arrangement that provides inputs for each mode of the multifunctional handheld device. The single input arrangement includes at least an input pad that provides signals when touched or pressed. The input pad may for example be a touch pad. The input pad is divided into one or more input areas that change in accordance with the current mode of the multifunctional handheld device. The multifunctional handheld device also includes a display device that presents graphical elements to indicate the configuration of the input areas at the input pad. Each mode of the multifunctional handheld device provides a different configuration of input areas and graphical elements associated therewith.
Images(13)
Previous page
Next page
Claims(41)
1. A multifunctional handheld device capable of operating in different modes, the multifunctional handheld device comprising:
a single input arrangement that provides inputs for each mode of the multifunctional handheld device, the single input arrangement including at least an input pad that provides signals when touched or pressed, the input pad being divided into one or more input areas that change in accordance with the current mode of the multifunctional handheld device; and
a display mechanism that presents graphical elements to indicate the configuration of the input areas at the input pad, each mode of the multifunctional handheld device providing a different configuration of input areas and graphical elements associated therewith.
2. The multifunctional handheld device as recited in claim 1 wherein the display mechanism further provides visual feedback indicative of which input areas are being touched or pressed.
3. The multifunctional handheld device as recited in claim 1 wherein the display mechanism is integrated with the input pad.
4. The multifunctional handheld device as recited in claim 3 wherein the display mechanism is hidden from view underneath a top surface of the input pad, and wherein the graphical elements are presented at the top surface of the input pad.
5. The multifunctional handheld device as recited in claim 3 further comprising a second display device that provides graphical information associated with each mode of the multifunctional handheld device.
6. The multifunctional handheld device as recited in claim 3 wherein the display mechanism is a graphics generator that includes one or more light sources for generating light, and one or more graphics layers having features for creating symbols from the generated light.
7. The multifunctional handheld device as recited in claim 3 wherein the display mechanism is not a liquid crystal display (LCD).
8. The multifunctional handheld device as recited in claim 1 wherein the display mechanism is distinct from the input pad.
9. The multifunctional handheld device as recited in claim 1 wherein the input pad is circular, the circular input pad is divided into a different set of angularly segmented input areas for each mode of the multifunctional handheld device, and the graphical elements are displayed in a circular fashion representative of the angularly segmented input areas for each mode of the multifunctional handheld device.
10. The multifunctional handheld device as recited in claim 1 wherein the multifunctional handheld device operates in at least a phone mode and a media player mode.
11. The multifunctional handheld device as recited in claim 1 wherein the input pad is a touch pad that provides one or more touch signals when touched, a clickable pad that provides one or more button signals when pressed, or a clickable touch pad that provides one or more button signals when pressed and one or more touch signals when touched.
12. The multifunctional handheld device as recited in claim 10 wherein the input arrangement further comprises a clickable button that is integrated with the input pad, the clickable button providing a button signal when pressed.
13. A touch pad that displays graphical elements to indicate input areas of the touch pad, each input area representing a different functionality, the input areas and graphical elements changing in accordance with different input modes.
14. The touch pad as recited in claim 13 comprising:
a touch sensing layer capable of being divided into one or more input areas, the layout and functionality of the input areas being based on a current input mode; and
a graphical generator integrated with the touch sensing layer, the graphical generator presenting graphical elements at each of the input areas, the graphical elements indicating the location and functionality of the input areas.
15. The touch pad as recited in claim 14 wherein the touch sensing layer is optically transmissive, and wherein the graphics generator is disposed below the optically transmissive touch sensing layer.
16. The touch pad as recited in claim 14 wherein the touch sensing layer is disposed below the graphics generator.
17. The touch pad as recited in claim 14 wherein the graphics generator comprises:
one or more light sources for generating light; and
one or more graphics layers having features for creating symbols from the generated light.
18. The touch pad as recited in claim 17 wherein the features are masking elements.
19. The touch pad as recited in claim 17 wherein the features are light excitable elements.
20. The touch pad as recited in claim 13 further comprising a cover disposed over the touch sensing layer and graphics generator, the cover acting as a light diffuser and hiding the under layers from view.
21. The touch pad as recited in claim 13 further comprising a light panel for producing visual effects separately or together with the graphics generator.
22. The touch pad as recited in claim 21 wherein the light panel is capable of highlighting the graphical elements generated via the graphics generator.
23. A touch pad, comprising:
a touch sensing layer;
a first set of symbols that only illuminate with a first light;
a second set of symbols that only illuminate with a second light; and
a light system capable of generating the first and second light.
24. The touch pad as recited in claim 23 wherein the symbols are embodied as light excitable elements configured to absorb and reemit the generated light, the light excitable elements associated with the first set of symbols being sensitive to a first wavelength of light, the light excitable elements associated with the second set of symbols being sensitive to a second wavelength of light, the light system being configured to generate light having the first and second wavelengths.
25. The touch pad as recited in claim 24 wherein the light system includes a first light source capable of generating light of a first wavelength, and a second light source capable of generating light of a second wavelength.
26. The touch pad as recited in claim 2 wherein the light excitable elements are formed from a photoluminescence material.
27. A circular touch pad, comprising:
a circular light diffusing cover;
a circular transparent touch sensing layer disposed below the light diffusing cover;
an circular organic light emitting device (OLED) disposed below the transparent touch sensing layer;
a printed circuit board disposed below the organic light emitting device (OLED), the printed circuit board carrying a controller that is operatively coupled to the transparent touch sensing layer and the organic light emitting device, the controller receiving touch data from the transparent touch sensing layer and instructing the organic light emitting device (OLED) how to present graphical information.
28. The touch pad as recited in claim 27 wherein the graphical information is based in part on the touch data.
29. The touch pad as recited in claim 27 wherein the graphical information is based in part on a mode.
30. The touch pad as recited in claim 27 wherein the touch pad includes a central integrated button such that the light diffusing layer, transparent touch sensing layer, and organic light emitting device (OLED) are circularly annular to provide space for the central integrated button.
31. A method of operating a multifunctional hand held electronic device having a touch surface, comprising:
displaying symbols in a circular fashion, each symbol representing a different input to be made in the hand held electronic device;
mapping individual symbols being displayed to individual regions of the touch surface;
detecting a touch on the touch surface;
determining the region of the touch surface being touched;
highlighting only the symbol associated with the region of the touch surface being touched;
detecting a selection event; and
implementing the input associated with the symbol being highlighted when the selection event is initiated.
32. The method as recited in claim 31 wherein the touch surface is a circular touch pad, and wherein the symbols are displayed in a circular fashion about the circular touch pad.
33. The method as recited in claim 31 wherein the multifunctional handheld device includes a display, and wherein the symbols are displayed in a circular fashion about the display.
34. The method as recited in claim 31 further comprising:
determining a mode of the multifunctional handheld device;
displaying symbols in a circular fashion in accordance with the mode.
35. The method as recited in claim 34 wherein a first set of symbols are provided for a first mode of the multifunctional handheld device, and a second set of symbols are provided for a second mode of the multifunctional handheld device.
36. The method as recited in claim 35 wherein the first mode is a phone mode, and wherein the first set of symbols are symbols associated with phone operations, and wherein the second mode is a media player mode, and wherein the second set of symbols are symbols associated with media player operations.
37. A method of operating a handheld electronic device having a touch device, the method comprising:
designating input regions within a touch surface of the touch device, each input region representing a different location within the touch surface;
assigning symbols to the input regions, the symbols characterizing the functionality of the input regions; and
displaying the symbols associated with the input regions, the location of the symbols indicating the location of the input area within the touch surface.
38. The method as recited in claim 37 wherein the symbols are displayed on a display of the handheld electronic device.
39. The method as recited in claim 37 wherein the symbols are displayed on the touch surface.
40. The method as recited in claim 37 wherein the touch surface is circular, the input regions and symbols are placed at angular locations, the angular locations of the symbols matching the angular locations of their corresponding input region.
41. The method as recited in claim 37 wherein the input regions are designated and the symbols are assigned and displayed based on the mode of the handheld electronic device, each mode having a different set of input regions and symbols associated therewith.
Description
    CROSS REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application claims priority to Provisional Patent Application No. 60/755,656 entitled “TOUCH PAD WITH FEEDBACK” filed Dec. 30, 2005 which is herein incorporated by reference.
  • [0002]
    This application is related to the following applications, all of which are herein incorporated by reference:
  • [0003]
    U.S. patent application Ser. No. 10/188,185, titled “TOUCH PAD FOR HANDHELD DEVICE”, filed Jul. 1, 2002;
  • [0004]
    U.S. patent application Ser. No. 10/722,948, titled “TOUCH PAD FOR HANDHELD DEVICE”, filed Nov. 25, 2003;
  • [0005]
    U.S. patent application Ser. No. 10/643,256, titled “MOVABLE TOUCH PAD WITH ADDED FUNCTIONALITY”, filed Aug. 18, 2003;
  • [0006]
    U.S. patent application Ser. No. 11/057,050, titled “DISPLAY ACTUATOR”, filed Feb. 11, 2005;
  • [0007]
    U.S. patent application Ser. No. 10/840,862, titled “MULTIPOINT TOUCH SCREEN”, filed May 6, 2004;
  • [0008]
    U.S. Patent Application No. 60/658,777, titled “MULTIFUNCTIONAL HAND HELD DEVICE”, filed Mar. 4, 2005;
  • [0009]
    U.S. patent application Ser. No. 11/115,539, titled “HANDHELD ELECTRONIC DEVICE WITH MULTIPLE TOUCH SENSING DEVICES”, filed Apr. 26, 2005;
  • [0010]
    U.S. patent application Ser. No. 11/394,493, titled “ILLUMINATED TOUCHPAD”, filed Mar. 31, 2006.
  • [0011]
    U.S. patent application Ser. No. 11/483,008, titled “CAPACITANCE SENSING ELECTRODE WITH INTEGRATED I/O MECHANISM”, filed Jul. 6, 2006.
  • [0012]
    U.S. patent application Ser. No. 11/482,286, titled “MUTUAL CAPACITANCE TOUCH SENSING DEVICE”, filed Jul. 6, 2006.
  • BACKGROUND OF THE INVENTION
  • [0013]
    1. Field of the Invention
  • [0014]
    The present invention relates generally to touch pads that provide visual feedback. More particularly, the present invention relates to touch pads with symbols that adapt based on mode.
  • [0015]
    2. Description of the Related Art
  • [0016]
    There exist today many styles of input devices for performing operations in a computer system. The operations generally correspond to moving a cursor and/or making selections on a display screen. By way of example, the input devices may include buttons or keys, mice, trackballs, touch pads, joy sticks, touch screens and the like.
  • [0017]
    Touch pads, in particular, are becoming increasingly popular because of their ease and versatility of operation as well as to their declining price. Touch pads allow a user to make selections and move a cursor by simply touching an input surface via a finger or stylus. In general, the touch pad recognizes the touch and position of the touch on the input surface and the computer system interprets the touch and thereafter performs an action based on the touch event.
  • [0018]
    Touch pads typically include an opaque touch panel, a controller and a software driver. The touch panel registers touch events and sends these signals to the controller. The controller processes these signals and sends the data to the computer system. The software driver translates the touch events into computer events.
  • [0019]
    Although touch pads work well, improvements to their form feel and functionality are desired. By way of example, it may be desirable to provide visual stimuli at the touch pad so that a user can better operate the touch pad. For example, the visual stimuli may be used (among others) to alert a user when the touch pad is registering a touch, alert a user where the touch is occurring on the touch pad, provide feedback related to the touch event, indicate the state of the touch pad, and/or the like.
  • SUMMARY OF THE INVENTION
  • [0020]
    The invention relates, in one embodiment, to a multifunctional handheld device capable of operating in different modes. The multifunctional handheld device includes a single input arrangement that provides inputs for each mode of the multifunctional handheld device. The single input arrangement includes at least an input pad that provides signals when touched or pressed. The input pad can be divided into one or more input areas that change in accordance with the current mode of the multifunctional handheld device. The multifunctional handheld device also includes a display mechanism that presents graphical elements to indicate the configuration of the input areas at the input pad. Each mode of the multifunctional handheld device provides a different configuration of input areas and graphical elements associated therewith.
  • [0021]
    The invention relates, in another embodiment, to a multifunctional handheld computing device capable of operating in different modes. The multifunctional computing device includes a touch device having a touch surface (e.g., touch pad). The multifunctional computing device also includes a means for presenting input identifiers that indicate the locations of the touch surface designated for actuating inputs associated with the input identifiers. The multifunctional computing device further includes a means for indicating which input area is ready for actuation.
  • [0022]
    The invention relates, in another embodiment, to a touch pad that displays graphical elements to indicate input areas of the touch pad. Each input area represents a different functionality. The input areas and graphical elements changing in accordance with different input modes.
  • [0023]
    The invention relates, in another embodiment, to a touch pad. The touch pad includes a touch sensing layer. The touch pad also includes a first set of symbols that only illuminate with a first light. The touch pad further includes a second set of symbols that only illuminate with a second light. The touch pad additionally includes a light system capable of generating the first and second light.
  • [0024]
    The invention relates, in another embodiment, to a circular touch pad. The circular touch pad includes a circular light diffusing cover. The circular touch pad also includes a circular transparent touch sensing layer disposed below the light diffusing cover. The circular touch pad further includes a circular organic light emitting device (OLED) disposed below the transparent touch sensing layer. The circular touch pad additionally includes a printed circuit board disposed below the organic light emitting device (OLED). The printed circuit board carries a controller that is operatively coupled to the transparent touch sensing layer and the organic light emitting device. The controller receives touch data from the transparent touch sensing layer and instructs the organic light emitting device (OLED) how to present graphical information.
  • [0025]
    The invention relates, in another embodiment, to a method of operating a multifunctional hand held electronic device having a touch surface. The method includes displaying symbols in a circular fashion. Each symbol represents a different input to be made in the hand held electronic device. The method also includes mapping individual symbols being displayed to individual regions of the touch surface. The method further includes detecting a touch on the touch surface. The method additionally includes determining the region of the touch surface being touched. Moreover, the method includes highlighting only the symbol associated with the region of the touch surface being touched. The method also includes detecting a selection event and implementing the input associated with the symbol being highlighted when the selection event is initiated.
  • [0026]
    The invention relates, in another embodiment, to a method of operating a handheld electronic device having a touch device. The method includes designating input regions within a touch surface of the touch device. Each input region represents a different location within the touch surface. The method also includes assigning symbols to the input regions. The symbols characterize the functionality of the input regions. The method further includes displaying the symbols associated with the input regions, the location of the symbols indicating the location of the input area within the touch surface.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0027]
    The invention may best be understood by reference to the following description taken in conjunction with the accompanying drawings in which:
  • [0028]
    FIG. 1 is a simplified diagram of a multifunctional hand held device, in accordance with one embodiment of the present invention.
  • [0029]
    FIG. 2 is a diagram of a phone mode user interface, in accordance with one embodiment of the present invention.
  • [0030]
    FIG. 3 is a diagram of a phone mode user interface, in accordance with one embodiment of the present invention.
  • [0031]
    FIG. 4 is a perspective diagram of a multifunctional handheld device, in accordance with one embodiment of the present invention.
  • [0032]
    FIG. 5 is a method of operating a multifunctional device having a plurality of modes, in accordance with one embodiment of the present invention.
  • [0033]
    FIG. 6 is a method of configuring a UI of a hand held device, in accordance with one embodiment of the present invention.
  • [0034]
    FIG. 7 is a method of activating a UI as for example at start up or when a mode is changed, in accordance with one embodiment of the present invention.
  • [0035]
    FIGS. 8A-8E illustrates one example of a handheld device with a keyless phone system, in accordance with one embodiment of the present invention.
  • [0036]
    FIGS. 9A-9E illustrate one example; of a handheld device with a keyless phone system, in accordance with one embodiment of the present invention.
  • [0037]
    FIG. 10 is a simplified diagram of a touch pad, in accordance with one embodiment of the present invention.
  • [0038]
    FIG. 11 is a simplified diagram of a touch pad, in accordance with one embodiment of the present invention.
  • [0039]
    FIG. 12 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0040]
    FIG. 13 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0041]
    FIG. 14 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0042]
    FIG. 15 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0043]
    FIG. 16 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0044]
    FIG. 17 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0045]
    FIG. 18 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0046]
    FIG. 19 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0047]
    FIG. 20 is a diagram of a graphics generator, in accordance with one embodiment of the present invention.
  • [0048]
    FIG. 21 is a diagram of a graphics generator including a light panel, in accordance with one embodiment of the present invention.
  • [0049]
    FIG. 22 is a diagram of a graphics generator including a light panel, in accordance with one embodiment of the present invention.
  • [0050]
    FIG. 23 is a diagram of a graphics generator including a light panel, in accordance with one embodiment of the present invention.
  • [0051]
    FIG. 24 is a graphical layer which can be used in a phone mode, in accordance with one embodiment of the present invention.
  • [0052]
    FIG. 25 is a graphical layer which can be used in a phone mode, in accordance with one embodiment of the present invention.
  • [0053]
    FIG. 26 is a graphical layer which can be used in a phone mode, in accordance with one embodiment of the present invention.
  • [0054]
    FIG. 27 is a graphical layer which can be used in a music player mode, in accordance with one embodiment of the present invention.
  • [0055]
    FIG. 28 is a graphical layer which can be used in a music player mode, in accordance with one embodiment of the present invention.
  • [0056]
    FIG. 29 is a variation of the graphical layers given above, in accordance with one embodiment of the present invention.
  • [0057]
    FIG. 30 is a diagram of a touch pad assembly, in accordance with one embodiment of the present invention.
  • [0058]
    FIG. 31 is a diagram of a touch pad assembly, in accordance with one embodiment of the present invention.
  • [0059]
    FIG. 32 is a diagram of a touch pad assembly, in accordance with one embodiment of the present invention.
  • [0060]
    FIG. 33 is a diagram of a touch pad assembly, in accordance with one embodiment of the present invention.
  • [0061]
    FIG. 34 is a diagram of a touch pad assembly, in accordance with one embodiment of the present invention.
  • [0062]
    FIG. 35 is an exploded perspective diagram of a touch pad, in accordance with one embodiment of the present invention.
  • DETAILED DESCRIPTION OF THE INVENTION
  • [0063]
    Recently, the functionality of individual hand held devices has been converging into a single hand held device with multiple functionality. For example, music player functionality has been added to cell phones and PDAs. While combining devices has advantages, it does create some design challenges. For one, each of these devices requires a different set of input devices, and thus it becomes a non trivial problem to create an input layout that can support multifunctional devices (especially when the input devices are at fixed locations). Examples of multifunctional devices may be found in U.S. Provisional Patent Application 60/658,777, which is herein incorporated by reference.
  • [0064]
    The invention pertains to a user interface for controlling an electronic device, particularly a multifunctional electronic device that is capable of operating in multiple modes as for example a phone mode for communications and a media player mode for playing audio files, video files, and the like.
  • [0065]
    In accordance with one aspect of the invention, the user interface includes a configurable input region for navigating, making selections and initiating commands with respect to the electronic device. The input region is configured to adjust its input areas based on mode so that the inputs being provided match the current mode of the electronic device. The input region may be widely varied and may include a touch or proximity sensing area that generates signals for one or more of the operations mentioned above when an object is positioned over a sensing surface. The sensing area is typically mapped according to mode of the electronic device.
  • [0066]
    In accordance with another aspect of the invention, the user interface also includes a display mechanism for presenting input identifiers that indicate particular locations of the input region capable of actuating inputs associated with the input identifiers. Generally speaking, the display mechanism is utilized in order to replace fixed printed graphics or indicia on or near the input region and to allow the graphical information to change or adjust in accordance with a current input mode (e.g., the graphics or indicia can be reconfigured on the fly). As such, a single input region can be utilized for multiple modes of the electronic device. The display mechanism may also be used to provide feedback associated with inputting. For example, it may be used to indicate which input area is ready for actuation (e.g., highlight).
  • [0067]
    In one embodiment, the display mechanism is configured to present graphical information proximate the input region so that it can be seen when inputs are being performed at the input region. For example, the display mechanism may be located above, below or next to the input region. In another embodiment, the display mechanism is configured to present graphical information at the input region. For example, the display mechanism may be integrated with a sensing surface of the input region. In either case, the graphics or indicia typically follows or is mapped to the desired input layout of the input region. For example, the adjustable graphics or indicia is located at the same position as their counterpart input areas of the input region. As such, physical fixed graphics and indicia can be removed from the input region without impairing the use of the input region (e.g., the user knows how to input based on the layout of the presented graphics and indicia).
  • [0068]
    Embodiments of the invention are discussed below with reference to FIGS. 1-35. However, those skilled in the art will readily appreciate that the detailed description given herein with respect to these figures is for explanatory purposes as the invention extends beyond these limited embodiments.
  • [0069]
    FIG. 1 is a simplified diagram of a multifunctional hand held device 10, in accordance with one embodiment of the present invention. The multifunctional device is capable of operating in different modes including for example a phone mode and a media player mode (e.g., audio, video, etc.). By way of example, in the phone mode, the handheld device operates like a phone. For example, a user is able to dial a phone number, receive and send phone calls, etc. In the media player mode, the handheld device operates like a media player. For example, a user is able to traverse through lists of songs or videos, select and play a song or video from the lists of songs, and videos, etc.
  • [0070]
    In accordance with one embodiment, the multifunctional device 10 includes a single user interface 12, which is used to control the operations for each mode of the device. That is, the same UI 12 is used for multiple modes of the device 10. The user interface 12 generally includes a display region 14 and an input region 16. The location of these regions may be widely varied. In one embodiment, the display region and input region are disposed at the front surface of the multifunctional device for easy access and viewing while the device is being held in the user's hand.
  • [0071]
    The display region 14 allows the handheld electronic device 10 to interact with the user. For example, displaying a graphical user interface GUI associated with each mode. The GUI provides an easy to use interface between a user of the handheld device and the operating system or applications running thereon. Generally speaking, the GUI represents, programs, files and various selectable options with graphical images. The GUI can additionally or alternatively display information, such as non interactive text and graphics, for the user of the handheld electronic device. The display may also be used to display images or play video.
  • [0072]
    The input region 16 allows a user to interact with the hand held electronic device 10. For example, it allows a user to navigate, make selections and initiate commands into the handheld electronic device 10. In most cases, the input region 16 may be simplified so as not to clutter and confuse the user interface. For example, the input region 16 may not be complexly arranged and may include a limited number of individualized input mechanisms. In one implementation, the input region is a single integrated unit for performing a majority if not all of the inputting of the handheld electronic device (e.g., operates each mode).
  • [0073]
    In most cases, the input region 16 provides signals when touched and/or pressed. The signals generated at the input region 16 are configured to provide one or more control functions for controlling various applications associated with the hand held device 10. For example, the control functions may be used to move an object on the display, to make selections or issue commands associated with operating the various modes of the handheld device 10.
  • [0074]
    The shape of the input region 16 may be widely varied. By way of example, and not by way of limitation, the input pad 18 may have a substantially rectangular, triangular, circular, square, oval, plus, and/or pill shape (among others). In the illustrated embodiment, the input region has a circular shape. It is generally believed that circular input regions are easier to manipulate when operating handheld devices.
  • [0075]
    In accordance with one embodiment, the input region 16 is configurable based on mode. In this embodiment, the input region 16 is divided into one or more input areas 22 that change in accordance with the current mode of the handheld device 10. For example, each mode may divide the input region 16 into different input areas 22 and assign different functions thereto (e.g., each input area has a different task associated therewith based on mode).
  • [0076]
    The layout including shape and position of the input areas 22 within the input region 16 may be widely varied. The layout typically depends on the needs of each mode and the shape of the input region 16. By way of example, and not by way of limitation, the input areas 22 may have a substantially rectangular, triangular, circular, square, oval, plus, L, and/or pill shape (among others).
  • [0077]
    In the illustrated embodiment, the circular input region 16 is divided into angularly segmented input areas 22 with each segmented area 22 representing a different function. By way of example; in the case of a phone mode, the circular input region 16 may be divided into angularly segmented input areas 22 with each segmented area 22 representing a different key. For example, the input region 16 may include twelve individual areas 22 associated with 0-9, * and #. In the case of a media player mode, the circular input region may be divided into angularly segmented areas 22 with each segmented region 22 representing a different playback control. By way of example, the input region 16 may include four individual input areas 22 associated with standard music playback including menu, play/pause, seek forward, and seek back. The input region 16 may further include a central input area 22 contained within the outer segmented input areas 22.
  • [0078]
    It should be appreciated, that the invention is not limited to circular shapes and layouts thereof. For example, a substantially rectangular input region with substantially rectangular, square or L shaped input areas may be used. Furthermore, the circular input region may be divided into radially segmented input areas solely or in addition to the angularly segmented input areas.
  • [0079]
    The UI configuration described above can be widely varied. In one embodiment, the UI is embodied with a fullscreen display and a touch screen disposed over all or a portion of the fullscreen display. In this embodiment, the display region and input regions are a graphical elements being displayed within the fullscreen display. The touchscreen, which covers at least the graphical elements of the input region, provides the means for inputting when a places their finger over the input region (e.g., virtual input region). This arrangement may be further embodied as display actuator that includes a movable component for initiating button signals. Examples of touchscreens may be found in U.S. patent application Ser. No. 10/840,862, and which is herein incorporated by reference.
  • [0080]
    In another embodiment, the UI is embodied with a physical display and a physical input pad that can be divided input various input areas based on mode. In this embodiment, the display defines the display region, and the input pad defines at least a portion if not all of the input region. The input pad may for example be a touch device that provides one or more touch signals when touched, a clickable or actuatable pad that provides one or more button signals when moved (e.g., pressed), or a clickable or actuatable touch device that provides one or more button signals when moved (e.g., pressed) and one or more touch signals when touched.
  • [0081]
    One or more touch or clickable buttons that provide button signals when touched or pressed may also be provided with the input pad. Although distinct from the input pad, the buttons may be integrated with the input pad. Furthermore, the buttons may be disposed outside the input pad, surround the input pad, and/or be disposed within the input pad. In one example, the central input area is embodied as a single central clickable button that is integrated with and disposed in the middle of the input pad. The buttons may also have assignable functions based on mode.
  • [0082]
    In one implementation, the input pad is a touch pad built into the housing of the hand held device. A touch pad is a touch sensing device with an extended continuous touch sensing surface. The touch pad may be rigid or fixed or it may be a movable actuator that provides button or clicking actions (e.g., a clickable or actuatable touch pad). Examples of touch pads may be found in U.S. patent application Ser. Nos. 10/188,182, 10/722,948, 10/643,256, 11/483,008 which are herein incorporated by reference.
  • [0083]
    In another implementation, the input pad is not a touch pad but rather a touch sensitive portion of a housing of the hand held device. A touch sensitive housing is a housing that includes touch sensing components integrated therewith (rather than a touch pad which is built into the housing). Examples of touch sensitive housing may be found in U.S. patent application Ser. No. 11/115,539, which are herein incorporated by reference.
  • [0084]
    In another implementation, the input pad is a movable or clickable actuator that is built into the housing of the handheld device. The movable or clickable actuator typically moves to a plurality of different positions to create button signals. This arrangement may be referred to as a navigation pad. Each position can be assigned a different function based on mode.
  • [0085]
    In any of the embodiments described above, the display may be selected from flat panel devices although this is not a requirement and other types of displays may be utilized. Flat panel devices typically provide a planar platform that is suitable for hand-held devices. By way of example, the display may correspond to a liquid crystal display (LCD) such as a character LCD that is capable of presenting text and symbols or a graphical LCD that is capable of presenting images, video, and graphical user interfaces (GUI). Alternatively, the display may correspond to a display based on organic light emitting diodes (OLED), or a display that is based on electronic inks.
  • [0086]
    Because the input region 16 is used over multiple platforms (e.g., modes), the device further includes a means for displaying or presenting information indicative of how the input region 16 is to be used or set up in each mode as well as to provide feedback when inputs are made. The information may be in the form of symbols including for example icons and/or characters such as letters and numbers.
  • [0087]
    In one embodiment, the display region 14 is used to present this information. In this embodiment, the display region 14 displays graphical elements that indicate functions which can be implemented with the input region. The graphical elements may be in the form of symbols including for example icons and/or characters such as letters and numbers. In most cases, the graphical elements are laid similarly to the various areas of the input region 16 so that a user knows the meaning of the input areas. That is, each graphical element is arranged in the same position as their counterpart input areas in the input region 16. By way of example, in the case of a circular touch sensing area 18 that is divided into angularly segmented regions, the graphical elements may be arranged in a circular manner with each circularly positioned graphical element located at the angular position of the angular segmented region they represent. Furthermore, if a button area 20 is disposed in the center of the touch sensing area 18, then an additional graphical element representative of the button area 18 may be displayed in the center of the circularly oriented graphical elements. Generally speaking, there is a one to one relationship between the graphical element and the region they represent.
  • [0088]
    Using a phone mode as an example, and referring to FIG. 2, the display 14 may present a circularly oriented number layout including for example 0-9, * and # positioned in a clocklike manner (e.g., 0 is located at 12 oclock, 1 is located at 1 oclock, 2 is located at 2 oclock, 3 is located at 3 oclock, 4 is located at 4 oclock, 5 is located at 5 oclock, 6 is located at 6 oclock, 7 is located at 7 oclock, 8 is located at 8 oclock, 9 is located at 9 oclock, * is located at 10 oclock, and # is located at 11 oclock). Furthermore, the input region 16, particularly the touch sensing area 18 may be segmented into twelve regions 22, each of which corresponds to the similarly positioned number in the circularly oriented number layout. As such, the user knows what regions to press for what number by looking at the display 14 and touching the appropriate area of the touch sensing area 18.
  • [0089]
    In another embodiment, the input region 16 is used to present this information. In this embodiment, the input region 16 displays graphical elements that indicate functions which can be implemented with each region 22. Like above, the graphical elements may be in the form of symbols including for example icons and/or characters such as letters and numbers. In most cases, the graphical elements are positioned within or at the appropriate input region 22 so that a user knows the meaning of the input region 22. By way of example, in the case of a circular touch sensing area that is divided into angularly segmented regions 22, individual graphical elements may be positioned in the angular segmented region they represent. Furthermore, if a button area 20 is disposed in the center of the touch sensing area 18, then an additional graphical element representative of the button area 20 may be displayed in the center of the button area. Generally speaking, there is a one to one relationship between the graphical element and the region they represent.
  • [0090]
    Using a phone mode as an example, and referring to FIG. 3, the input region 16 may present a circularly oriented number layout including for example 0-9, * and # positioned in a clocklike manner (e.g., 0 is located at 12 oclock, 1 is located at 1 oclock, 2 is located at 2 oclock, 3 is located at 3 oclock, 4 is located at 4 oclock, 5 is located at 5 oclock, 6 is located at 6 oclock, 7 is located at 7 oclock, 8 is located at 8 oclock, 9 is located at 9 oclock, * is located at 10 oclock, and # is located at 11 oclock). Furthermore, the input region 16, particularly the touch sensing area 18 may be segmented into twelve regions 22, each of which corresponds to the similarly positioned character in the circularly oriented number layout. As such, the user knows what regions to press for what number by looking at the input region 16 and touching the appropriate area of the touch sensing area 18.
  • [0091]
    FIG. 4 is a perspective diagram of a multifunctional handheld device 40, in accordance with one embodiment of the present invention. The multifunctional handheld device 40 is capable of being operated in various modes including for example a phone mode and a media player mode. By way of example, the multifunctional handheld device 40 may be a media player with added phone functionality. For example, the media player may be an iPod manufactured by Apple Computer Inc of Cupertino Calif., with additional components for operating the media player like a phone.
  • [0092]
    The multifunctional handheld device 40 includes a display 42 and further a configurable input arrangement 44 consisting of a clickable and circular touch pad 46 and a central clickable button 48. The display 42 and configurable input arrangement 44 are used for substantially all modes of the multifunctional device 40. The display 42 presents graphical elements including a graphical user interface for each mode of the device 40. The configurable input arrangement 44 provides inputs for each mode of the device 40. Particularly, the touch pad 46 provides position signals when touched, and one or more button signals when pressed. The button 48 also provides a button signal when pressed. The signals generated by the various device can be used to drive the modes in different ways.
  • [0093]
    In accordance with one embodiment, the configurable input arrangement changes its inputting including layout and functionality based on the current mode of the device. When a phone mode is active, for example, the configurable input arrangement is configured for phone inputting. By way of example, the touch pad may be divided into angular input areas that represent the keys of a phone. When a media player mode is active, on the other hand, the configurable input arrangement is configured for navigating and playing media. By way of example, the touch pad may be divided into angular input areas that represent various playback controls (e.g., menu, next, previous, and play/pause). In addition, the central clickable button may be used for making selections in both modes.
  • [0094]
    In accordance with another embodiment, the handheld device also includes a means for presenting input identifiers that indicate the locations and functionality of input areas of the touch pad. In one implementation, the input identifiers are presented on the display. Additionally or alternatively, the input identifiers may be presented at the surface of the touch pad and possibly the clickable button. Additionally or alternatively, the input identifiers may be presented at the surface of the housing around the touch pad. In all of these cases, the input identifiers are positioned in a circular fashion similarly to the input areas. The handheld device may further include a means for indicating which input area is ready for actuation/selection (e.g., highlighting). This indication may also be provided by the display, a touch pad surface and/or a housing surface around the touch pad.
  • [0095]
    As mentioned before, it is generally believed that circular input devices are easier to manipulate when operating handheld devices. This is especially true for circular touch pads as shown in FIG. 4. For example, one advantage of a circular touch pad is that the touch sensing area can be continuously actuated by a simple swirling motion of a finger, i.e., the finger can be rotated through 360 degrees of rotation without stopping. Another advantage of a circular touch pad is that the user can rotate his or her finger tangentially from all sides thus giving it more range of finger positions. For example, a left handed user may choose to use one portion of the touch sensing area while a right handed user may choose to use another portion of the touch sensing area. Yet another advantage of a circular touch pad is that it allows an intuitive way to navigate a display screen. For example, in the case of scrolling, the user can manipulate the his or her finger side to side for horizontal scrolling and the user can manipulate his or her finger backwards and forwards for vertical scrolling.
  • [0096]
    FIG. 5 is a method 100 of operating a multifunctional device having a plurality of modes, in accordance with one embodiment of the present invention. The method 100 begins at block 102 where a command is received to initiate a mode of the multifunctional device. The command can be generated by the device itself as for example at start up or by the user when they desire to change modes from one mode to another. In the case of start up, either the current mode at shut down or some default mode or a user preference start up mode is initiated. In the case of a user, the mode selected by the user is initiated.
  • [0097]
    Once a command is initiated, the method 100 proceeds to block 104 where the UI is configured in accordance with the new mode. For example, the current UI associated with the current mode is deactivated and the new UI associated with the new mode is activated. By way of example, switching from a phone mode to a music player mode may include removing the input symbol layout associated with the phone mode from the display and/or the touch pad and presenting a new input symbol layout associated with the music player on the display and/or the touch pad. Activation may further include reassigning the regions of the touch pad and the functionality associated therewith.
  • [0098]
    FIG. 6 is a method 110 of configuring a UI of a hand held device, in accordance with one embodiment of the present invention. The method 110 includes block 112 where different functions are assigned to different regions of a touch pad based on the mode. In block 114, symbols associated with the different regions of the touch pad based on the new mode are presented. The symbols generally provide meaning to the regions. The symbols may for example be presented on the display and/or the touch pad. When presented on the display, the symbols are typically arranged similarly to the corresponding regions of the touch pad. When presented on the touch pad, the symbols are typically positioned at their corresponding region of the touch pad. The symbols may be presented with a transition effect such as fading in/out. In some cases, the new symbols fade in as the old symbols fade out. In block 116, the touch pad waits for touch events to be performed thereon.
  • [0099]
    FIG. 7 is a method 120 of activating a UI as for example at start up or when a mode is changed. The method 120 begins at block 122 where symbols are presented. The symbols may be presented on a display and/or a touch pad. The symbols are typically tied to different regions or locations of the touch pad.
  • [0100]
    Thereafter, in block 124, a determination is made as to whether a touch is detected at the touch pad. If a touch is not detected, the method 120 waits for a touch or possible a new mode command.
  • [0101]
    If a touch is detected, the method 120 proceeds to block 126 where the absolute touch position associated with the touch is read from the touch pad. For example, the coordinates of the touch relative to the touch surface may be ascertained.
  • [0102]
    Thereafter, in block 128 the symbol associated with the touch position is highlighted. For example, the touch position may be mapped to a touch region, and the symbol associated with the touch region is highlighted.
  • [0103]
    Thereafter, in block 130, a determination is made as to whether or not a selection event has been performed. The determination may be based on the amount of pressure that is applied on the touch pad, i.e., whether or not the touch pad has been pressed (rather than just touched). This can be accomplished by analyzing the area of the touch (if the area of the touch increases then a press is being made). This can also be accomplished with actuators (sensors, switches) that sense pressure at the touch pad surface. In one implementation, the touch pad is a clickable touch pad that moves relative to a housing in order to provide a clicking action. When clicked, one or more tact switches are activated. An activated switch indicates a press and therefore a selection event.
  • [0104]
    Thereafter, in block 132, the input or function associated with the region where the symbol is highlighted when the selection event occurs is implemented. This may include referring to a table that maps a particular entry and symbol to a particular touch region, and thereafter entering and presenting the entry.
  • [0105]
    FIGS. 8A-8E illustrates one example of a handheld device with a keyless phone system. The handheld device 150 includes a display 152 and a circular touch wheel 154 with a button 156 disposed in the center. As shown in FIG. 8A, when placed in a phone mode, the display 152 is configured to present a virtual wheel 158 with phone characters 160 such as numbers, * and # placed at locations around the wheel 158. The locations of the characters 160 correspond to regions of the touch wheel 154 that may be touched in order to enter the character.
  • [0106]
    As shown in FIG. 8B, when a touch is detected at the touch wheel 154, the character 160 assigned to the region of the touch wheel 154 where the touch is detected is highlighted. For example, if the user touches the touch wheel at 2 oclock, the character 2 is highlighted. In the illustrated embodiment, the character 160 is highlighted with a circle member 162. In one implementation, the circle member 162 is black and when highlighted by the black circle member 162 the character 160 is turned to white. In another implementation, the circle member 162 is a semi transparent overlay.
  • [0107]
    As shown in FIG. 8C, when the finger is moved to a new region of the touch wheel 154, a new character 160 is highlighted based on the new location of the touch. In cases where the finger stays in contact with the touch pad (sliding across), each character 160 between the starting position and the ending position is consecutively highlighted as the finger is moved over the various regions. The user therefore knows what region of the touch pad they are positioned on. In cases where the finger is picked up and moved to a new locations, only the new touch location is highlighted.
  • [0108]
    As shown in FIG. 8D, when a finger performs a selection event as for example by clicking or tapping the touch wheel, the highlighted character 160 is entered into the system and presented on the display 152 along with the virtual wheel 158. For example, a portion of the display may be dedicated to number entires (e.g., above or below the virtual wheel). Utilizing the steps shown in FIGS. 8B-8D, any number of characters can be entered and presented on the display.
  • [0109]
    As shown in FIG. 8E, once the desired group of numbers/characters 168 have been entered, a send command may be performed. For example, the center button 156 can be activated in order to generate a send command. The send command informs the handheld device 150 to call/transmit the group of numbers that were entered.
  • [0110]
    In some cases, the display may further present letters associated with the numbers. This may follow the same circular pattern discussed above with the letters being displayed around the inner periphery and the numbers being display around the outer periphery. Alternatively, the display may include a letter region that displays the letters when the numbers are highlighted. This region may for example be found underneath the virtual wheel.
  • [0111]
    FIGS. 9A-9E illustrate one example of a handheld device with a keyless phone system. The handheld device 150 includes a display 152 and a circular touch wheel 154 with a button 156 disposed in the center. As shown in FIG. 9A, when placed in a phone mode, the touch wheel 154 is configured to present phone characters 160 such as numbers, * and # at different angular locations around the wheel 154.
  • [0112]
    As shown in FIG. 9B, when a touch is detected at the touch wheel 154, the character 160 assigned to the region of the touch wheel 154 where the touch is detected is highlighted. In one implementation, the entire segmented region is highlighted. In another implementation, the segmented region is surrounded by a highlight line. Furthermore, the highlighted character 160 is presented on the display 152 in the area of the display 152 dedicated to number entries.
  • [0113]
    As shown in FIG. 9C, when the finger is moved to a new region of the touch wheel, a new character 160 is highlighted and presented on the display 152. In cases where the finger stays in contact with the touch wheel (sliding across), each consecutive character 160 between the starting position and the ending position is highlighted and presented on the display 152 as the finger is moved over the various regions. The user therefore knows what region of the touch wheel 154 they are positioned on.
  • [0114]
    As shown in FIG. 9D, when a finger performs a selection event as for example by clicking or tapping the touch wheel, the highlighted character 160 is entered into the system. Utilizing the steps shown in FIGS. 9B-9D, any number of characters can be entered and presented on the display 152.
  • [0115]
    As shown in FIG. 9E, once the desired group of numbers/characters have been entered, a send command may be performed. For example, the center button can be activated in order to generate a send command. The send command informs the handheld device 150 to transmit the numbered that was entered.
  • [0116]
    FIG. 10 is a simplified diagram of a touch pad 200, in accordance with one embodiment of the present invention. In this embodiment, the touch pad 200 includes an optically transmissive touch sensing device 202 disposed over a graphics generator 204. Both the touch sensing device 202 and the graphics generator 204 communicate with a controller 206 that monitors touch inputs of the touch sensing device 202 and that directs the graphics generator 204 to generate graphics in a controlled manner.
  • [0117]
    The touch sensing device 202 may be widely varied. The touch sensing device 202 may for example be selected from any of those used for touch screens. An example of a touch screen that may be used can be found in U.S. patent application Ser. No. 10/840,862, which is herein incorporated by reference.
  • [0118]
    The graphics generator 204 may also be widely varied. In one embodiment, the graphics generator 204 includes one or more light sources 208 for generating light (visible and/or non visible) and one or more graphics layers 210 having features 212 for creating symbols such as characters from the generated light. The light sources 208 may be placed at a variety of locations depending on the configuration of the graphics layers 210. By way of example, the light sources 208 may be placed below, above and/or to the side of the graphics layers 210. Furthermore, light carriers such as light pipes and light distribution panels may be used to help distribute the light to the graphics layer 210. By way of example, a light distribution panel may help distribute light from side firing light sources 208 to the entire graphics layer 210. The light distribution panel can be disposed above, below and even in between various graphics layers.
  • [0119]
    The features 212, on the other hand, are typically configured in the desired symbol shape. The features 212 may include masking elements (e.g., openings in the layer) and/or light excitable elements (photo sensitive portions of the layer). In the case of masking elements, when a light source is turned on, light is emitted through the masking elements thereby making one or more symbols appear at the surface. In the case of light excitable elements, when a light source is turned on, the light is absorbed by the light excitable elements and reemitted thereby making one or more symbols appear at the surface. In most cases, the light excitable elements are configured to absorb non visible light and reemit visible light. In some cases, the light excitable elements may even be sensitive to a certain wavelength range (only absorb certain wavelengths of light). As such, different sets of features can be activated with different wavelength ranges. This is very beneficial when designing a touch pad to serve multiple modes of a hand held electronic device.
  • [0120]
    The touch pad 200 can also include a cover 216 for protecting the various layers. In some cases, the cover 216 may also act as a light diffuser for normalizing the intensity of light, and helping hide the various layers from view. By way of example, the cover may act as a canvas for the graphics generator (i.e., place where illuminated symbols are projected).
  • [0121]
    The touch pad 200 may further include a light panel 218 for producing other visual effects, either separately or together with the graphics generator 204. In one embodiment, the light panel 218 may be used to highlight the features 212 generated via the graphics generator 204. The light panel 218 may be placed above or below the graphics generator 204 (depending on the optical properties of the graphics generator).
  • [0122]
    Alternatively or additionally, the graphics generator 204 may be embodied as an OLED.
  • [0123]
    FIG. 11 is a simplified diagram of a touch pad 220, in accordance with one embodiment of the present invention. In this embodiment, the touch pad 220 includes an opaque or alternatively an optically transmissive touch sensing device 222 disposed below a graphics generator 224. Both the touch sensing device 222 and the graphics generator 224 communicate with a controller 226 that monitors touch inputs of the touch sensing device 222 and that directs the graphics generator 224 to generate graphics in a controlled manner.
  • [0124]
    The touch sensing device 222 may be widely varied. The touch sensing device 222 may for example be selected from any of those used for touch pads or touch screens. An example of a touch pad that may be used can be found in U.S. patent application Ser. Nos. 10/188,182, 10/722,948, 10/643,256 and 11/483,008, all of which are herein incorporated by reference.
  • [0125]
    The graphics generator 224 may also be widely varied. Unlike the graphics generator discussed in FIG. 10, this graphics generator herein needs to allow touch sensing to occur therethrough. For example, it may be formed from a dielectric material so that touch sensing can occur with impediments (e.g., capacitance). In all other aspects it can be configured similarly to the graphics generator described above. For example, the graphics generator includes light sources and a graphics layer consisting of masking elements and/or light excitable elements.
  • [0126]
    Furthermore, like the touch pad mentioned above the touch pad can also include a cover for protecting the various layers and a light panel for producing other visual effects.
  • [0127]
    FIG. 12 is a diagram of a graphics generator 240, in accordance with one embodiment of the present invention. The graphics generator 240 includes an opaque masking layer 242 and a light system 244. The masking layer 242 includes a plurality of openings 246 shaped as symbols. During operation, the light system 244 emits light below the masking layer 242. Light that intersects the masking layer 242 is blocked while light that intersects the openings 246 travels through the openings 246 to the other side thereby forming illuminated symbols.
  • [0128]
    In order to produce symbol layouts for different modes, the masking layer 242 may include different sets of openings 246A and 246B with each set having a dedicated light system 244A and 244B dedicated thereto. When the device is in a mode A, the light system 244A emits light below the masking layer 242, and more particularly directly behind the openings 246A such that illuminated symbols associated with mode A are formed. When the device is in mode B, the light system 244B emits light below the masking layer, 242 and more particularly directly behind the openings 246B such that illuminated symbols associated with mode B are formed.
  • [0129]
    FIGS. 13-20 are diagrams of graphics generators 250, in accordance with several embodiments of the present invention. The graphics generators 250 include one or more light systems 252, one or more light distribution panels 254, and one or more graphics layer 256 with light excitable elements 258 shaped as symbols. The light system 252 is configured to generate light, the light distribution panel 254, which is formed from an optically transmissive material (e.g., transparent) is configured to distribute the light to the graphics layers 256 with light excitable elements 258, and the light excitable elements 258 are configured to absorb and reemit the generated light. The light system 252 may be placed at various locations relative to the light excitable elements 258. For example, it may be placed above, below, and/or to the side. Furthermore, the light excitable elements 258 may be placed on the front and/or back or within the light distribution panel 254.
  • [0130]
    As shown in FIG. 13, the light excitable elements 258 are placed on the front of the light distribution panel 254.
  • [0131]
    As shown in FIG. 14, the light excitable elements 258 are placed on the back of the light distribution panel 254.
  • [0132]
    As shown in FIG. 15, the light excitable elements 258 are placed on both the front and the back of the light distribution panel 254.
  • [0133]
    Alternatively or additionally, as shown in FIG. 16, a portion of the light excitable elements 258 may be placed on a first light distribution panel 254A, and a second portion may be placed on a second light distribution panel 254B.
  • [0134]
    Alternatively or additionally, as shown in FIGS. 17 and 18, the light excitable elements 258 may be placed on a separate carrier 255 disposed above or below the light distribution panel 254.
  • [0135]
    Alternatively or additionally, as shown in FIG. 19, a first portion of the light excitable elements 258 may be placed on an smaller diameter light distribution panel 254, and a second portion of the light excitable elements 258 may be placed on larger diameter light distribution panel 254.
  • [0136]
    In one embodiment, which can be applied to the various embodiments described above, during operation, the light system 252 emits non visible light into the light distribution panel 254, and the light distribution panel 254 transmits the non visible light to the light excitable elements 258. The light excitable elements 258 then absorbs the non visible light directed thereon and reemits it as visible light thereby forming illuminated symbols.
  • [0137]
    In order to produce symbol layouts for different modes, the graphics layer 256 with light excitable elements 258 shaped as symbols may include different sets of light excitable elements 258A and 258B with each set having a dedicated light system 252A and 252B. In this embodiment, each set of light excitable elements 258 is excited with a different wavelength of light. When the device is in mode A, the light system 252A emits a first wavelength of light into the light distribution panel 254 thereby exciting the first set of light excitable elements 258A and not exciting the second set of light excitable elements 258B. When the device is in mode B, the light system 252B emits a first wavelength of light into the light distribution panel 254 thereby exciting the second set of light excitable elements 258B and not exciting the first set of light excitable elements 258A. When excited, the first set of light excitable elements 258A creates illuminated symbols associated with mode A, and the second set of light excitable elements 258B creates illuminated symbols associated with mode B.
  • [0138]
    FIG. 20 is a diagram of a graphics generator 270, in accordance with another embodiment of the present invention. This embodiment of the graphics generator 270 combines the masking layer of FIG. 12 with light excitable elements of the other embodiments of FIGS. 13-18. That is, the light excitable elements 258 are placed in front of, within or behind the openings 246 of the masking layer 242. As such, when non visible (or visible) light is directed towards or through the opening 246, the light excitable elements 258 are excited (absorb and reemit) thereby forming illuminated symbols.
  • [0139]
    In all of the embodiments described above, the configuration of the light system 244, 252 may be widely varied. For example, they may be embodied as LEDs, light panels, etc. Furthermore, the light excitable elements 258 may be formed from any photoluminescence (PL) material. The material selected may depend on whether the graphics layer 256 is disposed above or below a touch sensing device. For example, in cases where it is disposed above a capacitive touch sensing layer, the PL material needs to be formed from a dielectric material.
  • [0140]
    The PL material may be widely varied. Generally a PL material is classified as a material that radiates visible light after being energized. In the embodiments described herein, the PL material is energized with visible or non visible light. By way of example, the PL material may contain phosphors that are energized with ultraviolet light of various wavelengths. The UV light may be produced by LEDs. LEDs offer many advantages.
  • [0141]
    In order to highlight the various symbols produced by the graphics generators, the graphics generators may include highlight features and symbol features on the same graphics layer. In this embodiment, each symbol includes its own highlight feature. Further, the symbol features typically operate with the same light system while each highlight feature typically operates with its own dedicated light system. During operation, all the symbol features are turned on when a mode is activated, and then when a touch is detected over a particular symbol, the highlight feature associated with that symbol is turned on. This is typically accomplished with a controller.
  • [0142]
    Additionally or alternatively, the graphics generators may include dedicated graphics layers, one or more for the symbol features and one or more for the highlight features.
  • [0143]
    Additionally or alternatively, the graphics generators may include light panels for highlighting the symbol features. The light panel can be disposed above, below or in between the graphics layers. The light panels are configured to distribute light in a segmented manner. For example, the light panel can be configured with separately controlled light regions, each of which corresponds to a particular symbol feature. During operation, all the symbol features are turned on when a mode is activated, and then when a touch is detected over a particular symbol, the light region associated with that symbol is turned on. This is typically accomplished with a controller. FIGS. 21-23 show three simplified examples of this embodiment. In FIG. 21, a light panel 280 is disposed above a graphics generator 282. In FIG. 22, the light panel 280 is disposed below the graphics generator 282. In FIG. 23, the light panel 280 is disposed between two graphics generators 282. Although only these examples are shown, it should be appreciated that any number of configurations can be used to produce the desired effect. Furthermore, it should be pointed out that the light panel can be used for other visual effects (e.g., not limited to highlighting).
  • [0144]
    FIGS. 24-29 show several top views of graphical layers that can be used, in accordance with several embodiments of the present invention. In each of these embodiments, the touch pad in which the graphical layer is used has an annular and circular configuration. The area in the middle may for example be used for button inputting while the annular area may for example be used for touch inputting. Furthermore, in each of these embodiments, the graphical layer includes various symbols formed from masking elements and/or light excitable elements.
  • [0145]
    FIG. 24 is a graphical layer 300 which can be used in a phone mode. The graphical layer 300 includes the numbers and other characters 302 needed for phone inputting as for example 0-9, * and #. Each number is positioned in an angular manner around the touch pad. 0 is at 12 oclock, 1 is at 1 oclock, 2 is at 2 oclock, 3 is at 3 oclock, 4 is at 4 oclock, 5 is at 5 oclock, 6 is at 6 oclock, 7 is at 7 oclock, 8 is at 8 oclock, 9 is at 9 oclock, * is at 10 oclock, and # is at 11 oclock. In one embodiment, all the numbers and other characters are formed from a light excitable material with the same light sensitivity such that they can be turned on a with a single light source. In another embodiment, all the numbers and other characters are formed from light excitable materials with different light sensitivities such that they can be individually controlled.
  • [0146]
    FIG. 25 is a variation of the embodiment shown in FIG. 24. In this embodiment, the graphics layer 300 further includes the letters 303 that go along with the numbers 302 of the phone. The numbers are placed along the outer periphery while the letters are placed at the inner periphery. In one embodiment, both the numbers and letters are formed from a light excitable material with the same light sensitivity such that they can all be turned on a with a single light source. In another embodiment, the set of numbers is formed from a first light excitable material (same light sensitivity) and the set of letters is formed a second light excitable material (same light sensitivity) that is different than the light sensitivity of the first light excitable material such that they can be individually controlled.
  • [0147]
    FIG. 26 is a variation of the embodiment shown in FIG. 24. In this embodiment, the graphics layer 300 further includes highlighting bars 304 that go along with the numbers 302 of the phone (and/or letters). The individual highlighting bars 304 surround each of the numbers and other characters 302. The numbers and other characters 302 are formed from a first light excitable material with the same light sensitivity and each of the highlight bars 304 are formed from light excitable materials with light sensitivities that differ from each other and the numbers and other characters 302. In this manner, the highlight bars 304 can be individually controlled.
  • [0148]
    FIG. 27 is a graphical layer 310 which can be used in a music player mode. The graphical layer 310 includes the symbols 312 needed for navigating a music player as for example menu, <<, >> and play/pause. Each symbol is positioned in an angular manner around the graphics layer 310. Menu is at 12 oclock, >> is at 3 oclock, play/pause is at 6 oclock, and <<is at 9 oclock. In one embodiment, all the symbols are formed from a light excitable material with the same light sensitivity such that they can be turned on a with a single light source. In another embodiment, all the symbols are formed from light excitable materials with different light sensitivities such that they can be individually controlled.
  • [0149]
    FIG. 28 is a variation of the embodiment shown in FIG. 27. In this embodiment, the graphics layer 310 further includes highlighting bars 316 that go along with the symbols 312. The individual highlighting bars 316 surround each of the symbols 312. The symbols 312 are formed from a first light excitable material with the same light sensitivity and each of the highlight bars 316 is formed from light excitable materials with light sensitivities that differ from each other and the symbols 312. In this manner, the highlight bars 316 can be individually controlled.
  • [0150]
    FIG. 29 is a variation of all the examples given above. In this embodiment, the graphics layer 340 includes the phone numbers 302 formed from light sensitive materials with the same light sensitivity, and music player symbols 312 formed from light sensitive materials with the same light sensitivity but different than the light sensitivity of the light sensitive materials of the phone numbers 302. Furthermore, the graphics layer 340 includes highlight bars 304 for each of the phone numbers 302, and highlight bars 316 for each of the music player symbols 312. Each of the highlight bars 304 and 316 are formed from light sensitive materials with light sensitivities differing from each other as well as the light sensitivities of the phone numbers and music player symbols 302 and 312.
  • [0151]
    It should be appreciated that the examples given above are by way of example and not by way of limitation. For example, graphics layers may include features associated with other modes including for example modes associated with PDA, calendaring, GPS, remote control, video, game, etc. Furthermore, the features of the graphics layers are not limited to a single graphics layer and may be applied to multiple graphical layers depending on the needs of each touch pad.
  • [0152]
    Although the touch pad can take a variety of forms using the techniques mentioned above, one particular implementation will now be described in conjunction with FIGS. 30-34.
  • [0153]
    FIGS. 30-34 are diagrams of a touch pad assembly 350, in accordance with one embodiment of the present invention. The touch pad assembly 350 includes a frame 352 and a circular touch pad 354 assembled within the frame 352. The frame 352 may be a separate component or it may be integrated or part of a housing of a handheld device. The circular touch pad 354 includes various layers including a cover 356, a light panel 358, a graphics panel 360, an electrode layer 362 and a printed circuit board (PCB) 364. The electrode layer 362 is positioned on the PCB 364. The graphics panel 360 is disposed above the electrode layer 362. The light panel 358 is disposed above the graphics panel 360. And the cover 356 is disposed above the light panel 358. The touch pad 354 further includes a button 366 disposed at the center of the touch pad 354. As such, the various layers are annular in shape.
  • [0154]
    The electrode layer 362 includes a plurality of spatially separated electrodes configured to detect changes in capacitance at an upper surface of the touch pad 354. Each of the electrodes is operatively coupled to a controller 368 located on the backside of the printed circuit board 364. During operation, the controller 368 monitors the changes in capacitance and generates signals based on these changes.
  • [0155]
    In one embodiment, various regions of the electrode layer 362 are mapped to various functions (e.g., button functions) depending on the mode of a device. During operation, if the capacitance of electrodes mapped to a region change significantly, then the function associated with the region is implemented. The mapping may be widely varied. By way of example, in a phone mode, the electrode layer 362 may be mapped in such a way so as to simulate the keys associated with a phone. In a music player mode, the electrode layer 362 may be mapped in such a way so as to simulate the buttons associated with a music player.
  • [0156]
    The graphics panel 360 is configured to generate symbols that visually indicate the meaning of the various regions when in a particular mode. The graphics panel 360 includes a light distribution panel 370 disposed over the electrode layer 362. The light distribution panel 370 is configured to redirect the light made incident thereon to light activated symbols 372. The light distribution panel 370 is also configured to serve as a dielectric layer that covers the electrode layer 362 in order to help form the capacitance sensing circuit of the touch pad 354. The light distribution panel 370 may include any number of light activated symbols 372.
  • [0157]
    In the illustrated embodiment, the light distribution panel 370 includes a first set of symbols 372A associated with a first mode and a second set of symbols 372B associated with a second mode. The symbols in each of the sets 372 are angularly dispersed around the light distribution panel 370 in a uniform and equal manner. The first set 372A is disposed around the outer periphery and the second set 372B is disposed around the inner periphery. Furthermore, the first set of symbols 372A are formed from a light sensitive material sensitive to a first wavelength of light and the second set of symbols 372B are formed from a light sensitive material sensitive to a second wavelength of light. Although the sets 372 may be widely varied, in the illustrated embodiment, the first set 372A is associated with a phone mode and the second set 372B is associated with a music player mode. As such, the first set 372A includes 0-9, * and # while the second set 372B includes menu, >>, play/pause, and <<.
  • [0158]
    It should be noted that the graphics panel is not limited to only two sets and other sets may be provided. The number of sets is typically determined by the number of modes offered by the device in which the touch pad is placed.
  • [0159]
    The graphics panel 360 also includes separate light emitting diode(s) 374A and 374B dedicated to each set of symbols 372. The light emitting diodes 374 are positioned next to the light distribution panel 370 so that light generated therefrom can be directed into the light distribution panel 370 and ultimately to the light activated symbols 372. The light emitting diodes 374 may for example be placed in the center area provided by the annular shape. The light emitting diodes 374 are configured to generate non visible light such as ultraviolet or infrared light in the wavelength needed to drive the set of the symbols associated therewith. In the illustrated embodiment, the first light emitting diode(s) 374A are configured to generate non visible light having the first wavelength, and the second light emitting diode(s) 374B are configured to generate non visible light having a second wavelength. As shown, the LEDs 374 are attached to the printed circuit board 364 and operatively coupled to the controller 368 located on the backside of the printed circuit board 364. During operation, the controller 368 selectively adjusts the intensity of each of the LEDs 374 to illuminate the symbols 372 in a controlled manner. By way of example, in a first mode, the first LED 374A may be turned on and the second LED 374B turned off. And in a second mode, the second LED 374B may be turned on and the first LED 374A turned off.
  • [0160]
    Although only a single graphics panel 360 is shown, it should be appreciated that this is not a limitation and that additional graphics panels may be used. For example, one or more graphics panels may be further positioned underneath the first graphics panel described above.
  • [0161]
    Referring now to the light panel 358, the light panel 358 is configured to generate light for highlighting the light activated symbols 372 that are being touched. The light panel 358 includes a light distribution panel 380 disposed over the graphics panel 360 and one or more side mounted light emitting diodes 382 disposed around the periphery of the light distribution panel 380. The side mounted light emitting diodes 382 are configured to direct light into a different portion of the light distribution panel 380. Alternatively, a light pipe may be used to direct light from an LED located away from the light distribution panel. The light distribution panel 380 is configured to redirect the light made incident thereon via the light emitting diodes 382 to an upper surface of the light distribution panel 380 thereby illuminating the touch pad surface. The light distribution panel 380 is also configured to serve as a dielectric layer that covers the electrode layer 362 in order to help form the capacitance sensing circuit of the touch pad.
  • [0162]
    As shown, the LEDs 382 are attached to the printed circuit board 364 and operatively coupled to the controller 368 located on the backside of the printed circuit board 364. During operation, the controller 368 selectively adjusts the intensity of each of the LEDs to illuminate portions of or all of the light distribution panel 380 in a controlled manner.
  • [0163]
    The light distribution panel 380 can be widely varied. In the illustrated embodiment, the light distribution panel 380 typically includes a portion that extends below the inner surface of the frame. This portion provides a light receiving area at the sides of the light distribution panel 380 for receiving light emitted by the side mounted LED's 382. Furthermore, the light distribution panel 380, which can be formed from a single or multiple layers, is typically formed from translucent or semi-translucent dielectric materials including for example plastic materials such as polycarbonate, acrylic or ABS plastic. It should be appreciated, however, that these materials are not a limitation and that any optically transmittable dielectric material may be used (the same materials can be used for the graphic panel).
  • [0164]
    Further, the light distribution panel 380 is broken up into plurality of distinct nodes 384, each of which includes its own dedicated light emitting diode 382 for individual illumination thereof. During operation, when light is released by a light emitting diode 382, the light is made incident on the side of the light distribution panel 380 at the node 384. The node 384 redirects and transmits the light from its side to an upper surface of the node 384. In order to prevent light bleeding between adjacent nodes 384, each node 384 may be optically separated by a reflecting or masking region disposed therebetween.
  • [0165]
    Each of the nodes 384 may be formed from a solid piece of material or it may be formed from a combination of elements. In one embodiment, each of the nodes 384 is formed from a translucent or semi-translucent plastic insert that when combined with the other inserts forms the light distribution panel 380. In another embodiment, each of the nodes is formed from a bundle of fiber optic strands.
  • [0166]
    The configuration of the nodes 384 including layout, shape and size may be widely varied. Because the touch pad 354 is circular in the illustrated embodiment, the nodes 384 are embodied as distinct angular segments (e.g., pie shaped). Furthermore, the number of nodes 384 is typically based on the symbol set 372 with the largest number of symbols. For example, in the illustrated embodiment, this would be twelve, one for each symbol of the phone mode. In one configuration, in order to highlight a phone number, the node corresponding to the phone number (disposed directly above) is illuminated, and in order to highlight a music symbol, multiple nodes corresponding to the music symbol are illuminated (in the example provided, three nodes would be illuminated for each music symbol)
  • [0167]
    In one embodiment, all the LEDs 382 are powered at the same time to produce a fully illuminated touch pad 354. This may be analogous to backlighting. In another embodiment, the LEDs 382 are powered in accordance with the capacitance changes measured by each of the electrodes. For example, the node 384 above the detected region may be illuminated while the segments above the undetected regions may be turned off. This provides indication to the user as to their exact location on the touch surface, i.e., which symbol and thus which function will be implemented. In yet another embodiment, selected segments may be illuminated to encourage a user to place their finger in a particular area of the touch pad.
  • [0168]
    Although only a single light panel 358 is shown, it should be appreciated that this is not a limitation and that additional light panels may be used. For example, one or more light panels may be further positioned underneath the first light panel described above. In one embodiment, each light panel in a group of light panels is configured to distribute a different color. For example, three light panels including a red, green and blue light panel may be used. Using this arrangement, different colored segments may be produced. By controlling their intensity, almost any color can be produced (mixed) at the touch surface. In another embodiment, each light panel in the group of light panels may have a different orientation. For example, the angularly segmented nodes of the light distribution panel may be rotated relative to the other light panels so that they are placed at different positions about an axis (e.g., partially overlapping and angularly offset). Using this arrangement, leading and trailing illumination can be produced.
  • [0169]
    In most cases, some component of the touch pad 354 includes light diffusing elements to diffuse the light produced thereform in order to normalize the light intensity, to produce a characteristic glow, and/or to hide the physical parts of the touch pad 354 located underneath the input surface. By way of example, the component may be the light distribution panel 380 of the light panel or the cover 356 disposed thereover. The light diffusing elements may be provided on an inner surface, outer surface of the component or they may be embedded inside the component. In one embodiment, the light diffusing element is an additive disposed inside the light distribution panel. In another embodiment, the light diffusing element is a layer, coating and/or texture that is applied to the inner, side or outer surfaces of the panel.
  • [0170]
    In the illustrated embodiment, the light diffusing element is disposed in the cover 356. The cover 356 may for example be a label adhered to the top surface of the light distribution panel 380. The cover label may be formed from transparent or semitransparent dielectric materials such as Mylar or Polycarbonate or any other dielectric material that is thin, optically transmittable and includes some sort of light diffusing means.
  • [0171]
    Referring to the button 366, both the light distribution panel 370 and 380 as well as the electrode layer 362 have an annular shape that creates a void at the center of the touch pad 354 for placement for the button 366. The button 366 includes a translucent button cap 390 that is movable trapped between the cover 356 and a spring loaded switch 392. The switch 392 is mounted to the printed circuit board 364 and operatively coupled to the controller 368. When the button cap 390 is pressed, it moves against the actuator of the spring loaded switch 392 thereby generating a button event that is read by the controller 368. The button cap 390 may be illuminated with an LED 394 to indicate when a signal has been read by the controller 368. Furthermore, the button cap 390 may include a graphical layer 396 with one or more symbols that are driven by dedicated light emitting diodes 398A and 398B similar to the graphical panel 360 described above. In the illustrated embodiment, the graphical layer 396 includes a first symbol 399A associated with a first mode (e.g., phone) and a second symbol 399B associated with a second mode (e.g., music notes).
  • [0172]
    In accordance with one embodiment, the functionality of a button (or buttons) is also incorporated directly into the touch pad 354 such that the touch pad 354 acts like a button along with its touch sensing capabilities. That is, the touch pad 354 forms a platform that can be clicked relative to the frame 352 in order to activate one or more actuators such as switches.
  • [0173]
    To elaborate, the touch pad 354 is capable of moving relative to the frame 352 so as to create a clicking action at various regions of the touch pad 354. The clicking actions are generally arranged to actuate one or more movement indicators 402 contained inside the frame 352. That is, a portion of the touch pad 354 moving from a first position (e.g., upright) to a second position (e.g., depressed) is caused to actuate a movement indicator 402. The movement indicators 402 are configured to sense movements of the touch pad 354 during the clicking action and to send signals corresponding to the movements to the host device. By way of example, the movement indicators 402 may be switches, sensors and/or the like.
  • [0174]
    Because the touch pad 354 is used for different modes that require different inputs, the largest set of inputs is typically used as the base for determining the number of movement indicators 402. This may be done for signal purposes (although not a requirement) and/or for stability reasons (provide the same feel to each zone). In the illustrated embodiment, the touch pad 354 includes a movement indicator 402 for each of the regions required for a phone mode. That is, there is a movement indicator 402 disposed beneath each of the phone numbers and characters.
  • [0175]
    The movements of the touch pad 354 may be provided by various rotations, pivots, translations, flexes and the like. In one embodiment, the touch pad 354 is configured to gimbal relative to the frame 352 so as to generate clicking actions for each of the button zones. By gimbal, it is generally meant that the touch pad is able to float in space relative to the frame while still being constrained thereto. The gimbal may allow the touch pad 354 to move in single or multiple degrees of freedom (DOF) relative to the housing. For example, movements in the x, y and/or z directions and/or rotations about the x, y, and/or z axes (θx θy θz).
  • [0176]
    The movement indicators 402 may be widely varied, however, in this embodiment they take the form of mechanical switches. The mechanical switches are typically disposed between the circuit board 364 and the frame 352. The mechanical switches may be attached to the frame 352 or to the printed circuit board 364. A stiffening plate may be provided to stiffen the circuit board. In the illustrated embodiment, the mechanical switches are attached to the backside of the circuit board 364 and operatively coupled to the controller thus forming an integrated unit. They are generally attached in location that places them beneath the appropriate button zone (e.g., beneath each of the phone numbers or characters). As shown, the mechanical switches include actuators that are spring biased so that they extend away from the circuit board 364. As such, the mechanical switches act as legs for supporting the touch pad 354 in its upright position within the frame 352 (i.e., the actuators rest on the frame). By way of example, the mechanical switches may correspond to tact switches and more particularly, enclosed SMT dome switches (dome switch packaged for SMT).
  • [0177]
    Moving along, the integrated unit of the touch pad 354 and switches 402 is restrained within a space provided in the frame 352. The integrated unit is capable of moving within the space while still being prevented from moving entirely out of the space via the walls of the frame 352. The shape of the space generally coincides with the shape of the integrated unit. As such, the unit is substantially restrained along the X and Y axes via a side wall of the frame and along the Z axis and rotationally about the X and Y axis via a top wall and a bottom wall of the frame. A small gap may be provided between the side walls and the platform to allow the touch pad 354 to move to its four positions without obstruction (e.g., a slight amount of play). In some cases, the circuit board may include tabs that extend along the X and Y axis so as to prevent rotation about the Z axis. Furthermore, the top wall includes an opening for providing access to the touch sensitive surface of the touch pad 354. The spring force provided by the mechanical switches 402 places the touch pad 354 into mating engagement with the top wall of the frame 352 (e.g., upright position) and the gimbal substantially eliminates gaps and cracks found therebetween.
  • [0178]
    FIG. 35 is an exploded perspective diagram of a touch pad 420, in accordance with one embodiment of the present invention. The touch pad 420 may be a stationary fixed touch pad or a it may be integrated into a clickable touch pad. The touch pad 420 includes various layers including a light diffusing cover 422, a transparent touch sensing layer 424, an organic light emitting device (OLED) 426, and a printed circuit board 428. The light diffusing cover 422 is disposed over the touch sensing layer 424, the touch sensing layer 424 is disposed over the OLED 426, and the OLED 426 is disposed over the printed circuit board 428. The touch sensing layer 424 and OLED 426 are operatively coupled to a controller 430 located on the printed circuit board 428. The controller receive data from the touch sensing layer and instructs the OLED how to present graphical information. The graphical information may be based on the touch data. The touch sensing layer 424 may include its own carrier or it may be applied to the bottom surface of the cover 422 and/or the top surface of the OLED 426. In the illustrated embodiment, the touch pad 420 is circular. Furthermore, the circular touch pad 420 may include a button and therefore it may further include circularly annular OLED 426, circularly annular touch sensing layer 424, and a circularly annular cover 422 to provide space for the button.
  • [0179]
    While this invention has been described in terms of several preferred embodiments, there are alterations, permutations, and equivalents, which fall within the scope of this invention. For example, although the invention was primarily directed at a circular touch pad, it should be appreciated that this is not a limitation and that the principles disclosed herein may equally applied to other shaped touch pads. It should also be noted that there are many alternative ways of implementing the methods and apparatuses of the present invention. For example, with regards to light based touch pads, the light sources may be integrated with touch sensing nodes as described in U.S. patent application Ser. No. 11/483,008, which is herein incorporated by reference. It is therefore intended that the following appended claims be interpreted as including all such alterations, permutations, and equivalents as fall within the true spirit and scope of the present invention.
Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US2005 *Mar 16, 1841 Improvement in the manner of constructing molds for casting butt-hinges
US4246452 *Jan 5, 1979Jan 20, 1981Mattel, Inc.Switch apparatus
US4570149 *Mar 15, 1983Feb 11, 1986Koala Technologies CorporationSimplified touch tablet data device
US4644100 *Mar 22, 1985Feb 17, 1987Zenith Electronics CorporationSurface acoustic wave touch panel system
US4719524 *Oct 7, 1985Jan 12, 1988Sony CorporationSignal reproduction apparatus including touched state pattern recognition speed control
US4734034 *Mar 29, 1985Mar 29, 1988Sentek, IncorporatedContact sensor for measuring dental occlusion
US4798919 *Mar 11, 1988Jan 17, 1989International Business Machines CorporationGraphics input tablet with three-dimensional data
US4810992 *Apr 19, 1988Mar 7, 1989Interlink Electronics, Inc.Digitizer pad
US4897511 *Jun 16, 1988Jan 30, 1990Gunze LimitedMethod of detection of the contacting position in touch panel sensor
US4990900 *Jun 9, 1988Feb 5, 1991Alps Electric Co., Ltd.Touch panel
US5086870 *Oct 31, 1990Feb 11, 1992Division Driving Systems, Inc.Joystick-operated driving system
US5179648 *Jan 25, 1991Jan 12, 1993Hauck Lane TComputer auxiliary viewing system
US5186646 *Jan 16, 1992Feb 16, 1993Pederson William AConnector device for computers
US5192082 *Aug 12, 1992Mar 9, 1993Nintendo Company LimitedTV game machine
US5193669 *Feb 28, 1991Mar 16, 1993Lucas Industries, Inc.Switch assembly
US5278362 *Jul 6, 1992Jan 11, 1994Nihon Kaiheiki Industrial Company, Ltd.Push-button switch with display device
US5379057 *Jul 28, 1993Jan 3, 1995Microslate, Inc.Portable computer with touch screen and computer system employing same
US5494157 *Nov 14, 1994Feb 27, 1996Samsonite CorporationComputer bag with side accessible padded compartments
US5495566 *Nov 22, 1994Feb 27, 1996Microsoft CorporationScrolling contents of a window
US5596347 *Mar 31, 1995Jan 21, 1997Microsoft CorporationSystem and method for computer cursor control
US5596697 *Sep 30, 1993Jan 21, 1997Apple Computer, Inc.Method for routing items within a computer system
US5598183 *Dec 12, 1995Jan 28, 1997Microsoft CorporationSystem and method for computer cursor control
US5611040 *Apr 5, 1995Mar 11, 1997Microsoft CorporationMethod and system for activating double click applications with a single click
US5611080 *Mar 23, 1994Mar 18, 1997Jofa AbLimb protector
US5613137 *Mar 18, 1994Mar 18, 1997International Business Machines CorporationComputer system with touchpad support in operating system
US5721849 *Mar 29, 1996Feb 24, 1998International Business Machines CorporationMethod, memory and apparatus for postponing transference of focus to a newly opened window
US5726687 *Nov 13, 1996Mar 10, 1998Microsoft CorporationAuto-scrolling with mouse speed computation during dragging
US5729219 *Aug 2, 1996Mar 17, 1998Motorola, Inc.Selective call radio with contraposed touchpad
US5730165 *Dec 26, 1995Mar 24, 1998Philipp; HaraldTime domain capacitive field detector
US5856645 *Jun 6, 1995Jan 5, 1999Norton; PeterCrash sensing switch
US5856822 *Oct 27, 1995Jan 5, 199902 Micro, Inc.Touch-pad digital computer pointing-device
US5859629 *Jul 1, 1996Jan 12, 1999Sun Microsystems, Inc.Linear touch input device
US5861875 *Dec 11, 1996Jan 19, 1999Cirque CorporationMethods and apparatus for data input
US5869791 *Mar 1, 1996Feb 9, 1999U.S. Philips CorporationMethod and apparatus for a touch sensing device having a thin film insulation layer about the periphery of each sensing element
US5875311 *Aug 1, 1996Feb 23, 1999International Business Machines CorporationComputer system with touchpad support in operating system
US5883612 *Oct 24, 1996Mar 16, 1999Motorola, Inc.Method for positioning a vibrating alert adjacent to a selected alert in selective call device
US5883619 *Nov 12, 1996Mar 16, 1999Primax Electronics Ltd.Computer mouse for scrolling a view of an image
US5889236 *Nov 13, 1995Mar 30, 1999Synaptics IncorporatedPressure sensitive scrollbar feature
US5889511 *Jan 17, 1997Mar 30, 1999Tritech Microelectronics International, Ltd.Method and system for noise reduction for digitizing devices
US5890181 *Nov 14, 1996Mar 30, 1999Kurzwell Applied Intelligence, Inc.System and method for remotely grouping contents of an action history stack
US6011542 *Feb 13, 1998Jan 4, 2000Sony CorporationGraphical text entry wheel
US6025832 *Sep 27, 1996Feb 15, 2000Kabushiki Kaisha ToshibaSignal generating apparatus, signal inputting apparatus and force-electricity transducing apparatus
US6031518 *May 30, 1997Feb 29, 2000Microsoft CorporationErgonomic input device
US6034672 *Jun 10, 1994Mar 7, 2000Sextant AvioniqueDevice for multimode management of a cursor on the screen of a display device
US6179496 *Dec 28, 1999Jan 30, 2001Shin Jiuh Corp.Computer keyboard with turnable knob
US6181322 *Nov 7, 1997Jan 30, 2001Netscape Communications Corp.Pointing device having selection buttons operable from movement of a palm portion of a person's hands
US6185591 *Jul 29, 1997Feb 6, 2001International Business Machines Corp.Text edit system with enhanced undo user interface
US6188391 *Jul 9, 1998Feb 13, 2001Synaptics, Inc.Two-layer capacitive touchpad and method of making same
US6188393 *Oct 5, 1998Feb 13, 2001Sysgration Ltd.Scroll bar input device for mouse
US6191774 *Sep 22, 1999Feb 20, 2001Immersion CorporationMouse interface for providing force feedback
US6198054 *Apr 11, 2000Mar 6, 2001Itt Manufacturing Enterprises, Inc.Multiple electric switch with single actuating lever
US6198473 *Oct 6, 1998Mar 6, 2001Brad A. ArmstrongComputer mouse with enhance control button (s)
US6340800 *May 27, 2000Jan 22, 2002International Business Machines CorporationMultiplexing control device and method for electronic systems
US6357887 *Oct 25, 1999Mar 19, 2002Apple Computers, Inc.Housing for a computing device
US6359572 *Sep 3, 1998Mar 19, 2002Microsoft CorporationDynamic keyboard
US6677927 *Aug 23, 1999Jan 13, 2004Microsoft CorporationX-Y navigation input device
US6677932 *Jan 28, 2001Jan 13, 2004Finger Works, Inc.System and method for recognizing touch typing under limited tactile feedback conditions
US6678891 *Nov 19, 1998Jan 13, 2004Prasara Technologies, Inc.Navigational user interface for interactive television
US6686904 *Mar 30, 2001Feb 3, 2004Microsoft CorporationWheel reporting method for a personal computer keyboard interface
US6686906 *Jun 20, 2001Feb 3, 2004Nokia Mobile Phones Ltd.Tactile electromechanical data input mechanism
US6844872 *Oct 11, 2000Jan 18, 2005Apple Computer, Inc.Computer mouse having side areas to maintain a depressed button position
US6855899 *Jan 5, 2004Feb 15, 2005Pentax CorporationPush button device having an illuminator
US7006077 *Nov 30, 1999Feb 28, 2006Nokia Mobile Phones, Ltd.Electronic device having touch sensitive slide
US7218956 *Jun 19, 2003May 15, 2007Motokazu OkawaAdvertisement using cellular phone
US7321103 *Aug 30, 2006Jan 22, 2008Polymatech Co., Ltd.Key sheet and manufacturing method for key sheet
US7645955 *Jan 12, 2010Altek CorporationMetallic linkage-type keying device
US20020011993 *Jan 7, 1999Jan 31, 2002Charlton E. LuiSystem and method for automatically switching between writing and text input modes
US20020015024 *Jul 31, 2001Feb 7, 2002University Of DelawareMethod and apparatus for integrating manual input
US20020027547 *Jul 11, 2001Mar 7, 2002Noboru KamijoWristwatch type device and method for moving pointer
US20020030665 *Apr 30, 2001Mar 14, 2002Matsushita Electric Industrial Co., Ltd.Coordinate input device and portable information apparatus equipped with coordinate input device
US20020033848 *Apr 19, 2001Mar 21, 2002Sciammarella Eduardo AgustoSystem for managing data objects
US20030002246 *Feb 13, 2002Jan 2, 2003Apple Computers, Inc.Active enclousure for computing device
US20030025679 *Jun 6, 2002Feb 6, 2003Cirque CorporationSystem for disposing a proximity sensitive touchpad behind a mobile phone keypad
US20030028346 *Mar 30, 2001Feb 6, 2003Sinclair Michael J.Capacitance touch slider
US20030043174 *Aug 29, 2001Mar 6, 2003Hinckley Kenneth P.Automatic scrolling
US20030048262 *Oct 31, 2002Mar 13, 2003Charles WuMethod and apparatus for navigation, text input and phone dialing
US20030050092 *Aug 2, 2002Mar 13, 2003Yun Jimmy S.Portable digital player--battery
US20040009788 *Jun 12, 2003Jan 15, 2004Nokia CorporationElectronic device and method for managing its keyboard
US20040027341 *Jul 7, 2003Feb 12, 2004Derocher Michael D.Illuminated touch pad
US20050012644 *Jul 15, 2003Jan 20, 2005Hurst G. SamuelTouch sensor with non-uniform resistive band
US20050017957 *Feb 27, 2004Jan 27, 2005Samsung Electronics Co., Ltd.Touch screen system and control method therefor capable of setting active regions
US20060026521 *Jul 30, 2004Feb 2, 2006Apple Computer, Inc.Gestures for touch sensitive input devices
US20060032680 *Aug 15, 2005Feb 16, 2006Fingerworks, Inc.Method of increasing the spatial resolution of touch sensitive devices
US20060038791 *Aug 19, 2004Feb 23, 2006Mackey Bob LCapacitive sensing apparatus having varying depth sensing elements
US20070013671 *Mar 21, 2006Jan 18, 2007Apple Computer, Inc.Touch pad for handheld device
US20070018970 *May 4, 2006Jan 25, 2007Logitech Europe S.A.Optical slider for input devices
US20080006453 *Jul 6, 2006Jan 10, 2008Apple Computer, Inc., A California CorporationMutual capacitance touch sensing device
US20080006454 *Aug 1, 2007Jan 10, 2008Apple Computer, Inc.Mutual capacitance touch sensing device
US20080007533 *Jul 6, 2006Jan 10, 2008Apple Computer, Inc., A California CorporationCapacitance sensing electrode with integrated I/O mechanism
US20080007539 *Aug 1, 2007Jan 10, 2008Steve HotellingMutual capacitance touch sensing device
US20080012837 *Aug 1, 2007Jan 17, 2008Apple Computer, Inc.Touch pad for handheld device
US20090021267 *Jul 17, 2006Jan 22, 2009Mykola GolovchenkoVariably dimensioned capacitance sensor elements
US20090026558 *Aug 18, 2005Jan 29, 2009Infineon Technologies AgSemiconductor device having a sensor chip, and method for producing the same
US20090033635 *Jul 27, 2007Feb 5, 2009Kwong Yuen WaiInstruments, Touch Sensors for Instruments, and Methods or Making the Same
US20090036176 *Aug 1, 2007Feb 5, 2009Ure Michael JInterface with and communication between mobile electronic devices
US20110005845 *Jan 13, 2011Apple Inc.Touch sensing device having conductive nodes
USD437860 *Jun 1, 1998Feb 20, 2001Sony CorporationSelector for audio visual apparatus
USD454568 *Jul 17, 2000Mar 19, 2002Apple Computer, Inc.Mouse
USD468365 *Mar 12, 2002Jan 7, 2003Digisette, LlcDataplay player
USD469109 *Oct 22, 2001Jan 21, 2003Apple Computer, Inc.Media player
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7333092 *Jun 5, 2007Feb 19, 2008Apple Computer, Inc.Touch pad for handheld device
US7540205Sep 17, 2007Jun 2, 2009Viaflo Corp.Electronic pipettor
US7582840Jul 26, 2007Sep 1, 2009Sony Ericsson Mobile Communications AbUser control interface
US7671837Mar 2, 2010Apple Inc.Scrolling input arrangements using capacitive sensors on a flexible membrane
US7710393Dec 13, 2006May 4, 2010Apple Inc.Method and apparatus for accelerated scrolling
US7710394Dec 13, 2006May 4, 2010Apple Inc.Method and apparatus for use of rotational user inputs
US7710409Dec 13, 2006May 4, 2010Apple Inc.Method and apparatus for use of rotational user inputs
US7795553Sep 14, 2010Apple Inc.Hybrid button
US7825820Nov 2, 2010Apple Inc.Security using electronic devices
US7860536Jul 24, 2006Dec 28, 2010Apple Inc.Telephone interface for a portable communication device
US7880729Aug 4, 2006Feb 1, 2011Apple Inc.Center button isolation ring
US7910843Mar 22, 2011Apple Inc.Compact input device
US7932897Apr 26, 2011Apple Inc.Method of increasing the spatial resolution of touch sensitive devices
US8022571Aug 5, 2008Sep 20, 2011Apple Inc.Power management circuitry and solar cells
US8022935Jul 6, 2006Sep 20, 2011Apple Inc.Capacitance sensing electrode with integrated I/O mechanism
US8033188Oct 11, 2011Integra Biosciences Corp.Pipettor software interface
US8040321Jul 10, 2006Oct 18, 2011Cypress Semiconductor CorporationTouch-sensor with shared capacitive sensors
US8044314Oct 25, 2011Apple Inc.Hybrid button
US8058937Jan 30, 2007Nov 15, 2011Cypress Semiconductor CorporationSetting a discharge rate and a charge rate of a relaxation oscillator circuit
US8059099Nov 15, 2011Apple Inc.Techniques for interactive input to portable electronic devices
US8059232Nov 15, 2011Motorola Mobility, Inc.Electronic device and LC shutter for polarization-sensitive switching between transparent and diffusive states
US8077154Aug 13, 2007Dec 13, 2011Motorola Mobility, Inc.Electrically non-interfering printing for electronic devices having capacitive touch sensors
US8089472 *Jan 3, 2012Cypress Semiconductor CorporationBidirectional slider with delete function
US8122779Apr 13, 2009Feb 28, 2012Integra Biosciences Corp.Electronic pipettor with improved accuracy
US8125461 *Sep 5, 2008Feb 28, 2012Apple Inc.Dynamic input graphic display
US8138896Jun 13, 2008Mar 20, 2012Apple Inc.Tactile feedback in an electronic device
US8139195Dec 19, 2007Mar 20, 2012Motorola Mobility, Inc.Field effect mode electro-optical device having a quasi-random photospacer arrangement
US8154527Jan 5, 2009Apr 10, 2012Tactus TechnologyUser interface system
US8179375May 15, 2012Tactus TechnologyUser interface system and method
US8179377May 15, 2012Tactus TechnologyUser interface system
US8184104 *May 22, 2012Echostar Technologies L.L.C.Multiply tasked touchpad remote control
US8199124Jan 5, 2010Jun 12, 2012Tactus TechnologyUser interface system
US8207950Jun 26, 2012Tactus TechnologiesUser interface enhancement system
US8218306Jul 10, 2012Apple Inc.Assembly of a handheld electronic device
US8229160Jun 17, 2008Jul 24, 2012Apple Inc.Systems and methods for identifying objects and providing information related to identified objects
US8248084Aug 21, 2012Cypress Semiconductor CorporationTouch detection techniques for capacitive touch sense systems
US8254828Aug 28, 2012Apple Inc.Methods and systems for mixing media with communications
US8259073 *Apr 18, 2008Sep 4, 2012Samsung Display Co., Ltd.Display for multi-function keypad and electronic device having the same
US8274479Sep 25, 2012Apple Inc.Gimballed scroll wheel
US8280446Oct 2, 2012Lg Electronics Inc.Mobile terminal having touch input device
US8294047Dec 8, 2008Oct 23, 2012Apple Inc.Selective input signal rejection and modification
US8321174Sep 26, 2008Nov 27, 2012Cypress Semiconductor CorporationSystem and method to measure capacitance of capacitive sensor array
US8330061Mar 18, 2011Dec 11, 2012Apple Inc.Compact input device
US8358142Feb 27, 2009Jan 22, 2013Cypress Semiconductor CorporationMethods and circuits for measuring mutual and self capacitance
US8362349Sep 11, 2009Jan 29, 2013Gibson Guitar Corp.Touch pad disc jockey controller
US8363007Nov 6, 2008Jan 29, 2013Acer Inc.Method and touchpad interface device using light for displaying level
US8373549Jun 13, 2008Feb 12, 2013Apple Inc.Tactile feedback in an electronic device
US8395590Jun 1, 2009Mar 12, 2013Apple Inc.Integrated contact switch and touch sensor elements
US8400283Mar 19, 2013Apple Inc.Tactile feedback in an electronic device
US8406816 *Feb 3, 2009Mar 26, 2013Research In Motion LimitedMethod and apparatus for implementing a virtual rotary dial pad on a portable electronic device
US8416198Apr 9, 2013Apple Inc.Multi-dimensional scroll wheel
US8429286Apr 23, 2013Apple Inc.Methods and systems for rapid data acquisition over the internet
US8445793Sep 30, 2011May 21, 2013Apple Inc.Selective input signal rejection and modification
US8446370May 21, 2013Apple Inc.Touch pad for handheld device
US8456438Jun 4, 2013Tactus Technology, Inc.User interface system
US8478351Oct 21, 2010Jul 2, 2013Lg Electronics Inc.Mobile terminal having touch input device
US8482530Aug 21, 2007Jul 9, 2013Apple Inc.Method of capacitively sensing finger position
US8508473 *Dec 8, 2008Aug 13, 2013Ntt Docomo, Inc.Information processing device
US8514185Aug 1, 2007Aug 20, 2013Apple Inc.Mutual capacitance touch sensing device
US8519820Sep 2, 2008Aug 27, 2013Apple Inc.Systems and methods for saving and restoring scenes in a multimedia system
US8525798Feb 29, 2008Sep 3, 2013Cypress Semiconductor CorporationTouch sensing
US8536902Nov 21, 2011Sep 17, 2013Cypress Semiconductor CorporationCapacitance to frequency converter
US8537121May 26, 2006Sep 17, 2013Cypress Semiconductor CorporationMulti-function slider in touchpad
US8537132Apr 23, 2012Sep 17, 2013Apple Inc.Illuminated touchpad
US8542197 *Sep 26, 2008Sep 24, 2013Samsung Electronics Co., Ltd.Display apparatus and control method of the same
US8547114Nov 14, 2006Oct 1, 2013Cypress Semiconductor CorporationCapacitance to code converter with sigma-delta modulator
US8547339Jan 4, 2008Oct 1, 2013Tactus Technology, Inc.System and methods for raised touch screens
US8552990Aug 1, 2007Oct 8, 2013Apple Inc.Touch pad for handheld device
US8553005Mar 7, 2012Oct 8, 2013Tactus Technology, Inc.User interface system
US8564313Sep 12, 2012Oct 22, 2013Cypress Semiconductor CorporationCapacitive field sensor with sigma-delta modulator
US8570052Oct 31, 2012Oct 29, 2013Cypress Semiconductor CorporationMethods and circuits for measuring mutual and self capacitance
US8570053Feb 23, 2009Oct 29, 2013Cypress Semiconductor CorporationCapacitive field sensor with sigma-delta modulator
US8570295Mar 7, 2012Oct 29, 2013Tactus Technology, Inc.User interface system
US8587541Apr 19, 2011Nov 19, 2013Tactus Technology, Inc.Method for actuating a tactile interface layer
US8587548May 7, 2012Nov 19, 2013Tactus Technology, Inc.Method for adjusting the user interface of a device
US8619035Feb 9, 2011Dec 31, 2013Tactus Technology, Inc.Method for assisting user input to a device
US8627220 *Oct 1, 2009Jan 7, 2014Blackberry LimitedApparatus and method for invoking a function based on a gesture input
US8683378Jan 9, 2008Mar 25, 2014Apple Inc.Scrolling techniques for user interfaces
US8692563Dec 19, 2012Apr 8, 2014Cypress Semiconductor CorporationMethods and circuits for measuring mutual and self capacitance
US8717326Aug 29, 2013May 6, 2014Tactus Technology, Inc.System and methods for raised touch screens
US8723832Oct 15, 2013May 13, 2014Tactus Technology, Inc.Method for actuating a tactile interface layer
US8743060Jul 6, 2009Jun 3, 2014Apple Inc.Mutual capacitance touch sensing device
US8743077 *Jul 29, 2008Jun 3, 2014Sipix Imaging, Inc.Front light system for reflective displays
US8749493Jul 30, 2007Jun 10, 2014Apple Inc.Movable touch pad with added functionality
US8754759Mar 5, 2013Jun 17, 2014Apple Inc.Tactile feedback in an electronic device
US8773363 *Jul 26, 2007Jul 8, 2014DavControl module for an automotive vehicle using a touch sensor
US8781262Jul 23, 2012Jul 15, 2014Apple Inc.Systems and methods for identifying objects and providing information related to identified objects
US8786553 *Oct 6, 2006Jul 22, 2014Kyocera CorporationNavigation pad and method of using same
US8816967Sep 25, 2008Aug 26, 2014Apple Inc.Capacitive sensor having electrodes arranged on the substrate and the flex circuit
US8820133Sep 30, 2008Sep 2, 2014Apple Inc.Co-extruded materials and methods
US8843844Mar 17, 2012Sep 23, 2014Intellitact LlcInput device enhanced interface
US8854310 *Jul 26, 2007Oct 7, 2014Kyocera CorporationPortable electronic apparatus and operation detecting method of portable electronic apparatus
US8866780Apr 8, 2013Oct 21, 2014Apple Inc.Multi-dimensional scroll wheel
US8870791Mar 26, 2012Oct 28, 2014Michael E. SabatinoApparatus for acquiring, processing and transmitting physiological sounds
US8872771Jul 7, 2009Oct 28, 2014Apple Inc.Touch sensing device having conductive nodes
US8902152Apr 30, 2007Dec 2, 2014Motorola Mobility LlcDual sided electrophoretic display
US8913038 *Oct 12, 2012Dec 16, 2014Hon Hai Precision Industry Co., Ltd.Electronic device and electronic reader device with a proximity sensing button
US8918736 *Jul 24, 2006Dec 23, 2014Apple Inc.Replay recommendations in a text entry interface
US8920343Nov 20, 2006Dec 30, 2014Michael Edward SabatinoApparatus for acquiring and processing of physiological auditory signals
US8922502Dec 21, 2010Dec 30, 2014Tactus Technology, Inc.User interface system
US8922503Dec 21, 2010Dec 30, 2014Tactus Technology, Inc.User interface system
US8922510May 25, 2012Dec 30, 2014Tactus Technology, Inc.User interface system
US8933890Aug 1, 2007Jan 13, 2015Apple Inc.Techniques for interactive input to portable electronic devices
US8947383Apr 25, 2012Feb 3, 2015Tactus Technology, Inc.User interface system and method
US8952886Dec 19, 2007Feb 10, 2015Apple Inc.Method and apparatus for accelerated scrolling
US8952899Jun 5, 2009Feb 10, 2015Apple Inc.Method and apparatus to reject accidental contact on a touchpad
US8957863May 22, 2009Feb 17, 2015Google Technology Holdings LLCColored morphing apparatus for an electronic device
US8970403Apr 19, 2011Mar 3, 2015Tactus Technology, Inc.Method for actuating a tactile interface layer
US8970533Apr 23, 2013Mar 3, 2015Apple Inc.Selective input signal rejection and modification
US9009626Dec 19, 2007Apr 14, 2015Apple Inc.Method and apparatus for accelerated scrolling
US9013417Apr 19, 2011Apr 21, 2015Tactus Technology, Inc.User interface system
US9019228Mar 4, 2014Apr 28, 2015Tactus Technology, Inc.User interface system
US9035898Apr 1, 2014May 19, 2015Tactus Technology, Inc.System and methods for raised touch screens
US9041663Sep 30, 2011May 26, 2015Apple Inc.Selective rejection of touch contacts in an edge region of a touch surface
US9043597Apr 20, 2012May 26, 2015Apple Inc.Systems and methods for verifying the authenticity of a remote device
US9052790May 16, 2013Jun 9, 2015Tactus Technology, Inc.User interface and methods
US9063627May 16, 2013Jun 23, 2015Tactus Technology, Inc.User interface and methods
US9070262Apr 14, 2014Jun 30, 2015Apple Inc.Tactile feedback in an electronic device
US9075525Apr 25, 2012Jul 7, 2015Tactus Technology, Inc.User interface system
US9092053Jun 17, 2008Jul 28, 2015Apple Inc.Systems and methods for adjusting a display based on the user's position
US9093234Jul 9, 2012Jul 28, 2015Apple Inc.Assembly of a handheld electronic device
US9098141May 6, 2013Aug 4, 2015Tactus Technology, Inc.User interface system
US9104273Mar 2, 2009Aug 11, 2015Cypress Semiconductor CorporationMulti-touch sensing method
US9116617May 7, 2012Aug 25, 2015Tactus Technology, Inc.User interface enhancement system
US9122092Jun 22, 2007Sep 1, 2015Google Technology Holdings LLCColored morphing apparatus for an electronic device
US9128525Nov 15, 2013Sep 8, 2015Tactus Technology, Inc.Dynamic tactile interface
US9154160Mar 16, 2011Oct 6, 2015Cypress Semiconductor CorporationCapacitance to code converter with sigma-delta modulator
US9166621Jun 13, 2013Oct 20, 2015Cypress Semiconductor CorporationCapacitance to code converter with sigma-delta modulator
US9207795Oct 14, 2014Dec 8, 2015Tactus Technology, Inc.User interface system
US9218115Nov 30, 2011Dec 22, 2015Lg Electronics Inc.Input device and image display apparatus including the same
US9229571Oct 15, 2013Jan 5, 2016Tactus Technology, Inc.Method for adjusting the user interface of a device
US9239623Sep 8, 2014Jan 19, 2016Tactus Technology, Inc.Dynamic tactile interface
US9250804Aug 2, 2013Feb 2, 2016Freescale Semiconductor,Inc.Electronic device for detecting erronous key selection entry
US9274612Mar 7, 2012Mar 1, 2016Tactus Technology, Inc.User interface system
US9280224Sep 24, 2013Mar 8, 2016Tactus Technology, Inc.Dynamic tactile interface and methods
US9288422Jul 30, 2013Mar 15, 2016Apple Inc.Systems and methods for saving and restoring scenes in a multimedia system
US9297839Jul 1, 2014Mar 29, 2016Skydrop Holdings, LlcAutomatic detection of expansion component irrigation controller
US9298261Aug 28, 2014Mar 29, 2016Tactus Technology, Inc.Method for actuating a tactile interface layer
US9298262Sep 8, 2014Mar 29, 2016Tactus Technology, Inc.Dynamic tactile interface
US9319345Nov 3, 2014Apr 19, 2016Apple Inc.Methods and systems for rapid data acquisition over the internet
US9354751Sep 16, 2009May 31, 2016Apple Inc.Input device with optimized capacitive sensing
US9360967Jul 6, 2006Jun 7, 2016Apple Inc.Mutual capacitance touch sensing device
US9363504May 21, 2012Jun 7, 2016Lg Electronics Inc.Apparatus and method for displaying 3-dimensional image
US9367132Mar 11, 2011Jun 14, 2016Tactus Technology, Inc.User interface system
US9367151Jan 28, 2014Jun 14, 2016Apple Inc.Touch pad with symbols based on mode
US20070013671 *Mar 21, 2006Jan 18, 2007Apple Computer, Inc.Touch pad for handheld device
US20070132715 *Dec 13, 2005Jun 14, 2007Ming-Hsiung LiuInputting device for electronic products and method for interacting with a graphical user interface
US20070155369 *Jul 24, 2006Jul 5, 2007Jobs Steven PReplay Recommendations in a Text Entry Interface
US20070155434 *Jul 24, 2006Jul 5, 2007Jobs Steven PTelephone Interface for a Portable Communication Device
US20070202852 *Sep 14, 2006Aug 30, 2007Inventec Appliances Corp.Input method and arrangement for portable electronic device
US20070229472 *Mar 29, 2007Oct 4, 2007Bytheway Jared GCircular scrolling touchpad functionality determined by starting position of pointing object on touchpad surface
US20070263014 *May 9, 2006Nov 15, 2007Nokia CorporationMulti-function key with scrolling in electronic devices
US20070273659 *May 26, 2006Nov 29, 2007Xiaoping JiangMulti-function slider in touchpad
US20070273662 *Dec 20, 2006Nov 29, 2007Hon Hai Precision Industry Co., Ltd.Display apparatus and display method for a portable device
US20080084397 *Oct 6, 2006Apr 10, 2008Peter OnNavigation pad and method of using same
US20080158151 *Dec 21, 2007Jul 3, 2008High Tech Computer Corp.Electronic devices and input methods therefor
US20080266244 *Apr 30, 2007Oct 30, 2008Xiaoping BaiDual Sided Electrophoretic Display
US20080270900 *Jun 11, 2007Oct 30, 2008Wezowski Martin M RDevice, method and computer program product for switching a device between application modes
US20080284689 *Apr 18, 2008Nov 20, 2008Kyongdo KimDisplay for multi-function keypad and electronic device having the same
US20080316397 *Jun 22, 2007Dec 25, 2008Polak Robert DColored Morphing Apparatus for an Electronic Device
US20090002332 *Jun 2, 2008Jan 1, 2009Park Sung-SooMethod and apparatus for input in terminal having touch screen
US20090006421 *Jun 27, 2008Jan 1, 2009Apple Inc.Methods and systems for rapid data acquisition over the internet
US20090006846 *Jun 27, 2007Jan 1, 2009Apple Inc.Bluetooth device as security access key
US20090017874 *Feb 15, 2008Jan 15, 2009Lg Electronics Inc.Mobile terminal having touch input device
US20090033522 *Jul 30, 2007Feb 5, 2009Palm, Inc.Electronic Device with Reconfigurable Keypad
US20090046072 *Aug 13, 2007Feb 19, 2009Emig David MElectrically Non-interfering Printing for Electronic Devices Having Capacitive Touch Sensors
US20090049404 *Mar 13, 2008Feb 19, 2009Samsung Electronics Co., LtdInput method and apparatus for device having graphical user interface (gui)-based display unit
US20090058842 *May 27, 2008Mar 5, 2009Apple Inc.Devices and methods for controlling a display to conserve power
US20090062944 *May 16, 2008Mar 5, 2009Apple Inc.Modifying media files
US20090071266 *Sep 17, 2007Mar 19, 2009Nelson Gary EElectronic pipettor assembly
US20090074622 *Sep 17, 2007Mar 19, 2009George KalamakisPipettor software interface
US20090077130 *Sep 17, 2007Mar 19, 2009Abernethy Jr Michael NSystem and Method for Providing a Social Network Aware Input Dictionary
US20090113005 *Oct 31, 2007Apr 30, 2009Justin GreggSystems and methods for controlling pre-communications interactions
US20090143007 *Nov 30, 2007Jun 4, 2009Apple Inc.Methods and systems for mixing media with communications
US20090153489 *Dec 8, 2008Jun 18, 2009Ntt Docomo, Inc.Information processing device
US20090160671 *Nov 6, 2008Jun 25, 2009Hung-Wu ShihMethod and touchpad interface device using light for displaying level
US20090161059 *Dec 19, 2007Jun 25, 2009Emig David MField Effect Mode Electro-Optical Device Having a Quasi-Random Photospacer Arrangement
US20090167508 *Jun 13, 2008Jul 2, 2009Apple Inc.Tactile feedback in an electronic device
US20090167509 *Jun 13, 2008Jul 2, 2009Apple Inc.Tactile feedback in an electronic device
US20090170486 *Dec 28, 2007Jul 2, 2009Michael CulbertUrgent communications
US20090174673 *Jan 4, 2008Jul 9, 2009Ciesla Craig MSystem and methods for raised touch screens
US20090174679 *Sep 30, 2008Jul 9, 2009Wayne Carl WestermanSelective Rejection of Touch Contacts in an Edge Region of a Touch Surface
US20090174687 *Jan 5, 2009Jul 9, 2009Craig Michael CieslaUser Interface System
US20090175499 *Jun 17, 2008Jul 9, 2009Apple Inc.Systems and methods for identifying objects and providing information related to identified objects
US20090195510 *Feb 2, 2009Aug 6, 2009Saunders Samuel FErgonomic user interface for hand held devices
US20090225057 *May 22, 2009Sep 10, 2009Polak Robert DColored Morphing Apparatus for an Electronic Device
US20090231283 *May 22, 2009Sep 17, 2009Polak Robert DColored Morphing Apparatus for an Electronic Device
US20090251424 *Oct 30, 2008Oct 8, 2009Shenzhen Futaihong Precision Industry Co., Ltd.Systems and methods for controlling an electronic device via a touch panel
US20090256812 *Sep 26, 2008Oct 15, 2009Samsung Electronics Co., Ltd.Display apparatus and control method of the same
US20090289917 *Mar 20, 2009Nov 26, 2009Saunders Samuel FDynamic visual feature coordination in an electronic hand held device
US20090307633 *Dec 10, 2009Apple Inc.Acceleration navigation of media device displays
US20090313584 *Jun 17, 2008Dec 17, 2009Apple Inc.Systems and methods for adjusting a display based on the user's position
US20100005389 *Aug 15, 2007Jan 7, 2010Soren Borup JensenApparatus and method for a user to select one or more pieces of information
US20100011388 *Jan 14, 2010William BullSystem and method for creating playlists based on mood
US20100013309 *Jan 21, 2010Apple IncPower management circuitry and solar cells
US20100020028 *Jul 26, 2007Jan 28, 2010Patrice LaurentControl Module, In Particular For An Automotive Vehicle
US20100039371 *Aug 13, 2008Feb 18, 2010Sony Ericsson Mobile Communications AbArrangement for selectively viewing and hiding user interface items on a body of a communication apparatus, and communication apparatus comprising such arrangement
US20100052843 *Sep 2, 2008Mar 4, 2010Apple Inc.Systems and methods for saving and restoring scenes in a multimedia system
US20100079264 *Sep 29, 2008Apr 1, 2010Apple Inc.Haptic feedback system
US20100090961 *Jul 26, 2007Apr 15, 2010Kyocera CorporationPortable Electronic Apparatus and Operation Detecting Method of Portable Electronic Apparatus
US20100139990 *Dec 8, 2008Jun 10, 2010Wayne Carl WestermanSelective Input Signal Rejection and Modification
US20100171719 *Jul 8, 2010Ciesla Michael CraigUser interface system
US20100171720 *Jan 5, 2010Jul 8, 2010Ciesla Michael CraigUser interface system
US20100174987 *Jul 8, 2010Samsung Electronics Co., Ltd.Method and apparatus for navigation between objects in an electronic apparatus
US20100188343 *Jan 29, 2009Jul 29, 2010Edward William BachVehicular control system comprising touch pad and vehicles and methods
US20100197353 *Feb 3, 2009Aug 5, 2010Keizo MaruiMethod and apparatus for implementing a virtual rotary dial pad on a portable electronic device
US20100245255 *Mar 24, 2009Sep 30, 2010Echostar Technologies L.L.C.Multiply tasked touchpad remote control
US20100253617 *Aug 20, 2007Oct 7, 2010Kyocera CorporationPortable Electronic Apparatus and Control Method of Portable Electronic Apparatus
US20100257447 *Mar 25, 2010Oct 7, 2010Samsung Electronics Co., Ltd.Electronic device and method for gesture-based function control
US20100289737 *Jul 30, 2007Nov 18, 2010Kyocera CorporationPortable electronic apparatus, operation detecting method for the portable electronic apparatus, and control method for the portable electronic apparatus
US20100289750 *Nov 24, 2006Nov 18, 2010Hyung Gi KimTouch Type Character Input Device
US20100295794 *May 20, 2009Nov 25, 2010Microsoft CorporationTwo Sided Slate Device
US20110034214 *Feb 10, 2011Lg Electronics Inc.Mobile terminal having touch input device
US20110063230 *Sep 11, 2009Mar 17, 2011James MazurTouch Pad Disc Jockey Controller
US20110083100 *Oct 1, 2009Apr 7, 2011Steven Henry FykeApparatus and method for invoking a function based on a gesture input
US20110095998 *Apr 28, 2011Fortrend Taiwan Scientific Corp.External input device
US20110148762 *Jun 23, 2011Universal Electronics Inc.System and method for multi-mode command input
US20110148774 *Jun 23, 2011Nokia CorporationHandling Tactile Inputs
US20120052929 *Aug 31, 2010Mar 1, 2012Khamvong ThammasoukInteractive phone case
US20130038578 *Feb 14, 2013Wei-Young LiangElectronic reader device and graphical user interface control method thereof
US20130135243 *Jun 29, 2011May 30, 2013Research In Motion LimitedCharacter preview method and apparatus
US20130346636 *Aug 23, 2013Dec 26, 2013Microsoft CorporationInterchangeable Surface Input Device Mapping
US20140327651 *May 6, 2014Nov 6, 2014Cirque CorporationIndicator of entering a secure pasword on a touch sensor
DE102008009954A1 *Feb 20, 2008Jan 15, 2009Lg Electronics Inc.Mobiles Endgerät mit einer Berührungseingabeeinrichtung
DE102008009954B4 *Feb 20, 2008Mar 20, 2014Lg Electronics Inc.Mobiles Endgerät mit einer Berührungseingabeeinrichtung
EP2071430A1 *Dec 12, 2008Jun 17, 2009NTT DoCoMo, Inc.Information processing device
EP2079006A1 *Nov 11, 2008Jul 15, 2009Acer IncorporatedMethod and touchpad interface device using light for displaying level
EP2461238A2 *Nov 30, 2011Jun 6, 2012LG Electronics Inc.Input device and image display apparatus including the same
EP2461238A3 *Nov 30, 2011Mar 19, 2014LG Electronics Inc.Input device and image display apparatus including the same
EP2461239A2 *Nov 30, 2011Jun 6, 2012LG Electronics Inc.Input device and image display apparatus including the same
WO2009013026A1 *Jan 21, 2008Jan 29, 2009Sony Ericsson Mobile Communications AbA user control interface
WO2009023439A3 *Jul 31, 2008Mar 3, 2011Motorola, Inc.Electronic device with morphing user interface
WO2009039040A2 *Sep 12, 2008Mar 26, 2009Viaflo CorporationElectronic pipettor assembly
WO2009039040A3 *Sep 12, 2008May 14, 2009Richard CoteElectronic pipettor assembly
WO2009089465A2 *Jan 9, 2009Jul 16, 2009Apple Inc.Dynamic input graphic display
WO2009089465A3 *Jan 9, 2009Dec 23, 2009Apple Inc.Dynamic input graphic display
WO2009117685A2 *Mar 20, 2009Sep 24, 2009Spy Rock, LlcDynamic visual feature coordination in an electronic hand held device
WO2009130533A2 *Oct 16, 2008Oct 29, 2009Sony Ericsson Mobile Communications AbSmart glass touch display input device
WO2009130533A3 *Oct 16, 2008Feb 25, 2010Sony Ericsson Mobile Communications AbSmart glass touch display input device
WO2011079097A1 *Dec 21, 2010Jun 30, 2011Universal Electronics Inc.System and method for multi-mode command input
WO2012125988A3 *Mar 17, 2012Mar 13, 2014Laubach KevinInput device enhanced interface
WO2013064757A1 *Oct 30, 2012May 10, 2013Valeo Systemes ThermiquesControl and display module for a motor vehicle
WO2013188796A2 *Jun 14, 2013Dec 19, 2013Google Inc.Using touch pad to remote control home electronics like tv
WO2013188796A3 *Jun 14, 2013Apr 10, 2014Google Inc.Using touch pad to remote control home electronics like tv
WO2015003008A3 *Jul 1, 2014Jun 4, 2015Skydrop, LlcExpansion stacke component irrigation controller
Classifications
U.S. Classification345/173
International ClassificationG06F3/041
Cooperative ClassificationG06F3/04886, G06F3/0416, G06F2203/0338, G06F3/03547, G06F3/0485, G06F3/041
European ClassificationG06F3/0354P, G06F3/0488T, G06F3/041T
Legal Events
DateCodeEventDescription
Nov 1, 2006ASAssignment
Owner name: APPLE COMPUTER, INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MCKILLOP, CHRIS;GRIGNON, ANDREW;ORDING, BAS;REEL/FRAME:018499/0818;SIGNING DATES FROM 20061012 TO 20061016
May 21, 2008ASAssignment
Owner name: APPLE INC., CALIFORNIA
Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:020979/0667
Effective date: 20070109
Owner name: APPLE INC.,CALIFORNIA
Free format text: CHANGE OF NAME;ASSIGNOR:APPLE COMPUTER, INC.;REEL/FRAME:020979/0667
Effective date: 20070109