|Publication number||US20050134578 A1|
|Application number||US 11/025,216|
|Publication date||Jun 23, 2005|
|Filing date||Dec 29, 2004|
|Priority date||Jul 13, 2001|
|Also published as||US7877705, US8997020, US20080016467, US20080016468, US20110126135, WO2005067511A2, WO2005067511A3|
|Publication number||025216, 11025216, US 2005/0134578 A1, US 2005/134578 A1, US 20050134578 A1, US 20050134578A1, US 2005134578 A1, US 2005134578A1, US-A1-20050134578, US-A1-2005134578, US2005/0134578A1, US2005/134578A1, US20050134578 A1, US20050134578A1, US2005134578 A1, US2005134578A1|
|Inventors||Christopher Chambers, Wayne Scott, Alex Louie, Cheryl Scott, Allen Yuh, Cesar Alvarado, Paul Arling|
|Original Assignee||Universal Electronics Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (14), Referenced by (116), Classifications (9), Legal Events (1)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application claims the benefit of U.S. Provisional Application Ser. No. 60/534,608 filed on Jan. 6, 2004.
This application claims the benefit of and is a continuation-in-part of U.S. Published Patent Applications Nos. 2003/0103088 filed on Nov. 6, 2002, and 2003/0117427 filed on Nov. 4, 2002.
This application also claims the benefit of and is a continuation-in-part of U.S. Published Patent Application No. 2002/0143805 filed Jul. 13, 2001.
All of these prior applications are incorporated herein by reference in their entirety.
This invention relates generally to user interfaces for electronic devices. Exemplary devices include personal digital assistants (“PDAs”), Web Tablets, touch screen remote controls, mobile phones, lap-top computers, and the like.
In accordance with the description that follows, a system and method is provided for enabling enhanced user interaction, information display, and interface selection for electronic devices having graphic display and touch screen capabilities. An understanding of the objects, advantages, features, properties and relationships of the invention will be obtained from the following detailed description and accompanying drawings which set forth illustrative embodiments and which are indicative of the various ways in which the principles of the invention may be employed.
For a better understanding of the various aspects of the invention, reference may be had to preferred embodiments shown in the attached drawings in which:
A universal remote control application and associated interfacing methods are provided for executing on a portable electronic device 10. By way of example, representative platforms for the device 10 include, but are not limited to, devices such as remote controls, personal computers, lap-top computers, Smart Displays, Web Tablets and/or PDAs manufactured by HP/Compaq (such as the iPAQ brand PDA), Palm, Visor, Sony, etc., mobile phones (i.e., Microsoft based Smart Phones, Palm OS and/or Symbian OS based phones), personal gaming systems (i.e., Nintendo GameBoy, Nokia N-Gage), etc. Thus, a preferred underlying platform includes a processor coupled to a memory system comprising a combination of ROM memory, non-volatile read/write memory, and RAM memory (a memory system); a key matrix in the form of physical buttons; an internal clock and timer; a transmission circuit; a power supply; a touch screen display screen to provide visible feedback to and accept input from a user (i.e., via virtual buttons or keys); and I/O circuitry for allowing the device to exchange communications with an external computer such as server and/or client and consumer appliances. Additional input/output circuitry, such as a IR transmitter(s) and/or receiver(s), microphone, loudspeaker or earphone jack, barcode or RFID reader, etc., may also be provided.
To control the operation of the device 10, the memory system includes stored programming instructions that are intended to be executed by the processor. In this manner, the processor may be programmed to control the various electronic components within the device 10, e.g., to monitor power, to cause the transmission of signals, etc. Within the memory system, the ROM portion of memory is preferably used to store fixed programming and data that remains unchanged for the life of the product. The non-volatile read/write memory, which may be FLASH, EEPROM, battery-backed up RAM, “Smart Card,” memory stick, or the like, is preferably provided to store user entered setup data and parameters, downloaded data, etc., as necessary. RAM memory may be used by the processor for working storage as well as to hold data items which, by virtue of being backed up or duplicated on an external computer (for example, a client device) are not required to survive loss of battery power. While the described memory system comprises all three classes of memory, it will be appreciated that, in general, the memory system can be comprised of any type of computer-readable media, such as ROM, RAM, SRAM, FLASH, EEPROM, or the like alone or in various combinations. Preferably, however, at least part of the memory system should be non-volatile or battery backed such that basic setup parameters and operating features will survive loss of battery power. In addition, such memories may take the form of a chip, a hard disk, a magnetic disk, and/or an optical disk without limitation. For commanding the operation of appliances of different makes, models, and types, the memory system may also include a command code library. The command code library is comprised of a plurality of command codes that may be transmitted from the device 10 under the direction of the stored programming instructions for the purpose of controlling the operation of an appliance. The memory system may also include instructions which the processor uses in connection with the transmission circuit to cause the command codes to be transmitted in a format recognized by an identified appliance. While an exemplary transmission circuit preferably utilizes infrared transmissions, it will be appreciated that other forms of wired or wireless transmissions, such as radio frequency, may also be used.
To identify appliances by type and make (and sometimes model) such that application(s) of the device 10 are adapted to cause the transmission of command codes in the format appropriate for such identified appliances, information may be entered into the device 10. Since methods for setting up an application to cause the transmissions of commands to control the operation of specific appliances are well-known, they will not be described in greater detail herein. Nevertheless, for additional details pertaining to remote control setup, the reader may turn to U.S. Pat. Nos. 6,225,938, 6,157,319, 4,623,887, 5,872,562, 5,614,906, 4,959,810, 4,774,511, and 4,703,359 which are incorporated herein by reference in their entirety.
To cause the device 10 to perform an action, the device 10 is adapted to be responsive to events, such as a sensed user interaction with one or more keys on the key matrix, a sensed user interaction with the touch screen display, a sensed user voice/sound input, a sensed gesture, a sensed movement, or other non-traditional command input methods, or a sensed signal from an external source such as a remote computer. In response to an event, appropriate instructions within the memory system are executed. For example, when a hard or soft command key associated with a remote control application is activated on the device 10, the device 10 may read the command code corresponding to the activated command key from the memory system and transmit the command code to an appliance in a format recognizable by the appliance. It will be appreciated that the instructions within the memory system can be used not only to cause the transmission of command codes to appliances but also to perform local operations. While not limiting, local operations that may be performed by the device that are related to the remote control functionality include favorite channel setup, macro button setup, command function key relocation, etc. Examples of such local operations can be found in U.S. Pat. Nos. 5,481,256, 5,959,751, 6,014,092, which are incorporated herein by reference in their entirety.
As discussed, the underlying platform of the device 10 preferably comprises a general purpose, processor system which is controllable by stored programming instructions, i.e., software. The software may include routines, programs, objects, components, and/or data structures that perform particular tasks that can be viewed as an operating system together with one or more applications. The operating system, such as the “Windows CE” brand operating system or the like, provides an underlying set of management and control functions, device drivers, and the like which are utilized by the various applications to offer the user functions such as a calendar, address book, spreadsheet, notepad, Internet browsing, etc., as well as control of appliances. Thus, it is to be understood that applications in addition to or complimentary with the remote-control-like application can also be supported by the device 10 and, as such, in terms of the internal software architecture, the remote-control-like application may be but one of several possible applications which may co-exist within the device 10.
In terms of providing operating system functionality, it should also be understood that the demarcation between the device 10 and a host/client computer, described in greater detail hereinafter, may vary considerably from product to product. For example, at one extreme the device 10 may be nothing more than a slave display and input device in wireless communication with a computer that performs all computational functions. At the other extreme, the device 10 may be a fully-functional computer system in its own right complete with local mass storage. It is also to be appreciated that a hardware platform similar to that described above may be used in conjunction with a scaled-down operating system to provide remote control functionality only, i.e., as a standalone application. In all cases, however, the principles expressed herein remain the same.
To provide a means by which a user can interact with the device 10, the device 10 is preferably provided with software that implements a graphical user interface. The graphical user interface software may also provide access to additional software, such as a browser application, that is used to display information that may be received from an external computer. Such a graphical user interface system is described in pending U.S. application Ser. Nos. 09/905,396, 60/334,774, and 60/344,020 all of which are incorporated herein by reference in their entirety. Though described in the below embodiments in conjunction with remote control software applications, it will be understood and appreciated that the various user interface and interaction based features described herein may be used in conjunction with any software program or application and are thus not specifically limited to applications directed to control of consumer appliances.
Compact Status Indicator
For maximizing available display area on a user interface and for simplifying the process of indicating content status and/or browsing through a large content set, the device 10 utilizes a compact status indicator interface 11. In particular, the compact status indicator interface 11 is designed to overcome both size and functionality restraints of portable electronic devices, and present a full featured “scroll-bar” like interface and status indication to a user for browsing through a content set (i.e., a set of text or other displayed data referencing files, photos, music, videos, program guide information, etc., that cannot be conveniently displayed within the available display area of a device). In particular, presently to navigate within large data or content sets, the user must use a traditional scroll-bar type interface such as shown in
Looking now to
Compact status indicator 11 may also indicate the relative location of the then displayed content set on a display via the position of portion 112. For example, in 11 a the content set as then displayed in content display area 12 is substantially one-quarter of the way through the amount of available content to be displayed, in 11 b the content set as then displayed in content display area 12 is substantially one-half of the way through the amount of available content to be displayed, and in 11 c the content set as then displayed in content display area 12 is substantially thee-quarters of the way through the amount of available content to be displayed. In the event that portion 112 is not implemented in a specific embodiment and/or has been turned off by the user, it will be appreciated that a similar indication may be achieved by the relative position of the boundary between portions 110 and 114.
The inventive aspects of the present compact status indicator will be understood and appreciated by the foregoing description and associated drawings, however since in one exemplary method compact status indicator 11 is configured and operated to replace traditional scroll-bar 11′, reference to corresponding portions may be seen in
It will be understood and appreciated that the size, placement on a particular user interface or electronic device, shading, coloring, and other “look and feel” elements of the compact status indicator of the current invention may vary widely without departing from the spirit and scope of the current invention. Additionally, the particular methods and techniques for generating and allowing interaction with the compact status indicator will be apparent from the descriptions herein, as well within the routine skill of a programmer skilled in the art. For instance, when implemented in conjunction with a remote control application, the compact status indicator of the current invention can be configured to both provide status of displayed content sets, interaction with such content sets, as well as accomplish other desired user interactions with home appliances using the control methods described above. Thus, the compact status indicator may also be operable in a given device mode to not only indicate a relative position within an appliance state but also to provide interactive control of that state, such as volume control (wherein the pie-shaped indicator has a range from no volume to maximum volume), channel control/surfing (wherein the pie-shaped indicator has a range from the lowest available channel to the highest available channel), etc. In connection with operating the device 10 in this manner, interaction with the indicator may transmit appropriate commands such as volume up/down, channel up/down, etc. In certain circumstances, an absolute value may also be transmitted—such as a specific channel number if the device 10 is provided with a correspondence between a position of the indicator 112 within the pie-graph and a channel number within the range of channel numbers available.
Virtual Scroll Wheel
For further maximizing available display area on a user interface and for simplifying the process of browsing through a large content set, a device 10 may utilize a virtual scroll wheel interface 14. In particular, the virtual scroll wheel interface 14 is designed to overcome both size and functionality restraints of a small electronic device, and present a full featured “mechanical scroll-wheel” or “mechanical jog-dial” like interface to a user for browsing through a large content set (i.e., a set of text or other displayed data referencing files, photos, music, videos, etc. that cannot be conveniently displayed within the available display area of a device) using a finger, stylus, or other user interactivity element. In particular, presently to navigate within large data or content sets, the user must use a traditional scroll-bar type interface such as shown in
Looking now to
Other finger and stylus (and generally any user interactivity element) interactions are possible given the described method, for instance the stylus may be first caused to touch virtual scroll wheel 14 at a substantially middle location, whereby a movement dragging the stylus downward within virtual scroll wheel 14 causes increasingly accelerated upward scrolling of the content in content display area 12, and subsequent upward dragging of the stylus (generally without allowing the stylus to leave the surface of display 1) causes a gradual slowing, and then reversal of direction in the scrolling of content. Likewise, for interaction by a users finger in a manner similar to a scroll wheel on a mouse, virtual scroll wheel 14 may be configured (through programming on device 10) to respond to successive swipes or drags (e.g., touches and releases) of a users thumb in the direction of scroll bar orientation (as determined by programming in or available to device 10). For instance, downward vertical swipes or drags of a users thumb may cause upward scrolling of the content in content display area 12, while upward vertical swipes or drags of a users thumb may cause downward scrolling of the content in content display area 12. It will be understood and appreciated that distances and directions traveled within virtual scroll wheel 14 for scrolling and acceleration purposes, and the particular method and technique of monitoring and calculating stylus screen touch, drag, pause, and off points are the subject of design choices that may be dictated by, among other factors, device platform, operating system, programming language, etc., and are all well within the routine skill of a programmer skilled in the art. By way of example, with reference to
Scalable User Interface
For further maximizing available display area on a user interface and for providing quick transition between a number of available interface states or parameters, or a continuum of interfaces states available to a user, a device 10 may utilize a scalable user interface 200. In particular, the scalable user interface 200 is designed to overcome both size and functionality restraints of portable electronic devices, and present an easily modifiable user interface consisting of two or more interface states to a user for operating and interacting with an application or program on device 10. In particular, to presently modify the particular level of detail, resolution, or other user definable graphic user interface characteristics, the user must use a number of tedious steps and/or operations on the device or software application to effect the desired modifications. While methods such as skins, accessibility settings, and configurable toolbars have been used to allow a user to customize various aspects of a particular user interface, such methods are laborious as indicated above, and lack the immediacy of customization required for certain applications. As will be appreciated, for certain applications such as remote control user interfaces for electronic devices, the lack of immediately customizable user interfaces is problematic given that one main purpose of remote control applications is to save users time and effort when interacting with their home appliances and media content. As such, current user interface customization methods are lacking for multiple user interfaces, applications having multiple interface screens and elements, and for quickly switching between desired application interface states and parameters. The scalable user interface of the current invention overcomes these limitations while presenting an easily customizable, full function interface to a user.
Looking now to
In order to effect modification of interface 200 to interface 202 or 204, the user need not access any menus, settings pages, or the like according to the current invention. Rather, programming on or accessible to device 10 causes various interactions of an interactivity element with a predefined area of the screen to cause interface 200 to be reconfigured as interface 202 or 204. For example, as illustrated in
While not to be taken as limiting, the exemplary method described above involves interacting with a top portion of the display in order to modify one parameter (e.g., resolution, the enlarging or minimizing of interface elements) of interface 200. As shown in
It will be understood and appreciated that the actual shape, graphical elements, and other “look and feel” characteristics of interface states, the type and nature of modifiable interface parameters, as well as location and direction of defined display areas for interface customization purposes, and the particular method and technique of monitoring and calculating stylus screen touch, drag, pause, and off points are the subject of design choices that may be dictated by, among other factors, device platform, operating system, programming language, etc, and are all well within the routine skill of a programmer skilled in the art. As such, many methods of implementing and using the scalable user interface of the current invention are possible without departing from the spirit and scope of the current invention.
Virtual Gesture Pad
For providing an enhanced user interface to a user such that user movements and gestures on a touch screen display enable control and operation of a software application, the device 10 utilizes a virtual gesture pad interface 300. In particular, the virtual gesture pad interface 300 is designed to overcome both size and functionality restraints of portable electronic devices, and present an easily controllable interface such that predefined movements of a stylus, finger, cursor, or other user interactivity element may control operation of a software application, optimally without full attention of the user (i.e., single handed operation and/or operation while not viewing the display of device 10). For example, U.S. Pat. No. 6,396,523 describes a remote control device having a touch pad element adapted to receive gesture based input for initiation of various remote control functions. In this manner, gesture input is converted to command functions and control of the associated home entertainment equipment is effected.
While the system described in U.S. Pat. No. 6,396,523 performs adequately when implemented on a larger scale device having both physical buttons and touch pad elements to facilitate interaction with information presented on a separate display device such as a television screen, what is needed is a virtual gesture pad that functions to provide advanced control based features quickly and interchangeably with traditional soft button based controls for electronic devices having integrated touch sensitive screen displays. As will be appreciated, for certain applications such as touch screen based remote control user interfaces for electronic devices, the lack of easily accessible and intuitive gesture based user interfaces is problematic given that users often desire to operate the remote control application using a single hand or finger, and/or without directly viewing the display screen. As such, current gesture based methods for control and operation of software applications are lacking for integrated touch screen devices. The virtual gesture pad of the current invention overcomes these limitations while presenting an intuitive, interchangeable, gesture based interface to a user.
Looking now to
It will be understood and appreciated by those skilled in the art that various configurations of virtual gesture pads 302 and 304, including different shapes, sizes, and locations of the various gesture targets and the particular method and technique of monitoring and calculating finger or stylus screen touch, drag, pause, and off points are the subject of design choices that may be dictated by, among other factors, device platform, operating system, programming language, etc., and are all well within the routine skill of a programmer skilled in the art. For instance, using methods similar to those illustrated and described in reference to
Graphical Search Query Interface
For maximizing available display area on a user interface and for simplifying the process of indicating to a user the results of a search query and allowing the user to interact with the search query results, a device 10 may utilize a graphical search query interface 400 as shown in
Looking now to
Given the above disclosure and associated figures, it will be appreciated by those skilled in the art that the described results set constitute Venn diagrams as applied to Boolean search methods, however it is one object of the present invention to present these results sets to a user in a dynamic and interactive fashion. As shown in
It will be understood and appreciated that the size, placement on a particular user interface or electronic device, shading, coloring, and other “look and feel” elements of the graphical search query interface of the current invention may vary widely without departing from the spirit and scope of the current invention. Additionally, the particular methods and techniques for generating and allowing interaction with the graphical search query interface will be apparent from the descriptions herein, as well as well within the routine skill of a programmer skilled in the art. For instance, various algorithms for causing search terms or results sets to be modified can be implemented in conjunction with the graphical search query interface of the current invention to accomplish desired user experience goals for a specific device or software product without departing from the spirit and scope of the present invention.
Graphical Indication of Device Mode and State
Device 10 and associated application software may be configured to operate networked control environments wherein some or all of the home appliances and components are configured to be network enabled and interoperable, and which include one or more media server devices and media rendering devices (e.g., as defined and described in the UPnP and/or HAVi specifications which can be found at http://www.upnp.org and http://www.havi.org respectively and which are incorporated herein by reference in their entirety). Turning now to
It will be appreciated that in the case of servers offering multiple content types, switching between content types to be listed in area 12 may be accomplished by touching one of the sub-icons 506, 508, 510 with a stylus, finger, cursor, etc. Such interaction may occur directly within the area 500 illustrated, or alternatively an initial touch may cause an enlarged semi-transparent version of this area to overlay the display in a manner similar to that described earlier in connection with
In another aspect of the current invention, area 500 may provide information regarding a rendering device currently selected rather than a content server. In this case, icon 504 may be representative of a renderer such as a stereo amplifier, TV set, Smart Display, Linksys Digital Media Adapter, HP Digital Media Renderer, or other hardware or software based media renderer, and sub-icons 506, 508, 510 may indicate which types of content that rendering device is capable of processing. In a manner similar to that described above, one of the sub-icons may be highlighted to indicate the currently selected, or currently playing media type. Additionally, in order to indicate to a user that the rendering device represented by icon 504 is one of a plurality of available rendering devices for use, an alphanumeric indicator 505 may be associated with icon 504. Alphanumeric indicator 505 may represent the number of rendering devices available, or may represent the order of the particular rendering device being represented by icon 504 from a number of available rendering devices. Alphanumeric indicator 505 may be placed anywhere on or near icon 504 such that a user may associate indicator 505 with icon 504, and indicator 505 may be configured to respond to touch by a stylus, finger, cursor, or other interactivity element in order to switch to a different rendering device.
For situations where a selected renderer does not include certain media rendering functions, the icons 506, 508, or 510 may include an additional availability indicator 512 (i.e., an “X” placed next to the icon representing the unavailable media function as shown in
Similarly, as shown in
It will be appreciated that these concepts may be extended across multiple areas of the display surface in order to simultaneously present the status of both a server and a rendering device. For example, the space shown occupied by the setup “wrench” icon 520 could in certain circumstances be replaced by a display element similar to area 500, allowing simultaneous presentation of the capabilities of both a server and a rendering device using similar visual elements. In this instance it will be understood that in addition to the touch inputs previously described, switching and routing of media streams, and rendering devices may also be accomplished by dragging a stylus, finger, cursor, etc., from one of the sub-icons of a server representation into a renderer representation, and vice-versa. It will be further understood and appreciated that the size, placement on a particular user interface or electronic device, shading, coloring, and other “look and feel” elements of the described icons and indicators of the current invention may vary widely without departing from the spirit and scope of the current invention.
The system and process of the present invention has been described above via the use of illustrative graphical user interface elements and designs. It is understood that unless otherwise stated to the contrary herein, the functions and methods by which these are generated and rendered may be integrated in a single physical device or a software module in a software product, or may be implemented in separate physical devices or software modules, without departing from the scope and spirit of the present invention.
It is to be appreciated that detailed discussion of the actual implementation of each graphical display element and user interface method is not necessary for an enabling understanding of the invention. The actual implementation is well within the routine skill of a programmer and system engineer, given the disclosure herein of the system attributes, functionality, and inter-relationship of the various elements in the system. A person skilled in the art, applying ordinary skill can practice the present invention without undue experimentation.
While the invention has been described with respect to various illustrative examples, it will be apparent to those skilled in the art that various modifications and improvements may be made without departing from the scope and spirit of the invention. Accordingly, it is to be understood that the invention is not to be limited by these specifically illustrated examples.
All of the cited references are incorporated herein by reference in their entirety
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4623887 *||May 15, 1984||Nov 18, 1986||General Electric Company||Reconfigurable remote control|
|US4703359 *||Nov 20, 1985||Oct 27, 1987||Nap Consumer Electronics Corp.||Universal remote control unit with model identification capability|
|US4774511 *||May 30, 1985||Sep 27, 1988||Nap Consumer Electronics Corp.||Universal remote control unit|
|US4959810 *||Dec 2, 1987||Sep 25, 1990||Universal Electronics, Inc.||Universal remote control device|
|US5481256 *||Nov 29, 1993||Jan 2, 1996||Universal Electronics Inc.||Direct entry remote control with channel scan|
|US5592604 *||Aug 31, 1994||Jan 7, 1997||International Business Machines Corporation||Method and system for indicating boundaries of connected data subsets|
|US5614906 *||Apr 23, 1996||Mar 25, 1997||Universal Electronics Inc.||Method for selecting a remote control command set|
|US5872562 *||May 24, 1993||Feb 16, 1999||U.S. Philips Corporation||Universal remote control transmitter with simplified device identification|
|US5959751 *||Jun 13, 1997||Sep 28, 1999||Universal Electronics Inc.||Universal remote control device|
|US6014092 *||Dec 11, 1992||Jan 11, 2000||Universal Electronics Inc.||Key mover|
|US6157319 *||Jul 23, 1998||Dec 5, 2000||Universal Electronics Inc.||Universal remote control system with device activated setup|
|US6225938 *||Jan 14, 1999||May 1, 2001||Universal Electronics Inc.||Universal remote control system with bar code setup|
|US6396523 *||Mar 14, 2000||May 28, 2002||Interlink Electronics, Inc.||Home entertainment device remote control|
|US6867764 *||Mar 22, 2001||Mar 15, 2005||Sony Corporation||Data entry user interface|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7552402||Jun 22, 2006||Jun 23, 2009||Microsoft Corporation||Interface orientation using shadows|
|US7570259 *||Jun 1, 2004||Aug 4, 2009||Intel Corporation||System to manage display power consumption|
|US7596761||Jul 24, 2006||Sep 29, 2009||Apple Inc.||Application user interface with navigation bar showing current and prior application contexts|
|US7600201 *||Apr 7, 2004||Oct 6, 2009||Sony Corporation||Methods and apparatuses for viewing choices and making selections|
|US7612786||Feb 10, 2006||Nov 3, 2009||Microsoft Corporation||Variable orientation input mode|
|US7613696||Feb 24, 2006||Nov 3, 2009||International Business Machines Corporation||Configuring search results using a layout editor|
|US7653883 *||Sep 30, 2005||Jan 26, 2010||Apple Inc.||Proximity detector in handheld device|
|US7692637 *||Apr 26, 2005||Apr 6, 2010||Nokia Corporation||User input device for electronic device|
|US7707516 *||May 26, 2006||Apr 27, 2010||Google Inc.||Embedded navigation interface|
|US7770126 *||Feb 10, 2006||Aug 3, 2010||Microsoft Corporation||Assisting user interface element use|
|US7844914||Sep 16, 2005||Nov 30, 2010||Apple Inc.||Activating virtual keys of a touch-screen virtual keyboard|
|US7881656||Dec 26, 2007||Feb 1, 2011||Sandisk Corporation||Audio visual player apparatus and system and method of content distribution using the same|
|US7889095||Oct 3, 2008||Feb 15, 2011||Logitech Europe S.A.||Method and apparatus for uploading and downloading remote control codes|
|US7895533 *||Aug 6, 2007||Feb 22, 2011||Apple Inc.||Interactive image thumbnails|
|US7934156 *||Sep 5, 2007||Apr 26, 2011||Apple Inc.||Deletion gestures on a portable multifunction device|
|US7940250||Sep 4, 2007||May 10, 2011||Apple Inc.||Web-clip widgets on a portable multifunction device|
|US7944370||Nov 3, 2005||May 17, 2011||Logitech Europe S.A.||Configuration method for a remote control via model number entry for a controlled device|
|US7956847||Jun 13, 2007||Jun 7, 2011||Apple Inc.||Gestures for controlling, manipulating, and editing of media files using touch sensitive devices|
|US8001613||Jun 23, 2006||Aug 16, 2011||Microsoft Corporation||Security using physical objects|
|US8091045||Jun 28, 2007||Jan 3, 2012||Apple Inc.||System and method for managing lists|
|US8108491||Apr 30, 2009||Jan 31, 2012||International Business Machines Corporation||Method and system for control of access to global computer networks|
|US8120590 *||Oct 6, 2010||Feb 21, 2012||Lg Electronics Inc.||Mobile communication terminal and method of selecting menu and item|
|US8122384 *||Sep 18, 2007||Feb 21, 2012||Palo Alto Research Center Incorporated||Method and apparatus for selecting an object within a user interface by performing a gesture|
|US8127246 *||Oct 1, 2007||Feb 28, 2012||Apple Inc.||Varying user interface element based on movement|
|US8130205||Jan 4, 2008||Mar 6, 2012||Apple Inc.||Portable electronic device, method, and graphical user interface for displaying electronic lists and documents|
|US8139059||Mar 31, 2006||Mar 20, 2012||Microsoft Corporation||Object illumination in a virtual environment|
|US8146019 *||Feb 6, 2008||Mar 27, 2012||Samsung Electronics Co., Ltd.||Method and terminal for playing and displaying music|
|US8160495||Apr 17, 2012||Sandisk Technologies Inc.||Wireless portable device for sharing digital content items|
|US8190994 *||Oct 25, 2007||May 29, 2012||Nokia Corporation||System and method for listening to audio content|
|US8205157||Sep 30, 2008||Jun 19, 2012||Apple Inc.||Methods and graphical user interfaces for conducting searches on a portable multifunction device|
|US8223134||Mar 5, 2012||Jul 17, 2012||Apple Inc.||Portable electronic device, method, and graphical user interface for displaying electronic lists and documents|
|US8279192||Jan 9, 2012||Oct 2, 2012||Lg Electronics Inc.||Mobile communication terminal and method of selecting menu and item|
|US8341524 *||Sep 11, 2006||Dec 25, 2012||Apple Inc.||Portable electronic device with local search capabilities|
|US8368665||Jul 12, 2012||Feb 5, 2013||Apple Inc.||Portable electronic device, method, and graphical user interface for displaying electronic lists and documents|
|US8381135 *||Sep 30, 2005||Feb 19, 2013||Apple Inc.||Proximity detector in handheld device|
|US8386965 *||Jan 15, 2010||Feb 26, 2013||Apple Inc.||Techniques and systems for enhancing touch screen device accessibility through virtual containers and virtually enlarged boundaries|
|US8407199 *||Apr 28, 2011||Mar 26, 2013||International Business Machines Corporation||Graphic query suggestion apparatus and program product|
|US8407623||Jun 25, 2009||Mar 26, 2013||Apple Inc.||Playback control using a touch interface|
|US8452600||Aug 18, 2010||May 28, 2013||Apple Inc.||Assisted reader|
|US8453057 *||Dec 22, 2008||May 28, 2013||Verizon Patent And Licensing Inc.||Stage interaction for mobile device|
|US8469810 *||Oct 19, 2005||Jun 25, 2013||Nintendo Co., Ltd.||Storage medium having game program stored thereon and game apparatus|
|US8477112||Aug 31, 2012||Jul 2, 2013||Lg Electronics Inc.||Mobile communication terminal and method of selecting menu and item|
|US8493344||Sep 23, 2009||Jul 23, 2013||Apple Inc.||Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface|
|US8504947||Apr 25, 2011||Aug 6, 2013||Apple Inc.||Deletion gestures on a portable multifunction device|
|US8555091||Oct 27, 2010||Oct 8, 2013||Intel Corporation||Dynamic power state determination of a graphics processing unit|
|US8555182 *||Jun 7, 2006||Oct 8, 2013||Microsoft Corporation||Interface for managing search term importance relationships|
|US8564544 *||Sep 5, 2007||Oct 22, 2013||Apple Inc.||Touch screen device, method, and graphical user interface for customizing display of content category icons|
|US8566720||May 29, 2012||Oct 22, 2013||Nokia Corporation||System and method for listening to audio content|
|US8570279||Jun 27, 2008||Oct 29, 2013||Apple Inc.||Touch screen device, method, and graphical user interface for inserting a character from an alternate keyboard|
|US8589374||Sep 24, 2009||Nov 19, 2013||Apple Inc.||Multifunction device with integrated search and application selection|
|US8589823||Aug 25, 2009||Nov 19, 2013||Apple Inc.||Application user interface with navigation bar showing current and prior application contexts|
|US8621378 *||Sep 24, 2009||Dec 31, 2013||Fujitsu Limited||Mobile terminal device and display control method|
|US8635547 *||Jan 7, 2010||Jan 21, 2014||Sony Corporation||Display device and display method|
|US8635910 *||Jan 23, 2012||Jan 28, 2014||International Business Machines Corporation||Accelerometer module for use with a touch sensitive device|
|US8656296||Jan 22, 2013||Feb 18, 2014||Google Inc.||Selection of characters in a string of characters|
|US8656315 *||Sep 30, 2011||Feb 18, 2014||Google Inc.||Moving a graphical selector|
|US8665075||Oct 26, 2009||Mar 4, 2014||At&T Intellectual Property I, L.P.||Gesture-initiated remote control programming|
|US8681106||Sep 23, 2009||Mar 25, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface|
|US8686962||Jun 7, 2011||Apr 1, 2014||Apple Inc.||Gestures for controlling, manipulating, and editing of media files using touch sensitive devices|
|US8689132 *||Dec 31, 2007||Apr 1, 2014||Apple Inc.||Portable electronic device, method, and graphical user interface for displaying electronic documents and lists|
|US8700739||Mar 10, 2008||Apr 15, 2014||Sandisk Technologies Inc.||Device for automatically receiving new digital content from a network|
|US8706712 *||Mar 7, 2012||Apr 22, 2014||International Business Machines Corporation||Graphic query suggestion display method|
|US8707195||Jun 7, 2010||Apr 22, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for accessibility via a touch-sensitive surface|
|US8751971||Aug 30, 2011||Jun 10, 2014||Apple Inc.||Devices, methods, and graphical user interfaces for providing accessibility using a touch-sensitive surface|
|US8782546 *||Jan 27, 2014||Jul 15, 2014||Supercell Oy||System, method and graphical user interface for controlling a game|
|US8814683||Jan 22, 2013||Aug 26, 2014||Wms Gaming Inc.||Gaming system and methods adapted to utilize recorded player gestures|
|US8826190||May 27, 2011||Sep 2, 2014||Google Inc.||Moving a graphical selector|
|US8839154||Dec 31, 2008||Sep 16, 2014||Nokia Corporation||Enhanced zooming functionality|
|US8884882||Jan 31, 2008||Nov 11, 2014||Pentax Ricoh Imaging Company, Ltd.||Mobile equipment with display function|
|US8930834||Mar 20, 2006||Jan 6, 2015||Microsoft Corporation||Variable orientation user interface|
|US9009612||Sep 23, 2009||Apr 14, 2015||Apple Inc.||Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface|
|US9032294||Oct 18, 2013||May 12, 2015||Nokia Corporation||System and method for listening to audio content|
|US9066199||Jun 27, 2008||Jun 23, 2015||Apple Inc.||Location-aware mobile device|
|US9081781||Aug 22, 2011||Jul 14, 2015||Sandisk Technologies Inc.||Wireless portable device for creating and wirelessly transmitting digital audio and/or video|
|US9092519||Jun 28, 2010||Jul 28, 2015||Sandisk Technologies Inc.||Method and system for updating a list of content stored on a user-operated device|
|US9109904||Jan 25, 2008||Aug 18, 2015||Apple Inc.||Integration of map services and user applications in a mobile device|
|US20010033244 *||Mar 12, 2001||Oct 25, 2001||Harris Glen Mclean||Remote control multimedia content listing system|
|US20050030196 *||Jun 16, 2004||Feb 10, 2005||Harris Glen Mclean||State-based remote control system|
|US20050229116 *||Apr 7, 2004||Oct 13, 2005||Endler Sean C||Methods and apparatuses for viewing choices and making selections|
|US20050289360 *||Jun 1, 2004||Dec 29, 2005||Rajesh Banginwar||System to manage display power consumption|
|US20060026535 *||Jan 18, 2005||Feb 2, 2006||Apple Computer Inc.||Mode-based graphical user interfaces for touch sensitive input devices|
|US20070236475 *||Apr 4, 2007||Oct 11, 2007||Synaptics Incorporated||Graphical scroll wheel|
|US20080122796 *||Sep 5, 2007||May 29, 2008||Jobs Steven P||Touch Screen Device, Method, and Graphical User Interface for Determining Commands by Applying Heuristics|
|US20080168349 *||Dec 31, 2007||Jul 10, 2008||Lamiraux Henri C||Portable Electronic Device, Method, and Graphical User Interface for Displaying Electronic Documents and Lists|
|US20090125848 *||Nov 14, 2007||May 14, 2009||Susann Marie Keohane||Touch surface-sensitive edit system|
|US20090158149 *||Aug 6, 2008||Jun 18, 2009||Samsung Electronics Co., Ltd.||Menu control system and method|
|US20090158203 *||Dec 14, 2007||Jun 18, 2009||Apple Inc.||Scrolling displayed objects using a 3D remote controller in a media system|
|US20100026530 *||Mar 23, 2007||Feb 4, 2010||Jae Kyung Lee||Method of generating key code in coordinate recognition device and apparatus using the same|
|US20100053458 *||Apr 30, 2009||Mar 4, 2010||International Business Machines Corporation||Method and System for Network Enabled Remote Controls Using Physical Motion Detection Remote control Devices|
|US20100107116 *||Oct 27, 2008||Apr 29, 2010||Nokia Corporation||Input on touch user interfaces|
|US20100174987 *||Jul 8, 2010||Samsung Electronics Co., Ltd.||Method and apparatus for navigation between objects in an electronic apparatus|
|US20100180222 *||Jul 15, 2010||Sony Corporation||Display device and display method|
|US20110179388 *||Jan 15, 2010||Jul 21, 2011||Apple Inc.||Techniques And Systems For Enhancing Touch Screen Device Accessibility Through Virtual Containers And Virtually Enlarged Boundaries|
|US20120050008 *||Aug 31, 2010||Mar 1, 2012||Plantronics, Inc.||Methods and Systems For Secure Pass-Set Entry|
|US20120058825 *||Aug 31, 2011||Mar 8, 2012||Konami Digital Entertainment Co., Ltd.||Game apparatus, game control method, and information recording medium|
|US20120098850 *||Apr 26, 2012||Tomoya Narita||Information processing apparatus, information processing method and program|
|US20120113053 *||May 10, 2012||International Business Machines Corporation||Accelerometer Module for Use With A Touch Sensitive Device|
|US20120174025 *||Jul 5, 2012||Zumobi, Inc.||Single-Handed Approach for Navigation of Application Tiles Using Panning and Zooming|
|US20120278355 *||Nov 1, 2012||International Business Machines Corporation||Graphic Query Suggestion Display Method|
|US20120278369 *||Apr 28, 2011||Nov 1, 2012||International Business Machines Corporation||Graphic Query Suggestion Apparatus and Program Product|
|US20120306794 *||Jun 28, 2012||Dec 6, 2012||Apple Inc.||Method and apparatus for implementing multiple push buttons in a user input device|
|US20130036388 *||Sep 30, 2011||Feb 7, 2013||Google Inc.||Moving a graphical selector|
|US20130085847 *||Oct 7, 2011||Apr 4, 2013||Matthew G. Dyor||Persistent gesturelets|
|US20130085855 *||Oct 28, 2011||Apr 4, 2013||Matthew G. Dyor||Gesture based navigation system|
|US20130227464 *||Feb 22, 2013||Aug 29, 2013||Samsung Electronics Co., Ltd.||Screen change method of touch screen portable terminal and apparatus therefor|
|US20140013225 *||Mar 15, 2013||Jan 9, 2014||Pegatron Corporation||Digital media controller and method for controlling a digital media system|
|US20150177971 *||Aug 26, 2014||Jun 25, 2015||Han Uk JEONG||Electronic device and a method for controlling the same|
|EP1942403A2||Jan 3, 2008||Jul 9, 2008||Samsung Electronics Co., Ltd.||Data scrolling apparatus and method for mobile terminal|
|EP1993028A1||Feb 6, 2008||Nov 19, 2008||High Tech Computer Corp.||Method and device for handling large input mechanisms in touch screens|
|EP2068235A2 *||Nov 19, 2008||Jun 10, 2009||Sony Corporation||Input device, display device, input method, display method, and program|
|EP2103116A1 *||Dec 24, 2007||Sep 23, 2009||Microsoft Corporation||Media selection|
|WO2007037808A1 *||Aug 11, 2006||Apr 5, 2007||Apple Computer||Virtual input device placement on a touch screen user interface|
|WO2007082037A2 *||Jan 10, 2007||Jul 19, 2007||Cirque Corp||Touchpad control of character actions in a virtual environment using gestures|
|WO2008030878A2 *||Sep 5, 2007||Mar 13, 2008||Apple Inc||Web-clip widgets on a portable multifunction device|
|WO2008085725A1||Dec 24, 2007||Jul 17, 2008||Microsoft Corp||Media selection|
|WO2010119309A1 *||Oct 16, 2009||Oct 21, 2010||Sony Ericsson Mobile Communications Ab||Variable rate scrolling|
|U.S. Classification||345/184, 345/173, 345/157|
|Cooperative Classification||G06F2203/04807, G06F3/0485, G06F3/04883|
|European Classification||G06F3/0488G, G06F3/0485|
|Mar 7, 2005||AS||Assignment|
Owner name: UNIVERSAL ELECTRONICS INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CHAMBERS, CHRISTOPHER;SCOTT, WAYNE;LOUIE, ALEX;AND OTHERS;REEL/FRAME:016331/0485;SIGNING DATES FROM 20050107 TO 20050202