CA2533298A1 - Manipulating an on-screen object using zones surrounding the object - Google Patents
Manipulating an on-screen object using zones surrounding the object Download PDFInfo
- Publication number
- CA2533298A1 CA2533298A1 CA002533298A CA2533298A CA2533298A1 CA 2533298 A1 CA2533298 A1 CA 2533298A1 CA 002533298 A CA002533298 A CA 002533298A CA 2533298 A CA2533298 A CA 2533298A CA 2533298 A1 CA2533298 A1 CA 2533298A1
- Authority
- CA
- Canada
- Prior art keywords
- input
- menu
- zones
- zone
- command
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/038—Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0231—Cordless keyboards
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0381—Multimodal input, i.e. interface arrangements enabling the user to issue commands by simultaneous use of input devices of different nature, e.g. voice plus gesture on digitizer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/038—Indexing scheme relating to G06F3/038
- G06F2203/0384—Wireless input, i.e. hardware and software details of wireless interface arrangements for pointing devices
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
Abstract
A user interface for manipulating objects of various types in a con~sistent manner. Each on-screen object is surrounded by a control region including a number of zones for performing various control operations on the object.
Multiple input modes are available for interacting with the zones, allowing object manipula~tion commands to be initiated in several different ways, such as via stroke input, pressing a mouse button, double-clicking, menu selection, voice input, and the like. The user interface is operable using any of several different types of input devices.
Multiple input modes are available for interacting with the zones, allowing object manipula~tion commands to be initiated in several different ways, such as via stroke input, pressing a mouse button, double-clicking, menu selection, voice input, and the like. The user interface is operable using any of several different types of input devices.
Claims (60)
1. A user interface for a device including a display, for manipulating an object displayed on the display, the device executing program instructions for providing the user interface, the user interface comprising:
a displayed representation of the object; and a control region surrounding the displayed representation of the object and comprising a plurality of zones for accepting object manipulation commands via an input device and via at least two modes of user input.
a displayed representation of the object; and a control region surrounding the displayed representation of the object and comprising a plurality of zones for accepting object manipulation commands via an input device and via at least two modes of user input.
2. The user interface of claim 1, further comprising an input device for accept-ing user input in the zones.
3. The user interface of claim 2, wherein the input device comprises at least one selected from the group consisting of:
a tablet for detecting a stylus position;
a mouse;
a touchpad;
a pointing device;
a touch-sensitive screen;
a keyboard;
a microphone for accepting voice input; and a remote controller.
a tablet for detecting a stylus position;
a mouse;
a touchpad;
a pointing device;
a touch-sensitive screen;
a keyboard;
a microphone for accepting voice input; and a remote controller.
4. The user interface of claim 1, wherein the input device comprises a key-board including keys corresponding to the zones.
5. The user interface of claim 1, wherein the input device comprises a key-board, and wherein standard keys on the keyboard are selectively assigned to zones.
6. The user interface of claim 1, wherein the input device comprises a key-board including additional keys corresponding to the zones.
7. The user interface of claim 1, wherein the zones are arranged in a grid.
8. The user interface of claim 1, wherein the zones are arranged in a matrix comprising rows of cells, and wherein the object representation is located within a cell of the matrix.
9. The user interface of claim 1, wherein the zones are arranged in a matrix comprising three rows of three cells each, and wherein the object representation is lo-Gated in the center cell of the center row.
10. The user interface of claim 1, wherein the user input modes comprise at least two selected from the group consisting of:
an activation command;
an activation command concurrent with a modifier key;
voice input;
keyboard input;
remote controller input;
mouse input;
stroke input; and menu command selection.
an activation command;
an activation command concurrent with a modifier key;
voice input;
keyboard input;
remote controller input;
mouse input;
stroke input; and menu command selection.
11. The user interface of claim 1, further comprising:
a menu activatable by performing a menu activation command for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation.
a menu activatable by performing a menu activation command for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation.
12. The user interface of claim 11, wherein at least one of the menu commands is also directly activatable by at least one of stroking, pressing a button, or double-clicking within the zone.
13. The user interface of claim 11, wherein performing the menu activation command comprises positioning an on-screen cursor within the zone and pressing a button.
14. The user interface of claim 11, wherein performing the menu activation command comprises issuing a voice command.
15. The user interface of claim 11, wherein the menu includes, for at least one command, an icon indicating a stroke direction for directly activating the command.
16. The user interface of claim 11, wherein a stroke command for a zone is ac-tivatable by,positioning an on-screen cursor within the zone and stroking the cursor.
17. A computer-implemented method for manipulating an object, comprising:
displaying a representation of the object;
displaying a control region surrounding the object and comprising a plurality of zones for accepting object manipulation commands on the object via at least two modes of user input;
receiving user input in one of the zones; and responsive to the user input, changing a characteristic of the object.
displaying a representation of the object;
displaying a control region surrounding the object and comprising a plurality of zones for accepting object manipulation commands on the object via at least two modes of user input;
receiving user input in one of the zones; and responsive to the user input, changing a characteristic of the object.
18. The method of claim 17, wherein each mode of user input comprises one selected from the group consisting of:
stylus position input;
mouse input;
touchpad input;
pointing device input;
touch-sensitive screen input;
keyboard input;
voice input; and remote controller input.
stylus position input;
mouse input;
touchpad input;
pointing device input;
touch-sensitive screen input;
keyboard input;
voice input; and remote controller input.
19. The method of claim 17, wherein one mode of user input comprises receiv-ing keyboard input from a keyboard including keys corresponding to the zones.
20. The method of claim 17, wherein one mode of user input comprises receiv-ing keyboard input from a keyboard having standard keys on the keyboard selec-tively assigned to zones.
21. The method of claim 17, wherein one mode of user input comprises receiv-ing keyboard input from a keyboard including additional keys corresponding to the zones.
22. The method of claim 17, wherein the zones are arranged in a grid.
23. The method of claim 17, wherein the zones are arranged in a matrix com-prising rows of cells, and wherein the object representation is located within a cell of the matrix.
24. The method of claim 17, wherein the zones are arranged in a matrix com-prising three rows of three cells each, and wherein the object representation is located in the center cell of the center row.
25. The method of claim 17, further comprising:
responsive to a menu activation command, displaying a menu for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation;
responsive to a menu activation command, displaying a menu for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation;
26. The method of claim 25, wherein at least one of the menu commands is also directly activatable by at least one of stroking, pressing a button, or double-clicking within the zone.
27. The method of claim 25, wherein the menu activation command comprises positioning an on-screen cursor within the zone and pressing a button.
28. The method of claim 25, wherein the menu activation command comprises a voice corrunand.
29. The method of claim 25, wherein the menu includes, for at least one com-mand, an icon indicating a stroke direction for directly activating the command.
30. The method of claim 25, wherein the menu indicates a double-click com-mand for direct activation of each directly activatable command.
31. The method of claim 25, wherein a stroke command for a zone is activat-able by positioning an on-screen cursor within the zone and stroking the cursor.
32. The method of claim 25, wherein a double-click command for a zone is ac-tivatable by positioning an on-screen cursor within the zone and double-clicking.
33. In a user interface including a plurality of stroke commands for a zone, a computer-implemented method for manipulating an object, comprising:
responsive to a stroke along a first axis of a zone proximate the object, chang-ing a characteristic of the object by a first increment; and responsive a stroke along a second axis of the zone, changing the characteristic of the object by a second increment different from the first incre-ment.
responsive to a stroke along a first axis of a zone proximate the object, chang-ing a characteristic of the object by a first increment; and responsive a stroke along a second axis of the zone, changing the characteristic of the object by a second increment different from the first incre-ment.
34. The method of claim 33, wherein the second increment is of smaller magni-tude than the first increment.
35. The method of claim 33, wherein the second axis is perpendicular to the first axis.
36. The method of claim 35, wherein one axis is vertical, and the other axis is horizontal.
37. The method of claim 33, wherein the characteristic of the object is one se-lected from the group consisting of:
a start position;
an end position;
a duration;
a size;
a length;
a date;
a time;
a numeric value;
a width;
a height;
an image cropping specification;
a thickness;
a decimal place location;
playing speed;
playing position;
a leading character;
a terminating character;
a location;
an alignment;
a rotation;
a font;
a style;
a capitalization;
a color;
an opacity;
a brightness; and a relative volume.
a start position;
an end position;
a duration;
a size;
a length;
a date;
a time;
a numeric value;
a width;
a height;
an image cropping specification;
a thickness;
a decimal place location;
playing speed;
playing position;
a leading character;
a terminating character;
a location;
an alignment;
a rotation;
a font;
a style;
a capitalization;
a color;
an opacity;
a brightness; and a relative volume.
38. The method of claim 33, further comprising:
responsive to the user input comprising a menu activation command:
displaying a menu comprising commands;
accepting a second user input selecting a command from the menu;
and responsive to the menu command, changing a characteristic of the object.
responsive to the user input comprising a menu activation command:
displaying a menu comprising commands;
accepting a second user input selecting a command from the menu;
and responsive to the menu command, changing a characteristic of the object.
39. A computer program product for manipulating an object, comprising:
a computer-readable medium; and computer program code, encoded on the medium, for:
displaying a representation of the object;
displaying a control region surrounding the object and comprising a plurality of zones for accepting object manipulation com-mands on the object via at least two modes of user input;
receiving user input in one of the zones; and responsive to the user input, changing a characteristic of the object.
a computer-readable medium; and computer program code, encoded on the medium, for:
displaying a representation of the object;
displaying a control region surrounding the object and comprising a plurality of zones for accepting object manipulation com-mands on the object via at least two modes of user input;
receiving user input in one of the zones; and responsive to the user input, changing a characteristic of the object.
40. The computer program product of claim 39, wherein each mode of user input comprises one selected from the group consisting of:
stylus position input;
mouse input;
touchpad input;
pointing device input;
touch-sensitive screen input;
keyboard input;
voice input; and remote controller input.
stylus position input;
mouse input;
touchpad input;
pointing device input;
touch-sensitive screen input;
keyboard input;
voice input; and remote controller input.
41. The computer program product of claim 39, wherein one mode of user in-put comprises receiving keyboard input from a keyboard including keys correspond-ing to the zones.
42. The computer program product of claim 39, further comprising computer program code for:
responsive to a menu activation command, displaying a menu for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation;
responsive to a menu activation command, displaying a menu for a zone, the menu comprising commands, wherein the menu is displayed in proximity to the zone upon activation;
43. The computer program product of claim 42, wherein at least one of the menu commands is also directly activatable by at least one of stroking, pressing a but-ton, or double-clicking within the zone.
44. The computer program product of claim 42, wherein the menu includes, for at least one command, an icon indicating a stroke direction for directly activating the command.
45. In a user interface including a plurality of stroke commands for a zone, a computer-implemented computer program product for manipulating an object, com-prising:
a computer-readable medium; and computer program code, encoded on the medium, for:
responsive to a stroke along a first axis of a zone proximate the object, changing a characteristic of the object by a first increment;
and responsive a stroke along a second axis of the zone, changing the char-acteristic of the object by a second increment different from the first increment.
a computer-readable medium; and computer program code, encoded on the medium, for:
responsive to a stroke along a first axis of a zone proximate the object, changing a characteristic of the object by a first increment;
and responsive a stroke along a second axis of the zone, changing the char-acteristic of the object by a second increment different from the first increment.
46. The computer program product of claim 45, wherein the characteristic of the object is one selected from the group consisting of:
47 a start position;
an end position;
a duration;
a size;
a length;
a date;
a time;
a numeric value;
a width;
a height;
an image cropping specification;
a thickness;
a decimal place location;
playing speed;
playing position;
a leading character;
a terminating character;
a location;
an alignment;
a rotation;
a font;
a style;
a capitalization;
a color;
an opacity;
a brightness; and a relative volume.
47. The computer program product of claim 45, further comprising:
responsive to the user input comprising a menu activation command:
displaying a menu comprising commands;
an end position;
a duration;
a size;
a length;
a date;
a time;
a numeric value;
a width;
a height;
an image cropping specification;
a thickness;
a decimal place location;
playing speed;
playing position;
a leading character;
a terminating character;
a location;
an alignment;
a rotation;
a font;
a style;
a capitalization;
a color;
an opacity;
a brightness; and a relative volume.
47. The computer program product of claim 45, further comprising:
responsive to the user input comprising a menu activation command:
displaying a menu comprising commands;
48 accepting a second user input selecting a command from the menu;
and responsive to the menu command, changing a characteristic of the object.
48. A system for manipulating an object displayed on a display, comprising:
a display, for displaying a representation of the object and for displaying a control region surrounding the displayed representation of the ob-ject and comprising a plurality of zones for accepting object manipu-lation commands via an input device and via at least two modes of user input;
an input device for accepting user input in the zones; and a processor, coupled to the display and to the input device, for executing an object manipulation command in response to the user input.
and responsive to the menu command, changing a characteristic of the object.
48. A system for manipulating an object displayed on a display, comprising:
a display, for displaying a representation of the object and for displaying a control region surrounding the displayed representation of the ob-ject and comprising a plurality of zones for accepting object manipu-lation commands via an input device and via at least two modes of user input;
an input device for accepting user input in the zones; and a processor, coupled to the display and to the input device, for executing an object manipulation command in response to the user input.
49. The system of claim 48, wherein the input device comprises at least one se-lected from the group consisting of:
a tablet for detecting a stylus position;
a mouse;
a touchpad;
a pointing device;
a touch-sensitive screen;
a keyboard;
a microphone for accepting voice input; and a remote controller.
a tablet for detecting a stylus position;
a mouse;
a touchpad;
a pointing device;
a touch-sensitive screen;
a keyboard;
a microphone for accepting voice input; and a remote controller.
50. The system of claim 48, wherein the input device comprises a keyboard in-cluding keys corresponding to the zones.
51. The system of claim 48, wherein the input device comprises a keyboard, and wherein standard keys on the keyboard are selectively assigned to zones.
52. The system of claim 48, wherein the input device comprises a keyboard in-cluding additional keys corresponding to the zones.
53. The system of claim 48, wherein the zones are arranged in a grid.
54. The system of claim 48, wherein the zones are arranged in a matrix com-prising rows of cells, and wherein the object representation is located within a cell of the matrix.
55. The system of claim 48, wherein the zones are arranged in a matrix com-prising three rows of three cells each, and wherein the object representation is located in the center cell of the center row.
56. The system of claim 48, wherein the user input modes comprise at least two selected from the group consisting of:
an activation command;
an activation command concurrent with a modifier key;
voice input;
keyboard input;
remote controller input;
mouse input;
stroke input; and menu command selection.
an activation command;
an activation command concurrent with a modifier key;
voice input;
keyboard input;
remote controller input;
mouse input;
stroke input; and menu command selection.
57. The system of claim 48, wherein, responsive to the input device receiving a menu activation command for a zone, the display further displays, in proximity to the zone upon activation, a menu comprising commands.
58. The system of claim 57, wherein at least one of the menu commands is also directly activatable by at least one of stroking, pressing a button, or double-clicking within the zone.
59. The system of claim 57, wherein the menu includes, for at least one com-mand, an icon indicating a stroke direction for directly activating the command.
60. The system of claim 57, wherein a stroke command for a zone is activatable by positioning an on-screen cursor within the zone and stroking the cursor.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US10/629,129 | 2003-07-28 | ||
US10/629,129 US7164410B2 (en) | 2003-07-28 | 2003-07-28 | Manipulating an on-screen object using zones surrounding the object |
PCT/US2004/023510 WO2005013052A2 (en) | 2003-07-28 | 2004-07-20 | Manipulating an on-screen object using zones surrounding the object |
Publications (2)
Publication Number | Publication Date |
---|---|
CA2533298A1 true CA2533298A1 (en) | 2005-02-10 |
CA2533298C CA2533298C (en) | 2011-07-12 |
Family
ID=34103546
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA2533298A Expired - Fee Related CA2533298C (en) | 2003-07-28 | 2004-07-20 | Manipulating an on-screen object using zones surrounding the object |
Country Status (5)
Country | Link |
---|---|
US (2) | US7164410B2 (en) |
EP (1) | EP1652049A4 (en) |
JP (1) | JP4370326B2 (en) |
CA (1) | CA2533298C (en) |
WO (1) | WO2005013052A2 (en) |
Families Citing this family (171)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
GB0121150D0 (en) * | 2001-08-31 | 2001-10-24 | Mitel Knowledge Corp | Menu presentation system |
JP3488874B2 (en) * | 2001-10-31 | 2004-01-19 | シャープ株式会社 | Editing apparatus, editing method, editing program, and computer-readable recording medium recording editing program |
US7721197B2 (en) * | 2004-08-12 | 2010-05-18 | Microsoft Corporation | System and method of displaying content on small screen computing devices |
EP1821183A4 (en) * | 2004-10-05 | 2011-01-26 | Nikon Corp | Electronic device |
US7818672B2 (en) * | 2004-12-30 | 2010-10-19 | Microsoft Corporation | Floating action buttons |
US7720887B2 (en) * | 2004-12-30 | 2010-05-18 | Microsoft Corporation | Database navigation |
US7730067B2 (en) * | 2004-12-30 | 2010-06-01 | Microsoft Corporation | Database interaction |
US20060206815A1 (en) * | 2005-03-08 | 2006-09-14 | Pathiyal Krishna K | Handheld electronic device having improved word correction, and associated method |
US20080010055A1 (en) * | 2005-03-08 | 2008-01-10 | Pathiyal Krishna K | Handheld Electronic Device and Associated Method Employing a Multiple-Axis Input Device and Providing a Prior Variant List When Employing a Disambiguation Routine and Reinitiating a Text Entry Session on a Word |
JP4397347B2 (en) * | 2005-04-26 | 2010-01-13 | アルプス電気株式会社 | Input device |
US7509593B2 (en) * | 2005-05-12 | 2009-03-24 | Microsoft Corporation | Mouse sound volume control |
KR100643306B1 (en) * | 2005-06-13 | 2006-11-10 | 삼성전자주식회사 | Apparatus and method for supporting user interface enables selecting menus which has same position or direction of remote control's selection position |
US8365093B2 (en) * | 2005-06-30 | 2013-01-29 | Nokia Corporation | Apparatus, method and computer program product enabling storage of state of temporary display screen magnification view |
US20070067798A1 (en) * | 2005-08-17 | 2007-03-22 | Hillcrest Laboratories, Inc. | Hover-buttons for user interfaces |
US7600195B2 (en) * | 2005-11-22 | 2009-10-06 | International Business Machines Corporation | Selecting a menu option from a multiplicity of menu options which are automatically sequenced |
US7509588B2 (en) | 2005-12-30 | 2009-03-24 | Apple Inc. | Portable electronic device with interface reconfiguration mode |
US8250486B2 (en) * | 2006-01-19 | 2012-08-21 | International Business Machines Corporation | Computer controlled user interactive display interface for accessing graphic tools with a minimum of display pointer movement |
KR101327581B1 (en) * | 2006-05-24 | 2013-11-12 | 엘지전자 주식회사 | Apparatus and Operating method of touch screen |
US20090213086A1 (en) * | 2006-04-19 | 2009-08-27 | Ji Suk Chae | Touch screen device and operating method thereof |
TWI328185B (en) * | 2006-04-19 | 2010-08-01 | Lg Electronics Inc | Touch screen device for potable terminal and method of displaying and selecting menus thereon |
KR101269375B1 (en) * | 2006-05-24 | 2013-05-29 | 엘지전자 주식회사 | Touch screen apparatus and Imige displaying method of touch screen |
KR20070113022A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen responds to user input |
KR20070113025A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen |
KR20070113018A (en) * | 2006-05-24 | 2007-11-28 | 엘지전자 주식회사 | Apparatus and operating method of touch screen |
US9274807B2 (en) | 2006-04-20 | 2016-03-01 | Qualcomm Incorporated | Selective hibernation of activities in an electronic device |
US20090278806A1 (en) * | 2008-05-06 | 2009-11-12 | Matias Gonzalo Duarte | Extended touch-sensitive control area for electronic device |
US8296684B2 (en) | 2008-05-23 | 2012-10-23 | Hewlett-Packard Development Company, L.P. | Navigating among activities in a computing device |
US8683362B2 (en) | 2008-05-23 | 2014-03-25 | Qualcomm Incorporated | Card metaphor for activities in a computing device |
DE102006021399B4 (en) * | 2006-05-08 | 2008-08-28 | Combots Product Gmbh & Co. Kg | Method and device for providing a selection menu associated with a displayed symbol |
DE102006021400B4 (en) * | 2006-05-08 | 2008-08-21 | Combots Product Gmbh & Co. Kg | Method and device for providing a selection menu associated with a displayed symbol |
TW200805131A (en) * | 2006-05-24 | 2008-01-16 | Lg Electronics Inc | Touch screen device and method of selecting files thereon |
US8261967B1 (en) | 2006-07-19 | 2012-09-11 | Leapfrog Enterprises, Inc. | Techniques for interactively coupling electronic content with printed media |
US8106856B2 (en) | 2006-09-06 | 2012-01-31 | Apple Inc. | Portable electronic device for photo management |
US7934156B2 (en) * | 2006-09-06 | 2011-04-26 | Apple Inc. | Deletion gestures on a portable multifunction device |
US10313505B2 (en) | 2006-09-06 | 2019-06-04 | Apple Inc. | Portable multifunction device, method, and graphical user interface for configuring and displaying widgets |
KR101241907B1 (en) * | 2006-09-29 | 2013-03-11 | 엘지전자 주식회사 | Remote controller and Method for generation of key code on remote controller thereof |
US20080098315A1 (en) * | 2006-10-18 | 2008-04-24 | Dao-Liang Chou | Executing an operation associated with a region proximate a graphic element on a surface |
US8519964B2 (en) | 2007-01-07 | 2013-08-27 | Apple Inc. | Portable multifunction device, method, and graphical user interface supporting user navigations of graphical objects on a touch screen display |
US8091045B2 (en) * | 2007-01-07 | 2012-01-03 | Apple Inc. | System and method for managing lists |
KR100832260B1 (en) * | 2007-02-03 | 2008-05-28 | 엘지전자 주식회사 | Mobile communication terminal and controlling method for the same |
EP1959238B1 (en) * | 2007-02-13 | 2018-05-23 | Harman Becker Automotive Systems GmbH | Method for inputting a destination in a navigation unit and nagivation system therefor |
US8436815B2 (en) * | 2007-05-25 | 2013-05-07 | Microsoft Corporation | Selective enabling of multi-input controls |
US8130417B2 (en) * | 2007-06-14 | 2012-03-06 | Samsung Electronics Co., Ltd | Image processing apparatus and image processing method |
US9772751B2 (en) | 2007-06-29 | 2017-09-26 | Apple Inc. | Using gestures to slide between user interfaces |
US9619143B2 (en) * | 2008-01-06 | 2017-04-11 | Apple Inc. | Device, method, and graphical user interface for viewing application launch icons |
US8619038B2 (en) | 2007-09-04 | 2013-12-31 | Apple Inc. | Editing interface |
US11126321B2 (en) | 2007-09-04 | 2021-09-21 | Apple Inc. | Application menu user interface |
JP4835553B2 (en) * | 2007-09-05 | 2011-12-14 | 富士ゼロックス株式会社 | Information processing apparatus and program |
US9317176B2 (en) * | 2007-09-24 | 2016-04-19 | Adobe Systems Incorporated | Rendering of content in a defined region of a graphical user interface |
KR100864749B1 (en) * | 2007-11-22 | 2008-10-22 | 김연수 | Characters input method |
DE102007047933B3 (en) * | 2007-12-20 | 2009-02-26 | Vistec Semiconductor Systems Gmbh | Semiconductor wafer surface e.g. front side or rear side, inspecting method for detecting defects on wafer surface, involves processing parameter or type of image receiving for area fixed on surface with detection sensitivity of area |
KR101456570B1 (en) * | 2007-12-21 | 2014-10-31 | 엘지전자 주식회사 | Mobile terminal having digital equalizer and controlling method using the same |
US20090187840A1 (en) * | 2008-01-17 | 2009-07-23 | Vahid Moosavi | Side-bar menu and menu on a display screen of a handheld electronic device |
EP2081110A1 (en) | 2008-01-17 | 2009-07-22 | Research In Motion Limited | Side-bar menu and menu on a display screen of a handheld electronic device |
US7941765B2 (en) * | 2008-01-23 | 2011-05-10 | Wacom Co., Ltd | System and method of controlling variables using a radial control menu |
US8797271B2 (en) * | 2008-02-27 | 2014-08-05 | Microsoft Corporation | Input aggregation for a multi-touch device |
US20090242106A1 (en) * | 2008-03-07 | 2009-10-01 | Kupferman Michael E | Pre-operative surgical site marking with a temporary customizable tattoo |
US9274681B2 (en) * | 2008-03-26 | 2016-03-01 | Lg Electronics Inc. | Terminal and method of controlling the same |
US8159469B2 (en) * | 2008-05-06 | 2012-04-17 | Hewlett-Packard Development Company, L.P. | User interface for initiating activities in an electronic device |
US8296670B2 (en) * | 2008-05-19 | 2012-10-23 | Microsoft Corporation | Accessing a menu utilizing a drag-operation |
US8156445B2 (en) * | 2008-06-20 | 2012-04-10 | Microsoft Corporation | Controlled interaction with heterogeneous data |
US7996422B2 (en) | 2008-07-22 | 2011-08-09 | At&T Intellectual Property L.L.P. | System and method for adaptive media playback based on destination |
US8990848B2 (en) | 2008-07-22 | 2015-03-24 | At&T Intellectual Property I, L.P. | System and method for temporally adaptive media playback |
CN101655764A (en) * | 2008-08-19 | 2010-02-24 | 深圳富泰宏精密工业有限公司 | System and method for simplifying interface operation |
US8924892B2 (en) * | 2008-08-22 | 2014-12-30 | Fuji Xerox Co., Ltd. | Multiple selection on devices with many gestures |
JP4636141B2 (en) * | 2008-08-28 | 2011-02-23 | ソニー株式会社 | Information processing apparatus and method, and program |
US20100066764A1 (en) * | 2008-09-18 | 2010-03-18 | Microsoft Corporation | Selective character magnification on touch screen devices |
US8584031B2 (en) | 2008-11-19 | 2013-11-12 | Apple Inc. | Portable touch screen device, method, and graphical user interface for using emoji characters |
US8334845B2 (en) * | 2008-11-24 | 2012-12-18 | Firstrade Securities, Inc | Thwarting screen logging of keypad in a web-based form |
US8601385B2 (en) * | 2008-11-25 | 2013-12-03 | General Electric Company | Zero pixel travel systems and methods of use |
KR101510758B1 (en) * | 2008-12-05 | 2015-04-10 | 삼성전자 주식회사 | Display apparatus and user interface display method thereof |
KR20100069842A (en) * | 2008-12-17 | 2010-06-25 | 삼성전자주식회사 | Electronic apparatus implementing user interface and method thereof |
KR101593598B1 (en) * | 2009-04-03 | 2016-02-12 | 삼성전자주식회사 | Method for activating function of portable terminal using user gesture in portable terminal |
US8418082B2 (en) * | 2009-05-01 | 2013-04-09 | Apple Inc. | Cross-track edit indicators and edit selections |
US8627207B2 (en) * | 2009-05-01 | 2014-01-07 | Apple Inc. | Presenting an editing tool in a composite display area |
US8681106B2 (en) | 2009-06-07 | 2014-03-25 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility using a touch-sensitive surface |
KR20100134948A (en) * | 2009-06-16 | 2010-12-24 | 삼성전자주식회사 | Method for displaying menu list in touch screen based device |
US9035887B1 (en) * | 2009-07-10 | 2015-05-19 | Lexcycle, Inc | Interactive user interface |
CA2768214A1 (en) * | 2009-07-15 | 2011-01-20 | Google Inc. | Commands directed at displayed text |
JP5326912B2 (en) * | 2009-07-31 | 2013-10-30 | ブラザー工業株式会社 | Printing device, composite image data generation device, and composite image data generation program |
US20110035700A1 (en) * | 2009-08-05 | 2011-02-10 | Brian Meaney | Multi-Operation User Interface Tool |
EP2480957B1 (en) | 2009-09-22 | 2017-08-09 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US9310907B2 (en) | 2009-09-25 | 2016-04-12 | Apple Inc. | Device, method, and graphical user interface for manipulating user interface objects |
US8421762B2 (en) * | 2009-09-25 | 2013-04-16 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US8438500B2 (en) | 2009-09-25 | 2013-05-07 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US8799775B2 (en) * | 2009-09-25 | 2014-08-05 | Apple Inc. | Device, method, and graphical user interface for displaying emphasis animations for an electronic document in a presentation mode |
US8416205B2 (en) * | 2009-09-25 | 2013-04-09 | Apple Inc. | Device, method, and graphical user interface for manipulation of user interface objects with activation regions |
US20110096135A1 (en) * | 2009-10-23 | 2011-04-28 | Microsoft Corporation | Automatic labeling of a video session |
US8601394B2 (en) * | 2009-11-06 | 2013-12-03 | Bose Corporation | Graphical user interface user customization |
US20110113368A1 (en) * | 2009-11-06 | 2011-05-12 | Santiago Carvajal | Audio/Visual Device Graphical User Interface |
US20110157027A1 (en) * | 2009-12-30 | 2011-06-30 | Nokia Corporation | Method and Apparatus for Performing an Operation on a User Interface Object |
US8793611B2 (en) * | 2010-01-06 | 2014-07-29 | Apple Inc. | Device, method, and graphical user interface for manipulating selectable user interface objects |
US8786559B2 (en) * | 2010-01-06 | 2014-07-22 | Apple Inc. | Device, method, and graphical user interface for manipulating tables using multi-contact gestures |
US8698762B2 (en) | 2010-01-06 | 2014-04-15 | Apple Inc. | Device, method, and graphical user interface for navigating and displaying content in context |
US8386965B2 (en) * | 2010-01-15 | 2013-02-26 | Apple Inc. | Techniques and systems for enhancing touch screen device accessibility through virtual containers and virtually enlarged boundaries |
GB2477528B (en) * | 2010-02-04 | 2014-01-15 | Imagination Tech Ltd | Touch sensitive screen for scrolling through sets of data |
JP5704825B2 (en) * | 2010-03-08 | 2015-04-22 | キヤノン株式会社 | Information processing apparatus, control method thereof, and program |
WO2011113057A1 (en) * | 2010-03-12 | 2011-09-15 | Nuance Communications, Inc. | Multimodal text input system, such as for use with touch screens on mobile phones |
US10788976B2 (en) | 2010-04-07 | 2020-09-29 | Apple Inc. | Device, method, and graphical user interface for managing folders with multiple pages |
US8423911B2 (en) | 2010-04-07 | 2013-04-16 | Apple Inc. | Device, method, and graphical user interface for managing folders |
US9285988B2 (en) | 2010-04-20 | 2016-03-15 | Blackberry Limited | Portable electronic device having touch-sensitive display with variable repeat rate |
US20110265009A1 (en) * | 2010-04-27 | 2011-10-27 | Microsoft Corporation | Terminal services view toolbox |
KR20110121888A (en) * | 2010-05-03 | 2011-11-09 | 삼성전자주식회사 | Apparatus and method for determining the pop-up menu in portable terminal |
JP5743047B2 (en) * | 2010-05-17 | 2015-07-01 | セイコーエプソン株式会社 | Display device and display method |
US8707195B2 (en) | 2010-06-07 | 2014-04-22 | Apple Inc. | Devices, methods, and graphical user interfaces for accessibility via a touch-sensitive surface |
KR101667586B1 (en) * | 2010-07-12 | 2016-10-19 | 엘지전자 주식회사 | Mobile terminal and method for controlling the same |
US8773370B2 (en) | 2010-07-13 | 2014-07-08 | Apple Inc. | Table editing systems with gesture-based insertion and deletion of columns and rows |
US9870141B2 (en) | 2010-11-19 | 2018-01-16 | Microsoft Technology Licensing, Llc | Gesture recognition |
US9146674B2 (en) * | 2010-11-23 | 2015-09-29 | Sectra Ab | GUI controls with movable touch-control objects for alternate interactions |
JP5630703B2 (en) * | 2010-12-08 | 2014-11-26 | クボタシステム開発株式会社 | Command menu management module |
US9047590B2 (en) * | 2011-01-25 | 2015-06-02 | Bank Of America Corporation | Single identifiable entry point for accessing contact information via a computer network |
US9271027B2 (en) * | 2011-01-30 | 2016-02-23 | Lg Electronics Inc. | Image display apparatus and method for operating the same |
US9116616B2 (en) * | 2011-02-10 | 2015-08-25 | Blackberry Limited | Portable electronic device and method of controlling same |
US20120216113A1 (en) * | 2011-02-18 | 2012-08-23 | Google Inc. | Touch gestures for text-entry operations |
US8972267B2 (en) * | 2011-04-07 | 2015-03-03 | Sony Corporation | Controlling audio video display device (AVDD) tuning using channel name |
US9513799B2 (en) | 2011-06-05 | 2016-12-06 | Apple Inc. | Devices, methods, and graphical user interfaces for providing control of a touch-based user interface absent physical touch capabilities |
JP5360140B2 (en) * | 2011-06-17 | 2013-12-04 | コニカミノルタ株式会社 | Information browsing apparatus, control program, and control method |
US9983786B2 (en) | 2011-06-21 | 2018-05-29 | Google Technology Holdings LLC | Electronic device with gesture-based task management |
CN102890610B (en) * | 2011-07-18 | 2017-10-17 | 中兴通讯股份有限公司 | The method of terminal processes document with touch-screen and the terminal with touch-screen |
US10318146B2 (en) * | 2011-09-12 | 2019-06-11 | Microsoft Technology Licensing, Llc | Control area for a touch screen |
US9612670B2 (en) | 2011-09-12 | 2017-04-04 | Microsoft Technology Licensing, Llc | Explicit touch selection and cursor placement |
US8988467B2 (en) | 2011-10-13 | 2015-03-24 | Microsoft Technology Licensing, Llc | Touchscreen selection visual feedback |
KR101589104B1 (en) * | 2011-11-11 | 2016-01-27 | 퀄컴 인코포레이티드 | Providing keyboard shortcuts mapped to a keyboard |
US9116611B2 (en) | 2011-12-29 | 2015-08-25 | Apple Inc. | Devices, methods, and graphical user interfaces for providing multitouch inputs and hardware-based features using a single touch input |
JP5137150B1 (en) | 2012-02-23 | 2013-02-06 | 株式会社ワコム | Handwritten information input device and portable electronic device provided with handwritten information input device |
US8935606B2 (en) | 2012-02-29 | 2015-01-13 | Ebay Inc. | Systems and methods for providing a user interface with grid view |
US8881269B2 (en) | 2012-03-31 | 2014-11-04 | Apple Inc. | Device, method, and graphical user interface for integrating recognition of handwriting gestures with a screen reader |
CN103505240B (en) * | 2012-06-29 | 2018-05-22 | 通用电气公司 | Supersonic imaging apparatus and the device and method for adjust automatically user interface layout |
JP6013051B2 (en) * | 2012-07-02 | 2016-10-25 | 東芝メディカルシステムズ株式会社 | Ultrasonic diagnostic apparatus and operation support method thereof |
US20150186038A1 (en) * | 2012-07-12 | 2015-07-02 | Deying Guo | Terminal and terminal control method |
US20140033093A1 (en) * | 2012-07-25 | 2014-01-30 | Microsoft Corporation | Manipulating tables with touch gestures |
CN103576847B (en) * | 2012-08-09 | 2016-03-30 | 腾讯科技(深圳)有限公司 | Obtain the method and apparatus of account information |
JP5784566B2 (en) * | 2012-09-28 | 2015-09-24 | 京セラドキュメントソリューションズ株式会社 | Operating device and operating method |
US9589538B2 (en) | 2012-10-17 | 2017-03-07 | Perceptive Pixel, Inc. | Controlling virtual objects |
US10368836B2 (en) * | 2012-12-26 | 2019-08-06 | Volcano Corporation | Gesture-based interface for a multi-modality medical imaging system |
JP5761216B2 (en) * | 2013-01-22 | 2015-08-12 | カシオ計算機株式会社 | Information processing apparatus, information processing method, and program |
EP2781998A1 (en) * | 2013-03-20 | 2014-09-24 | Advanced Digital Broadcast S.A. | A method and a system for generating a graphical user interface menu |
US20140304607A1 (en) * | 2013-04-05 | 2014-10-09 | Michael Lyle Eglington | Monitoring System |
JP6303314B2 (en) * | 2013-07-31 | 2018-04-04 | ブラザー工業株式会社 | Program and information processing apparatus |
TWI493433B (en) * | 2013-08-28 | 2015-07-21 | Acer Inc | Covered image projecting method and portable electronic apparatus using the same |
KR102162836B1 (en) | 2013-08-30 | 2020-10-07 | 삼성전자주식회사 | Apparatas and method for supplying content according to field attribute |
KR102165818B1 (en) | 2013-09-10 | 2020-10-14 | 삼성전자주식회사 | Method, apparatus and recovering medium for controlling user interface using a input image |
EP3063608B1 (en) | 2013-10-30 | 2020-02-12 | Apple Inc. | Displaying relevant user interface objects |
US9507520B2 (en) * | 2013-12-16 | 2016-11-29 | Microsoft Technology Licensing, Llc | Touch-based reorganization of page element |
US9836203B2 (en) * | 2014-07-25 | 2017-12-05 | Axiom One Ltd. | Grid-based visual design environment |
US10671275B2 (en) | 2014-09-04 | 2020-06-02 | Apple Inc. | User interfaces for improving single-handed operation of devices |
US9961239B2 (en) | 2015-06-07 | 2018-05-01 | Apple Inc. | Touch accommodation options |
US10345991B2 (en) | 2015-06-16 | 2019-07-09 | International Business Machines Corporation | Adjusting appearance of icons in an electronic device |
JP6544073B2 (en) * | 2015-06-22 | 2019-07-17 | セイコーエプソン株式会社 | Image display system and image display method |
US10025452B2 (en) * | 2015-09-14 | 2018-07-17 | Adobe Systems Incorporated | Physics-based cell layout redesign |
CN105741630B (en) * | 2016-02-03 | 2018-11-13 | 李毅鸥 | A kind of system and method for making demonstration document that there is Interactive function |
US10402470B2 (en) * | 2016-02-12 | 2019-09-03 | Microsoft Technology Licensing, Llc | Effecting multi-step operations in an application in response to direct manipulation of a selected object |
USD886116S1 (en) | 2016-04-14 | 2020-06-02 | Markup Llc | Display screen portion with graphical user interface |
US10637986B2 (en) | 2016-06-10 | 2020-04-28 | Apple Inc. | Displaying and updating a set of application views |
DK201670595A1 (en) | 2016-06-11 | 2018-01-22 | Apple Inc | Configuring context-specific user interfaces |
US20170357644A1 (en) | 2016-06-12 | 2017-12-14 | Apple Inc. | Notable moments in a collection of digital assets |
US11816325B2 (en) | 2016-06-12 | 2023-11-14 | Apple Inc. | Application shortcuts for carplay |
AU2017100670C4 (en) | 2016-06-12 | 2019-11-21 | Apple Inc. | User interfaces for retrieving contextually relevant media content |
DK201670609A1 (en) | 2016-06-12 | 2018-01-02 | Apple Inc | User interfaces for retrieving contextually relevant media content |
DK180171B1 (en) | 2018-05-07 | 2020-07-14 | Apple Inc | USER INTERFACES FOR SHARING CONTEXTUALLY RELEVANT MEDIA CONTENT |
US11243996B2 (en) | 2018-05-07 | 2022-02-08 | Apple Inc. | Digital asset search user interface |
US11086935B2 (en) | 2018-05-07 | 2021-08-10 | Apple Inc. | Smart updates from historical database changes |
US10846343B2 (en) | 2018-09-11 | 2020-11-24 | Apple Inc. | Techniques for disambiguating clustered location identifiers |
US10803135B2 (en) | 2018-09-11 | 2020-10-13 | Apple Inc. | Techniques for disambiguating clustered occurrence identifiers |
US11675476B2 (en) | 2019-05-05 | 2023-06-13 | Apple Inc. | User interfaces for widgets |
US11131967B2 (en) | 2019-05-06 | 2021-09-28 | Apple Inc. | Clock faces for an electronic device |
DK201970535A1 (en) | 2019-05-06 | 2020-12-21 | Apple Inc | Media browsing user interface with intelligently selected representative media items |
DK202070612A1 (en) | 2020-02-14 | 2021-10-26 | Apple Inc | User interfaces for workout content |
USD974370S1 (en) | 2020-04-03 | 2023-01-03 | Markup Llc | Display screen portion with graphical user interface |
US11194471B1 (en) | 2021-01-28 | 2021-12-07 | Honda Motor Co., Ltd. | Apparatus and method for display control based on touch interface |
US11768591B2 (en) * | 2021-04-23 | 2023-09-26 | Lucid Software, Inc | Dynamic graphical containers |
Family Cites Families (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4202037A (en) * | 1977-04-22 | 1980-05-06 | Der Loos Hendrik Van | Computer microscope apparatus and method for superimposing an electronically-produced image from the computer memory upon the image in the microscope's field of view |
JPH02222021A (en) | 1989-02-23 | 1990-09-04 | Sanyo Electric Co Ltd | Icon selecting device |
JPH03246614A (en) | 1990-02-23 | 1991-11-05 | Seiko Epson Corp | Menu selection system |
AU9015891A (en) * | 1990-11-30 | 1992-06-25 | Cambridge Animation Systems Limited | Animation |
JPH0823803B2 (en) | 1991-10-31 | 1996-03-06 | インターナショナル・ビジネス・マシーンズ・コーポレイション | Method for accessing system behavior, icon display device and data processing system |
JP3246614B2 (en) | 1992-04-08 | 2002-01-15 | 日立金属株式会社 | Glass forming steel |
JPH0764749A (en) | 1993-08-25 | 1995-03-10 | Fujitsu Ltd | Command execution processor |
JP3546337B2 (en) | 1993-12-21 | 2004-07-28 | ゼロックス コーポレイション | User interface device for computing system and method of using graphic keyboard |
US5500935A (en) | 1993-12-30 | 1996-03-19 | Xerox Corporation | Apparatus and method for translating graphic objects and commands with direct touch input in a touch based input system |
US5543818A (en) * | 1994-05-13 | 1996-08-06 | Sony Corporation | Method and apparatus for entering text using an input device having a small number of keys |
US20020126161A1 (en) * | 1994-07-05 | 2002-09-12 | Hitachi, Ltd. | Information processing system |
WO1996009579A1 (en) | 1994-09-22 | 1996-03-28 | Izak Van Cruyningen | Popup menus with directional gestures |
JPH08115192A (en) | 1994-10-17 | 1996-05-07 | Hitachi Ltd | Data editing processing method and system |
US5689667A (en) * | 1995-06-06 | 1997-11-18 | Silicon Graphics, Inc. | Methods and system of controlling menus with radial and linear portions |
US5778404A (en) | 1995-08-07 | 1998-07-07 | Apple Computer, Inc. | String inserter for pen-based computer systems and method for providing same |
US5754176A (en) | 1995-10-02 | 1998-05-19 | Ast Research, Inc. | Pop-up help system for a computer graphical user interface |
US6057844A (en) * | 1997-04-28 | 2000-05-02 | Adobe Systems Incorporated | Drag operation gesture controller |
US6177941B1 (en) * | 1997-08-25 | 2001-01-23 | International Business Machine Corporation | Representative mapping between toolbars and menu bar pulldowns |
US6414700B1 (en) * | 1998-07-21 | 2002-07-02 | Silicon Graphics, Inc. | System for accessing a large number of menu items using a zoned menu bar |
US6424335B1 (en) * | 1998-09-02 | 2002-07-23 | Fujitsu Limited | Notebook computer with detachable infrared multi-mode input device |
US6727830B2 (en) | 1999-01-05 | 2004-04-27 | Microsoft Corporation | Time based hardware button for application launch |
US7030863B2 (en) * | 2000-05-26 | 2006-04-18 | America Online, Incorporated | Virtual keyboard system with automatic correction |
US6664991B1 (en) | 2000-01-06 | 2003-12-16 | Microsoft Corporation | Method and apparatus for providing context menus on a pen-based device |
US6822664B2 (en) | 2000-10-11 | 2004-11-23 | Microsoft Corporation | Browser navigation for devices with a limited input system |
US6904570B2 (en) | 2001-06-07 | 2005-06-07 | Synaptics, Inc. | Method and apparatus for controlling a display of data on a display screen |
CA2429660A1 (en) * | 2002-05-31 | 2003-11-30 | Regelous, Stephen Noel | Field control method and system |
US7082211B2 (en) * | 2002-05-31 | 2006-07-25 | Eastman Kodak Company | Method and system for enhancing portrait images |
AU2003297172A1 (en) * | 2002-12-16 | 2004-07-22 | Microsoft Corporation | Systems and methods for interfacing with computer devices |
US7554530B2 (en) * | 2002-12-23 | 2009-06-30 | Nokia Corporation | Touch screen user interface featuring stroke-based object selection and functional object activation |
-
2003
- 2003-07-28 US US10/629,129 patent/US7164410B2/en active Active
-
2004
- 2004-07-20 EP EP04778836A patent/EP1652049A4/en not_active Withdrawn
- 2004-07-20 WO PCT/US2004/023510 patent/WO2005013052A2/en active Application Filing
- 2004-07-20 JP JP2006521914A patent/JP4370326B2/en not_active Expired - Fee Related
- 2004-07-20 CA CA2533298A patent/CA2533298C/en not_active Expired - Fee Related
-
2006
- 2006-11-29 US US11/564,798 patent/US8286101B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
US7164410B2 (en) | 2007-01-16 |
CA2533298C (en) | 2011-07-12 |
JP2007505371A (en) | 2007-03-08 |
US20070101292A1 (en) | 2007-05-03 |
WO2005013052A3 (en) | 2006-11-23 |
WO2005013052A2 (en) | 2005-02-10 |
JP4370326B2 (en) | 2009-11-25 |
EP1652049A2 (en) | 2006-05-03 |
EP1652049A4 (en) | 2011-05-04 |
US8286101B2 (en) | 2012-10-09 |
US20050024322A1 (en) | 2005-02-03 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CA2533298A1 (en) | Manipulating an on-screen object using zones surrounding the object | |
CA2533296C (en) | Common on-screen zone for menu activation and stroke input | |
US8259077B2 (en) | Electronic device for inputting user command 3-dimensionally and method for employing the same | |
US9213477B2 (en) | Apparatus and method for touch screen user interface for handheld electric devices part II | |
US5805167A (en) | Popup menus with directional gestures | |
US7760187B2 (en) | Visual expander | |
KR101589104B1 (en) | Providing keyboard shortcuts mapped to a keyboard | |
CN101595445B (en) | Human-computer interaction device, electronic installation and man-machine interaction method | |
EP2105823A1 (en) | Human computer interaction device, electronic device and human computer interaction method | |
EP0422577A2 (en) | Method and apparatus for displaying simulated keyboards on touch-sensitive displays | |
EP1920408A2 (en) | Input device having multifunctional keys | |
CN101006493A (en) | Virtual keypad input device | |
US11194415B2 (en) | Method and apparatus for indirect force aware touch control with variable impedance touch sensor arrays | |
US11656718B2 (en) | Method and apparatus for variable impedance touch sensor array force aware interaction in large surface devices | |
JPH09244836A (en) | Display controller | |
US20090262072A1 (en) | Cursor control system and method thereof | |
KR20090096377A (en) | Data Input Device | |
JPH01142818A (en) | Screen control system | |
KR100762944B1 (en) | Editor for screen keyboard on display device and editing method therefor | |
JPH06230888A (en) | Mouse input system in touch panel | |
JPH10198505A (en) | Personal computer device | |
JPS63311519A (en) | Coordinate input method | |
WO2013078621A1 (en) | Touch screen input method for electronic device, and electronic device | |
KR101448559B1 (en) | Remote controller for smart-TV of hybrid-type character input based on virtual keyboard | |
JP2014135708A (en) | Character input method and device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
MKLA | Lapsed |
Effective date: 20190722 |