|Publication number||US5189403 A|
|Application number||US 07/649,711|
|Publication date||Feb 23, 1993|
|Filing date||Feb 1, 1991|
|Priority date||Sep 26, 1989|
|Also published as||EP0497502A2, EP0497502A3, US6107996|
|Publication number||07649711, 649711, US 5189403 A, US 5189403A, US-A-5189403, US5189403 A, US5189403A|
|Inventors||Patrick J. Franz, David H. Straayer|
|Original Assignee||Home Row, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (13), Non-Patent Citations (8), Referenced by (216), Classifications (22), Legal Events (9)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application is a continuation-in-part of copending application Ser. No. 07/412,680 filed Sep. 26, 1989, U.S. Pat. No. 5,124,689.
The QWERTY keyboard is entrenched as the preferred device for typing alpha-numeric data into a computer. Various apparatus and methods are known for pointing operations, such as selecting text, on a CRT display. Devices for that purpose include the mouse, joystick, step keys and text keys. Keyboards are known which include a joystick, joydisk or other pointing device mounted on the keyboard. However, a separate mouse has been shown to be the most effective pointing device with respect to operating time, error rate and the like. This accounts for its widespread acceptance in the computer industry.
Use of a conventional keyboard and a discrete mouse for entering typing and pointing information, respectively, into a computer require physical and mental disruptions that significantly reduce a user's productivity. For example, typing on a keyboard and pointing with a mouse require the user frequently to move the hands back and forth between the keyboard and the mouse. One research paper reports that it takes about 0.36 seconds to move a hand from the keyboard onto a mouse and additional time to adjust the grasp for operating the mouse buttons. Time to return to the keyboard must be considered as well. The data suggests that the total time spent moving to and from the mouse is greater than one second per occurrence. See S. K. Card, et al, "Evaluation of Mouse, Rate-Controlled Isometric Joystick, Step Keys, and Text Keys for Text Selection on a CRT" (Xerox Palo Alto Research Center), published in Ergonomics, Vol.2, No.8, 601-613 (1978).
These movements between keyboard and mouse are disruptive physically because of the distances and side to side arm motions typically required. These movements are disruptive mentally because of the time the movements take, because of the dramatic shift in physical activity (typing on a keyboard is very unlike pointing with a mouse), and because of many mental and physical steps required to perform the movements. Each additional step requires physical and mental effort and is an opportunity for error.
U.S. Pat. No. 4,712,101 describes a cursor positioning device which may be positioned on a keyboard below the space bar. Such a device has been announced under the trade name Isopoint. The Isopoint device includes a finger or thumb actuated roller coupled to a rotary-shaft encoder for indicating position change in a first direction. The roller rests in a sliding cradle which drives a second encoder. Such a device is awkward to operate, especially for diagonal cursor movement, because of the required combination of rolling and sliding actions. Additionally, the sliding cradle has fixed end points which impose discontinuities in its operation.
Another pointing device which can be embedded into a conventional keyboard for cursor control is the OmniPoint™ cursor controller, announced by Osiris Technologies, Inc., Nevada City, Calif. Omnipoint essentially includes a miniature joystick and associated interface electronics. The joystick may be mounted in a keyboard adjacent the standard array of keys. Its use of course requires moving the hand away from the usual typing position.
Both the Isopoint and Omnipoint devices include an embedded switch so that the user can press the device downward (into the keyboard) to emulate a mouse button "click" or dragging operation. Such devices therefore can emulate at best only a single button mouse.
U.S. Pat. No. 4,680,577 to Straayer, et al. shows a multipurpose keyswitch disposed within a keyboard array for controlling cursor movement on a CRT display and for character entry. One of the standard alpha keys is replaced with the multipurpose keyswitch. An additional keyswitch is suggested for activating the cursor positioning capabilities of the multipurpose keyswitch. The '577 patent does not disclose a practical way to implement and use such a system. Additionally, that patent does not address how to input any pointing event information, for example mouse button actions ("clicking"), so that system cannot substitute for a mouse. What is needed is a practical method of allowing a user to type and point without moving the hands away from the usual typing position and without dramatically changing the physical activity.
Prior art methods of pointing and typing assumed these disruptions and incorporated them as limitations in computers, keyboards, pointing devices, and software. Typing and pointing were perceived as distinct and irreconcilable activities.
As a result, there is considerable duplication of hardware and software. For example, the buttons on a mouse are required because pointing has been perceived as very different, mentally and physically, from typing. Since pointing is so different, the hardware and software are duplicated, reinforcing the separation. For example, the buttons on a mouse are duplications because there are plenty of keys on the keyboard that generally are not being used while the mouse is being used. What is needed is a pointing device system that eliminates the restrictions and the duplications of present systems, while maintaining the same software interface to application programs for complete compatibility.
Many known pointing devices also have physical limitations on the number of buttons they can have. These limitations severely limit the range of actions the user can take while pointing. One commercial product that attempts to add additional buttons, called the PowerMouse 100, integrates a two-button mouse with forty programmable keys, resulting in a large, cumbersome device that is difficult to use. What is needed is to allow the user to keep the hands on the keyboard while pointing, and simultaneously allow the user to input other pointing related data or "pointing events," presently input by mouse buttons, but again without moving the hands from the usual typing position.
Another problem overlooked in the prior art is that of visually locating the cursor image on a display screen. Each time a user stops typing and grasps a mouse to point, he has to search the display screen visually for the cursor, or actuate the pointing device in such a way that the moving cursor will be noticed. These methods are time consuming and may be disruptive. If the cursor image is mostly off screen, it can be extremely difficult to locate the cursor. In some cases, just one pixel is showing. With some software, moving the cursor onto certain areas of the screen indicates the desire for a particular action to be taken. If the user does not wish this action to be taken, they must perform additional steps to cancel the implied indication. What is needed is for the user to immediately detect the location of the cursor on the screen at the time pointing begins.
Known methods of repositioning a cursor in response to signals from a pointing device do not compensate for the many different modes and resolutions of available display systems. Two programs that configure the display system differently may exhibit different apparent cursor repositioning responses to the same pointing device signals. What is needed is a method of cursor repositioning that compensates for changes in the display system so that the same action performed by the user with the pointing device will produce a similar result independent of changes in the display system.
Adjusting cursor speed is another clumsy and time consuming action. Known methods of changing cursor speed introduce discontinuities that seriously disrupt the user's work flow. As a result, many people simply tolerate an inappropriate and therefore inefficient cursor speed because it is too cumbersome and too burdensome to change it.
For example, a typical mouse requires that the user first stop the application program in progress, and then execute a special mouse speed changing program, where the speed is specified by typing in a number. The resulting cursor speed cannot be observed until the application program is restarted. The user may find another adjustment is necessary and have to repeat the process.
Other known methods involve pressing keyboard keys and mouse buttons simultaneously, but these schemes are not continuous and interfere with the operation of certain application programs. What is needed is to afford a user fast, interactive ways to control cursor speed without significant interruption of work in progress.
It is an object of the present invention to allow a computer user to type and to point without removing the hands from the home row (asdf-jkl;) of the keyboard, and without dramatically changing the physical activity.
Another object of the invention is to reduce the restrictions in operation and the duplications of hardware and software found in present computer systems, while maintaining a standard software interface to existing programs for compatibility.
Another object of the invention is to allow the user to keep the hands on the keyboard while pointing, simultaneously allowing the user to conveniently input pointing related data or "pointing events," again without moving the hands from the usual typing position.
A further object of the invention is to enable a computer user to immediately detect the location of the cursor on the display screen whenever a pointing operation begins.
Yet another object is to compensate for variations in display systems when repositioning a display cursor so that a given action performed by the user with the pointing device will produce a similar result independent of variations in display systems.
According to the present invention, a single, integrated keyboard system is provided for inputting all typing and pointing information to a computer without moving the hands from the usual typing position. One of the keyboard keys, called the pointing key, has sensors coupled to it to acquire pointing direction data from lateral displacement of the pointing key.
The new system has a typing mode of operation and a pointing mode. In the typing mode, key codes corresponding to actuated keyswitches are forwarded to the operating system in the usual fashion. In the pointing mode, operation of the entire keyboard is changed. All of the keys become available for new functions.
One or more of the keys is assigned as a pointing event key, for inputting information which would come from mouse buttons in a segregated system. Other keys may be assigned various meanings for modifying operation of the system, such a cursor speed control, macros, etc.
Cursor movement is implemented in the pointing mode by reading sensor data from the pointing key, mapping the data to form cursor displacement data, and scaling the displacement data according to a speed index. Keys may be assigned for changing the speed index, and therefore the apparent cursor speed, at any time. This allows speed control interactively and without leaving an application program.
The cursor mapping may be done by algebraic manipulation of the sensor data according to a predetermined tracking algorithm, or by lookup tables. The new system also allows detecting the display mode, for example character or graphics mode, and altering cursor speed accordingly to optimize control.
According to another aspect of the invention, a user's intent to change from typing to pointing operations (and vice versa) can be inferred from the user's actions at the integrated keyboard. In response to a pointing key press during typing mode, the system enters an intermediate "wait and see" mode, and defers processing the pointing key press to determined whether the user intends to begin pointing.
Subsequent keyboard actions are timed and tested to make that determination. A conclusion is reached, and the system either switches to pointing or resumes the normal typing mode, within one-half second. This process is unobtrusive, if not transparent, to the user. In use, to switch to pointing mode, the user needs merely begin pointing. To switch back, the user simply stops pointing and starts typing.
The foregoing and other objects, features and advantages of the invention will become more readily apparent from the following detailed description of a preferred embodiment which proceeds with reference to the drawings.
FIG. 1 is an illustration of a prior art computer system that includes a keyboard and a mouse pointing device.
FIG. 2 is a high level block diagram of a prior art computer keyboard.
FIG. 3 is a prior art computer keyboard processing flow chart.
FIG. 4 is a prior art high level system block diagram of the software elements involved in processing keyboard data.
FIG. 5 is a control flow diagram for a prior art computer keyboard interrupt handler system.
FIG. 6 is a high level system block diagram of prior art software elements involved in processing pointing device data.
FIG. 7 is a system block diagram of a prior art computer system of the type illustrated in FIG. 1.
FIG. 8 is user flow chart illustrating the control flow of a computer user's decisions and actions while using a prior art computer system of the type illustrated in FIG. 1.
FIG. 9 is a high level block diagram of the hardware elements of an integrated keyboard and pointing device system according to the present invention.
FIG. 10 is a flow chart showing the high level processing control flow for the integrated computer keyboard and pointing device system of FIG. 9.
FIG. 11 is a system block diagram of a computer system that includes an integrated keyboard and pointing device of the type illustrated in FIG. 9.
FIG. 12 is a flow chart of integrated interface software for implementing the integrated keyboard and pointing device of FIGS. 9-11.
FIG. 13 is a user flow chart illustrating control flow of a computer user's decisions and actions while using a computer system of the type shown in FIGS. 9-12.
FIG. 14 is a user flow chart illustrating control flow of a computer user's decisions and actions while using an integrated pointing device speed control according to the present invention.
FIG. 15 is an enlarged, perspective view of a space bar on a conventional keyboard modified to include a thumb switch in the front face of the space bar.
FIG. 16 is an enlarged, perspective view of a pair of thumbswitches positioned in front of the space bar on a keyboard.
FIG. 17 shows a time line that includes a series of six discreet states used for timing keyboard events.
FIG. 18 is a flowchart for a timer interrupt used for transitions to a following state after conclusion of predetermined time intervals.
FIG. 19 shows a modified portion of a keyboard interrupt flowchart.
FIG. 20 is a flowchart of a method for handling a keyboard interrupt during timing state 0.
FIG. 21 is a flowchart of a method for handling a keyboard interrupt during timing state 1.
FIG. 22 is a flowchart of a method for handling a keyboard interrupt during timing state 2.
FIG. 23 is a flowchart of a method for handling a keyboard interrupt during timing state 3.
FIG. 24 is a flowchart of a method for handling a keyboard interrupt during timing state 4.
FIG. 25 is a flowchart of a method for handling a keyboard interrupt during the pointing mode of operation.
The present invention is best understood with reference first to the prior art. FIG. 1 illustrates a prior art computer system. The prior art system includes a computer 28, a keyboard 24 for inputting character data to the computer, a pointing device 26 for inputting graphic data to the computer, and a display terminal 29 for display of data output by the computer. The "arrow" 25 points to the row of keys containing the home row keys of the keyboard 24. The keyboard is coupled to the computer by a communications link 36. The pointing device is coupled to the computer by a pointing device communications link 23. The pointing device 26 may be, for example, a mouse that includes one or more mouse buttons 27. The pointing device generally is used to position and to move the cursor, as more fully explained below.
FIG. 2 is a block diagram of the keyboard 24 employed in the system of FIG. 1. The keyboard 24 includes an array 32 of keyswitches, of which keyswitch 30 is an example. The keyswitches in array 32 are arranged in rows and columns to facilitate scanning the keyboard. A microprocessor system 34 includes hardware for electrically interfacing the keyswitch array 32 to a microprocessor within the system 34.
In operation, microprocessor system 34 scans the rows and columns of array 32 to detect closures and openings of keyswitches. Upon detection of a closure or an opening of a keyswitch, the microprocessor system 34 transmits a code unique to the closed or opened keyswitch via a keyboard communications link 36 to the computer (28 in FIG. 1).
Referring now to FIG. 3, a high level processing control flow diagram for the keyboard of FIG. 2 is depicted to more specifically illustrate the creation and transmission of keyboard data to the computer. In FIG. 3, an initialize step 42 tests microprocessor system (34 in FIG. 2) and initializes the keyboard software state.
After initialization, a keyboard scan loop 56 begins by scanning the keyboard 44. Scan Keyboard 44 sequentially scans the rows and columns of the keyswitch array to detect changes in the state of the keyswitches. After Scan Keyboard 44 is completed, the results are checked in block 46. Control passes to an Auto Repeat block 52 if there have been no changes in the keyswitches. If there has been a change, a control passes to Determine Key block 54 to identify the affected keyswitch.
Determine Key block 54 analyzes the changes to determine what binary key codes, if any, should be transmitted to the computer. Send Key Code, block 60, transfers the key codes. In the typical case of a single keyswitch press and release, one unique key code corresponding to the keyswitch will be sent when the key is pressed and another code when the keyswitch is released.
The software represented by Auto Repeat 52 determines if it is time to automatically repeat sending the key code if the keyswitch remains pressed. After the key codes, if any, have been sent, control returns via loop 56 to begin the next keyboard scan.
Referring to FIG. 4, a block diagram of typical prior art keyboard software interface is depicted. In this system, a computer keyboard 74 generates binary key codes when the operator presses keyswitches as described above. Binary key codes are transmitted to the computer via a keyboard communications link 36. A keyboard interrupt handler 72 reads the key codes, processes them, and places corresponding character codes into a keyboard queue 70. Generally, the keyboard interrupt handler 72 is simple, fast interrupt level software.
An application program 62 is coupled to operating system keyboard services 66 via an interface 64. In response to requests by the application program, the operating system removes character codes from the keyboard queue 70, via another interface, operating system keyboard queue interface 68, and sends the character codes to the application program 62. Operating system keyboard services 66 may examine the character codes to check for operating system requests such as application program termination. Interfaces 64 and 68 are usually bi-directional to enable the application program 62 to "put back" character codes as needed.
Operation of the keyboard interrupt handler 72 is illustrated in greater detail in FIG. 5. Referring to FIG. 5, a Read Hardware block 76 reads the key codes from the keyboard interface hardware. Each such key code is checked to see if it represents any of the Shift, Caps Lock, Control and ALT keyswitches. These special cases are tested in blocks 78, 82, 86 and 90, respectively. For each test, if the test is negative, control is passed to the next test in the sequence shown. If all of the special case tests are negative, control passes to Map to Char block 94 for mapping the keycode. If any of the tests are positive, control is passed to update a corresponding one of status blocks 80, 84, 88 and 92, indicating whether the special key was pressed or released.
Map to Char block 94 uses a look-up table or tables to map the key code to a character code. The mapping includes the current states of the shift, caps lock, control, and ALT keys. This operation is a simple pass-through mapping. The resulting character code, if any, is passed on to Enqueue block 96. Generally, if the key code represents a keyswitch release, it is not passed on to Enqueue software 96. Enqueue 96 inserts the resulting character code into a keyboard queue (70 in FIG. 4). After a status update, or Enqueue operation, as the case may be, control returns to the interrupted program in the usual way, Return block 98.
FIG. 6 is a block diagram illustrating the operation of typical prior art pointing device software. Here, an application program 62 communicates with operating system (OS) pointing device services software 104. The application program passes pointing device control information, such as initial position, cursor shape, cursor visibility, etc., and receives pointing device state information such as position and button presses.
The OS services 104 in turn communicates with pointing device driver interface software 108 in accordance with interface specifications 106. Driver interface software 108 converts and formats the particular pointing device's data to conform to the interface 106. Pointing device state data, including information such as cursor shape, visibility, location and pointing device button status, are maintained in a set of memory locations called state data 114.
The pointing device driver interface software 108 communicates with state data 114 via an interface 112 and with low level pointing device software 118 via an interface 110. The low level software 118 also communicates with state data 114 through an interface 116.
In operation, the low level pointing device software 118 reads data from pointing device hardware 120 via the pointing device communications link 23. Software 118 modifies state data 114 and notifies the driver interface 108 as needed. The low level software 118 may also include software to move the cursor on the display terminal (29 in FIG. 1)
FIG. 7 is a software block diagram of a complete computer, keyboard and pointing device system such as that illustrated in FIG. 1. It essentially consists of a combination of keyboard software (see FIG. 4) and pointing device software (see FIG. 6). Accordingly, the details of operation need not be repeated here.
It is noteworthy that, from the lowest to highest levels, the data and control paths between the application program 62 and the keyboard 74 are separate and distinct from the data and control paths between the application program and the pointing device 120. This separation of the keyboard and pointing device interfaces reflects the segregation of keyboard and pointing device operations in the user's mind. Useful improvements in the state of this art require careful consideration of a computer user's thoughts and actions, described next.
FIG. 8 is a "user flow chart" for using a typical computer system of the type illustrated in FIGS. 1-7. The figure diagrams the control flow of a computer user's decisions and actions while using such a system. Referring to FIG. 8, a primary loop 140 begins with a decision whether to point or type, represented by block 122. If the decision (mentally) is to type, a test 134 is performed to see if the user is typing already. Test 134 must be performed to make sure the hands and the rest of the user's body are positioned for typing. If not, the user moves their hands to a typing position, (block 136) generally on the home row of the keyboard, to begin typing. Once the hands and body are properly positioned, the user presses keyboard keyswitches to enter alpha-numeric character 138, usually repeatedly.
If the decision is made to point, the user tests (mentally) to see if he or she is pointing already, 124. This must be done to make sure the hands and the rest of the user's body are positioned for operating the pointing device. If pointing already, the user has merely to continue. If not, the user must move their hands to the pointing device 126, and then find the cursor on the display screen, 128.
The step of moving a hand to the pointing device includes whatever is required to begin operating the pointing device if the user's hands have been typing on the keyboard. Generally, this includes moving a hand off of the home row of the keyboard to where the pointing device is located, grasping the pointing device, positioning the fingers to operate the pointing device event buttons, and repositioning the arm into a comfortable position.
Finding the cursor on the display screen 128 may be as simple as remembering where the cursor was last seen, or coincidentally having it in the field of view. Quite often, it involves operating the pointing device and looking for a moving object on the display screen, thereby wasting time and effort.
Once a hand is on the pointing device and the cursor has been located, the pointing device may be operated to perform two types of operations--"pointing" and "pointing events". Pointing refers to moving the mouse 130 to reposition a cursor. Pointing events are indicated by pressing pointing device buttons 132, which may include pressing, holding, or releasing selected pointing device buttons. Pointing events are used to select text, pick a menu item, and many other functions. Typical pointing device operations may involve repeated pointing and pointing events. When there is a break in activities, or when a change in activities is required, the user effectively returns to the point or type decision 122 via the primary loop 140.
FIG. 9 is a simplified block diagram of an integrated keyboard and pointing device ("integrated keyboard") according to the present invention. The new integrated keyboard includes an array 232 of keyswitches, of which keyswitch 230 is an example. Direction sensors 238 are embedded in the array 232 to detect pointing information. Preferably, the direction sensors are force sensors coupled to one of the keyswitches in the array to form a multi-purpose keyswitch called the pointing key. For example, where the direction sensors are coupled to the J key, the user can input pointing information by pressing on the J key in the desired direction. An example of such a multi-purpose keyswitch is disclosed in U.S. Pat. No. 4,680,577, assigned to Home Row, Inc., incorporated herein by this reference. The J key is preferred as the pointing key because it is located beneath the user's right index finger when the user's hands are positioned on the home row of the keyboard. How the integrated system distinguishes between a user inputting direction information versus merely entering the letter J is described below.
A smaller and less expensive embodiment would include four force-sensitive resistors (FSRs) coupled to the J key, one FSR for sensing force in each of the four directions in a plane. Analog to Digital conversion circuitry 240 includes electronic circuitry for transforming the FSR resistances, analog information, to representative digital signals. Analog to digital conversion is known and therefore will not be described in detail.
Analog to Digital circuitry 240 is coupled to the microprocessor system 234 via a path 242. Microprocessor system 234 includes hardware for electrically interfacing the array 232 and A/D circuitry 240 to a microprocessor (not shown) within the system 234.
The integrated keyboard and pointing device has at least two distinct modes of operation; a typing mode for entering alpha-numeric character data, and a pointing mode for entering pointing information, such as would be input in the prior art by a separate pointing device such as a mouse. Pointing information includes pointing direction information and pointing event information.
In general, the microprocessor system 234 scans the keyswitch array 232 to detect closures and openings of keyswitches. Upon detection of a closure or an opening of a keyswitch, system 234 transmits a code unique to the actuated keyswitch over a keyboard communications link 236. System 234 also determines whether the sensors 238 must be scanned by examining the sequence of keyswitch closures and openings, or by receiving a message from the computer system. If so, a sensor scan is performed to acquire pointing direction information from the sensors 238, and the pointing direction information also is transmitted over link 236. A separate physical connection to the computer for sending pointing direction information is not required.
FIG. 10 is a flow chart of operation of the integrated keyboard and pointing device of FIG. 9. Referring to FIG. 10, an initialization step 42 tests the microprocessor system 234 (FIG. 9) and initializes the keyboard software state. Next, keyboard scan loop 56 begins with scanning the keyboard to determine the current states of the keyswitches.
After scanning 44 is completed, the results are checked in step 46 to determine if any keyswitch state changes have occurred. If there have been no changes, control passes to an auto repeat step 52. Auto repeat step 52 functions as described above with regard to FIG. 3. If there have been changes, control passes to determine which keyswitch state changed, step 54. Step 46 does not indicate a changed condition until multiple keyboard scanning passes have been made for software keyswitch debounce.
Determine Key step 54 analyzes the changes to determine what binary key codes, if any, should be transmitted to the computer, step 60. In the typical case of a keyswitch press and release, one unique key code corresponding to the keyswitch will be sent when the keyswitch is pressed and another when the keyswitch is released.
After the keyswitch changes, if any, have been processed, a decision 144 determines if the keyboard operating mode must be changed between typing and pointing modes. This check for mode change may consist of examining keyswitch presses and releases, checking for commands from the computer via the communications link or checking for activation by the user of other sensors on the keyboard, such as thumbswitches (described below).
FIG. 16 illustrates a pair of thumbswitches 218 positioned below the space bar 214 of a keyboard 212 for indicating an operating mode change. The thumbswitches 218 are operated by sliding them in the plane of the keyboard generally towards the space bar. The thumbswitches are conveniently placed and operated at such an angle as to allow the natural movement of the thumbs towards the other fingers to provide the actuating effect.
The mode of the integrated keyboard and pointing device may be changed from typing to pointing by operating a thumbswitch. The effects on the pointing mode are determined by the next keyboard key pressed. For example, pointing mode may be entered by operating a thumbswitch, and when the J key is pressed to operate the pointing device, the thumbswitch may be released. When the J key is released, the keyboard returns to typing mode. Thus, the thumbswitch only indicates a possible entry into pointing mode, and not necessarily the exit from pointing mode. The thumbswitches 218 allow fast, natural operation without moving the hands away from the home row of the keyboard.
An appropriate mode change is made if indicated. Next, a test for pointing mode 146 is performed to determine if pointing device processing must be done. If the current mode is typing, control returns via loop 56 to again scan the keyboard 44.
If the current mode is pointing, the system next reads the direction sensors to acquire pointing direction data, and processes that data, step 148. Sensor data processing may include amplification, filtering, and nonlinear transformations. The sensor data processing that occurs at this point is simple and fixed in nature. Additional processing in the computer system remains to be done. When the sensor data processing is completed, the processed sensor data is sent to the computer 150. In an alternative embodiment, raw sensor data could be sent to the computer for processing, either in real-time or in packets. After the processed sensor data is sent, control passes over loop 56 to again scan the keyboard, and the foregoing process is repeated.
FIG. 11 is a block diagram of a computer system that includes the integrated keyboard of FIGS. 9-10. At the application program (62) and operating system (66, 104) levels, this figure is similar to FIG. 7. At the lowest interface level however, all user input is via common "integrated interface software" 162. This novel structure enables additional control to be performed and new features implemented, as further described below. By intercepting keyboard and pointing device data at the lowest levels, new operations may be provided which are entirely transparent to the application program. Thus, the application program "thinks" the system includes a mouse.
The computer system of FIG. 11 operates as follows. The integrated keyboard 166 sends binary encoded key code and pointing device sensor data over a communications link 236 to integrated interface software 162. The integrated interface software 162 a special interrupt handler for separating the data stream from the integrated keyboard into keyboard and pointing device data streams. Integrated interface software 162 includes key code-to-character code mapping software, and additional pointing device data processing software, including software to move the cursor on the display screen.
A key code processing portion of the integrated interface software 162 operates in a manner similar to keyboard interrupt handler 72, with additional software to detect and honor keyboard/pointing device mode change requests. Software 162 inserts character codes into the keyboard queue 70. Operating system keyboard services 66 reads the queue 70 via operating system keyboard queue operations 68 upon request by the application program 62. The applications program is linked to the OS keyboard services via an interface 64.
A set of memory locations called integrated state data 158 includes everything contained in the pointing device state data 114 (FIG. 6), as well as additional state data to keep track of operating modes (for example typing and pointing) and new features (for example variable cursor tracking controls, described below). Interface 156 and particularly interface 160 likewise have additional controls and features as compared to the corresponding interfaces 112 and 116 found in a typical pointing device (FIG. 6), as required in view of the new integrated interface software described below.
Driver interface block 152 provides a standard pointing device interface as required by OS (Operating System) pointing services 104 and the application program 62. Both the interface between the application program and the operating system, and the interface 106 between the operating system and the driver interface 152 are similar to a typical pointing device system so that operation of the integrated system is transparent to the OS and to the application program.
Integrated interface software 162 communicates with driver interface 152 as shown by path 154.
FIG. 12 is a control flow diagram for the integrated interface software 162. It includes all the processing performed by a typical keyboard interrupt handler (see FIG. 5), indicated by dashed line 99. Additionally, integrated interface software 162 includes software to detect and honor keyboard/pointing device mode change requests, convert pointing device sensor data into changes in cursor position, emulate pointing device buttons using keyboard keyswitches, modify pointing device operations using keyboard keyswitches as controls, and provide integrated typing and pointing macro capabilities from a pointing device using keyboard keyswitches (188). Each of these features is described in turn below, in the order of control flow shown in FIG. 12.
A Read Hardware block 76 reads binary key code/pointing device data provided over the communications link 236 (FIG. 11). Once the data has been read, Check Mode block 167 determines the present mode (for example, typing or pointing) of the integrated keyboard/pointing device. Control transfers to Change Mode? 169 if the present mode is typing, and to Change Mode? 170 if the present mode is pointing. The typing/pointing mode encompasses all other keyboard states (shift, caps lock, control, alt), so that typing/pointing mode becomes a new mode "on top of" all these conventional keyboard modes, not merely in addition to them.
Change Mode? 169 determines if the acquired key code data indicates that the user wishes to change the mode of the integrated keyboard from typing to pointing. This determination may include checking for a unique predetermined code, or a more complex test based on multiple previous codes and other keyboard modes (shift, caps lock, control, alt) as well. If the test determines that integrated keyboard is to be changed from typing to pointing mode, the requested change is performed in Change Mode 168. Change Mode 168 includes setting flags in state data 158 (FIG. 11), changing the visual appearance of the cursor and acknowledging the mode change to the user. After the mode change is completed, nothing more needs to be done until further data is received, so control is passed to return from interrupt 98.
If Change Mode? 169 determines that no change of mode is necessary, the key code is treated as it would be treated by a typical keyboard interrupt handler. Thus, tests for Shift Key 78, Caps Lock Key 82, Control Key 86, and Alt Key 90 are performed. The status of each of these special keys 80, 84, 88 and 92, respectively, are updated as required. Map Key Code to Character Code 94, Enqueue Character Code 96, and Return From Interrupt 98 all operate as in the typical keyboard interrupt handler.
If Check Mode 167 determines that the integrated keyboard device is in pointing mode, a test 170 is performed to determine if the user wishes to begin typing again. This test may include examining a single data item, or a more complicated test based on conventional keyboard modes (shift, caps lock, control, alt) and previous binary key code/pointing device data items. If Change Mode? 170 determines the mode is to be changed to typing mode, the requested change is performed by Change Mode block 172.
Change Mode 172 may include setting flags in state data 158, changing the visual appearance of the cursor and acknowledging the mode change to the user. After the requested mode change is effected, nothing more needs to be done until further data is received, so control is passed to Return 98.
If Change Mode? 170 determines that the mode is to remain pointing, the next test performed on the data is to test for cursor positioning data, labeled Pointing Data? block 174. This test determines whether the data is pointing device positioning data or key code data. This test may include checking flags and counters in state data 158 as well as testing bits within the data itself.
If Pointing Data? 174 determines that the data is cursor positioning data, Update (cursor) Position block 176 is performed to move the cursor on the computer display screen, update the cursor position in the state data 158, and notify the driver interface 152 (FIG. 11) of the changes. Update Position 176 may include cursor tracking algorithms to convert the cursor positioning data into changes in cursor position. Upon completion, control is transferred to return from interrupt 98.
Cursor movement is effected as follows. The pointing data are read (76) and the cursor repositioned periodically, for example, as driven by a keyboard clock interrupt operating at approximately 18-20 Hz. The "cursor speed" apparent to the user actually is proportional to the magnitude of changes in cursor position each time the cursor is repositioned, and the frequency of repositioning. A change in the clock speed will of course result in a proportional change in apparent cursor speed, so the keyboard clock rate of a system should be taken into account in cursor tracking.
Accordingly, to effect high apparent cursor speed, a given input force is mapped to a relatively large change in cursor position. To "slow" the cursor, changes in position are scaled down.
The update position or "tracking" software 176, in addition to responding to pointing data from the keyboard, can also consider implicit speed controls implied by the state of the pointing event keys as reflected in the state data.
The tracking software also takes into account the explicit speed modification settings, controlled by modification software 184, the display resolution, as determined by the standard means for the particular computer and display system, and the display mode (character cells or graphics pixels) as determined by the standard means for the particular computer and display system.
One operative example of implementing the cursor tracking software uses multiple predetermined look-up tables. For instance, a set of graphics mode look-up tables take into account everything except the implicit pointing event key speed controls and the explicit speed modification settings controlled by modify operation 184. Two sets of two graphics mode tables would be provided: a horizontal and a vertical table set for use when the pointing event keys are not pressed, and another horizontal and vertical set for when the pointing event keys are pressed.
The partially processed data from the communications link 236 is used as an index into the appropriate table. The result of this table look-up operation is a cursor displacement, which may be multiplied by a scaling factor determined by the explicit cursor speed controls in modify operation 184. After this multiplication, the displacement is used as a change in the horizontal or vertical component of the cursor position. The new cursor position is then calculated by adding the displacement to the present position. After both the horizontal and vertical coordinates have been recomputed, the cursor image is erased from the present position and redrawn at the new position. Analogous table sets may be provided for character cell display modes.
An advantage of using different tables depending on the state of the pointing event keys is that, with most application software, the pointing event keys in the released position indicates nothing important is happening. In this situation, the user generally desires to move the cursor as quickly as possible, to reduce the cursor repositioning time.
When a pointing event key is pressed, the user is indicating something specific and generally desires to move the cursor in a more controlled manner. Speed is important when the event buttons are not being pressed, and control is important when an event button is being pressed. In practice, it has been found that an approximately quadratic force-to-apparent cursor speed mapping table works best for fast cursor positioning, and that an approximately linear mapping table works best for more controlled cursor positioning.
"Pointing events" are input by a user of pointing-related information, usually other than cursor positioning or directional information. For example, in the prior art, pointing events include pressing and releasing mouse buttons, also referred to as "clicking" the mouse buttons. Pointing events may serve to select an item associated with the current position of the cursor. Or, a pointing event may signal a request to move previously selected text to the current location of the cursor.
According to the present invention, one or more of the keyswitches in the integrated keyboard are designated for emulating pointing events. These are called pointing event keys. When the integrated keyboard is in the pointing mode, actuation of one of the pointing event keys is interpreted as a pointing event. This allows a user to input pointing events without removing their hands from the keyboard. Preferably, the D and F keys, or other keys on the home row of the keyboard, are so designated for emulating pointing events, so that all typing and pointing operations can be conducted from the normal typing position. This feature yields substantial time savings and ergonomic advantages to a user of the integrated keyboard. Pointing event emulation is implemented in the preferred embodiment as follows.
Referring again to FIG. 12, if test 174 determines that the binary key code/pointing device data is not pointing device data, the data is then known to be key code data. In that case, control passe to test 178, to test the data for pointing device button emulation.
If the data represents any of the keyboard keyswitches designated to emulate pointing device buttons, the key code data is passed on to block 180 where the key code data is converted to changes in pointing device button status and stored in state data 158 (FIG. 11). This information may be used in various ways. For example, it may be helpful in some applications to temporarily freeze cursor motion for a predetermined period of time in response to a pointing event, to minimize the effects of inadvertent cursor movement during the pointing event.
Software 180 thus emulates the use of pointing device event buttons such as mouse buttons. Because nearly any or all of the keyswitches in the array may be designated to emulate pointing device buttons, the number of keyswitches so designated is limited only by the number of keyswitches on the keyboard. This feature provides for more flexibility in pointing operations as compared to a two or three button stand alone pointing device. When the emulation actions 180 have been completed, control is passed to block 184, where implicit changes to the operation of the pointing device are performed, such as changing the cursor tracking algorithm depending on the state of the emulated pointing device buttons.
The processes corresponding to FIG. 12, blocks 167,168,169,170 and 172 illustrate a system in which mode change is effected explicitly by predetermined keyboard events defined for that purpose.
The integrated keyboard and pointing device system can be arranged to provide for implicit mode change, i.e., to switch from the typing mode to the pointing mode of operation in response to user actions at the keyboard other than actions explicitly directed to a mode change as described above. A user's intent to change mode can be inferred from the user's actions at the keyboard. For example, a user intending to begin pointing might depress the pointing key and hold it down for a period of time longer than is usually encountered in typing. During typing mode, these actions can be used to infer that a change to the pointing mode is desired.
Whether or not a mode change is appropriate, however, will depend upon the user's actions after the pointing key press, and the timing of those actions. Therefore, responsive to a pointing key press in typing mode, the system goes into an intermediate or "wait and see" mode to subsequently determine whether to finally switch to pointing mode or resume the normal typing mode. This decision-making process, illustrated in FIGS. 17-25, is implemented so as to be substantially transparent to the user.
We have found that the distributions of time intervals between keyboard events (usually key presses) for typing versus pointing are distinctly different. Graphically, these distributions would resemble a bimodal probability curve. When typing, the amount of time a key is held down is typically less than 100 milliseconds. When pointing, however, the amount of time the key is held down is typically not less than 500 milliseconds (mostly due to the fact that the user's tracking response time is around 400 milliseconds).
Also, when typing, it is common for a second key to be pressed or released soon after any given key is pressed, for example within about 200 milliseconds. When pointing, however, at least about 300 milliseconds elapses before a pointing event button is pressed. At least that much time is necessary for repositioning the cursor, even over a short distance.
To provide for implicit mode change, the system processes a pointing key press detected in typing mode as follows. First, the pointing key press is intercepted in that, initially, it is not processed as a usual typing event. (Typically, that would be mapping the key press and enqueing it). Rather, the system begins monitoring a time interval elapsed since the pointing key press, and monitors the keyboard to detect a second keyboard event After a sufficiently long time passes without detecting a second keyboard event, intent to point is inferred, and the system switches to the pointing mode of operation. This is a safe conclusion after about 400 milliseconds.
Conversely, if a second keyboard event is detected very soon after the pointing key press, within approximately 100 milliseconds, it is concluded that the user intends to remain in the normal typing mode. Accordingly, the system processes first the pointing key press and then the second keyboard event as normal typing events. The "wait and see" delay of a few hundred milliseconds is not disruptive to the user.
Implementation of the methods just described and of methods of determining whether to change mode when a second keyboard event arrives within the 100-400 millisecond "wait and see" window are described next.
The "J" key will be referred to herein interchangeably with the "pointing key" as any key on the keyboard may be coupled to force sensors to serve as a pointing key.
When the "J" key is pressed, the chording keys' (SHIFT, CONTROL and ALT) status are checked. If any of these are being held down, it is assumed that the user wishes to type, and the "J" key pressed is passed on normally. If none of the chording keys are pressed, it cannot yet be determined whether the "J" was pressed with the intention of typing or pointing.
The fact that the "J" key was pressed is stored internally, and a timer is started The timer may be implemented in a variety of ways, for example, using a hardware timer in the keyboard controller, or a software loop, or some interrupt signal from a host processor. Preferably, in response to the "J" key press, the keyboard is instructed to start sending pointing sensor data packets. Receipt of the sensor data packets is used as a timer interrupt to clock the timer The packets are read in and parsed, but are not used to cause the cursor to move, at least initially.
FIG. 17 shows a timeline for implementing an implicit mode change feature As shown, there are six states numbered 0 through 5. Each state extends for an interval of approximately 100 milliseconds after the conclusion of a next preceding state. These times are approximate and may be varied to meet individual needs. State 0 is the normal typing mode. Responsive to a pointing key press, the State advances to State 1 and the timer interrupt is enabled to begin monitoring the time elapsed since the pointing key press.
FIGS. 18-25 are flowcharts of the implicit mode change methodology. In all of these flowcharts, the following conventions apply. Control starts at the top of the flowchart and flows generally toward the bottom. Decisions are indicated by a question mark (?) in a process or step description. Control flows to the right from an affirmative decision.
FIG. 18 is a flowchart for the timer interrupt. This process is not executed unless the timer is enabled (started). First, a counter is incremented. Next, the current state is checked. For each state that has a following state, the value of the count is checked to see if the current time period (State) has elapsed. If the current period is concluded, the state transitions to the next state. Additionally, special actions are performed in some cases further described below.
FIG. 19 shows a new or modified portion of the keyboard interrupt routine. It is assumed here that any handshaking protocol and any sensor data packets have already been handled. It is also assumed that during the intermediate or "wait and see" states, pointing key presses detected after the first pointing key press (results of auto-repeat) are ignored. As shown in the flowchart, in response to a keyboard interrupt, control passes to a respective one of FIGS. 20-25 that corresponds to the present timing state. For example, in response to a first keyboard interrupt during State 0, control passes to a process represented by the flowchart of FIG. 20.
Referring now to FIG. 20, the system checks to determine if the keyboard event is a pointing key press. If the first keyboard event is a pointing key press, the system tests the states of the chording keys. If any of the chording keys is also pressed, the pointing key press is processed normally and the system continues in the typing mode. By the phrase, "processed normally", it is meant that the pointing key press is further processed in the same fashion as any keyboard event in the normal typing mode. For example, a keycode corresponding to the pointing key press may be enqued for transmission to an application program. The particulars of such a transmission are known.
The user's intent to type is inferred from use of the chording keys in combination with the pointing key. This detects many known typing events and allows the auto repeat function to work on the pointing key if the user so desires. For example, to repeat lower case Js, the user can invoke CAPS LOCK, press and hold SHIFT, and press and hold "J". Auto repeat will function and the system remains in typing mode.
Referring now to the right side of FIG. 20, if the first keyboard event was a pointing key press and none of the chording keys was also pressed, the state is set to state 1, a "wait and see" state. The system begins to monitor the time elapsed since the pointing key press by enabling the timer. The system also starts to acquire sensor scan data from the pointing key.
State 1 is a "wait and see" state that corresponds to FIG. 21. If a second keyboard event occurs in this state, it is inferred that the user intends to continue typing. Accordingly, referring to FIG. 21, the system enqueues the pointing key press, resets to state 0, stops the timer, stops sensor data acquisition and processes the new (second) keyboard event normally so that the system resumes the normal typing mode of operation.
Three events are continuously monitored, namely, the "J" key going back up (pointing key release), other keyboard events, and the timer. Referring to FIG. 18, if the system is in state 1, and the elapsed time exceeds approximately 100 milliseconds, the state advances to State 2; the present cursor position is saved (stored); and then the cursor is allowed to begin moving.
During state 2, if the timer exceeds a second predetermined time interval, approximately 200 milliseconds, the state advances to state 3 (see FIG. 18). On the other hand, if a second keyboard event is detected during state 2, the control passes to code represented by the flowchart of FIG. 22.
As shown in FIG. 22, in this case, the system resumes the normal typing mode. .Specifically, resuming the typing mode may include creating an indication of the pointing key press for further processing as a typing event; resetting to state 0; stopping the timer; stopping the sensor data acquisition; restoring the cursor to the initial position; and, processing the second keyboard event normally.
As indicated, if no keyboard event occurs during timing state 2, the system advances to state 3, and continues to monitor the keyboard to detect the next keyboard event. Accordingly, if the elapsed time since the pointing key press is greater than the second time interval, and a second keyboard event has not been detected, the system is in state 3.
In this state, in response to detecting a second keyboard event, control passes to code represented by the flowchart of FIG. 23. Here, the system tests to determine whether the second keyboard event is a pointing key release. If the second keyboard event is not a pointing key release, the system determines whether the second keyboard event is a pointing event. If the second keyboard event is a pointing event, the pointing event is enqued and the system switches to the pointing mode of operation.
To place this in context, state 3 corresponds to the interval from approximately 200 to 300 milliseconds after the pointing key press. During that interval, the likelihood is that the user intends to begin pointing, though this is not yet certain. But if the next keyboard event is a pointing event, such as pressing one of the keys designated as a pointing event key, it is then concluded that pointing mode is desired. The assumption here is that the likelihood of a user wanting to type a "J" followed by an "F", "D", or "S" is lower than the user wanting a pointing event. This is conditioned such that a typing event is assumed below 200 milliseconds and a pointing event after that. We have found that if a key is held down longer than approximately 300 milliseconds, it is very unlikely that another key would be pressed unless the user intended to point.
Referring again to FIG. 23, if the second keyboard event is not a pointing event, or if the second keyboard event is a pointing key release, then the system resumes the typing mode of operation. The specific steps taken to resume the typing mode of operation are the same as those mentioned above with regard to FIG. 22.
If the elapsed time is greater than the third time interval (300 msec) and a second keyboard event has not been detected, the state is advanced to state 4, as shown in FIG. 18. If a second keyboard event is detected during state 4, control is directed, according to the keyboard interrupt flowchart of FIG. 19, to code that corresponds to the flowchart of FIG. 24.
Referring to FIG. 24, the system tests to determine whether the second keyboard event is a pointing key release. If the second keyboard event is not a pointing key release, indicated on the left side of FIG. 24, the second keyboard event is processed as a pointing macro and the state is advanced to state 5, i.e., the system switches to the pointing mode of operation. In other words, referring to the timeline of FIG. 1, if the pointing key has been held down for more than 300 milliseconds (state 4), and a keyboard event other than a pointing key release is detected, it is assumed to be a pointing macro and the system is switched to a pointing mode of operation. On the other hand, if the event is a pointing key release, corresponding to the right side of FIG. 24, the system resumes the typing mode of operation, in the same manner described above.
Referring back again to FIG. 18, if the elapsed time since the pointing key press is greater than a fourth predetermined interval, about 400 milliseconds, pointing mode is assumed. Thus, the system switches to the pointing mode of operation without explicit instruction from the user to do so. This corresponds to timing state 5. In timing state 5, a keyboard interrupt is handled according to the flowchart of FIG. 25. Referring to FIG. 25, in response to detecting a keyboard event in this state, the system tests to determine whether the event is a pointing key release. If not, the new keyboard event is processed as a pointing macro. Alternatively, if the keyboard event is a pointing key release, as shown on the right side of FIG. 25, the system switches back to the typing mode of operation. Specifically, state is reset to state 0, the timer is stopped, and sensor scan data acquisition is stopped. The system thus resumes the typing mode of operation as described above.
The foregoing mode change principles are applicable to systems other than conventional computers with integrated keyboards. For example, any electronic apparatus may be equipped with a multi-function keyswitch. The apparatus may have a first operating mode, analogous to the typing mode of a computer keyboard system, in which the multi-purpose keyswitch has a binary function (press and release). The apparatus may be configured, using the above described methods, to be switched into a second mode for acquiring pointing or pointing direction information from the multi-purpose keyswitch. This change may be effected implicitly as described.
Referring back to FIG. 12, explicit modification of the integrated system, i.e. changes in operation requested by a user, are described next. If test 178 determines that the key code information does not represent any of the pointing event keyswitches, control is passed to a test 182 to determine whether the key code represents a keyswitch assigned to modify operation of the pointing device or other aspects of the integrated keyboard system. Such keyswitches are called modify operation keyswitches, and can perform various tasks further described below. If the key code represent a modify operation keyswitch, control is passed to modification software 184. Modification software 184 uses the key code information to modify operation of the system as requested.
In a system of the type described, such modifications may include a wide variety of changes to the behavior of the system. The effects of some of these changes are known in the prior art, but known ways to accomplish them are quite different and far less efficient. Other modifications are applicable only to the new, integrated keyboard system.
To illustrate, it is known to change the apparent cursor speed in a mouse system by first exiting the application program, running a special program to alter cursor speed, and then re-entering the desired application program. In the integrated system, a single stroke of the keyswitch assigned the cursor speed-up (or down) function accomplishes the same result. Obviously, the change is made more quickly and without significant interruption of the user's work. Importantly, the speed control is interactive in that the user immediately sees the result of the speed adjustment. Further adjustment may be made, immediately, if deemed necessary.
Another illustrative modification is changing the condition to exit from pointing mode. In other words, entering a pointing-lock mode, in which the user has to do more than release the pointing key to resume typing mode. For instance, pressing the space bar while in pointing mode may be arranged to enter a pointing-lock mode, so the system does not exit pointing mode, even if the J key is released The system may return to typing mode only when the space bar is pressed.
Yet another use of modify operation keys is to lock/unlock or change axes of the pointing device. In the past, these changes have been made by special additional mouse buttons, or by running special software at the application level. Making these selections with modification keys is much faster and less disruptive.
Another example of a system modifications which may be implemented with modify operation keys is changing the cursor tracking algorithm. For example, various predefined tracking algorithms may be provided, each of which is optimized for a particular application, e.g. word processing, CAD, etc. A single keystroke selects the best tracking algorithm for cursor control. After the indicated modify operation is completed, control is passed to Return 98.
Yet another modification is to toggle the integrated keyboard between right and left handed operation. For example, in right-handed mode on a QWERTY keyboard, the J key is the pointing device and the D and F keys are conveniently used as pointing event keys. Upon switching to left-handed mode, the F key is the pointing device and J and K keys are pointing event keys. This feature requires two multipurpose keyswitches (disposed at F and J), but otherwise is easily implemented in the software by simply changing the interpretation of the affected keys for pointing mode. Thus, the test for modify operation keys would look for J and K to detect pointing events while in left-handed mode, but look for D and F for the same purpose while in right-handed mode. These examples merely begin to illustrate the functions available with an integrated keyboard system. Virtually all of them may be implemented at the lower software levels, namely the interrupt and driver levels, so they are transparent to the application software and the operating system.
Another illustrative use of modify operation keys is for macros. Macros, used as abbreviations for segments of text and/or strings of program commands, are known. In the integrated system, macros can include a mix of cursor motions, pointing events and keyboard strokes. For example, a macro can be set up to move the cursor, pull down a menu and pick an item. This feature can easily be programmed in the integrated system environment described herein.
If the test for pointing device modification (182) determines the key code does not represent any other modification request, control is passed to a test for macro expansion 188. Macro Expand 188 determines if the key code is one that is currently a macro command in need of expansion. If so, the macro is expanded and control is passed to Change Mode? 170 where the expanded data is operated on.
If Macro Expand 188 determines the key code is not one that is currently designated for macro expansion, control is passed to a test for typing emulation, "Alpha Emulate" block 186. Alpha Emulate 186 determines if the key code is currently being mapped back to a typing operation, and if so, control is passed to test for shift key "Shift?" 78 and key code processing continues as if the integrated device mode was typing instead of pointing. This feature thereby effectively changes the device temporarily to typing mode automatically in response to the user entering typing data. The user can resume pointing activity without explicitly changing mode. If test 186 determines the key code is not currently being mapped back to a typing operation, control is passed to Return 98 and the key code is effectively ignored.
Details of implementing the foregoing software will be apparent to those skilled in the art in view of this disclosure. The various features indicated as performed in the computer, as distinguished from in the integrated keyboard, preferably are implemented in software so that they can be employed on various computer systems without modification of the computer hardware.
FIG. 13 depicts a user flowchart for an integrated keyboard according to the present invention. The figure illustrates the control flow of the computer user's decisions and actions while using the integrated system. Once the user has begun a task requiring both typing and pointing activities, the user must repeatedly make a mental decision to point or type 122. If the decision is to type, a test to see if the user is typing already 134 must be performed to make sure the mode of the integrated device is in typing mode. If not, a change to typing mode 192 must be performed before typing can begin. This mode change is a simple action that can be performed with the hands in the typing position on the keyboard. Once the mode is typing, the user proceeds to type 138.
If the decision 122 is made to point, a test to see if pointing already 124 must be performed to make sure the integrated device is already in pointing mode. If so, user has merely to continue. If not, the user must change the keyboard to pointing mode 190 and notice the cursor 194.
The switch to pointing mode 190 can be performed with the hands in the typing position on the keyboard. The cursor is easy to find because, in response to the mode change, the system causes the cursor image on the display screen to flash, for example by intermittently enlarging the cursor image. Noticing the change in cursor appearance is particularly easy because the peripheral vision of the human eye is extremely well adapted at noticing changes in brightness and movement. Thus, there is no need to remember where the cursor was last seen, or coincidentally have it in the field of view, or to actuate the pointing device to detect cursor motion.
Once the integrated device is set to pointing mode, the pointing device may be operated 130 and useful actions performed. Pointing events are indicated by pressing, holding, or releasing keyboard keys 196. This aspect of the integrated system allows the user to indicate pointing events while the hands remain in the typing position. Typical pointing device operations may involve repeatedly moving the cursor and pressing keyswitches.
When there is a break in activities, or when a change in activities is required, the user effectively returns to the decision to point or type 122 via loop 140.
FIG. 14 is a user flow chart for changing the cursor speed using the integrated keyboard. It is assumed, of course, that the system is in the pointing mode. The user starts out by deciding if the cursor speed is acceptable 198 or if it should be altered. If the present speed is too slow, the user can increase it, pressing a predetermined keyboard keyswitch (for example, the R key) 204. If the cursor is deemed too fast, decelerating the cursor is likewise as simple as pressing a predetermined keyboard keyswitch (for example, the V key) 206. After the appropriate keyswitch is pressed, the user moves the cursor 208 to determine the effect of the change, and again decides if the speed is acceptable 198. If the speed is still not correct, the foregoing steps are repeated.
Once the cursor speed is correct, control transfers to Done 210 where any speed modification keyswitches that were being pressed are released. This process may be continuous: the user has only to press a rate control keyswitch, move the cursor until the speed reaches the desired rate, and release the rate control keyswitch. The user thus obtains instant feedback of cursor speed changes without leaving the application program and, indeed, without even moving a hand away from the typing position.
FIG. 15 illustrates modification of the space bar on a conventional keyboard in accordance with the present invention to distinguish being pressed towards the other keys in the keyboard as well as down into the keyboard. Space bar 214 is shown in the conventional position below and in front of example keyboard keys 212. Mode change bar 216 is placed in the front face of space bar 214 for ready operation by the user's thumbs. When the hands are in the typing position, the natural grasping action of the thumbs easily brings them to push mode change bar 216 further into the front of space bar 214, activating a mode change keyswitch mounted beneath the space bar.
Having illustrated and described the principles of my invention in a preferred embodiment thereof, it should be readily apparent to those skilled in the art that the invention can be modified in arrangement and detail without departing from such principles. In particular, but without limitation, allocation of functions between hardware and software is subject to wide variation depending upon numerous design considerations for any particular application. The principles disclosed herein can be implemented in many different combinations of hardware and software, as a matter of design choices, without departing from the principles of the invention. We claim all modifications coming within the spirit and scope of the accompanying claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4201489 *||Jan 12, 1978||May 6, 1980||Creatcchnil Patent AG||Keyboard actuatable with the aid of the fingers of at least one hand|
|US4680577 *||Apr 18, 1986||Jul 14, 1987||Tektronix, Inc.||Multipurpose cursor control keyswitch|
|US4712101 *||Dec 4, 1984||Dec 8, 1987||Cheetah Control, Inc.||Control mechanism for electronic apparatus|
|US4736191 *||Aug 2, 1985||Apr 5, 1988||Karl E. Matzke||Touch activated control method and apparatus|
|US4786895 *||Aug 2, 1985||Nov 22, 1988||Xeltron, S. A.||Control panel|
|US4924433 *||Jul 13, 1987||May 8, 1990||Brother Kogyo Kabushiki Kaisha||Word processor with attribute functions|
|US4937778 *||Apr 14, 1986||Jun 26, 1990||Wolf Chris L||System for selectively modifying codes generated by a touch type keyboard upon detecting of predetermined sequence of make codes and break codes|
|US4974183 *||Apr 5, 1989||Nov 27, 1990||Miller Wendell E||Computer keyboard with thumb-actuated edit keys|
|US5007008 *||Aug 17, 1990||Apr 9, 1991||Hewlett-Packard Company||Method and apparatus for selecting key action|
|US5041819 *||Oct 18, 1989||Aug 20, 1991||Brother Kogyo Kabushiki Kaisha||Data processing device|
|EP0365305A2 *||Oct 18, 1989||Apr 25, 1990||Brother Kogyo Kabushiki Kaisha||Data processing device|
|JPS62279419A *||Title not available|
|WO1981002272A1 *||Feb 5, 1981||Aug 20, 1981||I Litterick||Keyboards and methods of operating keyboards|
|1||*||Borland International, Inc., Turbo Pascal version 3.0 reference manual, 1985, pp. 22 25.|
|2||Borland International, Inc., Turbo Pascal version 3.0 reference manual, 1985, pp. 22-25.|
|3||*||IBM Corp., 700 IBM Technical Disclosure Bulletin, Oct. 31, 1988, vol. 31, No. 5, pp. 276 277.|
|4||IBM Corp., 700 IBM Technical Disclosure Bulletin, Oct. 31, 1988, vol. 31, No. 5, pp. 276-277.|
|5||IBM Technical Disclosure Bulletin, "Compact Computer Keyboard", Mar. 1985, vol. 27, No. 10A, pp. 3640-3642.|
|6||*||IBM Technical Disclosure Bulletin, Compact Computer Keyboard , Mar. 1985, vol. 27, No. 10A, pp. 3640 3642.|
|7||*||Osiris Technologies, Inc., OmniPoint Embedded Cursor Control, Nov. 1988, S. Card et al., Evaluation of Mouse, Rate Controlled Isometric Joystick, Step Keys, and Test Keys for Text Selection on a CRT , Ergonomics, 1978, vol. 21, No. 8, 601 613.|
|8||Osiris Technologies, Inc., OmniPoint™ Embedded Cursor Control, Nov. 1988, S. Card et al., "Evaluation of Mouse, Rate-Controlled Isometric Joystick, Step Keys, and Test Keys for Text Selection on a CRT", Ergonomics, 1978, vol. 21, No. 8, 601-613.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US5317695 *||Apr 3, 1992||May 31, 1994||International Business Machines Corporation||Method and system for permitting communication between a data processing system and input pointing devices of multiple types|
|US5404458 *||Feb 24, 1994||Apr 4, 1995||International Business Machines Corporation||Recognizing the cessation of motion of a pointing device on a display by comparing a group of signals to an anchor point|
|US5454110 *||Aug 26, 1994||Sep 26, 1995||International Business Machines Corporation||Techniques for supporting operating systems for portable computers|
|US5485614 *||Jul 21, 1994||Jan 16, 1996||Dell Usa, L.P.||Computer with pointing device mapped into keyboard|
|US5574891 *||Sep 27, 1994||Nov 12, 1996||Acer Peripherals, Inc.||Method for managing the input codes from keyboard and pointing device|
|US5596348 *||Jan 26, 1995||Jan 21, 1997||Mitsubishi Denki Kabushiki Kaisha||Input apparatus|
|US5608895 *||Nov 30, 1994||Mar 4, 1997||Samsung Electronics Co., Ltd.||Method for providing mouse functionality using either an internal or external mouse input device|
|US5627566 *||Jun 9, 1992||May 6, 1997||Litschel; Dietmar||Keyboard|
|US5646647 *||Nov 14, 1994||Jul 8, 1997||International Business Machines Corporation||Automatic parking of cursor in a graphical environment|
|US5675361 *||Aug 23, 1995||Oct 7, 1997||Santilli; Donald S.||Computer keyboard pointing device|
|US5699082 *||Jun 7, 1995||Dec 16, 1997||International Business Machines Corporation||Enhanced program access in a graphical user interface|
|US5786805 *||Dec 27, 1996||Jul 28, 1998||Barry; Edwin Franklin||Method and apparatus for improving object selection on a computer display by providing cursor control with a sticky property|
|US6107996 *||Nov 24, 1992||Aug 22, 2000||Incontrol Solutions, Inc.||Integrated keyboard and pointing device system with automatic mode change|
|US6288709 *||Jul 20, 1998||Sep 11, 2001||Alphagrip, Inc.||Hand held data entry system|
|US6323846||Jan 25, 1999||Nov 27, 2001||University Of Delaware||Method and apparatus for integrating manual input|
|US6469694||Apr 12, 2000||Oct 22, 2002||Peter J. Mikan||Mouse emulation keyboard system|
|US6512511||Jun 20, 2001||Jan 28, 2003||Alphagrip, Inc.||Hand grippable combined keyboard and game controller system|
|US6545666 *||May 30, 2000||Apr 8, 2003||Agilent Technologies, Inc.||Devices, systems and methods for positioning cursor on display device|
|US6760013||Jan 4, 2001||Jul 6, 2004||Alphagrip, Inc.||Hand held gaming and data entry system|
|US6795055 *||May 30, 2000||Sep 21, 2004||Agilent Technologies, Inc.||Devices, systems and methods for facilitating positioning of cursor on display device|
|US6869011 *||May 2, 2002||Mar 22, 2005||Texas Instruments Incorporated||Apparatus and method for controlling an electrical switch array|
|US6888536||Jul 31, 2001||May 3, 2005||The University Of Delaware||Method and apparatus for integrating manual input|
|US6970158 *||May 1, 2003||Nov 29, 2005||Emerson Harry E||Computer keyboard providing an alert when typing in Caps Lock mode|
|US7091954 *||Jul 17, 2002||Aug 15, 2006||Kazuho Iesaka||Computer keyboard and cursor control system and method with keyboard map switching|
|US7145551 *||Feb 17, 1999||Dec 5, 2006||Microsoft Corporation||Two-handed computer input device with orientation sensor|
|US7154480 *||Apr 30, 2002||Dec 26, 2006||Kazuho Iesaka||Computer keyboard and cursor control system with keyboard map switching system|
|US7339580||Dec 17, 2004||Mar 4, 2008||Apple Inc.||Method and apparatus for integrating manual input|
|US7511702||May 9, 2006||Mar 31, 2009||Apple Inc.||Force and location sensitive display|
|US7538760||Mar 30, 2006||May 26, 2009||Apple Inc.||Force imaging input device and system|
|US7614008||Sep 16, 2005||Nov 3, 2009||Apple Inc.||Operation of a computer with touch screen interface|
|US7619618||Jul 3, 2006||Nov 17, 2009||Apple Inc.||Identifying contacts on a touch surface|
|US7653883||Sep 30, 2005||Jan 26, 2010||Apple Inc.||Proximity detector in handheld device|
|US7656393||Jun 23, 2006||Feb 2, 2010||Apple Inc.||Electronic device having display and surrounding touch sensitive bezel for user interface and control|
|US7656394||Jul 3, 2006||Feb 2, 2010||Apple Inc.||User interface gestures|
|US7663607||May 6, 2004||Feb 16, 2010||Apple Inc.||Multipoint touchscreen|
|US7705830||Feb 10, 2006||Apr 27, 2010||Apple Inc.||System and method for packing multitouch gestures onto a hand|
|US7764274||Jul 3, 2006||Jul 27, 2010||Apple Inc.||Capacitive sensing arrangement|
|US7782307||Nov 14, 2006||Aug 24, 2010||Apple Inc.||Maintaining activity after contact liftoff or touchdown|
|US7812828||Feb 22, 2007||Oct 12, 2010||Apple Inc.||Ellipse fitting for multi-touch surfaces|
|US7813998 *||Mar 22, 2007||Oct 12, 2010||Trading Technologies International, Inc.||System and method for selectively displaying market information related to a plurality of tradeable objects|
|US7844914||Sep 16, 2005||Nov 30, 2010||Apple Inc.||Activating virtual keys of a touch-screen virtual keyboard|
|US7920131||Aug 28, 2009||Apr 5, 2011||Apple Inc.||Keystroke tactility arrangement on a smooth touch surface|
|US7932897||Aug 15, 2005||Apr 26, 2011||Apple Inc.||Method of increasing the spatial resolution of touch sensitive devices|
|US7978181||Apr 25, 2006||Jul 12, 2011||Apple Inc.||Keystroke tactility arrangement on a smooth touch surface|
|US8041630||Sep 3, 2010||Oct 18, 2011||Trading Technologies International, Inc.||System and method for selectively displaying market information related to a plurality of tradeable objects|
|US8115745||Dec 19, 2008||Feb 14, 2012||Tactile Displays, Llc||Apparatus and method for interactive display with tactile feedback|
|US8125463||Nov 7, 2008||Feb 28, 2012||Apple Inc.||Multipoint touchscreen|
|US8131631||Mar 4, 2011||Mar 6, 2012||Trading Technologies International, Inc.||System and method for selectively displaying market information related to a plurality of tradeable objects|
|US8217908||Jun 19, 2008||Jul 10, 2012||Tactile Displays, Llc||Apparatus and method for interactive display with tactile feedback|
|US8239784||Jan 18, 2005||Aug 7, 2012||Apple Inc.||Mode-based graphical user interfaces for touch sensitive input devices|
|US8279180||May 2, 2006||Oct 2, 2012||Apple Inc.||Multipoint touch surface controller|
|US8314775||Jul 3, 2006||Nov 20, 2012||Apple Inc.||Multi-touch touch surface|
|US8321332||Jan 6, 2012||Nov 27, 2012||Trading Technologies International Inc.|
|US8330727||Nov 14, 2006||Dec 11, 2012||Apple Inc.||Generating control signals from multiple contacts|
|US8334846||Nov 14, 2006||Dec 18, 2012||Apple Inc.||Multi-touch contact tracking using predicted paths|
|US8381135||Sep 30, 2005||Feb 19, 2013||Apple Inc.||Proximity detector in handheld device|
|US8384675||Jul 3, 2006||Feb 26, 2013||Apple Inc.||User interface gestures|
|US8416209||Jan 6, 2012||Apr 9, 2013||Apple Inc.||Multipoint touchscreen|
|US8432371||Jun 29, 2012||Apr 30, 2013||Apple Inc.||Touch screen liquid crystal display|
|US8441453||Jun 5, 2009||May 14, 2013||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8451244||Apr 11, 2011||May 28, 2013||Apple Inc.||Segmented Vcom|
|US8466880||Dec 22, 2008||Jun 18, 2013||Apple Inc.||Multi-touch contact motion extraction|
|US8466881||Apr 10, 2009||Jun 18, 2013||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8466883||May 1, 2009||Jun 18, 2013||Apple Inc.||Identifying contacts on a touch surface|
|US8479122||Jul 30, 2004||Jul 2, 2013||Apple Inc.||Gestures for touch sensitive input devices|
|US8482533||Jun 5, 2009||Jul 9, 2013||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8493330||Jan 3, 2007||Jul 23, 2013||Apple Inc.||Individual channel phase delay scheme|
|US8514183||Nov 14, 2006||Aug 20, 2013||Apple Inc.||Degree of freedom extraction from multiple contacts|
|US8552989||Jun 8, 2007||Oct 8, 2013||Apple Inc.||Integrated display and touch screen|
|US8554664||Sep 12, 2012||Oct 8, 2013||Trading Technologies International, Inc.|
|US8562437 *||May 2, 2012||Oct 22, 2013||Sony Corporation||Keyboard equipped with functions of operation buttons and an analog stick provided in a game controller|
|US8576177||Jul 30, 2007||Nov 5, 2013||Apple Inc.||Typing with a touch sensor|
|US8581870||Dec 6, 2011||Nov 12, 2013||Apple Inc.||Touch-sensitive button with two levels|
|US8593426||Feb 1, 2013||Nov 26, 2013||Apple Inc.||Identifying contacts on a touch surface|
|US8605051||Dec 17, 2012||Dec 10, 2013||Apple Inc.||Multipoint touchscreen|
|US8612856||Feb 13, 2013||Dec 17, 2013||Apple Inc.||Proximity detector in handheld device|
|US8629840||Jul 30, 2007||Jan 14, 2014||Apple Inc.||Touch sensing architecture|
|US8633898||Jul 30, 2007||Jan 21, 2014||Apple Inc.||Sensor arrangement for use with a touch sensor that identifies hand parts|
|US8654083||Jun 8, 2007||Feb 18, 2014||Apple Inc.||Touch screen liquid crystal display|
|US8654524||Aug 17, 2009||Feb 18, 2014||Apple Inc.||Housing as an I/O device|
|US8665228||Apr 13, 2010||Mar 4, 2014||Tactile Displays, Llc||Energy efficient interactive display with energy regenerative keyboard|
|US8665240||May 15, 2013||Mar 4, 2014||Apple Inc.||Degree of freedom extraction from multiple contacts|
|US8674943||Nov 14, 2006||Mar 18, 2014||Apple Inc.||Multi-touch hand position offset computation|
|US8698755||Jul 30, 2007||Apr 15, 2014||Apple Inc.||Touch sensor contact information|
|US8730177||Jul 30, 2007||May 20, 2014||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8730192||Aug 7, 2012||May 20, 2014||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8736555||Jul 30, 2007||May 27, 2014||Apple Inc.||Touch sensing through hand dissection|
|US8743300||Sep 30, 2011||Jun 3, 2014||Apple Inc.||Integrated touch screens|
|US8760430 *||Jul 25, 2013||Jun 24, 2014||Kabushiki Kaisha Toshiba||Electronic apparatus, input control program, and input control method|
|US8804056||Dec 22, 2010||Aug 12, 2014||Apple Inc.||Integrated touch screens|
|US8816984||Aug 27, 2012||Aug 26, 2014||Apple Inc.||Multipoint touch surface controller|
|US8866752||Apr 10, 2009||Oct 21, 2014||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8872785||Nov 6, 2013||Oct 28, 2014||Apple Inc.||Multipoint touchscreen|
|US8902175||Apr 10, 2009||Dec 2, 2014||Apple Inc.||Contact tracking and identification module for touch sensing|
|US8924283||Aug 13, 2013||Dec 30, 2014||Trading Technologies International, Inc.|
|US8928618||Jun 18, 2014||Jan 6, 2015||Apple Inc.||Multipoint touchscreen|
|US8933905||Oct 9, 2013||Jan 13, 2015||Apple Inc.||Touch-sensitive button with two levels|
|US8982087||Jun 18, 2014||Mar 17, 2015||Apple Inc.||Multipoint touchscreen|
|US9001068||Jan 24, 2014||Apr 7, 2015||Apple Inc.||Touch sensor contact information|
|US9025090||Aug 11, 2014||May 5, 2015||Apple Inc.||Integrated touch screens|
|US9035907||Nov 21, 2013||May 19, 2015||Apple Inc.||Multipoint touchscreen|
|US9041652||Sep 14, 2011||May 26, 2015||Apple Inc.||Fusion keyboard|
|US9047009||Jun 17, 2009||Jun 2, 2015||Apple Inc.||Electronic device having display and surrounding touch sensitive bezel for user interface and control|
|US9069404||May 22, 2009||Jun 30, 2015||Apple Inc.||Force imaging input device and system|
|US9098142||Nov 25, 2013||Aug 4, 2015||Apple Inc.||Sensor arrangement for use with a touch sensor that identifies hand parts|
|US9128611||Feb 23, 2010||Sep 8, 2015||Tactile Displays, Llc||Apparatus and method for interactive display with tactile feedback|
|US9146414||Mar 23, 2015||Sep 29, 2015||Apple Inc.||Integrated touch screens|
|US9239673||Sep 11, 2012||Jan 19, 2016||Apple Inc.||Gesturing with a multipoint sensing device|
|US9239677||Apr 4, 2007||Jan 19, 2016||Apple Inc.||Operation of a computer with touch screen interface|
|US9244561||Feb 6, 2014||Jan 26, 2016||Apple Inc.||Touch screen liquid crystal display|
|US9262029||Aug 20, 2014||Feb 16, 2016||Apple Inc.||Multipoint touch surface controller|
|US9268429||Oct 7, 2013||Feb 23, 2016||Apple Inc.||Integrated display and touch screen|
|US9274611||May 16, 2014||Mar 1, 2016||Kabushiki Kaisha Toshiba||Electronic apparatus, input control program, and input control method|
|US9292111||Jan 31, 2007||Mar 22, 2016||Apple Inc.||Gesturing with a multipoint sensing device|
|US9298310||Sep 3, 2014||Mar 29, 2016||Apple Inc.||Touch sensor contact information|
|US9329717||Jul 30, 2007||May 3, 2016||Apple Inc.||Touch sensing with mobile sensors|
|US9342180||Jun 5, 2009||May 17, 2016||Apple Inc.||Contact tracking and identification module for touch sensing|
|US9348452||Apr 10, 2009||May 24, 2016||Apple Inc.||Writing using a touch sensor|
|US9348458||Jan 31, 2005||May 24, 2016||Apple Inc.||Gestures for touch sensitive input devices|
|US9383855||Jun 13, 2008||Jul 5, 2016||Apple Inc.||Identifying contacts on a touch surface|
|US9400581||Dec 9, 2014||Jul 26, 2016||Apple Inc.||Touch-sensitive button with two levels|
|US9448658||Jul 30, 2007||Sep 20, 2016||Apple Inc.||Resting contacts|
|US9454239||Sep 14, 2011||Sep 27, 2016||Apple Inc.||Enabling touch events on a touch sensitive mechanical keyboard|
|US9454277||Mar 26, 2015||Sep 27, 2016||Apple Inc.||Multipoint touchscreen|
|US9513705||Aug 3, 2010||Dec 6, 2016||Tactile Displays, Llc||Interactive display with tactile feedback|
|US9535546 *||Jun 11, 2014||Jan 3, 2017||Samsung Electronics Co., Ltd||Cover device having input unit and portable terminal having the cover device|
|US9547394||Aug 27, 2009||Jan 17, 2017||Apple Inc.||Multipoint touch surface controller|
|US9552100||Apr 8, 2016||Jan 24, 2017||Apple Inc.||Touch sensing with mobile sensors|
|US9557846||Oct 2, 2013||Jan 31, 2017||Corning Incorporated||Pressure-sensing touch system utilizing optical and capacitive systems|
|US9575610||Dec 30, 2015||Feb 21, 2017||Apple Inc.||Touch screen liquid crystal display|
|US9589302||Jul 24, 2014||Mar 7, 2017||Trading Technologies International, Inc.|
|US9600037||Feb 14, 2014||Mar 21, 2017||Apple Inc.||Housing as an I/O device|
|US9606668||Aug 1, 2012||Mar 28, 2017||Apple Inc.||Mode-based graphical user interfaces for touch sensitive input devices|
|US9626032||Jul 30, 2007||Apr 18, 2017||Apple Inc.||Sensor arrangement for use with a touch sensor|
|US9710095||Jun 13, 2007||Jul 18, 2017||Apple Inc.||Touch screen stack-ups|
|US9727193 *||Aug 27, 2015||Aug 8, 2017||Apple Inc.||Integrated touch screens|
|US20020190946 *||Dec 21, 2000||Dec 19, 2002||Ram Metzger||Pointing method|
|US20030201971 *||Apr 30, 2002||Oct 30, 2003||Kazuho Iesaka||Computer keyboard and cursor control system with keyboard map switching system|
|US20030201982 *||Jul 17, 2002||Oct 30, 2003||Kazuho Iesaka||Computer keyboard and cursor control system and method with keyboard map switching|
|US20030206155 *||May 2, 2002||Nov 6, 2003||Mitchell Michael Lane||Apparatus and method for controlling an electrical switch array|
|US20030206157 *||May 1, 2003||Nov 6, 2003||Emerson Harry E.||Computer keyboard providing an alert when typing in CAPS LOCK mode|
|US20050104867 *||Dec 17, 2004||May 19, 2005||University Of Delaware||Method and apparatus for integrating manual input|
|US20050225557 *||Jun 7, 2005||Oct 13, 2005||Satyaki Koneru||Method and apparatus for reading texture data from a cache|
|US20060007129 *||Dec 13, 2004||Jan 12, 2006||Research In Motion Limited||Scroll wheel with character input|
|US20060026535 *||Jan 18, 2005||Feb 2, 2006||Apple Computer Inc.||Mode-based graphical user interfaces for touch sensitive input devices|
|US20060026536 *||Jan 31, 2005||Feb 2, 2006||Apple Computer, Inc.||Gestures for touch sensitive input devices|
|US20060053387 *||Sep 16, 2005||Mar 9, 2006||Apple Computer, Inc.||Operation of a computer with touch screen interface|
|US20060085757 *||Sep 16, 2005||Apr 20, 2006||Apple Computer, Inc.||Activating virtual keys of a touch-screen virtual keyboard|
|US20060097991 *||May 6, 2004||May 11, 2006||Apple Computer, Inc.||Multipoint touchscreen|
|US20060125803 *||Feb 10, 2006||Jun 15, 2006||Wayne Westerman||System and method for packing multitouch gestures onto a hand|
|US20060197753 *||Mar 3, 2006||Sep 7, 2006||Hotelling Steven P||Multi-functional hand-held device|
|US20060238518 *||Jul 3, 2006||Oct 26, 2006||Fingerworks, Inc.||Touch surface|
|US20060238519 *||Jul 3, 2006||Oct 26, 2006||Fingerworks, Inc.||User interface gestures|
|US20060238521 *||Jul 3, 2006||Oct 26, 2006||Fingerworks, Inc.||Identifying contacts on a touch surface|
|US20060238522 *||Jul 3, 2006||Oct 26, 2006||Fingerworks, Inc.||Identifying contacts on a touch surface|
|US20070037657 *||Aug 15, 2005||Feb 15, 2007||Thomas Steven G||Multiple-speed automatic transmission|
|US20070070051 *||Nov 14, 2006||Mar 29, 2007||Fingerworks, Inc.||Multi-touch contact motion extraction|
|US20070070052 *||Nov 14, 2006||Mar 29, 2007||Fingerworks, Inc.||Multi-touch contact motion extraction|
|US20070078919 *||Nov 14, 2006||Apr 5, 2007||Fingerworks, Inc.||Multi-touch hand position offset computation|
|US20070081726 *||Nov 14, 2006||Apr 12, 2007||Fingerworks, Inc.||Multi-touch contact tracking algorithm|
|US20070139395 *||Feb 22, 2007||Jun 21, 2007||Fingerworks, Inc.||Ellipse Fitting for Multi-Touch Surfaces|
|US20070171210 *||Apr 4, 2007||Jul 26, 2007||Imran Chaudhri||Virtual input device placement on a touch screen user interface|
|US20070174788 *||Apr 4, 2007||Jul 26, 2007||Bas Ording||Operation of a computer with touch screen interface|
|US20070229464 *||Mar 30, 2006||Oct 4, 2007||Apple Computer, Inc.||Force Imaging Input Device and System|
|US20070236466 *||May 9, 2006||Oct 11, 2007||Apple Computer, Inc.||Force and Location Sensitive Display|
|US20070247429 *||Apr 25, 2006||Oct 25, 2007||Apple Computer, Inc.||Keystroke tactility arrangement on a smooth touch surface|
|US20070257890 *||May 2, 2006||Nov 8, 2007||Apple Computer, Inc.||Multipoint touch surface controller|
|US20070268273 *||Jul 30, 2007||Nov 22, 2007||Apple Inc.||Sensor arrangement for use with a touch sensor that identifies hand parts|
|US20070268274 *||Jul 30, 2007||Nov 22, 2007||Apple Inc.||Touch sensing with mobile sensors|
|US20070268275 *||Jul 30, 2007||Nov 22, 2007||Apple Inc.||Touch sensing with a compliant conductor|
|US20080036743 *||Jan 31, 2007||Feb 14, 2008||Apple Computer, Inc.||Gesturing with a multipoint sensing device|
|US20080041639 *||Jul 30, 2007||Feb 21, 2008||Apple Inc.||Contact tracking and identification module for touch sensing|
|US20080042986 *||Jul 30, 2007||Feb 21, 2008||Apple Inc.||Touch sensing architecture|
|US20080042987 *||Jul 30, 2007||Feb 21, 2008||Apple Inc.||Touch sensing through hand dissection|
|US20080042988 *||Jul 30, 2007||Feb 21, 2008||Apple Inc.||Writing using a touch sensor|
|US20080042989 *||Jul 30, 2007||Feb 21, 2008||Apple Inc.||Typing with a touch sensor|
|US20080062139 *||Jun 8, 2007||Mar 13, 2008||Apple Inc.||Touch screen liquid crystal display|
|US20080088602 *||Dec 28, 2007||Apr 17, 2008||Apple Inc.||Multi-functional hand-held device|
|US20080128182 *||Jul 30, 2007||Jun 5, 2008||Apple Inc.||Sensor arrangement for use with a touch sensor|
|US20080157867 *||Jan 3, 2007||Jul 3, 2008||Apple Inc.||Individual channel phase delay scheme|
|US20080211775 *||May 9, 2008||Sep 4, 2008||Apple Inc.||Gestures for touch sensitive input devices|
|US20080211783 *||May 9, 2008||Sep 4, 2008||Apple Inc.||Gestures for touch sensitive input devices|
|US20080211784 *||May 9, 2008||Sep 4, 2008||Apple Inc.||Gestures for touch sensitive input devices|
|US20080211785 *||May 9, 2008||Sep 4, 2008||Apple Inc.||Gestures for touch sensitive input devices|
|US20080231610 *||May 9, 2008||Sep 25, 2008||Apple Inc.||Gestures for touch sensitive input devices|
|US20090096758 *||Nov 7, 2008||Apr 16, 2009||Steve Hotelling||Multipoint touchscreen|
|US20090109173 *||Oct 28, 2007||Apr 30, 2009||Liang Fu||Multi-function computer pointing device|
|US20090244031 *||Apr 10, 2009||Oct 1, 2009||Wayne Westerman||Contact tracking and identification module for touch sensing|
|US20090244032 *||Jun 5, 2009||Oct 1, 2009||Wayne Westerman||Contact Tracking and Identification Module for Touch Sensing|
|US20090244033 *||Jun 5, 2009||Oct 1, 2009||Wayne Westerman||Contact tracking and identification module for touch sensing|
|US20090249236 *||Jun 5, 2009||Oct 1, 2009||Wayne Westerman||Contact tracking and identification module for touch sensing|
|US20090251439 *||Apr 10, 2009||Oct 8, 2009||Wayne Westerman||Contact tracking and identification module for touch sensing|
|US20090315850 *||Aug 27, 2009||Dec 24, 2009||Steven Porter Hotelling||Multipoint Touch Surface Controller|
|US20100026631 *||Oct 13, 2009||Feb 4, 2010||Research In Motion Limited||Scroll wheel with character input|
|US20100148995 *||Dec 12, 2008||Jun 17, 2010||John Greer Elias||Touch Sensitive Mechanical Keyboard|
|US20100149092 *||May 1, 2009||Jun 17, 2010||Wayne Westerman||Identifying contacts on a touch surface|
|US20100149099 *||Dec 12, 2008||Jun 17, 2010||John Greer Elias||Motion sensitive mechanical keyboard|
|US20100149134 *||Apr 10, 2009||Jun 17, 2010||Wayne Westerman||Writing using a touch sensor|
|US20100332378 *||Sep 3, 2010||Dec 30, 2010||Trading Technologies International, Inc.|
|US20110022976 *||Oct 1, 2010||Jan 27, 2011||Cadexterity, Inc.||dynamic user interface system|
|US20110153489 *||Mar 4, 2011||Jun 23, 2011||Trading Technologies International, Inc.||System and Method for Selectively Displaying Market Information Related to a Plurality of Tradeable Objects|
|US20110187677 *||Apr 11, 2011||Aug 4, 2011||Steve Porter Hotelling||Segmented vcom|
|US20110209085 *||Apr 29, 2011||Aug 25, 2011||Apple Inc.||Mode activated scrolling|
|US20110234498 *||Aug 3, 2010||Sep 29, 2011||Gray R O'neal||Interactive display with tactile feedback|
|US20120289336 *||May 2, 2012||Nov 15, 2012||Sony Computer Entertainment Inc.||Keyboard|
|US20130307805 *||Jul 25, 2013||Nov 21, 2013||Kabushiki Kaisha Toshiba||Electronic apparatus, input control program, and input control method|
|US20130335323 *||May 9, 2013||Dec 19, 2013||Pixart Imaging Inc.||Cursor control device and system|
|US20140362044 *||Jun 11, 2014||Dec 11, 2014||Samsung Electronics Co., Ltd.||Cover device having input unit and portable terminal having the cover device|
|US20150370378 *||Aug 27, 2015||Dec 24, 2015||Apple Inc.||Integrated touch screens|
|USRE40153||May 27, 2005||Mar 18, 2008||Apple Inc.||Multi-touch system and method for emulating modifier keys via fingertip chords|
|USRE40993||Jan 13, 2006||Nov 24, 2009||Apple Inc.||System and method for recognizing touch typing under limited tactile feedback conditions|
|EP2256605A2||Jan 25, 1999||Dec 1, 2010||Apple Inc.||Method and apparatus for integrating manual input|
|EP2256606A2||Jan 25, 1999||Dec 1, 2010||Apple Inc.||Method and apparatus for integrating manual input|
|EP2256607A2||Jan 25, 1999||Dec 1, 2010||Apple Inc.||Method and apparatus for integrating manual input|
|WO2004025449A2 *||Sep 8, 2003||Mar 25, 2004||Koninklijke Philips Electronics N.V.||Method for inputting character and position information|
|WO2004025449A3 *||Sep 8, 2003||Aug 12, 2004||Koninkl Philips Electronics Nv||Method for inputting character and position information|
|U.S. Classification||345/172, 400/485, 345/168|
|International Classification||G09G5/08, H03M11/14, G06F1/16, G06F3/038, G06F3/02, G05G9/047, G06F3/023|
|Cooperative Classification||G06F3/0213, G06F3/023, G06F2203/04801, G06F3/0489, G06F3/038, G05G9/047, H01H2221/012|
|European Classification||G06F3/0489, G05G9/047, G06F3/038, G06F3/023, G06F3/02A3P|
|Mar 25, 1991||AS||Assignment|
Owner name: HOME ROW, INC., A CORP. OF OR, OREGON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:FRANZ, PATRICK J.;STRAAYER, DAVID H.;REEL/FRAME:005655/0689
Effective date: 19910211
|May 9, 1995||CC||Certificate of correction|
|Aug 12, 1996||FPAY||Fee payment|
Year of fee payment: 4
|Oct 11, 1996||AS||Assignment|
Owner name: INCONTROL SOLUTIONS, INC., OREGON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HOME ROW, INC.;REEL/FRAME:008167/0670
Effective date: 19960123
|Aug 22, 2000||FPAY||Fee payment|
Year of fee payment: 8
|Aug 17, 2004||FPAY||Fee payment|
Year of fee payment: 12
|Nov 4, 2005||AS||Assignment|
Owner name: PRIMAX ELECTRONICS, LTD., TAIWAN
Free format text: TECHNOLOGY TRANSFER AGREEMENT;ASSIGNOR:INCONTROL SOLUTIONS, INC.;REEL/FRAME:016722/0688
Effective date: 20030404
|Jul 12, 2006||AS||Assignment|
Owner name: TRANSPACIFIC PLASMA, LLC, TAIWAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRIMAX ELECTRONICS LTD.;REEL/FRAME:018047/0778
Effective date: 20060626
Owner name: TRANSPACIFIC PLASMA, LLC,TAIWAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PRIMAX ELECTRONICS LTD.;REEL/FRAME:018047/0778
Effective date: 20060626
|Jan 23, 2007||AS||Assignment|
Owner name: PRIMAX ELECTRONICS LTD., TAIWAN
Free format text: LICENSE;ASSIGNORS:TRANSPACIFIC IP LTD.;TRANSPACIFIC PLASMA LLC;REEL/FRAME:018787/0358
Effective date: 20060404
Owner name: PRIMAX ELECTRONICS LTD.,TAIWAN
Free format text: LICENSE;ASSIGNORS:TRANSPACIFIC IP LTD.;TRANSPACIFIC PLASMA LLC;REEL/FRAME:018787/0358
Effective date: 20060404