Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUSRE42183 E1
Publication typeGrant
Application numberUS 09/393,998
Publication dateMar 1, 2011
Filing dateSep 8, 1999
Priority dateNov 22, 1994
Also published asUS5666138
Publication number09393998, 393998, US RE42183 E1, US RE42183E1, US-E1-RE42183, USRE42183 E1, USRE42183E1
InventorsCraig Culver
Original AssigneeImmersion Corporation
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Interface control
US RE42183 E1
Abstract
An improved interface control advantageously may be operated by one hand. A thumbpiece is slidably disposed within a longitudinal arm member which moves in an arcuate path. Placing his or her thumb in the thumbpiece, a user controls the horizontal positioning of a cursor by moving the arm member along the arcuate path. Vertical positioning of the cursor is controlled by sliding the thumbpiece along the length of the arm member. Trigger functions are implemented by exerting a downward force on the thumbpiece. Since the downward force used to implement the trigger function is orthogonal to motions used to control positioning of the cursor irrespective of the particular positions of the arm member and thumbpiece, the disclosed interface control prevents a user from inadvertently altering the positioning of the cursor during implementation of the trigger function. The arm member and sliding thumbpiece emulate the natural pivoting and curling/extending motions of the thumb, thereby resulting in a precise, easy to use, and ergonomically superior interface control.
Images(8)
Previous page
Next page
Claims(38)
1. An apparatus for generating at least two control signals, said apparatus comprising:
a support;
an arm member disposed within said support, said arm member being moveable in an arcuate path within said support;
a contact member slidably mounted on said arm member, said contact member being slidable along said arm member;
a first sensor coupled to said arm member for sensing movement of said arm member along said arcuate path; and
a second sensor coupled to said contact member for sensing linear movement of said contact member along said arm member.
2. The apparatus of claim 1 further comprising:
first guide means disposed along said arm member; and
second guide means disposed on said contact member, said second guide means matingly coupled with said first guide means so as to facilitate the movement of said contact member along said arm member.
3. The apparatus of claim 2 wherein said first guide means comprise first and second grooves horizontally disposed in first and second side walls, respectively, of said arm member, said second guide means comprising first and second flanges disposed on first and second side walls of said contact member, respectively.
4. The apparatus of claim 1 wherein said arm member is pivotally coupled to said support at a pivot point, said arm member moving in said arcuate path as said arm member pivots about said pivot point.
5. The apparatus of claim 1 further comprising:
a track means disposed within said support, said track means defining said arcuate path; and
one or more guide elements disposed on an outer surface of said arm member, said one or mere guide elements being matingly coupled to said track means for facilitating the movement of said arm member in said arcuate path.
6. The apparatus of claim 1 wherein said first sensor detects the magnitude and direction of arcuate movement of said arm member along said arcuate path and generates a first control signal in response thereto.
7. The apparatus of claim 1 wherein said second sensor detects the magnitude and direction of linear movement of said contact member and generates a second control signal in response thereto.
8. The apparatus of claim 1 further comprising a third sensor coupled to said arm member, said third sensor detecting a downward pressure on said contact member and in response thereto generating a signal for implementing a first predetermined function.
9. An apparatus for generating at least two control signals, said apparatus comprising:
a support;
an arm member having a first end portion pivotally coupled to said housing at a point, said arm member being rotatable about said point;
a contact member slidably mounted on said arm member, said contact member being slidable along said arm member;
a first sensor coupled to said arm member for sensing arcuate movement of said arm member; and
a second sensor coupled to said contact member for sensing linear movement of said contact member along said arm member.
10. The apparatus of claim 9 wherein said support is of a size such that said apparatus is capable of being operated with one hand of a person.
11. The apparatus of claim 9 wherein said first sensor generates a first control signal representing the angle and direction of rotation of said arm member, said first control signal being capable of being used to alter the position of a cursor or pointer in a first direction on a display screen.
12. The apparatus of claim 9 wherein said second sensor generates a second control signal representing the magnitude and direction of linear movement of said contact member, said second control signal bang capable of being used to alter the position of a cursor or pointer in a second direction on a display screen.
13. The apparatus of claim 9 further comprising a third sensor disposed on said arm member, said third sensor detecting a downward pressure on said contact member.
14. The apparatus of claim 13 wherein said third sensor generates a third control signal indicative of said downward pressure on said contact member, said third control signal implementing a first predetermined function.
15. The apparatus of claim 9 further comprising a first actuator, wherein in response to a first feedback signal said first actuator restrains said rotation of said arm member.
16. The apparatus of claim 15 wherein said first actuator is contained within said first sensor.
17. The apparatus of claim 15 further comprising a second actuator, wherein in response to a second feedback signal said second actuator restrains the linear movement of said arm member.
18. The apparatus of claim 17 wherein said second actuator is contained within said second sensor.
19. The apparatus of claim 9 wherein said contact member has a concave upper surface so that a human thumb may matingly situate on said contact member.
20. The apparatus of claim 9 wherein said arm member has disposed therein one or more first guide means, said contact member having disposed thereon one or more second guide means, each of said first guide means being matingly coupled with an associated one of said second guide means for facilitating sliding of said contact member along said arm member.
21. The apparatus of claim 20 wherein said first guide means comprise first and second grooves horizontally disposed in first and second side walls, respectively, of said arm member, said second guide means comprising first and second flanges disposed on first and second side walls of said contact member, respectively.
22. The apparatus of claim 9 wherein said housing has a bottom surface, said apparatus further comprising:
one or more cavities formed in said bottom surface of said housing;
one or more switches, each of said switches being associated with and coupled to an associated one of said cavities.
23. The apparatus of claim 22 wherein said switches comprise pressure sensitive devices, the actuation of which generates one or more fourth control signals for implementing one or more second predetermined functions.
24. An interface control device, comprising:
a support housing configured to be held by one hand of a user;
a user manipulatable member engageable and moveable by a single thumb of said user;
at least one sensor coupled to said user manipulatable member and operative to sense movement in a first and second degree of freedom and to generate at least one sensor signal associated with said movement;
said user manipulatable member moveable in a third degree of freedom and configured to cause a trigger signal to be sent to an application on a computer, said third degree of freedom approximately orthogonal to said first and second degrees of freedom; and
at least one actuator coupled to said interface control device, said actuator operative to provide a feedback force to said user,
wherein said interface control device is configured to be operated by said one hand of a user, and wherein said user manipulatable member is coupled to an arm member having rotary motion about a pivot point to provide motion in one of said first or second degrees of freedom, wherein said actuator is coupled to said arm member to output forces about said pivot point.
25. The interface control device as recited in claim 24 wherein said rotary motion of said arm member is limited to an arcuate path of less than ninety degrees.
26. The interface control device as recited in claim 24 further comprising a second actuator, and wherein said first actuator is grounded to said housing and wherein said second actuator is carried by said arm member.
27. The interface control device as recited in claim 24 wherein said user manipulatable member is a sliding contact member which can be moved in a linear degree of freedom approximately perpendicular to an axis of rotation of said arm member and in substantially the same plane as said rotary motion, thereby providing said motion in one of said first or second degrees of freedom.
28. An interface control device, comprising:
a support housing configured to be held by one hand of a user;
a user manipulatable member engageable and moveable by a single thumb of said user;
at least one sensor coupled to said user manipulatable member and operative to sense movement in a first and second degree of freedom and to generate at least one sensor signal associated with said movement;
said user manipulatable member moveable in a third degree of freedom and configured to cause a trigger signal to be sent to an application on a computer, said third degree of freedom approximately orthogonal to said first and second degrees of freedom; and
at least one actuator coupled to said interface control device, said actuator operative to provide a feedback force to said user,
wherein said interface control device is configured to be operated by said one hand of a user, and wherein a centering spring bias on said user manipulatable member may be electrically actuated by a signal received from said computer, allowing said interface control device to have a centering mode and a non-centering mode, selected by said computer.
29. An interface control device in communication with a computer for providing positioning signals to said computer for manipulating an image in a computer environment displayed on a screen by said computer, said device comprising:
a handheld support housing configured to be held by one hand of a user;
a user manipulatable member coupled to said housing and engageable and moveable by a single thumb of said user in two degrees of freedom relative to said housing, and configured with a contact surface configured to be contacted by said thumb;
at least one sensor coupled to said user manipulatable member and operative to sense movement of said user manipulatable member in said two degrees of freedom, said sensor operative to provide positioning signals;
at least one actuator coupled to said interface control device, wherein said actuator is operative to provide a feedback force to said user, and wherein said at least one actuator includes a first brake providing a drag in a first of said two degrees of freedom, and a second computer controlled brake coupled to said user manipulatable member and providing a drag in a second one of said degrees of freedom of said user manipulatable member; and
a thumb trigger sensor operative to detect a trigger command from said user and to cause a trigger signal to be sent to said computer, said trigger command including a pressing motion by said thumb causing said user manipulatable member to move in a trigger degree of freedom different from said two degrees of freedom, wherein said user manipulatable member is configured to allow said user to control said movement in said two degrees of freedom and perform said trigger command simultaneously using said single thumb on said contact surface, and
wherein said interface control device is configured to be operated by said one hand of a user, and wherein said user manipulatable member is coupled to an arm member having rotary motion about a pivot point and is a sliding member which can be moved along at least a portion of said arm member in a linear degree of freedom, and wherein said second brake outputs forces in said linear degree of freedom, wherein said first brake is coupled to said arm member to output forces about said pivot point.
30. An interface control device in communication with a computer for providing positioning signals to said computer for positioning an image displayed on a screen said device comprising:
a support housing configured to be held by one hand of the user;
a user manipulatable member coupled to said housing and engageable and moveable by a digit of said user in two degrees of freedom relative to said housing while said housing is held by said hand of said user, wherein at least one of said degrees of freedom is a rotary degree of freedom about an axis of rotation;
a spring return mechanism coupled to said user manipulatable member to provide a centering bias on said user manipulatable member toward a center position of said rotary degree of freedom when said user manipulatable member has been moved from said center position, wherein said spring return mechanism is electrically actuated by an external signal received from said computer, allowing said spring return mechanism to be selectively applied in a centering mode and allowing said spring return mechanism to have no effect in a non-centering model;
at least one sensor coupled to said user manipulatable member and sensing movement of said user manipulatable member in said two degrees of freedom, said sensor providing positioning signals which control said positioning of said image on said screen;
at least one actuator coupled to said user manipulatable member; and
a trigger sensor for detecting a trigger command from said user, said trigger command including a pressing motion causing said user manipulatable member to move in a trigger degree of freedom different from said two degrees of freedom.
31. The interface control device as recited in claim 30 wherein said external signal is controlled by a video game running on said computer.
32. The interface control device as recited in claim 30 wherein said spring return mechanism is coupled to a pivotable arm member providing said rotary degree of freedom, and further comprising a centering spring coupled to said user manipulatable member to provide a centering bias in another of said two degrees of freedom.
33. An interface control feedback device in communication with a computer for providing positioning signals to said computer for manipulating an image in a computer environment displayed on a screen by said computer; said device comprising:
a support housing configured to be held by one hand of a user;
a sliding contact member engageable and moveable by a thumb of said user in two degrees of freedom relative to said support housing while said support housing is held by said hand of said user, one of said two degrees of freedom being a linear degree of freedom, wherein said movement in said two degrees of freedom positions said image in two screen dimensions on said display device;
an arm member coupled to said sliding contact member, said arm member operative to rotationally move about a pivot point to provide motion in one of said two degrees of freedom, wherein said linear degree of freedom is approximately perpendicular to an axis of rotation of said arm member and is in substantially the same plane as said rotary motion;
at least one sensor coupled to said user manipulatable member and operative to sense movement of said sliding contact member in said two degrees of freedom, said sensor operative to provide positioning signals which control said positioning of said image on said display device;
at least one actuator coupled to said arm member to output forces about said pivot point, wherein said forces facilitate the selection of options or icons displayed on said display device based on feedback signals generated by an application running on said computer; and
a trigger sensor for detecting a trigger command from said user, said trigger command including moving said sliding contact member approximately orthogonally to said two degrees of freedom,
wherein said interface control device is configured to be operated by said one hand of a user.
34. The interface control device as recited in claim 33 further comprising a second actuator to output forces on said sliding contact member in said linear degree of freedom, and wherein said first actuator is grounded to said housing and wherein said second actuator is carried by said arm member.
35. The interface control device as recited in claim 33 wherein said image is a cursor controlled to move in two dimensions of said screen, wherein said cursor can be used to select an icon, wherein said trigger command selects said icon when said cursor is positioned over said icon.
36. The interface control device as recited in claim 33 wherein said image is a video game character provided in a video game environment.
37. An interface control feedback device in communication with a computer for providing positioning signals to said computer for positioning an image displayed on a display device, said device comprising:
a support housing configured to be held by one hand of a user;
a user manipulatable member engageable and moveable by a thumb of said user in two degrees of freedom relative to said support housing while said support housing is held by said hand of said user, wherein said movement in said two degrees of freedom positions said image in two screen dimensions on said display device;
at least one sensor coupled to said user manipulatable member and operative to sense movement of said user manipulatable member in said two degrees of freedom, said sensor operative to provide positioning signals which control said positioning of said image on said display device;
at least one actuator coupled to said user manipulatable member, wherein said actuator provides a force in at least one of said degrees of freedom of said user manipulatable member, wherein said force facilitates the selection of options or icons displayed on said display device based on feedback signals generated by an application running on said computer, wherein a centering spring bias on said user manipulatable member is electrically actuated by a signal received from said computer in a centering mode, allowing said force feedback device to have said centering mode and a non-centering mode selected by said computer; and
a trigger sensor for detecting a trigger command from said user, said trigger command including moving said user manipulatable member approximately orthogonally to said two degrees of freedom,
wherein said interface control device is configured to be operated by said one hand of a user.
38. A device comprising:
a housing configured to be held in one hand such that no additional support is needed to operate the device;
a user manipulatable member coupled to said housing and configured to be manipulated by a single digit of a user in two degrees of freedom, wherein said user manipulatable member comprises an arm member operable to rotate in said rotary degree of freedom, and a sliding contact member operable to move in a linear degree of freedom approximately perpendicular to an axis of rotation of said arm member;
a sensor coupled to said user manipulatable member and operative to sense movement of said user manipulatable member in said two degrees of freedom, wherein one of said two degrees of motion comprises a rotary degree of freedom;
an actuator operative to provide a feedback force to said user;
a trigger operative to move in a degree of freedom different from said two degrees of freedom and configured to be actuated by said single digit simultaneously with said user manipulatable object,
wherein said device is configured to be operated by said one hand of a user.
Description
FIELD OF THE INVENTION

The present invention relates to an interface control. More particularly, the present invention relates to an interface control device which allows a user to control the operation of computer applications, machinery, and video games.

BACKGROUND OF THE INVENTION

Joystick controls have been employed in a wide variety of applications, including computer software, industrial machinery, and multimedia interfaces to control the positioning of an object displayed on a screen, such as a cursor or pointer. A typical prior art joystick includes a gimballed stick pivotally coupled to a flat base portion. Angle sensors coupled to the gimballed stick generate position control signals in response to a user pivoting the gimballed stick relative to the base portion. These control signals are used to manipulate the position of the cursor. A depressible switch coupled to the top of the stick is used to generate a trigger control signal for implementing various functions, such as selecting items from a pull down menu or causing a character in a video game to jump.

The structure of these gimballed joystick controllers makes them somewhat difficult to operate. Rotating the arm and wrist to control positioning functions (i.e., pivoting the stick) while pressing downward with the thumb or finger to manipulate trigger functions requires a fair amount of practice and coordination. Further, requiring a user to simultaneously combine these motions may lead to an inadvertent change in the positioning of a cursor while implementing a trigger function. For instance, in a point-and-shoot operation, where a user first positions the cursor onto a target on the display screen and then activates the trigger function, depressing the trigger switch with the thumb or finger often results in slight movements of the arm and/or wrist, thereby causing the cursor to slip off the target. This phenomenon is commonly referred to as cursor creep.

The conventional joystick controller described above has the further disadvantage of undesirably requiring the use of two hands, i.e., one hand to hold the base of the controller and the other hand to operate the controller's stick. The only manner in which these controllers may be operated with one hand is to place the controller on a table or other flat surface.

Other joystick controllers have been developed in response to the above-mentioned problems. One such controller includes a pivoting, handgrip-shaped stick having one or more squeezable trigger switches built into a side portion of the handgrip. The positioning of an image is controlled by pivoting the handgrip, while the trigger functions are controlled by squeezing the trigger switches with the fingers. Although in such a design the positioning controls are somewhat isolated from the trigger function controls (i.e., squeezing the trigger switch with the index finger is not likely to cause as much of an inadvertent change in position as would depressing a trigger switch on the top of the stick with the thumb), cursor creep is nevertheless a problem. Further, such a controller requires the use of two hands or, alternatively, a tabletop support.

Some have attempted to develop a one-handed controller by simply reducing the size of conventional joystick controllers. These controllers fit within a user's hand, where the thumb, resting atop the stick, controls the positioning function. The trigger function is controlled by squeezing a trigger switch located on the side of the controller's stick.

These miniaturized versions of conventional joystick controllers are for the most part clumsy and ineffective. Merely reducing the size of a controller designed for two-handed operation so as to be operated by one hand severely limits the precision with which a user may control a cursor. Further, these miniaturized controllers are ineffective in isolating trigger controls from positioning controls. Indeed, squeezing a trigger switch with, for example, the index finger typically causes the controller stick to move forward, thereby resulting in undesirable vertical cursor creep.

This undesirable interaction between positioning and trigger controls of miniature joystick controllers, coupled with users' complaints of inferior ergonomics, has led others to revert to the more primitive two-handed video game controller shown in FIG. 1. Controller 1 has four keys 2a-2d clustered together in a first portion of controller 1 and three keys 4a-4c grouped together in a second portion of controller 1. Keys 2a-2d control the positioning of a displayed object (such as the hero of the video game) by generating digital positioning signals in response to a user depressing one or more of keys 2a-2d. Keys 4a-4c control various trigger functions (i.e., start-stop, jump, shoot, for example). The controller shown in FIG. 1, although virtually eliminating inadvertent interaction between positioning and trigger controls, nonetheless requires the use of two hands.

Thus, there is a need for joystick controller which may be operated in one hand. There is also a need for a controller having improved precision and ergonomics. Such a device should also isolate positioning and trigger controls, thereby eliminating cursor creep and other inadvertent position control signals produced during activation of trigger functions.

SUMMARY OF THE INVENTION

In accordance with the present invention, an interface control is disclosed which offers users superior performance and ergonomics. In the preferred embodiment, a thumbpiece is slidably disposed within a longitudinal arm member having a first end pivotally coupled to a fixed point. The arm member may pivot about the fixed point such that a second end portion of the arm member follows an arcuate path having as its center the fixed point. The thumbpiece slides back and forth along the longitudinal axis of the arm member. A first sensor coupled to the arm member in a region proximate to the fixed point senses the angular position of the arm member. A second sensor coupled to the thumbpiece senses the linear movement of the thumbpiece relative to and longitudinally along the arm member. A third sensor coupled to the thumbpiece senses a downward force exerted upon the thumbpiece.

The interface control may, in one embodiment, comfortably rest in the palm of a user's hand. Positioning the fingers along the underside of the interface control, a user places the thumb in the thumbpiece. The user controls the horizontal positioning of, for example, a cursor by causing the arm member to pivot either to the right or to the left about the fixed point. This motion is detected by the first sensor, which in response thereto causes the cursor to move either right or left, respectively, on a display screen. The vertical positioning of the cursor is controlled by sliding the thumbpiece along the length of the arm member. The second sensor detects this linear movement of the thumbpiece and, in response thereto, causes the cursor to move up and down on the display screen. A user may implement trigger functions by exerting a downward force on the thumbpiece. This pressure is detected by the third sensor which, in turn, causes some predetermined function to be implemented on the display screen.

In another embodiment, the longitudinal arm member is disposed within a track defining an arcuate path rather than being coupled to a fixed pivot point. In this embodiment, positioning and trigger functions are controlled in the same manner as described in the preferred embodiment. By moving the arm member along the arcuate path as defined by the track, the thumbpiece follows an arcuate path having as its center a virtual pivot point. A sensor coupled to the arm member senses the arcuate movement of the arm member relative to the interface control and in response thereto generates a horizontal positioning control signal. The vertical positioning of the cursor and trigger functions are implemented as described above in connection with the preferred embodiment.

Embodiments of the present invention isolate the trigger function from the positioning controls. The downward force used to implement a trigger function will always be orthogonal to those motions of the thumb which are used to control the positioning of the cursor, regardless of the positions of the arm member or thumbpiece. This orthogonal relationship prevents a user from inadvertently altering the positioning of the cursor when trying to implement a trigger function.

Embodiments in accordance with the present invention also achieve superior ergonomics. The arm member is preferably of a length approximately equal to that of an adult thumb where different length arm members can be provided for different size hands. Together the arm member, which rotates to form an arcuate path, and the thumbpiece, which slides along the length of the arm member, emulate the natural pivoting and curling/extending motions of the thumb, respectively. The result is a comfortable, precise, and easy to use interface control.

In another embodiment, additional trigger switches are provided within cavities formed in the underside of the interface control. The fingertips of the user's hand, each comfortably nestled within an associated cavity, control the operation of the additional trigger switches, which may be used to implement numerous other functions.

Embodiments of the present invention are usable as an interface between a user and a machine where the machine carries out come predetermined function in response to commands issued by the user. In one embodiment, for instance, the user may control the mechanical operation of construction equipment. In another embodiment, the user may control moveable elements on a display screen, such as a cursor in a software application or an object in a video game.

This invention will be more fully understood in view of the following description taken together with the drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 is a prior art four-key cluster joystick controller;

FIG. 2 is a plan view of one embodiment in accordance with the present invention;

FIGS. 3A and 3B are side and end views of the embodiment of FIG. 2, respectively;

FIG. 4 is a perspective view of a portion of the embodiment of FIG. 2;

FIG. 5 shows one embodiment in accordance with the present invention resting in the palm of a user's hand;

FIG. 6 shows another embodiment in accordance with the present invention;

FIG. 7 is an end view of another embodiment in accordance with the present invention;

FIG. 8 is a plan view of yet another embodiment in accordance with the present invention;

FIG. 9 shows the embodiment of FIG. 2 used in conjunction with the prior art controller of FIG. 1; and

FIGS. 10, 11 and 12 are plan views of three other embodiments in accordance with the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

In accordance with the present invention, an interface control is disclosed which allows a user to simultaneously control both a trigger function and the positioning of a cursor through a single contact surface using only the thumb. In the preferred embodiment shown in FIGS. 2, 3A, 3B, 4 and 5, a hand-held interface control 10 is provided in support 12 which includes base plate 14 and side wall 16 having first rounded end portion 16a, second rounded end portion 16b, and side portions 16c, 16d connecting rounded end portions 16a, 16b. Top plate 18 (FIGS. 3A and 3B) is provided above side wall 16 and encloses a portion of support 12. Longitudinal arm member 20 (FIGS. 2, 3A, 3B, 4, 5) is disposed in and pivotally connected to support 12 (FIG. 2) at point 22. Arm member 20 may be connected to support 12 by any suitable means, such as bolt 23 (FIGS. 2, 3A, 3B, 4, 5), or an appropriate bearing surface which allows arm member 20 to pivot about point 22 as indicated by arrows 25a, 25b (FIG. 2). As will be explained below, the resistance provided between (1) arm member 20 and base plate 14 and (2) thumbpiece 28 and arm member 20 may be adjusted to a level suitable to the needs of a particular user or application.

First sensor 26 (FIGS. 2, 3A, 3B) is coupled to arm member 20 and senses the angular rotation of arm member 20 about point 22. First sensor 26, preferably a rotary resistive potentiometer although other appropriate sensing structures such as capacitive sensors, for example, can also be used, generates first positioning signals indicative of the direction and magnitude of arm member 20's rotation about point 22.

Referring to FIGS. 3A and 4, arm member 20 has cavity 30 formed at one end thereof. Cavity 30 is bounded on either side by side walls 20a, 20b which have formed therein grooves 34a, 34b (groove 34b, which is formed within side wall 20b, is not shown in FIG. 4).

Thumbpiece 28 (FIGS. 2, 3A, and 4) is slidably mounted within cavity 30 of arm member 20 and may slide along the length of arm member 20, as indicated by arrows 32a, 32b (FIG. 2). Outwardly protruding flanges 36a, 36b (FIG. 4) provided on the sides of thumbpiece 28 matingly couple with grooves 34a, 34b formed in side walls 20a, 20b, respectively, of arm member 20 to facilitate the sliding of thumbpiece 28 along the length of arm member 20. Curved contact surface 38, which is provided on a top surface of thumbpiece 28, engages the thumb of a user operating hand-held interface control 10 (FIGS. 2, 3A, and 3B).

Shaft 40 (FIGS. 2, 3A) has a first end fixably mounted to a bottom surface of thumbpiece 28 and a second end slidably disposed within second sensor 42. Second sensor 42, which is disposed within arm member 20, detects the linear movement of shaft 40 (and thus the linear movement of thumbpiece 28) relative to sensor 42 and generates second positioning signals indicative of the position of thumbpiece 28 relative to arm member 20.

Second sensor 42 is preferably a linear resistive potentiometer. If a rotary potentiometer is used as second sensor 42, thumbpiece 28 may be coupled to the potentiometer by a conventional rack and pinion gear. It is to be understood, however, that any other sensing device capable of detecting either motion or pressure may be used as first sensor 26 and second sensor 42 in interface control 10. Although first sensor 26 and second sensor 42 preferably produce analog positioning signals, interface control 10 may also employ s sensors which produce digital positioning signals.

A third sensor 44 (FIG. 3A) is secured to a bottom surface of arm member 20 so mat third sensor 44 moves with arm member 20 about point 22 (FIG. 2). A bottom surface of third sensor 44 is in contact with and slides along (when arm member 20 pivots about point 22) a top surface of base plate 14 (FIG. 3A). When a user exerts a downward pressure upon thumbpiece 28, third sensor 44 engages base plate 14 and in response thereto generates a trigger control signal. Third sensor 44 may be any conventional pressure sensitive device which converts a pressure into an electric signal. In one embodiment, third sensor 44 is a microswitch of well known design.

A user cradles hand-held interface control 10 in the palm of his or her hand, positioning his fingers along the underside of bottom plate 14 and resting his thumb on curved contact surface 38 of thumbpiece 38, as shown in FIG. 5. Using only his thumb, the user may control the horizontal and vertical positioning of, for instance, a cursor on a display screen, as well as implement various related functions (such as selecting options from a pull-down menu).

The horizontal positioning of the cursor is manipulated by pivoting arm member 20 about point 22, whereby arm member 20 traces an arcuate path as indicated by arrows 25a, 25b (FIG. 2). First sensor 26, in response to the arcuate motion of arm member 20, generates a first positioning signal which controls the horizontal positioning of a cursor on a display screen indicative of the angular position of arm member 20. For instance, to move the cursor to the right on the display screen, the thumb (resting in thumbpiece 28) is used to move arm member 20 in an arcuate path to the right, as indicated by arrow 25a.

Preferably, arm member 20 is of a length (measured between portions 16a and 16b of side wall 16) approximately equal to that of the user's thumb so that when a user places the thumb on contact surface 38 of thumbpiece 28, the user's large thumb joint is immediately above point 22. The length of arm member 20 may be of varying lengths so as to accommodate different size hands. As discussed above, arm member 20, pivoting about point 22, travels in an arcuate path between sidewalls 16a, 16b of interface control 10. This arcuate path emulates the natural motion of the thumb as it pivots about the large thumb joint, thereby resulting in a natural and ergonomic relationship between the thumb and arm member 20. Hence, interface control 10 uses the natural arcuate motion of a user's thumb to effect linear changes in the horizontal positioning of a cursor on the display screen.

The vertical positioning of the cursor is controlled by sliding thumbpiece 28 along the length of arm member 20 as indicated by arrows 32a, 32b (FIG. 2). Second sensor 42 detects the linear movement of thumbpiece 28 and, in response thereto, generates second positioning signals which control the vertical positioning of the cursor. For instance, to move the cursor up or down on a display screen, the user simply uses his or her thumb to move thumbpiece 28 away from or towards point 22. This movement is easily achieved by extending or curling the thumb. Note that by positioning the fingers on the bottom side of base plate 14, the combined movement of the fingers assist the thumb in the curling and extending motions.

Trigger functions are implemented by exerting a downward pressure (using the thumb) upon thumbpiece 28. Third sensor 44 senses this downward pressure and, in response thereto, generates a trigger control signal. Depending upon the particular application with which interface control 10 is being used, this trigger control signal may implement a variety of functions. For instance, in computer software or interactive systems, this trigger control may select items from a menu. When used with a video game, for example, this trigger control may cause a character to jump.

The downward pressure exerted by the thumb to implement a trigger control signal is always orthogonal to the thumb motions used to control horizontal and vertical positioning, regardless of the angular position of arm member 20 or the linear position of thumbpiece 28. This orthogonal relationship eliminates undesirable interaction between the trigger control and the positioning controls and thus prevents cursor creep. A user, when exerting a downward pressure on thumbpiece 28, is not likely to inadvertently pivot or extend/curl the thumb (which would cause a change in the positioning of the cursor). Thus, by isolating the trigger and positioning controls, as described above, interface control 10 provides advantages over conventional joystick controllers which suffer from cursor creep problems.

FIG. 6 shows another embodiment of the present invention. Interface control 50, the operation of which is identical to that of interface control 10, includes all the components of interface control 10 plus additional features described below. Interface control 50 includes two groups of cavities 52a, 52b, and 52c and 54a, 54b, and 54c formed in the bottom surface of base plate 14. When a user cradles interface control 50 in the left hand, the tips of the three fingers closest to the thumb may comfortably rest within associated cavities 52a-52c. Each of cavities 52a, 52b, and 52c has an associated switch 56a, 56b, 56c, respectively, disposed therein so that each of the user's fingers may activate one of associated switches 56a, 56b, 56c while holding interface control 50. Switches 56a, 56b, 56c, when depressed by the user's finger tips, generate second, third, and fourth trigger control signals, respectively, for implementing additional predetermined functions.

In a similar manner, a right-handed user may place one of the three fingers closest to the thumb (of the right hand) in each of cavities 54a, 54b, and 54c to operate associated switches 58a, 58b, 58c, which, like switches 56a, 56b, 56c, generate second, third, and fourth trigger control signals, respectively, for implementing various predetermined functions.

If additional control signals are desired for implementing even more predetermined fractions, control circuitry may be added so that the simultaneous depression of two of more switches 56a, 56b, 56c (or 58a, 58b, 58c for right-handed users) generate these additional control signals. For instance, simultaneously depressing switches 56a and 56b (for left-hand operation) or 58a and 58b (for right-hand operation) may generate a fifth trigger control signal.

FIG. 7 is an end view of another embodiment in accordance with the present invention. Interface control 60, which operates in a manner identical to interface control 10 (FIGS. 2, 3A, 3B, 4, and 5), includes a plurality of first detents 61 formed on the top surface of base plate 14. One second detent 62 is provided on the bottom surface of arm member 20 such that as arm member 20 pivots about point 22 (not shown), second detent 62 moves between pairs of the plurality of first detents 61. Arm member 20 may preferably come to rest only at those positions where second detent 62 is positioned between a pair of first detents 61, thereby providing arm member 20 with a discrete number of click stops. These click stops may add increased stability and control to the positioning controls of interface control 60.

The embodiments in accordance with the present invention are advantageous over conventional joystick controllers. Indeed, interface controls 10 and 50 are suitable for one-handed operation, thereby leaving the user's other hand free to perform other tasks. This one-handed operation also eliminates the need for a flat surface, as required when using a mouse or operating conventional joystick interface controls with one hand.

Using the thumb to control positioning functions contributes to the superior ergonomics of interface controls 10 and 50. As mentioned above, conventional joystick controllers require various combinations of wrist and arm movements to control the positioning of a cursor and additional finger or thumb motions to control trigger functions. Such unnatural combinations of motions necessitate training and practice in order for a user to become proficient. This is especially true as the size of such a control is reduced. Unlike conventional controllers, embodiments in accordance with the present invention (1) isolate the horizontal and vertical positioning functions from each other and (2) isolate the positioning functions from the trigger functions while simultaneously allowing both functions to be controlled by a single user contact point. The result is an interface control which allows users to issue vertical and horizontal positioning commands in concert for smooth and precise motions along both axes (i.e., a diagonal motion), as well as implementing trigger functions without inadvertently altering those positioning commands. Further, the interface controls described herein allow users to control both positioning and trigger control functions with simple, intuitive thumb motions.

When a user traces his or her thumb across the tips of the fingers, every joint in his or her hand moves in concert to facilitate the thumb's motion. This opposed position of a user's thumb relative to his or her fingers and palm is utilized by interface controls 10 and 50 to achieve a comfortable and natural interface between the user and a machine (such as a computer). Indeed, by positioning a user's fingers along the bottom surface of base plate 14 and his or her thumb in thumbpiece 28, interface controls 10 and 50 operate in a manner consistent with the thumb's natural motions. By taking advantage of the thumb's full arcuate motion about the large thumb joint and the thumb's excellent linear motion, interface controls in accordance with the present invention allow a user to quickly and easily position a cursor or pointer on a display screen. The superior ergonomics of the above interface controls afford users a high degree of precision and efficiency without the extensive practice and training required of conventional joystick controllers. This accuracy and ease of use makes the present interface controls especially well suited for CAD or any other computer illustration systems.

The frictional coupling between elements of the preferred embodiments may be manipulated to adjust the “feel” of interface controls 10 and 50. For instance, contact surface 38 (FIGS. 4 and 8) of thumbpiece 28 may be shaped with respect to the top surfaces of sidewalls 20a, 20b so that the sides of a user's thumb are in frictional contact with sidewalls 20a, 20b. This frictional contact may increase the stability with which thumbpiece 28 slides along arm member 20, thereby increasing the accuracy of vertical positioning of a cursor.

In a similar manner, the frictional coupling between third sensor 44 (FIG. 3A) and the top surface of base plate 14 of housing 12 may be adjusted to increase the accuracy of the horizontal positioning control of interface controls 10 and 50. For example, a strip of Teflon material (not shown) may be provided between third sensor 44 and bottom plate 14 to achieve a desirable “silky” feel when a user pivots arm member 20 about point 22 (FIG. 4). The Teflon causes drag to progressively increase as thumbpiece 28 is depressed, without any significant increase in static friction. This resultant increase in drag contributes to an increased stability in performing drag-select operations in which the trigger switch is depressed while the cursor is moved from a first position to a second position (as in highlighting text in word processing programs).

FIG. 8 shows another embodiment in accordance with the present invention. Interface control 65 includes arm member 29 slidably disposed on conventional curved guide tracks 64a, 64b within support 67. Tracks 64a, 64b define an arcuate path having as its center virtual pivot point 66. Arm member 29 slides along this arcuate path as indicated by arrows 25a, 25b as if arm member 29 were pivoting about virtual pivot point 66. Disposing arm member 29 within tracks 64a, 64b in this manner eliminates the need for arm member 29 to be coupled to a fixed pivot point, as is arm member 20 of interface control 10 (FIGS. 2, 3A, 3B, 4, and 5), and therefore allows for interface control 65 to be of a significantly smaller size. Thumbpiece 28 is slidably disposed within arm member 29 and slides along the length of arm member 29 as indicated by arrows 32a, 32b.

The positioning and trigger functions of interface control 65 are controlled in a manner identical to those of interface control 10 as described above (see FIGS. 2, 3A, 3B, 4, and 5) and will thus not be described here. Interface control 65 possesses all of the advantages discussed above with respect to interface control 10, including allowing users to control the operation of applications with simple and intuitive motions that closely emulate the natural motions of the human thumb and isolates positioning controls (1) from each other and (2) from trigger controls as described above. Likewise, interface control 65 may be also be provided with the friction coupling and feedback features described above.

The embodiments described above may be used in virtually any application which requires an interface control between a user and a machine. Embodiments in accordance with the present invention may be used to control the operation of a construction crane or boom. Interface controls 10 and 50 are ideal for replacing the mouse or trackball in computer software applications such as word processing, databases, and spreadsheets. For instance, interface control 50 of FIG. 6 (see also FIGS. 2, 3A, 3B, 4, and 5) is well suited for use with video games. As described above, thumbpiece 28 may be used to control the positioning of a character in the video game. By depressing thumbpiece 28 so as to activate third sensor 44, the user may implement various predetermined functions, such as starting/stopping the game and selecting game options. Switch 56a (58a for right-handed users) may, for instance, cause the character to jump. Switch 56b (58b) may cause the character to fire a bullet, and so on.

Embodiments of the present invention may also be incorporated into conventional two-banded video game controllers (see FIG. 1) to provide a superior video game interface control. For example, positioning control keys 2a, 2b, 2c, and 2d (FIG. 1) may be replaced by interface control 10, as illustrated in FIG. 9. Two-handed video game controller 90 has disposed within a first portion thereof a portion of interface control 10 of FIGS. 2, 3A, 3B, and 4. For purposes of clarity, not all of the components of interface control 10 are labelled. Arm member 20 and thumbpiece 28 control the positioning of objects (i.e., characters of a video game) displayed on a screen in the same manner as described previously with reference to FIGS. 2, 3A, 3B, and 4, while keys 4a, 4b, and 4c implement various trigger functions.

The above described interface controls may be mounted in virtually any enclosure, including (but not limited to) control panels, automobile dashboards, steering wheels, or handgrips of other interface controls. For instance, in one such embodiment, base plate 14 (FIG. 2) may be disposed within the handgrip portion of a floor-mounted lever arm control, i.e., a transmission selector in a vehicle, to provide users with a superior means to control such things as the vehicle's navigation system or communications with the vehicle's on-board computer system.

In another embodiment, interface controls in accordance with the present invention may be disposed within a control panel such as the dashboard of an automobile, boat, or even an airplane to provide control over certain operations. For example, interface control 10 may be mounted in the control panel of construction equipment to control the operation of a boom or crane. A control panel-mounted interface control 10 could also be used to manually control, for instance, the processing operations of an industrial application or the positioning and firing of lasers in medical applications. These embodiments, like those discussed above, are advantageous since multiple control functions (1) are disposed on a single contact surface and (2) are isolated from one another.

Some of the ergonomic advantages discussed herein may be compromised in order to provide a user interface control capable of controlling positioning in three, rather than two, directions. In one such embodiment in accordance with the present invention, various elements of interface control 10 may be incorporated into the handgrip of a conventional full-size joystick to provide three-dimensional positioning control as well as trigger functions.

FIG. 10 shows interface control 70 including gimballed stick 72 having fanned at one end an inclined, elongated upper portion 74. Formed within top surface 76 of upper portion 74 is secondary interface control 80 which includes all the features of and operates in a similar manner to interface control 10 (FIGS. 2, 3A, 3B, and 4). Secondary interface control 80 preferably has thumbpiece 28 fixably mounted within arm member 20 so that thumbpiece 28 may not slide along arm member 20, thereby eliminating the need for second sensor 42 as well as grooves 34a, 34b and flanges 36a, 36b (FIG. 4). The other end of stick 72 (FIG. 10) is pivotally mounted to a base portion (not shown) having sensors which generate first and second positioning signals in response to stick 72 pivoting with respect to the base portion, as discussed above in reference to conventional joystick controllers.

A user curls the four fingers of his or her hand around stick 72 and places the thumb in thumbpiece 28 (FIG. 10). The user controls the horizontal and vertical positioning of, for instance, a cursor displayed on a CRT in a conventional manner as described above, i.e., by pivoting stick 72 about the base portion. The user controls the depth positioning of the cursor with the thumb by pivoting arm member 20 about pivot point 22 (see FIG. 2). Trigger functions are activated by pressing downward on thumbpiece 28 (as discussed in reference to interface control 10).

Various forms of feedback may be added to the above described embodiments to provide a user with additional information about the particular application he or she is controlling, as described below in reference to FIGS. 11 and 12. For instance, arm member 20 of interface control 10 (FIGS. 2, 3A, 3B, and 4) may be fitted with a first actuator that in response to a first feedback signal prevents arm member 20 from further pivoting in one or both directions or, in the alternative, alters the frictional contact between arm member 20 and baseplate 14 so as to alter the ease with which arm member 20 pivots.

Referring to FIG. 11, interface control 80, the operation of which is identical to that of interface control 10, includes all of the components of interface control 10 plus additional features described below. For purposes of clarity, not all of the components of interface control 80 common with those of interface control 10 are shown. Arm member 20 of interface control 80 has coupled thereto electromagnetic coil 82 which, in turn, is wound around a conventional ferrous core (not shown). Shaft 84 extends along arm member 20 and has a first end matingly coupled to surface 86 of sidewall 16b. A second end of shaft 84 extends through coil 82 and is coupled to iron armature 88. Armature 88 is preferably positioned as close to coil 82 as possible. When a first feedback current is provided to coil 82, the resultant magnetic field produced by coil 82 attracts armature 88 towards coil 82, thereby causing the first end of shaft 84 to shift towards and press against surface 86 of sidewall 16b. The resultant increase in frictional coupling between arm member 20 and sidewall 16b resists any pivoting movement of arm member 20 about point 22. In other words, coil 82, shaft 84, and armature 88 act as a magnetically activated brake. Varying levels of feedback current will result in proportionally varying levels of drag. This brake may be implemented to simulate detents, stops, or other forms of reflective feedback.

In a similar manner, a second actuator may be provided that in response to a second feedback signal inhibits the movement of thumbpiece 28 along arm member 20, as illustrated in FIG. 12. Arm member 20 has provided therein a sliding bar 90 having a first end coupled to thumbpiece 28. Iron core 92 is coupled to arm member 20 and is positioned in a region proximate to a second end of bar 90. Magnetic coil 94 is wound around ferrous care 92. A second feedback current provided to coil 94 will induce a magnetic attraction between bar 90 and against core 92, thereby resulting in an increased frictional coupling between bar 90 and core 92. This increased frictional coupling resists the sliding motion of thumbpiece 28 along arm member 20. Note that in some embodiments shaft 40 (FIG. 3A) (not shown in FIG. 12 for simplicity) may also serve the same function as bar 90 in addition to being part of the longitudinal sensing structure of arm member 20.

The embodiments described above and illustrated in FIGS. 11 and 12 would, for instance, be especially well suited for use with applications in which it is desirable to preclude a user from selecting certain options or moving a cursor into certain areas. In a video game application, for instance, the game's character may be precluded from entering a restricted area of the displayed image. The video game may issue feedback signals as discussed above to preclude the user from causing the character to move into the restricted areas. Thus, the feedback signals, by restraining or even preventing (1) arm member 20 from pivoting about point 22 and/or (2) thumbpiece 28 from sliding along arm member 20 directly inform the user he can no longer move in that direction. In a similar manner, an additional actuator may be contained within third sensor 44 (see FIG. 3A) to preclude activation of trigger functions at certain predetermined character positions. Unlike conventional interface control feedback systems which use flashing lights or sounds to warn users of an improper selection or movement, the direct force-reflecting feedback described above, by preventing the user from effecting certain positioning commands, provides a realistic feel to video games and other applications.

In other applications, interface control 80 (FIG. 11) may be used to facilitate the selection of options or icons. As the user moves the cursor or pointer over an icon displayed on a screen, feedback, signals generated by the application may simulate a detent by increasing the frictional coupling between arm member 20 and sidewall 16b and between thumbpiece 28 and arm member 20, as described above with reference to FIGS. 11 and 12, when the cursor or pointer is positioned near or overlaps certain icons displayed on the screen. This simulated detent varies the amount of force the user must exert to effect further positioning changes in certain directions, i.e., the detent may either make it easier or harder for the user to cause the cursor to pass across the icon. In this manner, the user can “feel” when, he or she has reached a particular icon (or any other specific screen location). This simulated detent may be deactivated when, for instance, the icon has been selected or when the cursor has passed over the icon.

The actuators discussed above may comprise a solenoid, a servomotor, or any other suitable device known in the art which generates a force in response to electric signals. The actuators may also employ shape-memory alloys, piezo ceramics, or electro-rheological compounds. Further, motor-type actuators may be employed to augment or restrain motion.

In other embodiments, the actuators discussed above may used to activate and deactivate electrically controlled detents so as to provide tactile click stops in the pivoting motion of arm member 20 (FIG. 2) about point 22 or in the linear motion of thumbpiece 28 along arm member 20. These detents may be logically correlated with specific targets or options on a display screen such that once a particular option is selected, its corresponding detent is electrically deactivated. Adaptive feedback of this type can be very effective in making the above-described controls more intuitive.

Embodiments of the present invention may also be equipped with a spring return mechanism. With reference to interface control 10 (FIG. 2), a centering spring may be coupled to arm member 20 which causes arm member 20 to return to its center position whenever arm member 20 has deviated from the center position by exerting pressure on arm member 20. A manually controlled latch may also be provided which engages the centering spring to and disengages the centering spring from arm member 20 so as to turn on and off the centering mechanism. Such a centering mechanism is useful in applications requiring proportional control (i.e., a conventional joystick) rather than absolute control (i.e., a mouse). The centering spring may also be electrically actuated by an external signal from the interfaced device (i.e., computer, video game, and so on). Inclusion of such an electrically actuated spring allows the interfaced device to switch the controller between two modes of operation (spring centering and non-centering), as the particular application may require. In a similar manner, an additional centering spring may be coupled to thumbpiece 28 to provide proportional control in the vertical direction.

While particular embodiments of the present invention have been shown and described, it will be obvious to those skilled in the art that changes and modifications may be made without departing from this invention in its broader aspects and, therefore, the appended claims are to encompass within their scope all such changes and modifications as fall within the true spirit and scope of this invention.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3157853Dec 6, 1957Nov 17, 1964Joseph HirschTactile communication system
US3220121May 20, 1963Nov 30, 1965Communications Patents LtdGround-based flight training or simulating apparatus
US3497668Aug 25, 1966Feb 24, 1970Joseph HirschTactile control system
US3517446Apr 19, 1967Jun 30, 1970Singer General PrecisionVehicle trainer controls and control loading
US3902687Jun 25, 1973Sep 2, 1975Robert E HightowerAircraft indicator system
US3903614Mar 27, 1970Sep 9, 1975Singer CoApparatus for simulating aircraft control loading
US3919691May 26, 1971Nov 11, 1975Bell Telephone Labor IncTactile man-machine communication system
US4131033Feb 18, 1977Dec 26, 1978Rockwell International CorporationPush-pull and rotating knob
US4160508Aug 19, 1977Jul 10, 1979NasaController arm for a remotely related slave arm
US4236325Dec 26, 1978Dec 2, 1980The Singer CompanySimulator control loading inertia compensator
US4398889Jun 8, 1981Aug 16, 1983Fokker B.V.Flight simulator
US4477043Dec 15, 1982Oct 16, 1984The United States Of America As Represented By The Secretary Of The Air ForceBiodynamic resistant control stick
US4489304Jul 22, 1983Dec 18, 1984Hayes Charles LSpring disconnect mechanism for self centering multiple axis analog control stick
US4513235Jan 24, 1983Apr 23, 1985British Aerospace Public Limited CompanyControl apparatus
US4560983Sep 17, 1982Dec 24, 1985Ampex CorporationDynamically interactive responsive control device and system
US4581491May 4, 1984Apr 8, 1986Research CorporationWearable tactile sensory aid providing information on voice pitch and intonation patterns
US4599070Jul 29, 1981Jul 8, 1986Control Interface Company LimitedAircraft simulator and simulated control system therefor
US4604016Aug 3, 1983Aug 5, 1986Joyce Stephen AMulti-dimensional force-torque hand controller having force feedback
US4692756Jul 5, 1984Sep 8, 1987U.S. Philips CorporationDevice for generating a 2-axis control signal
US4706294Jun 10, 1986Nov 10, 1987Alpine Electronics Inc.Audio control device
US4708656Feb 4, 1986Nov 24, 1987Fokker B.V.Simulator of mechanical properties of a steering system
US4712101Dec 4, 1984Dec 8, 1987Cheetah Control, Inc.Control mechanism for electronic apparatus
US4713007Oct 11, 1985Dec 15, 1987Alban Eugene PAircraft controls simulator
US4724715Apr 30, 1986Feb 16, 1988Culver Craig FControl mechanism for computer keyboard and the like
US4782327Jan 2, 1985Nov 1, 1988Victor B. KleyComputer control
US4794384Apr 9, 1987Dec 27, 1988Xerox CorporationOptical translator device
US4795296Nov 17, 1986Jan 3, 1989California Institute Of TechnologyHand-held robot end effector controller having movement and force control
US4800721Feb 13, 1987Jan 31, 1989Caterpillar Inc.Off)highway vehicle; earthworking/material handling work
US4823634Nov 3, 1987Apr 25, 1989Culver Craig FMultifunction tactile manipulatable control
US4825157 *May 16, 1988Apr 25, 1989Mikan Peter JJoystick
US4861269Mar 30, 1988Aug 29, 1989Grumman Aerospace CorporationSidestick flight control simulator
US4868549May 18, 1987Sep 19, 1989International Business Machines CorporationFeedback mouse
US4891764Dec 11, 1987Jan 2, 1990Tensor Development Inc.Program controlled force measurement and control system
US4930770Dec 1, 1988Jun 5, 1990Baker Norman AEccentrically loaded computerized positive/negative exercise machine
US4934694Mar 9, 1988Jun 19, 1990Mcintosh James LComputer controlled exercise system
US4935728Nov 20, 1987Jun 19, 1990Altra CorporationComputer control
US4949119Jan 12, 1989Aug 14, 1990Atari Games CorporationGearshift for a vehicle simulator using computer controlled realistic real world forces
US4962448Sep 30, 1988Oct 9, 1990Demaio JosephVirtual pivot handcontroller
US4983901Apr 21, 1989Jan 8, 1991Allergan, Inc.Digital electronic foot control for medical apparatus and the like
US5004391Aug 21, 1989Apr 2, 1991Rutgers UniversityPortable dextrous force feedback master for robot telemanipulation
US5007300Jan 22, 1990Apr 16, 1991United Kingdom Atomic Energy AuthorityMulti-axis hand controller
US5019761Feb 21, 1989May 28, 1991Kraft Brett WForce feedback control for backhoe
US5022407Jan 24, 1990Jun 11, 1991Topical Testing, Inc.Apparatus for automated tactile testing
US5035242Apr 16, 1990Jul 30, 1991David FranklinMethod and apparatus for sound responsive tactile stimulation of deaf individuals
US5038089Oct 28, 1988Aug 6, 1991The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationSynchronized computational architecture for generalized bilateral control of robot arms
US5044956Jan 12, 1989Sep 3, 1991Atari Games CorporationControl device such as a steering wheel for video vehicle simulator with realistic feedback forces
US5065145 *Oct 6, 1989Nov 12, 1991Summagraphics CorporationMethod and apparatus for producing signals corresponding to the position of a cursor
US5078152Dec 25, 1988Jan 7, 1992Loredan Biomedical, Inc.Method for diagnosis and/or training of proprioceptor feedback capabilities in a muscle and joint system of a human patient
US5086296 *Dec 2, 1988Feb 4, 1992U.S. Philips CorporationSignal generating device
US5095303Mar 27, 1990Mar 10, 1992Apple Computer, Inc.Six degree of freedom graphic object controller
US5103404Dec 20, 1989Apr 7, 1992Tensor Development, Inc.Feedback for a manipulator
US5107080Dec 1, 1989Apr 21, 1992Massachusetts Institute Of TechnologyMultiple degree of freedom damped hand controls
US5142931Feb 14, 1991Sep 1, 1992Honeywell Inc.3 degree of freedom hand controller
US5146566May 29, 1991Sep 8, 1992Ibm CorporationInput/output system for computer user interface using magnetic levitation
US5184319Feb 2, 1990Feb 2, 1993Kramer James FForce feedback and textures simulating interface device
US5185561Jul 23, 1991Feb 9, 1993Digital Equipment CorporationTorque motor as a tactile feedback device in a computer system
US5186629Aug 22, 1991Feb 16, 1993International Business Machines CorporationVirtual graphics display capable of presenting icons and windows to the blind computer user and method
US5186695Oct 26, 1990Feb 16, 1993Loredan Biomedical, Inc.Apparatus for controlled exercise and diagnosis of human performance
US5193963Oct 31, 1990Mar 16, 1993The United States Of America As Represented By The Administrator Of The National Aeronautics And Space AdministrationForce reflecting hand controller
US5202961 *Jun 8, 1990Apr 13, 1993Apple Computer, Inc.Sequential information controller
US5203563Mar 21, 1991Apr 20, 1993Atari Games CorporationFor generating vibrations in a steering wheel
US5212473Feb 21, 1991May 18, 1993Typeright Keyboard Corp.Membrane keyboard and method of using same
US5220260Oct 24, 1991Jun 15, 1993Lex Computer And Management CorporationActuator having electronically controllable tactile responsiveness
US5223776Dec 31, 1990Jun 29, 1993Honeywell Inc.Six-degree virtual pivot controller
US5235868Oct 2, 1991Aug 17, 1993Culver Craig FMechanism for generating control signals
US5237327Nov 14, 1991Aug 17, 1993Sony CorporationWireless, hand-held
US5240417Mar 14, 1991Aug 31, 1993Atari Games CorporationSystem and method for bicycle riding simulation
US5271290Apr 14, 1993Dec 21, 1993United Kingdom Atomic Energy AuthorityActuator assembly
US5275174Jul 16, 1992Jan 4, 1994Cook Jonathan ARepetitive strain injury assessment
US5280276Jul 10, 1992Jan 18, 1994Quickshot (Bvi) Ltd.Combination mouse/trackball input device
US5284330Jun 18, 1992Feb 8, 1994Lord CorporationMagnetorheological fluid devices
US5293158 *May 5, 1992Mar 8, 1994Alps Electric Co., Ltd.X-Y direction input device
US5296846Oct 5, 1992Mar 22, 1994National Biomedical Research FoundationThree-dimensional cursor control device
US5296871Jul 27, 1992Mar 22, 1994Paley W BradfordThree-dimensional mouse with tactile feedback
US5299810Jun 23, 1992Apr 5, 1994Atari Games CorporationVehicle simulator including cross-network feedback
US5309140Nov 26, 1991May 3, 1994The United States Of America As Represented By The Secretary Of The NavyFeedback system for remotely operated vehicles
US5313229 *Feb 5, 1993May 17, 1994Gilligan Federico GMouse and method for concurrent cursor position and scrolling control
US5313230Jul 24, 1992May 17, 1994Apple Computer, Inc.Three degree of freedom graphic object controller
US5317336Jun 23, 1992May 31, 1994Hall Kenneth JMouse yoke assembly for interfacing with a computer
US5329289 *Apr 22, 1992Jul 12, 1994Sharp Kabushiki KaishaData processor with rotatable display
US5334027Feb 25, 1991Aug 2, 1994Terry WherlockBig game fish training and exercise device and method
US5379663 *Nov 22, 1993Jan 10, 1995Mitsui Engineering & Shipbuilding Co., Ltd.Multi-axial joy stick device
US5381080Feb 18, 1993Jan 10, 1995Vdo Adolf Schindling AgControl device
US5396266Jun 8, 1993Mar 7, 1995Technical Research Associates, Inc.Kinesthetic feedback apparatus and method
US5405152Jun 8, 1993Apr 11, 1995The Walt Disney CompanyMethod and apparatus for an interactive video game with physical feedback
US5414337Jun 11, 1993May 9, 1995Lex Computer And Management CorporationActuator having electronically controllable tactile responsiveness
US5450079Sep 7, 1994Sep 12, 1995International Business Machines CorporationMultimodal remote control device having electrically alterable keypad designations
US5457479Jun 13, 1994Oct 10, 1995Primax Electronics Ltd.Apparatus having dual modes for controlling cursor on display screen
US5466213Jan 6, 1994Nov 14, 1995Massachusetts Institute Of TechnologyInteractive robotic therapist
US5473344Jan 6, 1994Dec 5, 1995Microsoft Corporation3-D cursor positioning device
US5474082 *Jan 6, 1993Dec 12, 1995Junker; AndrewBrain-body actuated system
US5477237Jul 14, 1994Dec 19, 1995Dell Usa, L.P.Positioning device reporting X, Y and yaw motion
US5491477Sep 13, 1993Feb 13, 1996Apple Computer, Inc.Anti-rotation mechanism for direct manipulation position input controller for computer
US5492312Apr 17, 1995Feb 20, 1996Lord CorporationMulti-degree of freedom magnetorheological devices and system for using same
US5506605Jan 26, 1994Apr 9, 1996Paley; W. BradfordThree-dimensional mouse with tactile feedback
US5512919Mar 30, 1993Apr 30, 1996Pioneer Electronic CorporationThree-dimensional coordinates input apparatus
US5513100Jun 10, 1993Apr 30, 1996The University Of British ColumbiaVelocity controller with force feedback stiffness control
US5530455 *Aug 10, 1994Jun 25, 1996Mouse Systems CorporationRoller mouse for implementing scrolling in windows applications
US5543821Jul 20, 1994Aug 6, 1996Logitech, Inc.For attaching to a computer
US5547382Apr 10, 1995Aug 20, 1996Honda Giken Kogyo Kabushiki KaishaRiding simulation system for motorcycles
US5550563 *Dec 23, 1992Aug 27, 1996Taligent, Inc.Interaction framework system
US5576727 *Jun 5, 1995Nov 19, 1996Immersion Human Interface CorporationElectromechanical human-computer interface with force feedback
US5583407Dec 27, 1994Dec 10, 1996Konami Co., Ltd.Manipulating device having three degree freedom
US5587937Apr 25, 1995Dec 24, 1996Massachusetts Institute Of TechnologyForce reflecting haptic interface
US5589828 *Mar 5, 1992Dec 31, 1996Armstrong; Brad A.6 Degrees of freedom controller with capability of tactile feedback
US5623582Jul 14, 1994Apr 22, 1997Immersion Human Interface CorporationComputer interface or control input device for laparoscopic surgical instrument and other elongated mechanical objects
US5627531Sep 30, 1994May 6, 1997Ohmeda Inc.Multi-function menu selection device
US5629594Oct 16, 1995May 13, 1997Cybernet Systems CorporationForce feedback system
US5631669Mar 29, 1995May 20, 1997Stobbs; Gregory A.Pointing device with integral microphone
US5642469Nov 3, 1994Jun 24, 1997University Of WashingtonDirect-drive manipulator for pen-based force display
US5643087 *Jul 29, 1994Jul 1, 1997Microsoft CorporationInput device including digital force feedback apparatus
US5666473Oct 8, 1992Sep 9, 1997Science & Technology Corporation & UnmTactile computer aided sculpting device
US5691747Dec 19, 1994Nov 25, 1997Seiko Epson CorporationFor determining a static position on a viewing screen of a computing device
US5781172 *Jun 16, 1997Jul 14, 1998U.S. Philips CorporationData input device for use with a data processing apparatus and a data processing apparatus provided with such a device
US6281883 *Sep 8, 1994Aug 28, 2001Voice Domain Technologies, LlcData entry device
JPS5820012A * Title not available
JPS59160229A * Title not available
Non-Patent Citations
Reference
1"Cyberman Technical Specification," Logitech Cyberman SWIFT Supplement, Apr. 5, 1994.
2Adachi, Yoshitaka et al., "Sensroy Evaluation of Virtual Haptic Push-Buttons," Technical Research Center Suzuki Motor Corporation.
3Adelstein, "A Virtual Environment System For The Study of Human Arm Tremor," Ph.D Dissertation, Dept. of Mechanical Engineering, MIT, Jun. 1989.
4Adelstein, "Design and Implementation of a Force Reflecting Manipulandum for Manual Control research," DSC-vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992.
5Adelstein, "Design and Implementation of a Force Reflecting Manipulandum for Manual Control research," DSC—vol. 42, Advances in Robotics, Edited by H. Kazerooni, pp. 1-12, 1992.
6Adelstein, Bernard D. et al., "A High Performance Two Degree-of-Freedom Kinesthetic Interface," Massachusetts Institute of Technology 1992, pp. 108-112.
7Adlestein, Bernard D. et al., "Design and Implementation of a Force Reflecting Manipulandum for Manual Control Research," 1992, pp. 1-24.
8Akamatsu, M. et al., "Multimodal Mouse: A Mouse-Type Device with Tactile and Force Display," Presence, vol. 3, No. 1, 1994, pp. 73-80.
9Atkinston, William D. et al, "Computing with Feeling," Comput. & Graphics, vol. 2, No. 2-E, pp. 97-103.
10Aukstakalnis et al., "Silicon Mirage: The Art and Science of Virtual Reality," ISBN 0-938151-82-7, pp. 129-180, 1992.
11Baigrie, "Electric Control Loading-A Low Cost, High Performance Alternative," Proceedings, pp. 247-254, Nov. 6-8, 1990.
12Batter, James J. et al., "Grope-1: A Computer Display to the Sense of Feel," pp. TA-4-188-TA-4-192.
13Bejczy el al., "Kinesthetic Coupling Between Operator and Remote Manipulator," International Computer Technology Conference, The American Society of Mechanical Engineers, San Francisco, CA, Aug. 12-15, 1980.
14Bejczy et al., "A Laboratory Breadboard System For Dual-Arm Teleoperation," SOAR '89 Workshop, JSC, Houston, TX, Jul. 25-27, 1989.
15Bejczy, "Generalization of Bilateral Force-Reflecting Control of Manipulators,"Proceedings Of Fourth CISM-IFToMM, Sep. 8-12, 1981.
16Bejczy, "Sensors, Controls, and Man-Machine Interface for Advanced Teleoperation,"Science, vol. 208, No. 4450, pp. 1327-1335, 1980.
17Bejczy, et al., "Universal Computer Control System (UCCS) For Space Telerobots," CH2413-3/87/0000/0318501.00 1987 IEEE, 1987.
18Bostrom, M. et al., "Design of An Interactive Lumbar Puncture Simulator With Tactile Feedback," IEEE 0-7803-1363, 1993, pp. 280-286.
19Brooks et al., "Hand Controllers for Teleoperation-A State-of-the-Art Technology Survey and Evaluation," JPL Publication 85-11; NASA-CR-175890; N85-28559, pp. 1-84, Mar. 11, 1985.
20Brooks, F. et al., "Project GROPE-Haptic Displays for Scientific Visualization," Computer Graphics, vol. 24, No. 4, 1990, pp. 177-185.
21Brooks, F. et al., "Project GROPE—Haptic Displays for Scientific Visualization," Computer Graphics, vol. 24, No. 4, 1990, pp. 177-185.
22Burdea et al., "Distributed Virtual Force Feedback, Lecture Notes for Workshop on Force Display in Virtual Environments and its Application to Robotic Teleoperation," 1993 IEEE International Conference on Robotics and Automation, pp. 25-44, May 2, 1993.
23Caldwell et al., "Enhanced Tactile Feedback (Tele-Taction) Using a Multi-Functional Sensory System," 1050-4729/93, pp. 955-960, 1993.
24Eberhardt et al., "Including Dynamic Haptic Perception by The Hand: System Description and Some Results," DSC-vol. 55-1, Dynamic Systems and Control: vol. 1, ASME 1994.
25Eberhardt et al., "OMAR-A Haptic display for speech perception by deaf and deaf-blind individuals," IEEE Virtual Reality Annual International Symposium, Seattle, WA, Sep. 18-22, 1993.
26Ellis, R.E. et al., "Design and Evaluation of a High-Performance Prototype Planar Haptic Interface," ASME Dec. 3, 1993, DSC-vol. 49, pp. 55-64.
27Ellis, R.E. et al., "Design and Evaluation of a High-Performance Prototype Planar Haptic Interface," ASME Dec. 3, 1993, DSC—vol. 49, pp. 55-64.
28Gobel et al., "Tactile Feedback Applied to Computer Mice," International Journal of Human-Computer Interaction, vol. 7, No. 1, pp. 1-24, 1995.
29Gotow et al., "Controlled Impedance Test Apparatus for Studying Human Interpretation of Kinesthetic Feedback," WA11-11:00, pp. 332-337, 1989.
30Gotow, J.K., et al., "Perception of Mechanical Properties at the Man-Machine Interface," IEEE 1987, pp. 688-689.
31Hannaford, B. et al., "Force Feedback Cursor Control," NASA Tech Brief, vol. 13, No. 11, Item #21, 1989, pp. i, 1-4.
32Hayward, V. et al., "Design and Multi-Objective Optimization of a Linkage for a Haptic Interface," Advances in Robot Kinematics and Computationed Geometry, Kluwer Academic Publishers, 1994, pp. 359-368.
33Hirota, Koichi et al., "Development of Surface Display," IEEE 0-7803-1363-1, 1993, pp. 256-262.
34Howe, "A Force-Reflecting Teleoperated Hand System for the Study of Tactile Sensing in Precision Manipulation," Proceedings of the 1992 IEEE International Conference on Robotics and Automation, Nice, France, May 1992.
35Howe, Robert D., "Task Performance with a Dextrous Teleoperated Hand System," Proceedings of SPIE, Nov. 1992, vol. 1833, pp. 1-9.
36IBM Technical Disclosure Bullein, "Mouse Ball-Actuating Device With Force and Tactile Feedback," vol. 32, No. 9B, Feb. 1990.
37Iwata, "Pen-based Haptic Virtual Environment," 0-7803-1363-1/93 IEEE, pp. 287-292, 1993.
38Iwata, Hiroo, "Artificial Reality with Force-feedback: Development of Desktop Virtual Space with Compact Master Manipulator," Computer Graphics, vol. 24, No. 4, 1990, pp. 165-170.
39Iwata, Hiroo, "Pen-base Haptic Virtual Environment", IEEE, 0-7803-1363-1, pp. 287-292.
40Jacobsen et al., "High Performance, Dextrous Telerobotic Manipulator With Force Reflection," Intervention/ROV '91 Conference & Exposition, Hollywood, Florida, May 21-23, 1991.
41Jones et al., "A Perceptual analysis of stiffness," ISSN 0014-4819 Springer International (Springer-Verlag); Experimental Brain Research, vol. 79, No. 1, pp. 150-156, 1990.
42Kaczmarek et al., "Tactile Displays," Virtual Environment Technologies, 1995.
43Kelley, A. J. et al., "MagicMouse: Tactile and Kinesthetic Feedback in the Human-Computer Interface using an Electromagnetically Actuated Input/Output Device," Dept. of Elec. Eng., Univ. of Brit. Columbia, 1993, pp. 1-27.
44Kilpatrick, P., "The Use of a Kinesthetic Supplement in an Interactive Graphics System," Univ. of N. Carolina, 1976, pp. 1-175.
45Kontarinis et al., "Display of High-Frequency Tactile Information to Teleoperators," Telemanipulator Technology and Space Telerobotics, Won S. Kim, Editor, Proc. SPIE vol. 2057, pp. 40-50, Sep. 7-9, 1993.
46Marcus, "Touch Feedback in Surgery," Proceedings of Virtual Reality and Medicine The Cutting Edge, Sep. 8-11, 1994.
47McAffee, "Teleoperator Subsystem/Teleobot Demonsdtrator: Force Reflecting Hand Controller Equipment Manual," JPL D-5172, pp. 1-50, A1-A36, B1-B5, C1-C36, Jan. 1988.
48Millman, P. et al., "Design of a Four Degree-of-Freedom Force-Reflecting Manipulandum with a Specified Force/Torque Workspace," IEEE CH2969-4, 1991, pp. 1488-1492.
49Minsky, "Computational Haptics: The Sandpaper System for Synthesizing Texture for a Force-Feedback Display," Ph. D. Dissertation, MIT, Jun. 1995.
50Minsky, Margaret et al., "Feeling and Seeing: Issues in Force Display," ACM 1990, pp. 235-242.
51Ouhyoung et al., "The Development of A Low-Cost Force Feedback Joystick and Its Use in the Virtual Reality Environment," Proceedings of the Third Pacific Conference on Computer Graphics and Applications, Pacific Graphics '95, Seoul, Korea, Aug. 21-24, 1995.
52Ouh-Young, "A Low-Cost Force Feedback Joystick and Its Use in PC Video Games," IEEE Transactions on Consumer Electronics, vol. 41, No. 3, Aug. 1995.
53Ouh-Young, "Force Display in Molecular Docking," Order No. 9034744, pp. 1-369, 1990.
54Ouh-young, M. et al., "Creating an Illustion of Feel: Control Issues in Force Display," Computer Science Dept., Univ. of N. Carolina, 1989, pp. 1-14.
55Patrick et al., "Design and Testing of A Non-reactive, Fingertip, Tactile Display for Interaction with Remote Environments," Cooperative Intelligent Robotics in Space, Rui J. deFigueiredo et al., Editor, Proc. SPIE vol. 1987, pp. 215-222, 1990.
56Pimentel et al., "Virtual Reality: through the new looking glass," 2nd Edition; McGraw-Hill, ISBN 0-07-050167-X, pp. 41-202, 1994.
57Rabinowitz et al., "Multidimensional tactile displays: Identification of vibratory intensity, frequency, and contactor area," Journal of The Acoustical Society of America, vol. 82, No. 4, Oct. 1987.
58Ramstein, Christophe, "The Pantograph: A Large Workspace Haptic Device for a Multimodal Human-Computer Interaction", Compuiter-Human Interaction, CHI 1994.
59Rosenberg, Louis B., "Virtual Haptic Overlays Enhance Performance in Telepresence Tasks," SPIE 1994.
60Rosenberg, Louis B., Perceptual Design of A Virtual Rigid Surface Contact, Center for Design Research, Stanford University, Armstrong Laboratory, AL/CF-TR-1995-0029, Apr. 1993.
61Russo, "Controlling Dissipative Magnetic Particle Brakes in Force Reflective Devices," DSC-vol. 42, Advances in Robotics, pp. 63-70, ASME 1992.
62Russo, "The Design and Inplementation of a Three Degree of Freedom Force Output Joystick," MIT Libraries Archives Aug. 14, 1990, pp. 1-131, May 1990.
63Russo, Massimo Andrea, "The Design and Implementation of a Three Degree-of-Freedom Force Output Joystick," Department of Mechanical Engineering, May 11, 1990, pp. 9-40 & 96 & 97.
64Scannell, "Taking a Joystick Ride,"Computer Currents, Boston Edition, vol. 9, No. 11, Nov. 1994.
65Schmult, Brian et al., "Application Areas for a Force-Feedback Joystick," ASME 1993, DSC-vol. 49, pp. 47-54.
66Schmult, Brian et al., "Application Areas for a Force-Feedback Joystick," ASME 1993, DSC—vol. 49, pp. 47-54.
67Shimoga, "Finger Force and Touch Feedback Issues in Dexterous Telemanipulation," Proceedings of Fourth Annual Conference on Intelligent Robotic Systems for Space Expploration, Rensselaer Polytechnic Institute, Sep. 30-Oct. 1, 1992.
68Snow et al., "Model-X Force-Reflecting-Hand-Controller," NT Control No. MPO-17851; JPL Case No. 5348, pp. 1-4, Jun. 15, 1989.
69Stanley et al., "Computer Simulation of Interacting Dynamic Mechanical Systems Using Distributed Memory Parallel Processors," DSC-vol. 42, Advances in Robotics, pp. 55-61, ASME 1992.
70Su, S. Augustine et al., "The Virtual Panel Architecture: A 3D Gesture Framework," IEEE 1993.
71Tadros, "Control System Design for a Three Degree of Freedom Virtual Environment Simulator Using Motor/Brake Pair Actuators", MIT Archive © Massachusetts Institute of Technology, pp. 1-88, Feb. 1990.
72Terry et al., "Tactile Feedback In A Computer Mouse," Proceedings of Fouteenth Annual Northeast Bioengineering Conference, University of New Hampshire, Mar. 10-11, 1988.
73United States Patent and Tradmark Office , Office Action U.S. Appl. No. 09/875,458, mailed Jun. 28, 2006.
74United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Feb. 25, 2004.
75United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Jan. 1, 2006.
76United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Jan. 13, 2003.
77United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Jul. 29, 2003.
78United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Jun. 1, 2005.
79United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, mailed Sep. 29, 2004.
80United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 09/875,458, May 30, 2002.
81United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 11/594,668, mailed Mar. 26, 2010.
82United States Patent and Tradmark Office , Office Action, U.S. Appl. No. 11/594,668, mailed Oct. 20, 2009.
83Wiker, Steven F. et al., "Development of Tactile Mice for Blind Access to Computers: Importance of Stimulation Locus, Object Size, and Vibrotactile Display Resolution," Proceedings of the Human Factors Society 35th Annual Meeting 1991, pp. 708-712.
84Winey III, C., "Computer Simulated Visual and Tactile Feedback as an Aid to Manipulator and Vehicle Control," Mass. Inst. of Tech., 1981, pp. 1-79.
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US8708823 *Nov 3, 2011Apr 29, 2014Haitao ZhouGame controller with locating device having guider tracks and displacement sensor with touch sensor switch
US20130029763 *Nov 3, 2011Jan 31, 2013Haitao ZhouGame controller
Classifications
U.S. Classification345/161, 345/184, 345/157
International ClassificationG09G5/00, G06F3/033
Cooperative ClassificationG06F3/0338, G06F3/0362
European ClassificationG06F3/0362, G06F3/0338
Legal Events
DateCodeEventDescription
Feb 12, 2002ASAssignment
Effective date: 19991102
Free format text: MERGER;ASSIGNOR:IMMERSION CORPORATION (CALIFORNIA CORPORATION);REEL/FRAME:012607/0368
Owner name: IMMERSION CORPORATION (DELAWARE CORPORATION), CALI