Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS5288078 A
Publication typeGrant
Application numberUS 07/914,640
Publication dateFeb 22, 1994
Filing dateJul 16, 1992
Priority dateOct 14, 1988
Fee statusPaid
Also published asUS5521616
Publication number07914640, 914640, US 5288078 A, US 5288078A, US-A-5288078, US5288078 A, US5288078A
InventorsDavid G. Capper, Stan Axelrod
Original AssigneeDavid G. Capper
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Control interface apparatus
US 5288078 A
Abstract
This invention comprises a control interface between a machine and a physical object. The invention includes an infrared transmitter for transmitting a first infrared signal to an object. Upon striking the object, the infrared signal is reflected forming a reflected infrared signal. An infrared receiver receives the reflected signal from the object and the reflected signal is transformed into a second signal which may be either an analogue type or a yes/no threshold type, representative of a distance between the object and the receiver. The second signal is coupled to the machine. The apparatus is a cordless touch-free controller interface for use with a machine. The present invention is ideally suited for controlling cursor position for use with computers and also with video games.
Images(9)
Previous page
Next page
Claims(13)
What is claimed is:
1. A method by which a person may play a video game, the method comprising:
transmitting infrared radiation from a radiation transmitter to create a detection field;
providing a radiation receiver which has a fixed position relative to the radiation transmitter and this position being one where the receiver can detect reflections of the infrared radiation;
moving at least a part of the person to a position within the detection field thereby causing infrared radiation to reflect off of the part of the person within the detection field toward the receiver;
receiving radiation reflected from the part of the person within the detection field by the radiation receiver;
determining the strength of the radiation reflected from the part of the person within the detection field;
triggering an action in the game when the strength of the reflected radiation is above a predetermined threshold.
2. The method of claim 1 further comprising the step of depicting the action on a visual display.
3. A video game apparatus which allows a user to play a video game without touching the apparatus, comprising:
an infrared radiation transmitter for outwardly transmitting radiation to create a detection field into which the user may put at least a part or extension of the user's body;
a radiation receiver which has a fixed position relative to the radiation transmitter for receiving a reflection of the outwardly transmitted radiation that creates a detection field from the part or extension of the user within the detection field, and for producing a signal corresponding to the strength of the received reflection; and
a processor, operably associated with the radiation receiver, for triggering at least one predetermined action in the game when the signal corresponding to the strength of the reflected radiation is above a predetermined threshold.
4. The apparatus of claim 3 where the infrared radiation transmitter is an infrared light emitting diode.
5. The apparatus of claim 3 where the radiation receiver comprises a photo-transistor.
6. The apparatus of claim 3 where the signal is an analog signal.
7. The apparatus of claim 3 where the signal is a digital signal.
8. The apparatus of claim 3 where the radiation receiver may produce additional signals corresponding to changes in the strength of the received reflection.
9. The apparatus of claim 8 where the additional signals trigger additional predetermined actions in the game.
10. The apparatus of claim 3 further comprising:
a second infrared radiation transmitter for outwardly transmitting radiation to create a second detection field into which the user also may put at least a part or extension of the user's body; and
a second radiation receiver, operably associated with the second radiation transmitter, for receiving a reflection of the outwardly transmitted radiation that creates the second detection field from the part or extension of the user that is within the second detection field, and for producing a second signal corresponding to the strength of the received reflection;
where the processor may trigger at least one additional predetermined action in the game corresponding to the second signal.
11. The apparatus of claim 10 where the second radiation receiver has a fixed position relative to the second radiation transmitter.
12. The apparatus of claim 11 where the radiation transmitter and radiation receiver are positioned in a generally linear array and where the second radiation transmitter and second radiation receiver are also positioned in a generally linear array.
13. The apparatus of claim 3 where the radiation transmitter and the radiation receiver are positioned in a generally linear array.
Description

This is a continuation of application Ser. No. 07/258,157 filed Oct. 14, 1988, now abandoned.

A portion of the disclosure of this patent document, an Appendix contains material which is subject to copyright protection. The copyright owner has no objection to the facsimile reproduction by anyone of the patent document or the patent disclosure, as it appears in the Patent and Trademark Office patent file or records, but otherwise reserves all copyright rights whatsoever. The material is protected under U.S. Copyright Law by the notice "Copyright Axelrod 1988."

FIELD OF THE INVENTION

This invention relates to the field of a control interface for computer or video game. More particularly, this invention relates to determining the location of an object, such as a human hand, using infrared radiation transmission and reception. The apparatus disclosed and claimed is capable of operating selectively in such a way that location determination can be used to generate not only "threshold" (yes/no) type control signals, but also "analogue" (non-yes/no) type control signals.

BACKGROUND OF THE INVENTION

Controlling the position of a cursor is important when using certain types of machines such as a computer or video games. Cursors have been controlled by keyboard keystrokes. Improved ergonomic interfaces include a joystick, mouse and trackball. Using these devices, a computer operator has a better "feel" for adjusting cursor position relative to control operation. However, each of these devices requires the operator to move a physical object which is hard wired to the machine. Such a practice can be cumbersome or inefficient requiring the operator to retrieve and deposit the object for each use.

There are other non-keyboard devices which must be moved for controlling a cursor. These devices include such means measuring the doppler shift, combining infrared and ultrasonic transceivers, transmitting light from a light pen at the display screen, or affixing a radio transmitter or receiver to the head of the user. Such methods and devices are shown in Baker, et al., in U.S. Pat. No. 4,578,764, King et al., in U.S. Pat. No. 4,565,999, Davison, U.S. Pat. No. 4,682,159, Herrington et al., U.S. Pat. No. 4,645,648, and Mueller et al., U.S. Pat. No. 4,550,250.

Lefkowitz, in U.S. Pat. No. 4,524,348, discloses a cordless control interface between a physical object such as a part of the human body, and a machine. Movement of the physical object in a defined field is sensed, and signals corresponding to such movement are received, detected, amplified and produced as an input signal to the machine to move an element of the machine in the same direction as, and in an amount proportional to, movement of the object. In one embodiment, the machine is a video game system and the element is a display signal.

The Lefkowitz apparatus comprises planar antennas, such as sheets of metal and the antenna is coupled to a detunable oscillator. If a physical object, such as a human hand, is placed into the field of the oscillator, the presence of the hand due to body capacitance is communicated to a tuned circuit in the form of added capacitance to the combination of circuit capacitance causing an alteration in the frequency of the active oscillator. Such alteration is in the form of the lowering the operating frequency. Accordingly, the position of the hand is sensed. By moving the hand, the capacitance changes.

The sensed object is electrically coupled as a capacitance into the circuit of Lefkowitz through one or more antennas. The position of an electrically inert object, having no ability to effect the capacitance of the system, cannot be detected and located. Therefore, electrically inert objects cannot be used to control a cursor using Lefkowitz. In certain applications, such as video or computer games, a player may wish to wield an object, such as a sword, baseball bat or the like, to enhance the realism of play. Where such objects are electrically inert for safety or other reasons the object cannot be sensed.

The oscillations are typically in the radio frequency range. Accordingly, these devices are expensive to manufacture. Further, governmental restrictions are placed upon radio frequency devices requiring adherence to government restrictions.

An ideal cordless cursor control device would merely sense the position of any physical object, for example, the operator's hand without requiring the operator to move an object or requiring the use of expensive radio frequency oscillators. Such a device would be extremely easy for the user to operate. Further, such a device would greatly simplify and enhance the playing of video games.

SUMMARY OF THE INVENTION

This invention comprises a control interface between a machine and a physical object. The invention includes an infrared transmitter for transmitting a first infrared signal to an object. Upon striking the object, the infrared signal is reflected forming a reflected infrared signal. An infrared receiver receives the reflected signal from the object and the reflected signal is transformed into a second signal representative of a distance between the object and the receiver. The second signal, which may be either an analogue type, or a yes/no threshold type, is coupled to the machine. The apparatus is a cordless touch-free controller interface for use with a machine. The present invention is ideally suited for controlling cursor position for use with computers, video games or other display devices.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 shows a perspective view of one embodiment of the preferred embodiment of the present invention.

FIG. 2 shows a detailed view of a portion of the embodiment of FIG. 1.

FIG. 3 shows a schematic diagram of the embodiment of the invention of FIG. 1.

FIG. 4 shows a perspective diagram of an alternate embodiment of the present invention.

FIG. 5A is a schematic diagram of the transmitter of the preferred embodiment.

FIG. 5B is a schematic diagram of the receiver of the preferred embodiment.

FIG. 5C is a block diagram of the preferred embodiment.

FIG. 6A is a schematic diagram of the transmitter of an alternate embodiment of the present invention.

FIG. 6B is a schematic diagram of the receiver of an alternate embodiment.

FIG. 6C is a block diagram of a second alternate embodiment of the receiver circuit.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT

FIG. 1 shows a first embodiment of the present invention. A computer device 10 having a display screen 12 is shown. A first infrared transmitter 14 is associated with a first infrared receiver 16 forming an infrared transceiver 18. Similarly, a second infrared transmitter 20 is associated with a second infrared receiver 22 forming an infrared transceiver 24. Each transceiver 18 and 24 is coupled to a control circuit 26. The control circuit 26 is coupled to the computer 10 to control the cursor 28 on the screen 12.

In FIG. 1, the hand of the operator 30 is shown to move first to the left and then in an upward direction. If the computer 10 is operating in a conventional display mode, the cursor will traverse a path on the screen 12 mimicking (illustrated at 32) the path traversed by the movement of the operator's hand.

FIG. 2 shows a more detailed view of the infrared transmitter 14, the infrared receiver 16 and the operator's hand 30. In the preferred embodiment, the transmitter 14 is an infrared light emitting diode (LED) and the receiver 16 is a photo transistor. The receiver 16 could also be a photo diode. In addition, FIG. 2 shows a representation of the transmitted infrared radiation 34 and the reflected radiation 36. The infrared radiation 34 is transmitted from the infrared transmitter 14. The radiation is transmitted in all directions. For certain applications, the transmitted radiation might be columnized using lenses. Some portion of the transmitted radiation 34 will strike the operator's hand 30. That portion of the radiation striking the operator's hand will be reflected, also in all directions.

The strength of radiation is proportional to the distance from the source of the radiation. Accordingly, the amount of radiation received by the infrared receiver 16 is proportional to the distance that the radiation travels from the infrared transmitter 14 to the operator's hand plus the distance from the operator's hand to the infrared receiver 16, i.e., the path of the infrared signal from the transmitter to the receiver. Certain applications may require a radiation shield between the transmitter and receiver to prevent false readings.

The system of FIG. 1 shows transceiver 18 and transceiver 24 mounted into a single line in a planar surface. The distance of the operator's hand from each transceiver represents a location on the computer screen 12. A third transceiver pair could be mounted into the plane of the other two transceivers but not in the same line relative to three dimensional space. In the alternative, the transceivers could be mounted in planar surfaces oriented perpendicular to one another. In such an embodiment the cursor would be controlled by the distance of the object from each planar surface.

FIG. 3 shows two transceivers T1 and T2 mounted in the same plane. To determine the distance of the hand 30 from a plane defined by the transceivers T1 and T2 in an ideal system, the following equations are used. ##EQU1## An is the distance of the hand from the first transceiver T1 measured as a function of the strength of the signal received by the first transceiver.

Bn is the distance of the hand from the second transceiver T2 measured as a function of the strength of the signal received by the second transceiver.

Xn is the distance of the hand from the plane defined by the transceivers T1 and T2.

dn is the distance from transceiver T1 and the projection of the location of the hand on the plane defined by the transceivers T1 and T2.

D is the distance between transceivers T1 and T2.

EXAMPLE 1

d1 =1≈((3.16)2 -(6.71)2 +49)/14

x1 =3≈((3.16)2 -(1)2)1/2 
EXAMPLE 2

d2 =4≈((6.4)2 -(5.83)2 +49)/14

x2 =5≈(6.42 -42)1/2 
EXAMPLE 3

d3 =-3≈((6.71)2 -(11.66)2 +49)/14

x3 =6≈(6.71)2 -(-3)2 

Using the Pythagorean Theorem, these equations show that both lateral distance along and distance from the surface in which both transceivers are mounted can be measured using these two transceivers, T1 and T2. Referring to FIG. 3, the distance D between the two transceivers is always a constant. The two distances An and Bn of a given point from the two transceivers is measured using the relative strength of the signals received at each of the transceivers. Accordingly, we have two equations and two unknowns, Xn and dn. Using a third such transceiver mounted out of the line connecting T1 and T2 one can determine three dimensional spacing for applications requiring such information, such as mechanical CAD design or certain video games.

No system will be ideal. An operator's hand located within the volume of space sensed by the present invention will be represented on the display screen at a particular location. As the operator's hand is moved and positioned in a new location the relative analogue change in position will be displayed on the screen. More precise position identification can be obtained through the use of precision components, infrared lenses and circuitry.

In some applications, the infrared signals from a first transceiver after striking the operator's hand and thereby forming a reflected signal may inadvertently strike a second, unintended transceiver forming a false indication of location. Three possible ways to avoid such a false reading include: 1) having each transceiver operate within a unique segment of the infrared portion of the electromagnetic spectrum, 2) collimate the signals sent and received with optical lenses and 3) alternately pulse each transceiver for a predetermined length of time.

For example, in the embodiment of FIG. 1, each transceiver could be turned on for 100 milliseconds and then turned off for the same length of time. The other transceiver would be turned on when the first transceiver is turned off and vice versa. For three dimensional applications, each transceiver could operate for one-third of the duty cycle. Because infrared radiation travels at the speed of light, approximately 186,000 miles per second, only very short operating cycles are needed to pinpoint the location of the operator's hand with reasonable accuracy. In this way, inexpensive electronic components can be utilized and still maintain high accuracy of cursor control.

In some applications, it may be desirable for a cursor control device to sense the location of more than one object. In the preferred embodiment of the present invention, the cursor control device may be utilized with a Nintendo video game device. Nintendo is a trademark of Nintendo of America, Inc. If for example, the Mike Tyson Punch Out game is used, it may be necessary to sense each of the "boxer's" hands separately. In FIG. 4, the player 40 is positioned in order to see the display 42. The display 42 is controlled in the usual manner by the video game device 44 which in some circumstances may be a personal computer. The display 42 shows, among other things, a caricature of the player as a boxer and an opponent in a boxing ring.

In the Nintendo boxing game, some means must be used to identify a left punch and a right punch, as well as the differences between blows to the face and body jabs. Nintendo sells a controller which requires the player to press buttons or combinations of buttons to signify each such punch. When utilizing the present invention the location of each hand can be uniquely determined by having a screen 46 divide two playing areas each having a transceiver 48. The location of each hand is separately registered by the appropriate transceiver.

When utilizing the present invention with this game, the control circuitry can be set to punch after a particular threshold of reflected signal is received. This signifies that the player's hand is at least as close as some predefined limit to the transceiver. In the event that the player's hand is further from the transceiver than is necessary to achieve the appropriate threshold, no punch is indicated on the game screen. When the player's hand approaches the transceiver sufficiently close that the threshold is crossed, the display screen will then indicate a punch.

The game might be played without a dividing screen should the transceiver be able to differentiate between the player's left and right hands. This might be achieved for example, by having transceivers which are directional. The infrared radiation may be focused with lenses for both transmitter and the receiver to collimate the signal. Each transceiver would only transmit and receive infrared radiation within a sufficiently narrow field of view. This would avoid the problem of having a left punch being misinterpreted as a right punch. The player must take care to avoid moving his or her left hand into the right punch field of view. Accordingly, the apparatus can differentiate between the player's two hands.

FIG. 5A shows a transmitter of the preferred embodiment. The transmitter contains a one of eight multiplexer 100 having three select control lines TA, TB, and TC and an inhibit line TI. The inhibit line disables the multiplexer as necessary. The circuit for each of the eight outputs is preferably the same as each other output circuit; only three of these eight output circuits is shown to avoid unnecessarily complicating this circuit schematic diagram. The control lines are preferable controlled by the CPU shown in FIG. 5C.

The emitter of an input NPN transistor Q2 is coupled to an input port 102. The collector of transistor Q2 is coupled to a positive voltage V and the base of transistor Q2 to a current limiting bias resistor R1. The bias resistor is coupled to any convenient 1 KHz square wave source.

Each of the outputs of the multiplexer 100 is coupled to an appropriate output circuit. Each of the eight output circuits are the same as each other output circuit. The first output of the multiplexer is coupled to the base of an appropriate NPN transistor Q1. The emitter of the transistor Q1 is coupled to ground and the collector to the anode of a infrared light emitting diode LED1. All of the LEDs 1 through 8 are coupled to the positive voltage supply V through a second current limiting bias resistor R2.

The select channels A, B and C singly select one of the eight outputs of the multiplexer 100 to be active. The input transistor Q1 and the selected output transistor Q2 operate together as a Darlington pair to drive the selected light emitting diode LED1.

FIG. 5B shows the receiver circuit of the preferred embodiment. The infrared light received at the receiver strikes an appropriate one of the eight phototransistors Q3. The circuit for each of the eight inputs is preferably the same as each other input circuit; only two of these eight inputs circuits is shown to avoid unnecessarily complicating this circuit schematic diagram. Three control lines to the multiplexer circuit RA, RB and RC select the appropriate one of the eight the input circuits. The control lines are preferable controlled by the CPU shown in FIG. 5C. An inhibit line RI is also supplied to inactivate the receiver multiplexer if necessary.

The emitter of the phototransistor Q3 is coupled to a negative voltage supply -V2. The collector of the phototransistor Q3 is coupled to a positive voltage supply V1 through a current limiting bias resistor R3. The collector of the phototransistor Q3 is also coupled to one of the eight inputs of a one of eight multiplexer 104 through a high pass filter formed by a capacitor C1 and a resistor R4 to reduce the low frequency hum and decouple dc offset caused by variations in phototransistor gains. The resistor R4 is coupled between the input of the multiplexer and ground.

The output of the multiplexer 104 is coupled to a resistor R5 which sets the reference point for the gain stages. The resistor R5 is also coupled to ground. The output of the multiplexer is also coupled to a first terminal of capacitor C2. The second terminal of the capacitor C2 is coupled to the positive input of a first operational amplifier A1 and to a first terminal of a resistor R6. The second terminal of the resistor R6 is coupled to ground. The capacitor C2 and the resistor R6 operate as a high pass filter for the positive input of the operational amplifier A1.

The negative input of the operational amplifier A1 is coupled to ground through the series connection of a resistor R7 and a capacitor C3. The negative input of the operational A1 is also coupled to its output through a resistor R8. The output of the operational amplifier A1 is coupled to the positive input of the second operational amplifier A2.

The negative input of the second operational amplifier A2 is coupled to a resistor R9. The other terminal of the resistor R9 is coupled to the sliding terminal of a potentiometer R10. A first fixed terminal of the potentiometer R10 is coupled to ground and the second fixed terminal of the potentiometer is coupled to the negative supply voltage -V2. Accordingly, the appropriate potential can be applied to the negative input of the second operational amplifier A2 through the adjustable voltage divider network of the potentiometer.

The negative input of the second operational amplifier A2 is also coupled to its output through a resistor R11. The output of the second operational amplifier is coupled to an analog to digital converter through as resistor R12. The terminal of the resistor R12 which is not coupled to the second operational amplifier is coupled to the anode of a diode D1. The cathode of the diode D1 is coupled to the circuit ground.

FIG. 5C shows a block diagram of the circuit of the preferred embodiment of the present invention. The transmitter and receiver sections are representational only and it should be understood that the transmitter and receiver are those shown in FIGS. 5A and 5B, respectively. Circuits having similar function and using different components can be designed and still fall within the spirit and scope of the present invention.

The channel select lines A, B and C and the inhibit line for the transmitter and receiver are coupled together and driven by a CPU 106. In the preferred embodiment the CPU is an Apple II. The CPU 106 can also be used to generate the 1 KHz square signal used to drive the selected infrared light emitting diode LED1. The output of the receiver op amp A2 is coupled to an analog to digital converter 108 (A/D converter). The A/D converter 108 forms an eight bit binary representation of the strength of the infrared received by the receiver circuit. That signal is supplied to the CPU 106 through pins Pa0 through Pa7. The CPU can control the A/D converter through the conversion control line as shown.

The CPU operates on these eight bits and supplies the appropriate information to the output interface 110 in order to control the display (not shown). The CPU 106 can control the output interface through the handshake line as shown.

The attached Appendix contains a copyrighted computer program. This computer program can be used by an Apple II to control the circuit of the preferred embodiment to interface with a Nintendo video game system.

FIG. 6A shows a circuit schematic of the transmitter of an alternate embodiment of the present invention. A clocking circuit 50 operates to drive an infrared LED 52 which has its negative terminal grounded. The LED 52 is loaded by resistor 54. In the preferred embodiment the clock circuit 50 is an LM555 having eight input contact pins P1 through P8. The transmitter circuit also has a power supply Vcc and a circuit ground. Pin P8 and P4 are coupled to Vcc. Pin P1 is coupled to ground. Pin P7 is coupled to pins P8 and P6 through resistors 56 and 57, respectively. Pin P1 is coupled to pins P2 and P5 through capacitors 58 and 59, respectively. Pin P3 is coupled to the load resistor 54 which in turn is coupled to the positive terminal of the LED 52.

FIG. 6B shows a circuit schematic of the receiver of an alternate embodiment of the present invention. A reflected infrared signal 58 impinges on a phototransistor 60. The transistor is coupled to and biased by power supplies Vcc, coupled to the collector, and Vee, coupled to the emitter. In certain circumstances it may be desireable to replace the phototransistor 60 with a photodiode. The phototransistor 60 may be loaded by a resistor 68.

The signal is ac coupled to the amplifier circuit 62 by the capacitor 70. The ac coupling can eliminate dc shift which can occur from ambient light such as sunlight or artificial room light. In this way only the signal resulting from the reflected signal will be amplified.

The signal developed by the phototransistor 60 is amplified in the amplifier circuit 62. The amplifier circuit includes a feedback resistor 72 and an operational amplifier 74 (op amp). The feedback resistor 72 may be a variable resistor. The resistor 72 is coupled between the output and the negative input of the op amp 74. The coupling capacitor 70 is coupled between the collector of the phototransistor 60 and the negative input of the op amp 74.

The signal is then filtered in high pass filter 64 which eliminates power line noise, hum and other interference. In the preferred embodiment, the filter includes two identical stages. The amplified signal from the amplifier circuit is applied to an filter input capacitor 76. A second filter capacitor 78 is coupled between the filter input capacitor 76 and the positive input of an op amp 80. A feedback resistor 82 is coupled between the output of the op amp 80 and the node coupling the filter input capacitor 76 and the second filter capacitor 78. A biasing resistor 84 is coupled between the positive input of the op amp 80 and ground. The negative input and the output of the op amp 80 are coupled together. A second similar filter may be used, as shown, to further remove unwanted line noise.

In some applications it may be desired to amplify the signal after filtering out the noise. The second amplifier circuit 66 has an input resistor 86 coupled to the negative input of the op amp 88. A feedback resistor 90 is coupled between the output and the negative of the op amp 88. The feedback resistor 90 may be variable to adjust the gain of the amplifier circuit 66. The positive input of the op amp 88 is coupled to ground.

The amplified signal is then applied to a comparator circuit 92 to determine the strength of the received signal. The output of the comparator 92 may be applied to a computer to be analyzed in controlling the cursor. An analog to digital circuit may be substituted for the comparator. Two or three of these circuits can be used to pinpoint a physical object in two or three dimensional space, respectively, for cursor control representing these dimensions.

In the alternative, the signal received by the phototransistor 60 can be directly applied to an analog to digital converter 94 as shown in FIG. 6C. The output of the analog to digital converter 94 is applied to a processor 96 which can digitally remove extraneous spurious signals and operate on the desired signal as necessary for the particular application. The processor 96 can be any commercially available processor such as a personal computer, video game or microprocessor chip.

The present invention has been described relative to specific embodiments. The system described herein clearly has a great deal of operational versatility. It can be used to effect not only a threshold (yes/no) type response (control signal), but also a gradient, analogue type response (control signal). Put another way, with the responded-to object producing reflected infrared above a certain level, or above different selected, specific levels, a related response is threshold "triggerable". In addition, responsive activity can "analogue-track" with the real-time, actual, reflected infrared level. Accordingly, response activity can range from a simple, single-mode characteristic to different, more complex, multiple-mode characteristics.

It will thus be clear to persons of ordinary skill in the art that the present invention may be utilized in a variety of applications. Modifications which become apparent to persons skilled in the art after reading this patent are deemed within the scope of the present invention.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US3886361 *Oct 19, 1973May 27, 1975Ericsson Telefon Ab L MDevice for contactless positional indication
US4111421 *Dec 9, 1976Sep 5, 1978The Magnavox CompanyOptical linked remote control video game apparatus
US4137651 *May 19, 1978Feb 6, 1979The United States Of America As Represented By The Secretary Of The ArmyMoving target practice firing simulator
US4210329 *Nov 23, 1977Jul 1, 1980Loewe-Opta GmbhVideogame with mechanically disjoint target detector
US4521772 *Jan 13, 1983Jun 4, 1985Xerox CorporationCursor control device
US4521870 *Oct 18, 1984Jun 4, 1985Ampex CorporationAudio/video system having touch responsive function display screen
US4524348 *Sep 26, 1983Jun 18, 1985Lefkowitz Leonard RControl interface
US4545583 *Dec 23, 1982Oct 8, 1985Showdown Electronics, Inc.Electronic gun and target apparatus and method
US4550250 *Nov 14, 1983Oct 29, 1985Hei, Inc.Cordless digital graphics input device
US4565999 *Apr 1, 1983Jan 21, 1986Prime Computer, Inc.For a data terminal
US4695953 *Apr 14, 1986Sep 22, 1987Blair Preston ETV animation interactively controlled by the viewer
US4713545 *Feb 18, 1986Dec 15, 1987Besam AbDevice for detecting objects, particularly such in motion
US4771344 *Nov 13, 1986Sep 13, 1988James FallacaroSystem for enhancing audio and/or visual presentation
US4791416 *Jul 12, 1985Dec 13, 1988Zenith Electronics CorporationTouch control system for controllable apparatus
US4796019 *Feb 19, 1987Jan 3, 1989Rca Licensing CorporationRemote control system
US4799687 *Feb 18, 1987Jan 24, 1989Davis Dennis WProjected image tag game
US4910464 *Jul 7, 1988Mar 20, 1990Formula Systems LimitedProximity detector
US4924216 *Feb 12, 1988May 8, 1990Acemore International Ltd.Joystick controller apparatus
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US5423554 *Sep 24, 1993Jun 13, 1995Metamedia Ventures, Inc.Virtual reality game method and apparatus
US5435557 *Aug 23, 1994Jul 25, 1995Coffey; Timothy M.Video-game screen divider
US5453758 *Jul 29, 1993Sep 26, 1995Sony CorporationInput apparatus
US5521616 *Feb 18, 1994May 28, 1996Capper; David G.Control interface apparatus
US5616078 *Dec 27, 1994Apr 1, 1997Konami Co., Ltd.Motion-controlled video entertainment system
US5623587 *Jun 12, 1995Apr 22, 1997Kideo Productions, Inc.Method and apparatus for producing an electronic image
US5649861 *Aug 25, 1994Jul 22, 1997Sega Enterprises, Ltd.Game device for displaying game input operations on the display
US5650608 *Dec 19, 1994Jul 22, 1997Tv Interactive Data CorporationMethod and apparatus for generating ratiometric control signals
US5686940 *Dec 23, 1994Nov 11, 1997Rohm Co., Ltd.Display apparatus
US5727188 *Jan 19, 1996Mar 10, 1998Hayes; Charles L.Flight-control simulator for computer games
US5764164 *Feb 7, 1997Jun 9, 1998Reality Quest Corp.Ergonomic hand-attachable controller
US5796354 *Feb 7, 1997Aug 18, 1998Reality Quest Corp.Hand-attachable controller with direction sensing
US5818037 *Apr 9, 1996Oct 6, 1998Tv Interactive Data CorporationController using a flexible element to vary light transferred to a photosensitive element
US5847694 *Dec 19, 1994Dec 8, 1998Tv Interactive Data CorporationApparatus for generating a signal indicative of the position of a movable element in the apparatus
US5898421 *May 7, 1996Apr 27, 1999Gyration, Inc.Gyroscopic pointer and method
US5899809 *Jun 5, 1997May 4, 1999Landa Cosio; Nicolas ArriojaDigital body interface apparatus
US5913727 *Jun 13, 1997Jun 22, 1999Ahdoot; NedInteractive movement and contact simulation game
US5973313 *Nov 19, 1996Oct 26, 1999Tv Interactive Data CorporationMethod and apparatus for generating ratiometric control signals
US5974262 *Aug 15, 1997Oct 26, 1999Fuller Research CorporationSystem for generating output based on involuntary and voluntary user input without providing output information to induce user to alter involuntary input
US6079709 *Jun 11, 1999Jun 27, 2000Ethridge; MichaelScreen segment, viewing isolation apparatus
US6130663 *Jul 31, 1997Oct 10, 2000Null; Nathan D.Touchless input method and apparatus
US6183365 *May 23, 1997Feb 6, 2001Casio Computer Co., Ltd.Movement measuring device, electronic game machine including movement measuring device, and method of playing game machine
US6229526 *Dec 18, 1997May 8, 2001International Business Machines CorporationMethod and system for simultaneous operation of multiple handheld IR control devices in a data processing system
US6308565Oct 15, 1998Oct 30, 2001Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US6419580 *Nov 5, 1999Jul 16, 2002Kabushuki Kaisha Sega EnterprisesPlayer object displayed in second configuration even if cursor moves outside prescribed area
US6430997Sep 5, 2000Aug 13, 2002Trazer Technologies, Inc.System and method for tracking and assessing movement skills in multidimensional space
US6524186 *May 28, 1999Feb 25, 2003Sony Computer Entertainment, Inc.Game input means to replicate how object is handled by character
US6585593Sep 28, 1999Jul 1, 2003Sega Enterprises, Ltd.Game device for displaying game input operations on the display
US6592455 *Sep 28, 1999Jul 15, 2003Sega Enterprises, Ltd.Game device for displaying game input operations on the display
US6607443Oct 28, 1998Aug 19, 2003Kabushiki Kaisha Sega EnterprisesGame device
US6638160 *Jun 21, 2001Oct 28, 2003Konami CorporationGame system allowing calibration of timing evaluation of a player operation and storage medium to be used for the same
US6642917 *Nov 13, 2000Nov 4, 2003Namco, Ltd.Sign perception system, game system, and computer-readable recording medium having game program recorded thereon
US6703999 *Nov 13, 2000Mar 9, 2004Toyota Jidosha Kabushiki KaishaSystem for computer user interface
US6765726Jul 17, 2002Jul 20, 2004Impluse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US6876496Jul 9, 2004Apr 5, 2005Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US6955603Jan 29, 2002Oct 18, 2005Jeffway Jr Robert WInteractive gaming device capable of perceiving user movement
US7001272 *Mar 27, 2002Feb 21, 2006Konami CorporationVideo game device, video game method, video game program, and video game system
US7038855Apr 5, 2005May 2, 2006Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US7070500 *Sep 5, 2000Jul 4, 2006Konami CorporationMusical player-motion sensing game system
US7128651Jun 9, 2003Oct 31, 2006Kabushiki Kaisha Sega EnterprisesCard game for displaying images based on sound recognition
US7137861Jul 16, 2003Nov 21, 2006Carr Sandra LInteractive three-dimensional multimedia I/O device for a computer
US7255351Sep 20, 2004Aug 14, 2007Shuffle Master, Inc.Interactive simulated blackjack game with side bet apparatus and in method
US7292151Jul 22, 2005Nov 6, 2007Kevin FergusonHuman movement measurement system
US7309065Sep 14, 2004Dec 18, 2007Shuffle Master, Inc.Interactive simulated baccarat side bet apparatus and method
US7333089 *Jan 6, 1999Feb 19, 2008Matthew Davis GardComputer interface device
US7335105 *Aug 20, 2002Feb 26, 2008Ssd Company LimitedSoccer game apparatus
US7359121May 1, 2006Apr 15, 2008Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US7367563Sep 10, 2004May 6, 2008Shuffle Master, Inc.Interactive simulated stud poker apparatus and method
US7492268Nov 6, 2007Feb 17, 2009Motiva LlcHuman movement measurement system
US7594853 *Jun 12, 2006Sep 29, 2009Canon Kabushiki KaishaControl apparatus and method for games and others
US7632187 *Sep 27, 2004Dec 15, 2009Hasbro, Inc.Device and method for an electronic tag game
US7661676Jan 26, 2004Feb 16, 2010Shuffle Master, IncorporatedCard shuffler with reading capability integrated into multiplayer automated gaming table
US7791808Apr 10, 2008Sep 7, 2010Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US7815507 *Jun 18, 2004Oct 19, 2010IgtGame machine user interface using a non-contact eye motion recognition device
US7846028May 18, 2006Dec 7, 2010Shoot The Moon Products Ii, LlcLazer tag advanced
US7859551Feb 25, 2002Dec 28, 2010Bulman Richard LObject customization and presentation system
US7864168May 10, 2006Jan 4, 2011Impulse Technology Ltd.Virtual reality movement system
US7874918 *Nov 3, 2006Jan 25, 2011Mattel Inc.Game unit with motion and orientation sensing controller
US7952483Feb 16, 2009May 31, 2011Motiva LlcHuman movement measurement system
US8159354Apr 28, 2011Apr 17, 2012Motiva LlcHuman movement measurement system
US8213680Mar 19, 2010Jul 3, 2012Microsoft CorporationProxy training data for human body tracking
US8253746May 1, 2009Aug 28, 2012Microsoft CorporationDetermine intended motions
US8264536Aug 25, 2009Sep 11, 2012Microsoft CorporationDepth-sensitive imaging via polarization-state mapping
US8265341Jan 25, 2010Sep 11, 2012Microsoft CorporationVoice-body identity correlation
US8267781Jan 30, 2009Sep 18, 2012Microsoft CorporationVisual target tracking
US8272958Jan 26, 2004Sep 25, 2012Shuffle Master, Inc.Automated multiplayer game table with unique image feed of dealer
US8279418Mar 17, 2010Oct 2, 2012Microsoft CorporationRaster scanning for depth detection
US8284847May 3, 2010Oct 9, 2012Microsoft CorporationDetecting motion for a multifunction sensor device
US8294767Jan 30, 2009Oct 23, 2012Microsoft CorporationBody scan
US8295546Oct 21, 2009Oct 23, 2012Microsoft CorporationPose tracking pipeline
US8296151Jun 18, 2010Oct 23, 2012Microsoft CorporationCompound gesture-speech commands
US8320619Jun 15, 2009Nov 27, 2012Microsoft CorporationSystems and methods for tracking a model
US8320621Dec 21, 2009Nov 27, 2012Microsoft CorporationDepth projector system with integrated VCSEL array
US8325909Jun 25, 2008Dec 4, 2012Microsoft CorporationAcoustic echo suppression
US8325984Jun 9, 2011Dec 4, 2012Microsoft CorporationSystems and methods for tracking a model
US8328613Nov 15, 2010Dec 11, 2012Hasbro, Inc.Game tower
US8330134Sep 14, 2009Dec 11, 2012Microsoft CorporationOptical fault monitoring
US8330822Jun 9, 2010Dec 11, 2012Microsoft CorporationThermally-tuned depth camera light source
US8340432Jun 16, 2009Dec 25, 2012Microsoft CorporationSystems and methods for detecting a tilt angle from a depth image
US8351651Apr 26, 2010Jan 8, 2013Microsoft CorporationHand-location post-process refinement in a tracking system
US8351652Feb 2, 2012Jan 8, 2013Microsoft CorporationSystems and methods for tracking a model
US8363212Apr 2, 2012Jan 29, 2013Microsoft CorporationSystem architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8374423Mar 2, 2012Feb 12, 2013Microsoft CorporationMotion detection using depth images
US8379101May 29, 2009Feb 19, 2013Microsoft CorporationEnvironment and/or target segmentation
US8379919Apr 29, 2010Feb 19, 2013Microsoft CorporationMultiple centroid condensation of probability distribution clouds
US8381108Jun 21, 2010Feb 19, 2013Microsoft CorporationNatural user input for driving interactive stories
US8385557Jun 19, 2008Feb 26, 2013Microsoft CorporationMultichannel acoustic echo reduction
US8385596Dec 21, 2010Feb 26, 2013Microsoft CorporationFirst person shooter control with virtual skeleton
US8390680Jul 9, 2009Mar 5, 2013Microsoft CorporationVisual representation expression based on player expression
US8401225Jan 31, 2011Mar 19, 2013Microsoft CorporationMoving object segmentation using depth images
US8401242Jan 31, 2011Mar 19, 2013Microsoft CorporationReal-time camera tracking using depth maps
US8408706Dec 13, 2010Apr 2, 2013Microsoft Corporation3D gaze tracker
US8411948Mar 5, 2010Apr 2, 2013Microsoft CorporationUp-sampling binary images for segmentation
US8416187Jun 22, 2010Apr 9, 2013Microsoft CorporationItem navigation using motion-capture data
US8418085May 29, 2009Apr 9, 2013Microsoft CorporationGesture coach
US8422769Mar 5, 2010Apr 16, 2013Microsoft CorporationImage segmentation using reduced foreground training data
US8427325Mar 23, 2012Apr 23, 2013Motiva LlcHuman movement measurement system
US8428340Sep 21, 2009Apr 23, 2013Microsoft CorporationScreen space plane identification
US8437506Sep 7, 2010May 7, 2013Microsoft CorporationSystem for fast, probabilistic skeletal tracking
US8448056Dec 17, 2010May 21, 2013Microsoft CorporationValidation analysis of human target
US8448094Mar 25, 2009May 21, 2013Microsoft CorporationMapping a natural input device to a legacy system
US8451278Aug 3, 2012May 28, 2013Microsoft CorporationDetermine intended motions
US8452051Dec 18, 2012May 28, 2013Microsoft CorporationHand-location post-process refinement in a tracking system
US8452087Sep 30, 2009May 28, 2013Microsoft CorporationImage selection techniques
US8456419Apr 18, 2008Jun 4, 2013Microsoft CorporationDetermining a position of a pointing device
US8457353May 18, 2010Jun 4, 2013Microsoft CorporationGestures and gesture modifiers for manipulating a user-interface
US8460103Jul 6, 2007Jun 11, 2013IgtGesture controlled casino gaming system
US8467574Oct 28, 2010Jun 18, 2013Microsoft CorporationBody scan
US8475252May 30, 2007Jul 2, 2013Shfl Entertainment, Inc.Multi-player games with individual player decks
US8483436Nov 4, 2011Jul 9, 2013Microsoft CorporationSystems and methods for tracking a model
US8485903 *Dec 10, 2010Jul 16, 2013Kico Sound LlcElectronic gaming device with feedback
US8487871Jun 1, 2009Jul 16, 2013Microsoft CorporationVirtual desktop coordinate transformation
US8487938Feb 23, 2009Jul 16, 2013Microsoft CorporationStandard Gestures
US8488888Dec 28, 2010Jul 16, 2013Microsoft CorporationClassification of posture states
US8497838Feb 16, 2011Jul 30, 2013Microsoft CorporationPush actuation of interface controls
US8498481May 7, 2010Jul 30, 2013Microsoft CorporationImage segmentation using star-convexity constraints
US8499257Feb 9, 2010Jul 30, 2013Microsoft CorporationHandles interactions for humanócomputer interface
US8503086Aug 16, 2010Aug 6, 2013Impulse Technology Ltd.System and method for tracking and assessing movement skills in multidimensional space
US8503494Apr 5, 2011Aug 6, 2013Microsoft CorporationThermal management system
US8503766Dec 13, 2012Aug 6, 2013Microsoft CorporationSystems and methods for detecting a tilt angle from a depth image
US8508919Sep 14, 2009Aug 13, 2013Microsoft CorporationSeparation of electrical and optical components
US8509479Jun 16, 2009Aug 13, 2013Microsoft CorporationVirtual object
US8509545Nov 29, 2011Aug 13, 2013Microsoft CorporationForeground subject detection
US8514269Mar 26, 2010Aug 20, 2013Microsoft CorporationDe-aliasing depth images
US8523667Mar 29, 2010Sep 3, 2013Microsoft CorporationParental control settings based on body dimensions
US8526734Jun 1, 2011Sep 3, 2013Microsoft CorporationThree-dimensional background removal for vision system
US8542252May 29, 2009Sep 24, 2013Microsoft CorporationTarget digitization, extraction, and tracking
US8542910Feb 2, 2012Sep 24, 2013Microsoft CorporationHuman tracking system
US8548270Oct 4, 2010Oct 1, 2013Microsoft CorporationTime-of-flight depth imaging
US8553934Dec 8, 2010Oct 8, 2013Microsoft CorporationOrienting the position of a sensor
US8553939Feb 29, 2012Oct 8, 2013Microsoft CorporationPose tracking pipeline
US8558873Jun 16, 2010Oct 15, 2013Microsoft CorporationUse of wavefront coding to create a depth image
US8564534Oct 7, 2009Oct 22, 2013Microsoft CorporationHuman tracking system
US8565476Dec 7, 2009Oct 22, 2013Microsoft CorporationVisual target tracking
US8565477Dec 7, 2009Oct 22, 2013Microsoft CorporationVisual target tracking
US8565485Sep 13, 2012Oct 22, 2013Microsoft CorporationPose tracking pipeline
US8571263Mar 17, 2011Oct 29, 2013Microsoft CorporationPredicting joint positions
US8574050Nov 2, 2006Nov 5, 2013Mattel, Inc.Game unit with dual joystick controllers
US8577084Dec 7, 2009Nov 5, 2013Microsoft CorporationVisual target tracking
US8577085Dec 7, 2009Nov 5, 2013Microsoft CorporationVisual target tracking
US8578302Jun 6, 2011Nov 5, 2013Microsoft CorporationPredictive determination
US8587583Jan 31, 2011Nov 19, 2013Microsoft CorporationThree-dimensional environment reconstruction
US8587773Dec 13, 2012Nov 19, 2013Microsoft CorporationSystem architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
US8588465Dec 7, 2009Nov 19, 2013Microsoft CorporationVisual target tracking
US8588517Jan 15, 2013Nov 19, 2013Microsoft CorporationMotion detection using depth images
US8592739Nov 2, 2010Nov 26, 2013Microsoft CorporationDetection of configuration changes of an optical element in an illumination system
US8597142Sep 13, 2011Dec 3, 2013Microsoft CorporationDynamic camera based practice mode
US8605763Mar 31, 2010Dec 10, 2013Microsoft CorporationTemperature measurement and control for laser and light-emitting diodes
US8610665Apr 26, 2013Dec 17, 2013Microsoft CorporationPose tracking pipeline
US8611607Feb 19, 2013Dec 17, 2013Microsoft CorporationMultiple centroid condensation of probability distribution clouds
US8613666Aug 31, 2010Dec 24, 2013Microsoft CorporationUser selection and navigation based on looped motions
US8618405Dec 9, 2010Dec 31, 2013Microsoft Corp.Free-space gesture musical instrument digital interface (MIDI) controller
US8619122Feb 2, 2010Dec 31, 2013Microsoft CorporationDepth camera compatibility
US8620113Apr 25, 2011Dec 31, 2013Microsoft CorporationLaser diode modes
US8625837Jun 16, 2009Jan 7, 2014Microsoft CorporationProtocol and format for communicating an image from a camera to a computing environment
US8629976Feb 4, 2011Jan 14, 2014Microsoft CorporationMethods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
US8630457Dec 15, 2011Jan 14, 2014Microsoft CorporationProblem states for pose tracking pipeline
US8631355Jan 8, 2010Jan 14, 2014Microsoft CorporationAssigning gesture dictionaries
US8633890Feb 16, 2010Jan 21, 2014Microsoft CorporationGesture detection based on joint skipping
US8635637Dec 2, 2011Jan 21, 2014Microsoft CorporationUser interface presenting an animated avatar performing a media reaction
US8638985Mar 3, 2011Jan 28, 2014Microsoft CorporationHuman body pose estimation
US8644609Mar 19, 2013Feb 4, 2014Microsoft CorporationUp-sampling binary images for segmentation
US8649554May 29, 2009Feb 11, 2014Microsoft CorporationMethod to control perspective for a camera-controlled computer
US8655069Mar 5, 2010Feb 18, 2014Microsoft CorporationUpdating image segmentation following user input
US8659658Feb 9, 2010Feb 25, 2014Microsoft CorporationPhysical interaction zone for gesture-based user interfaces
US8660303Dec 20, 2010Feb 25, 2014Microsoft CorporationDetection of body and props
US8660310Dec 13, 2012Feb 25, 2014Microsoft CorporationSystems and methods for tracking a model
US8667519Nov 12, 2010Mar 4, 2014Microsoft CorporationAutomatic passive and anonymous feedback system
US8668584Sep 14, 2012Mar 11, 2014IgtVirtual input system
US8670029Jun 16, 2010Mar 11, 2014Microsoft CorporationDepth camera illuminator with superluminescent light-emitting diode
US8675981Jun 11, 2010Mar 18, 2014Microsoft CorporationMulti-modal gender recognition including depth data
US8676581Jan 22, 2010Mar 18, 2014Microsoft CorporationSpeech recognition analysis via identification information
US8681255Sep 28, 2010Mar 25, 2014Microsoft CorporationIntegrated low power depth camera and projection device
US8681321Dec 31, 2009Mar 25, 2014Microsoft International Holdings B.V.Gated 3D camera
US8682028Dec 7, 2009Mar 25, 2014Microsoft CorporationVisual target tracking
US8684839Jul 6, 2007Apr 1, 2014IgtControl of wager-based game using gesture recognition
US8687044Feb 2, 2010Apr 1, 2014Microsoft CorporationDepth camera compatibility
US8693724May 28, 2010Apr 8, 2014Microsoft CorporationMethod and system implementing user-centric gesture control
US8702507Sep 20, 2011Apr 22, 2014Microsoft CorporationManual and camera-based avatar control
US8707216Feb 26, 2009Apr 22, 2014Microsoft CorporationControlling objects via gesturing
US8717469Feb 3, 2010May 6, 2014Microsoft CorporationFast gating photosurface
US8723118Oct 1, 2009May 13, 2014Microsoft CorporationImager for constructing color and depth images
US8724887Feb 3, 2011May 13, 2014Microsoft CorporationEnvironmental modifications to mitigate environmental factors
US8724906Nov 18, 2011May 13, 2014Microsoft CorporationComputing pose and/or shape of modifiable entities
US8744121May 29, 2009Jun 3, 2014Microsoft CorporationDevice for identifying and tracking multiple humans over time
US8745541Dec 1, 2003Jun 3, 2014Microsoft CorporationArchitecture for controlling a computer using hand gestures
US8749557Jun 11, 2010Jun 10, 2014Microsoft CorporationInteracting with user interface via avatar
US8751215Jun 4, 2010Jun 10, 2014Microsoft CorporationMachine based sign language interpreter
US8760395May 31, 2011Jun 24, 2014Microsoft CorporationGesture recognition techniques
US8760571Sep 21, 2009Jun 24, 2014Microsoft CorporationAlignment of lens and image sensor
US8762894Feb 10, 2012Jun 24, 2014Microsoft CorporationManaging virtual ports
US8773355Mar 16, 2009Jul 8, 2014Microsoft CorporationAdaptive cursor sizing
US8775916May 17, 2013Jul 8, 2014Microsoft CorporationValidation analysis of human target
US8777748 *Jul 16, 2013Jul 15, 2014Kico Sound LlcElectronic gaming device with feedback
US8781156Sep 10, 2012Jul 15, 2014Microsoft CorporationVoice-body identity correlation
US8782567Nov 4, 2011Jul 15, 2014Microsoft CorporationGesture recognizer system architecture
US8786730Aug 18, 2011Jul 22, 2014Microsoft CorporationImage exposure using exclusion regions
US8787658Mar 19, 2013Jul 22, 2014Microsoft CorporationImage segmentation using reduced foreground training data
US8788973May 23, 2011Jul 22, 2014Microsoft CorporationThree-dimensional gesture controlled avatar configuration interface
US8803800Dec 2, 2011Aug 12, 2014Microsoft CorporationUser interface control based on head orientation
US8803888Jun 2, 2010Aug 12, 2014Microsoft CorporationRecognition system for sharing information
US8803952Dec 20, 2010Aug 12, 2014Microsoft CorporationPlural detector time-of-flight depth mapping
US8811938Dec 16, 2011Aug 19, 2014Microsoft CorporationProviding a user interface experience based on inferred vehicle state
US20090280901 *May 9, 2008Nov 12, 2009Dell Products, LpGame controller device and methods thereof
US20110086709 *Dec 10, 2010Apr 14, 2011Kico Sound LlcElectronic sword game with input and feedback
US20120157198 *Dec 21, 2010Jun 21, 2012Microsoft CorporationDriving simulator control with virtual skeleton
USRE41520Oct 12, 2000Aug 17, 2010Thomson LicensingGyroscopic pointer and method
EP1524015A1 *Jul 9, 2003Apr 20, 2005SSD Company LimitedBoxing game system
WO1995035135A1 *Jun 16, 1995Dec 28, 1995Sports Sciences IncSensing spatial movement
WO1996005766A1 *Aug 22, 1995Feb 29, 1996Assist Advanced Tech LtdA user controlled combination video game and exercise system
WO1996019821A1 *Dec 12, 1995Jun 27, 1996Tv Interactive Data CorpMethod and apparatus for generating ratiometric control signals
Classifications
U.S. Classification463/39, 345/156, 463/8
International ClassificationG06F3/03, G06F3/033, G06K11/06, G06F3/038, G06F3/041, G06F3/048, G06F3/042, G06F3/00, A63F13/06, G06F3/01
Cooperative ClassificationA63F13/06, G06F3/0421, G06F3/011, A63F2300/1031, A63F2300/1012, A63F2300/8029, A63F2300/1062
European ClassificationA63F13/06, G06F3/01B, G06F3/042B
Legal Events
DateCodeEventDescription
Jul 20, 2005FPAYFee payment
Year of fee payment: 12
Aug 20, 2001FPAYFee payment
Year of fee payment: 8
Feb 23, 1998SULPSurcharge for late payment
Feb 23, 1998FPAYFee payment
Year of fee payment: 4
Sep 30, 1997REMIMaintenance fee reminder mailed