|Publication number||US4702475 A|
|Application number||US 06/890,716|
|Publication date||Oct 27, 1987|
|Filing date||Jul 25, 1986|
|Priority date||Aug 16, 1985|
|Also published as||EP0213533A2, EP0213533A3|
|Publication number||06890716, 890716, US 4702475 A, US 4702475A, US-A-4702475, US4702475 A, US4702475A|
|Inventors||Rick A. Elstein, Svein Faret, John J. Gazzo|
|Original Assignee||Innovating Training Products, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (26), Referenced by (441), Classifications (28), Legal Events (6)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This patent application is a continuation-in-part application of patent application Ser. No. 766,913, filed Aug. 16, 1985 for Apparatus For Accelerated Reaction Training.
1. Field of the Invention
The present invention relates generally to a Sports Technique And Reaction Training (START) system which is a highly sophisticated training system with programming capabilities designed particularly for improving, progressing, and testing the development pattern of skilled motor functions(engrams) in sports, rehabilitation, and health and fitness. In the field of rehabilitation in particular, the subject invention should prove valuable and have particular utility in providing measured objective evidence of recovery from an injury. This is particularly useful in professional sports in gauging the ability of an injured player to perform under competitive situations, and also has utility in legal situations involving compensation, for example, in cases involving an injured employee or worker.
In the fields of sports, rehabilitation, health and fitness, a person frequently performs particular motor movements to achieve a specific purpose, such as for example the motor movements performed during execution of a backhand stroke in tennis. It is primarily in the sensory and sensory association areas that the athlete experiences the effects of such motor movements and records "memories" of the different patterns of motor movements, which are called sensory engrams of the motor movements. When the athlete wishes to perform a specific act, he presumably calls forth one of these engrams, and then sets the motor system of the brain into action to reproduce the sensory pattern that is engrained in the engram.
Even a highly skilled motor activity can be performed the very first time if it is performed, extremely slowly, slowly enough for sensory feedback to guide the movements through each step. However, to be really useful, many skilled motor activities must be performed rapidly. This is capable of being achieved by successive performance of the skilled activity at game speed using the START system of the present invention until finally an engram of the skilled activity is engrained in the motor system as well as in the sensory system. This motor engram causes a precise set of muscles to perform a specific sequence of movements required for the skilled activity.
Most types of Inter partes competitive athletic performance involve predetermined patterns of sequenced muscle performance, usually in response to an act of an opponent, and the proficiency level of such performance is usually dependent, at least in large part, upon the reaction time required to initiate a predetermined pattern of sequenced muscle performance in response to an opponent's act and the rapidity with which such predetermined pattern is carried out. A corollary of the foregoing is the physical conditioning of the various muscles and other interrelated body components involved in each such predetermined pattern of muscle performance to minimize, if not substantially avoid, injury in the performance thereof.
2. Discussion of the Prior Art
The following U.S. patents are considered somewhat pertinent to the present invention as disclosing concepts related in some respects to the subject START system. However, none of the cited prior art discloses a system having the versatile attributes of the sports technique and reaction training system as disclosed herein.
Goldfarb et al. U.S. Pat. No. 3,933,354 discloses a marshall arts amusement device having a picture, such as a display of a combatant, which is adapted to be struck by a participant, a series of lights mounted behind the picture, preferably each located at a different key attack or defensive position on the body of the combatant. The display detects when the picture is struck in the vicinity of a light, and is responsive to the detection for illuminating one of the lights and for controlling which light in the series is next illuminated when the picture is hit. In order to demonstrate high performance or win against an opponent, the participant must rapidly extinguish each light in the series by touching or hitting the picture at the illuminated light. The lights are illuminated in a pseudo-random order which the participant cannot anticipate, and therefore his relaxation, coordination, balance and speed are tested much the same as they would be in combat in determining the quality of his performance.
Hurley U.S. Pat. No. 4,027,875 discloses a reaction training device which includes a pair of spaced apart, electrically connected stands, each being provided with electrical switch boxes. Each of the switch boxes is provided with an external plunger, with the plunger being connected to electrical circuitry and acting as a switch. A timer is connected to the electrical circuitry, such that that the time required for a person to activate the timer by touching the plunger on one switch box and stop the timer by touching the plunger on the other switch box is recorded.
Groff U.S. Pat. No. 4,493,6555 discloses a radio controlled teaching system in which a portable, self-powered, radio-controlled teaching device is provided for each student of a classroom, such that the teacher maintains a high level of student alertness by remaining in radio contact with each and every student during selected periods of the classroom day. A teaching device electronically transmits teacher-selected data to each student which, in turn, requires individual student responses to the data without the necessity of wired connections between the teacher and students. The teaching device is used to instantly and extemporaneously test the students in the class on a selected subject area.
Bigelow et al. U.S. Pat. No. 4,534,557 discloses a reaction time and applied force feedback training system for sports which includes at least one sports training device, and a stimulus indicator located near and associated with the sports training device. The stimulus indicator generates a plurality of ready signals at random time intervals, and a sensor in the sports training device is receptive of a force applied to the sports training device for generating an electrical signal having a magnitude proportional to the magnitude of the applied force. A control unit controls the emanation of the ready signals, and determines and displays the reaction time from emanation of the ready signal to sensing the applied force, along with the magnitude of the applied force.
In summary, none of the aforementioned prior art provides an integrated system for technique and accelerated reaction training having the general applicability and versatility of the subject invention with its many significant attributes as described in greater detail hereinbelow.
Accordingly, it is a primary object of the present invention to provide a training system which will enhance and improve the reflex capabilities of amateur and professional athletes with a unique training program that advances the state of the art in athletic training.
The START system of the present invention trains an individual in actual game situations using the identical movements that are necessary and at the same speed required by the sport. By training the actual movements necessary for the sport, the specificity of training is tremendously improved in the following areas: quicker reaction to outside stimulus and response with proper technique; aerobic-anaerobic fitness; strength; power; agility; balance and endurance. The specificity of training is very high because the athlete is motivated by competing against an audible feedback at the end of a measured period of time to perform at maximum levels on each movement in order to perform within the measured time period, which is analagous to a victory over an opponent.
The present invention may be briefly described as an improved method and apparatus for improving predetermined patterns of sequenced muscle performance, and in reducing the reaction time for the initiation thereof. In its broader aspects, the subject method includes the provision of a plurality of individually available external stimuli in the form of a cyclically repetitive sequence of available action signals, each of which requires a particular pattern of sequenced muscle performance in response thereto, in association with what normally appears to the participant to be a random energization of a single stimulus or action signal from the available plurality thereof. However, in some applications of the present invention, such as in physical therapy and rehabilitation, the order of energization of the external stimuli is repetitive and is known to the person undertaking the program. In its narrower aspects, the subject invention includes effecting the apparent random energization of particular stimuli signals by the act or sensed position of the performer and the provision of a performance rating signal indicative of the nature of the participants time and/or spatial response to the stimulus.
In accordance with a preferred commercial embodiment which has been designed, the subject invention provides a system for technique and accelerated reaction training of a person by a training program in which an array of lights is positioned visibly in front of the person, with each light signifying a different particular movement pattern to be executed by the person in a given amount of time. A control system selectively energizes one light of the array at a time, signifying a particular movement pattern to be executed, in a sequence of lighting of the array of lights unknown to the person undertaking the training program. In this program, the sequence of lighting of the array appears to be random, such that the person waits for an unknown light to be energized, and must then react in a measured time period with the particular movement pattern to be executed in response to that particular light, and the person then waits for the next unknown light to be energized, and must then react in a measured time period with the particular given movement pattern to be executed in response to that particular light. Moreover, the control system is programmable to enter a different individual time period of response for each different light, and then times each individual time period of response. Additionally, an audible feedback is supplied to the person by an acoustic transducer which is activated by the control system at the end of each individual time period of response to audibly signal, as by a beep, to the person the end thereof, such that the person in the program works to complete the particular movement pattern to be executed prior to hearing the audible signal or beep.
In a preferred embodiment, the array of lights comprises an array of six lights arranged in top and bottom horizontal rows of three lights, with the top and bottom rows being aligned vertically with respect to each other. The array of lights can represent movements in 360°, forward lateral and backward movements as they pertain to upper and lower body movements. Moreover, the START system is preferably constructed and provided in a portable carrying case, wherein the array of lights is mounted in the top portion of the carrying case, and the control system therefor is located in the bottom portion.
A preferred embodiment of the present invention has been developed wherein the control system is a microprocessor programmed and operated control system. In this embodiment, the microprocessor is coupled to an address bus, a control bus, and a data bus, and each of the array of lights, as well as additional controlled features, is coupled to and controlled by the microprocessor by signals issued on the address bus, the control bus, and the data bus.
The training program is stored in an external memory mounted in a cartridge which is insertable into a port in the bottom portion of the carrying case. The cartridge has stored in memory a sequence of lighting of the particular lights in the array, along with different individual time periods of response for each light, and the pause duration time period between the end of one individual time period of response and the beginning of the next individual time period of response, such that different training programs can be used in the system merely by changing program cartridges. Moreover, each cartridge preferably contains several different training programs stored in memory with different sequences of lights and different individual time periods of response. For instance, a cartridge can have stored in memory at least a beginner training program, an intermediate training program, and an advanced training program.
Advantageously, a cartridge can be programmed with a weakness drill program wherein at least one particular light in the array of lights is energized more frequently than other lights, with that particular light signifying a weakness movement pattern to be executed by the person, such that the program works on strenthening a particular weakness movement pattern. The system is also preferably programmed to provide a warm-up program which is run prior to the training program and a cool-down program which is run after the training program.
Moreover, in a preferred embodiment the microprocessor operated control system is programmable by a keypad entry array of keys in the bottom portion of the carrying case, which includes a keypad entry display for displaying the entries being made into the system. In this system, the individual time periods of response for each light stored in memory are changeable and reprogrammable by operation of the keypad entry array, particularly to suit the development and training of the person undertaking the training program. Advantageously, a percentage faster key is provided on the keypad entry array to actuate a routine to change the time periods of response in the program to make them a given percentage of time faster, and a percentage slower key is also provided to actuate a routine to change the time periods of response in the program to make them a given percentage of time slower.
In a preferred embodiment, at least one transducer is coupled to the control system which is activated by the person at the end of the particular movement pattern being executed, and the control system measures the actual period of time taken by the person to activate the transducer, and stores each measured time period of actual response in memory. Moreover, preferably a separate pressure touch pad transducer is provided for each light to be energized in the training program, and the control system measures the actual period of time taken by the person to touch each pressure pad, and stores each measured time period of actual response in memory.
One advantageous feature of the present invention is the ability to obtain a print out from the computer memory of the performance of the person in the program. The print out can include the individual measured response times, averages thereof, plotted curves thereof, and additional displays of the response data stored in memory.
A preferred embodiment of the subject invention also incorporates therein voice synthesizer circuits for instructing the person on correct operation of the system, and also during the training program.
The present invention also provides a training mat which has been developed particularly for use in conjunction with the START system, particularly for rehabilitation programs and in the measurement of timed responses. The training mat has on the upper surface thereof marked areas of position and marked areas of response. The training mat is generally rectangular in shape, and the marked areas of response are arranged in a pattern around the periphery thereof, with the marked areas of position being marked integrally with the marked areas of response. In this design, the pressure touch pads can be positioned at different marked areas of response on the mat or constructed integrally therein, such that a person orients himself with respect to a marked area of position, and then reacts to input stimulus signals to execute particular movement patterns, at the end of which the person touches a marked area of response on the training mat. Moreover, in a preferred embodiment the training mat preferably has a generally square shape, and the marked areas of response include a plurality of contiguous square areas positioned around the periphery thereof. Each side of the training mat is preferably between four and ten feet in length, most preferably six feet, and includes six square areas of response arranged contiguously along the length thereof. A central square area is thereby delineated on the central area of the training mat inside the square marked areas of response, and is adapted to receive one of several different central mat sections to be selectively placed centrally on the training mat.
Among the advantages of the subject invention is the provision of an improved method for accelerated reaction training to improve predetermined patterns of sequenced muscle performance and the reaction times therefor that can be utilized in diverse enviroments within the broad field of physical bionics, such as, for example, in basic aerobic and anerobic training exercises, and in the obtaining of enhanced reaction time performances, and also in specific athletic training for enhancement of performance in sports such as tennis, football, basketball, hockey, baseball and the like.
Another advantage of the subject invention is the enhancement of performance and results obtainable in a physical therapy program designed particularly for athletes desirous of returning to competitive activity following an injury or other physical disablement, as well as for enhanced general physical conditioning. Still other advantages of the practice of the subject invention are the development of improved cardio-vascular fitness, improved reaction times, improved balance, agility and speed, as well as an enhanced resistance to injury in the performance of athletic functions, and enhanced recovery from injury resulting from athletic or related physical endeavors.
The foregoing objects and advantages of the present invention for a sports technique and training system may be more readily understood by one skilled in the art, with reference being had to the following detailed description of several preferred embodiments thereof, taken in conjunction with the accompanying drawings wherein like elements are designated by identical reference numerals throughout the several views, and in which:
FIG. 1 is a schematic perspective view illustrating the employment of the methods of the subject invention in the training of tennis players;
FIG. 2 is a schematic circuit diagram for the stimuli battery depicted in FIG. 1;
FIG. 3 is an elevational view of a stimuli battery for providing a visual indication of a desired type of movement by a subject;
FIG. 4 is a schematic perspective view illustrating the employment of the programs of the present invention in the training of more advanced tennis players;
FIG. 5 is a side elevational view of a photosensor assembly;
FIG. 6 is a side elevational view of a light source for use with the photosensors of FIG. 5;
FIG. 7 is a schematic circuit diagram for a stimuli battery of the type illustrated in FIG. 3;
FIGS. 8 and 9 illustrate a preferred commerical embodiment of the present invention designed as a portable unit the size of a small carrying case, with FIG. 8 illustrating a display panel of six high intensity lamps mounted on the inside of the top portion of the portable case, and FIG. 9 illustrating the control keypad and control display panel mounted on the inside of the bottom portion of the portable case;
FIG. 10 is a plan view of a preferred embodiment of an exercise mat developed for use in association with the START system;
FIG. 11 is a block diagram of the major components of a preferred embodiment of a microprocessor controlled START system;
FIGS. 12 through 33 are logic flow diagrams illustrating the primary logic flow steps of the program for the microprocessor, in which:
FIGS. 12 through 16 illustrate the programming steps involved in the initialization of the unit after it is initially turned on;
FIG. 17 illustrates the programming sequence of the main operational running loop which allows an operator to select a drill and set up the parameters governing the operation thereof, and the middle of FIG. 17 refers to the four state routines of the system, the three more complicated of which are illustrated in FIGS. 25 through 27, and the right side of FIG. 7 refers to thirty-one different routines, the more complicated of which are illustrated in FIGS. 28 through 35;
FIG. 18 illustrates handling of the interrupt and backgrount routines which are performed every 0.01 seconds;
FIGS. 19 through 24 illustrate the interrelated logic flow diagrams of the interrupt and background routines perfomed every 0.01 seconds; in which
FIG. 19 illustrates the logic flow diagram of the input and output subroutine which keeps track of all inputs and outputs of the system;
FIGS. 20 and 21 are logic flow diagrams of the timing functions and counters of the processor;
FIG. 22 is a logic flow diagram of the LED display drive and keyboard matrix scanner operations;
FIGS. 23 and 24 illustrate the logic flow diagrams of the key detection and debouncing routines;
FIGS. 25 through 27 illustrate the logic flow diagrams of the three state routines of the system, including the numeric display routine of FIG. 25, the modify display routine of FIG. 26, and the drill running routine of FIG. 27, which state routines are illustrated in the central portion of the main operational loop of FIG. 17; and
FIGS. 28 through 35 illustrate the logic flow diagrams of the more complicated of the thirty-one routines shown on the right portion of the main operational loop of FIG. 17, including the start routine of FIG. 28, the program routine of FIG. 29, the beginner routine of FIG. 30, the number of routine of FIG. 31, the modify routine of FIG. 32, the duration routine of FIG. 33, the cancel warm-up routine of FIG. 34, and the enter routine of FIG. 35.
Most competitive atheletic performances against an opponent, such as for example in tennis, football, soccer, basketball, hockey and baseball involve a specific repertoire of a relatively few basic patterns of movement, the rapidity of initiation and performance of which are significant factors in an athlete's competitive effectiveness. Each such pattern of movement normally involves a predetermined pattern of sequenced muscle performance to attain the desired result. For example, it has been observed that successful tennis players have developed a specific repertoire of movement patterns, each comprised of a few basic and very rapid movements and shots which place the player and the ball precisely where they can be most competitively effective. It has been observed further that the basic movement patterns are remarkably similar among the top successful tennis players. Similar movement patterns are also ascertainable for particular participants in other competitive sports endeavors. Instances where pronounced patterns of movement are readily ascertainable include football players, and particularly defensive backs, goalies and defensemen in hockey, basketball players, and baseball players, where good fielders have always been recognized as those who "get a good jump on the ball".
The methods hereinafter described are generally directed to accelerated reaction training, and in particular to the training of athletes to adapt and become increasingly proficient in such basic movement patterns through the utilization of randomly generated stimuli signals coupled with movement pattern responsive indicia to provide immediate positive or negative reinforcement for properly or improperly executed movements or patterns thereof.
FIG. 1 is illustrative of the practice of the present invention in enhancing the performance of an athlete in a basic side to side movement pattern such as is commonly employed in tennis. Such side to side movement involves a predetermined pattern of sequenced muscle performance. In order to enhance both a player's reaction time and the rapidity of performance, there is provided a stimuli battery, generally designated 10, positioned on the court center line and in view of the player 12. The stimuli battery 10 contains three lamps 14, 16 and 18 mounted in horizontal array on a support 20. As shown in FIG. 2, the lamps 14, 16 and 18 are adapted to be sequentially and repetitively individually energized by a continuously operating cyclic switch 22 included in the energized circuits therefor. However, such lamps will remain in an unlit condition due to the presence of a normally open and remotely operable switch 24 in the power circuit.
In the practice of the present invention, an athlete 30 positions himself on the baseline 32 in generally straddle relationship with the center line 34. In a simple version thereof, the athlete 30 may initiate the drill by manual operation of a trigger transmitter of the type conventionally employed to trigger garage door opening devices. A receiver element 40 is associated with the switch 24 and, upon receipt of a signal from the trigger transmitter, operates to close the switch 24. Upon such remotely initiated closure of the switch 24, the power circuit is completed and the particular lamp whose energizing circuit is then closed or is the next to be closed by the operation of the cyclically operable switch 22 will light. As will now be apparent, however, activation by the trigger transmitter by the player 30 will result in a purely random selection of one particular lamp to be lit, thus precluding conscious or subconscious anticipation of a movement direction by the player.
In the above described example, the athlete 30 initiates the drill by activation of the transmitter trigger. The stimuli battery 10 responds immediately to the trigger signal by illuminating a randomly selected one of the plurality of lights 14, 16 or 18. The outermost lights, for example 14 and 18, correspond to different movement pattern directions, for example, movement pattern to the left and movement pattern to the right. There is preplaced in each such direction a mark 42 and 44 upon a ground surface located a finite distance from the centerline starting position 34. When, for example, light 18 illuminates, the athlete 30 moves through a predetermined pattern of movement to mark 44 and upon there arriving, immediately reverses direction and returns to the starting position. If desired, the lamp energizing circuits may be designed to maintain lamp illumination for a predetermined but selectable period of time within which the particular movement pattern should be completed.
As will now be apparent, use of the transmitter trigger by the athlete 30, although providing for random light selection, permits the athlete to train at his own pace. On the other hand, the transmitter trigger could also be held by an instructor, who can then control the pace of the drill as well as observe, and correct where necessary, the movement patterns being employed by the player during the drill. Repetitive drills in accord with the foregoing will improve both the athlete's reaction time and rapidity of performance by the particular movement pattern through enhanced sequenced muscle performance and, in addition, will function to condition the muscles involved therein.
If desired, the transmitter trigger may be dispensed with and the stimuli battery 10 actuated by a photosensor unit 46. Such photosensor unit 46 may be placed behind the baseline 32 coaxially with the centerline 34. In this instance, the athlete 30 initiates the drill by physical interposition in the path of the photocell sensor beam. Operation is as described hereinabove except that the system automatically recycles each time the athlete 30 returns to the base line starting position.
Referring now to FIG. 4, there is illustratively provided a preferred multipurpose stimuli battery, generally designated 110, in the form of a plurality of lamps 112, 114, 116, 118, 120 and 122 mounted in a generally rectangular array on a support structure 124 above a base 126. Included within the base 126 is a power supply 128 connectable to any convenient source of electricity, not shown, through a line plug 130. Also included within the base 126 is a normally open and remotely operable switch 132 disposed intermediate the power supply 127 and a continuously operating cyclic switch 134 which sequentially completes individual energizing circuits for the lamps 112, 114, 116, 118, 120 and 122. In the operation of the described unit, the continuously operating cyclic switch 134 selectively and sequentially completes the energizing circuits for the lamps. However, such lamps will remain in an unlit condition due to the presence of the normally open and remotely operable switch 132. Activation of the switch 132 may be effected, for example, by a manually operable trigger transmitter 136, such as a transmitter of the type conventionally employed to trigger garage door opening devices or by a photocell response or the like. Upon such remotely initiated operation of the switch 132, a power circuit is completed between the power supply 128 and the particular lamp whose energizing circuit is either then closed or is the next to be closed by the operation of the cyclically operable switch 134. As will be apparent, activation of the trigger transmitter 136 results in a purely random selection of one particular lamp to be lit, dependent upon the status of the cyclic switch 134 at the time of transmitter activation.
As will now be apparent, the stimuli battery illustrated in FIG. 4 can provide a plurality of randomly selected action signals. For example, and assuming the user is facing the battery 110, ignition of lamp 116 can initiate a predetermined movement pattern to the right as indicated by the arrow 116a, FIG. 3. Similarly, selective ignition of lamps 118 and 122 can be employed to initiate diagonal movement patterns, while selective ignition of lamps 114 and 120 can be employed to initiate backward and forward movement patterns respectively. As will now also be apparent, elevation or jumping patterns could also be initiated by single or combinational lamp energization.
FIG. 4 illustrates another and more complicated tennis drill employing the stimuli battery shown in FIG. 3 and described above. In this drill, the stimulis battery means 110 comprises the previously described six lights 112, 114, 116, 118, 120 and 122, again placed within view of the athlete on the far side of the court. Stimuli battery means 110 is here electronically coupled to a plurality of photosensor means 220, 222, 224, 226, and 228, and to an electronic clock 232. The athlete 30 can initiate the drill by serving the ball and moving netward through the zone of focus 229 of a first photosensor means 220, with the zone of focus 229 being proximate to and substantially parallel to the usual location of the tennis court service line 293 along the central segment therof. The stimuli battery 110 responds to the movement of the athlete through the second zone of focus 234 by selecting and illuminating one light of the available plurality therof. In this embodiment lamps 118 and 122 would direct movement toward additional focus zones 236 and 238, respectively. Each light corresponds to one of a plurarity of additional zones of focus, i.e., light 120 for moving forward, light 114 for moving back, etc. Each of such additional zones of focus 236, 238, and 239 is located in a different direction from each other with respect to the second zone 234. The athlete responds to the stimuli battery 110, for example, the illumination of lamp 118, by moving rapidly towards and through the zone corresponding to the illuminated light, for example 238. When the athlete moves through the zone, for example 238, his motion causes the digital clock to stop and display the time elapsed from his motion through the first zone.
FIG. 5 is a side elevation of a photosensor assembly 240 such as is used in the drills described in FIGS. 12 and 13. It includes a photosensor 241, a support means 242, and a tripod base 244. Photosensor means 241 is a conventional photocell with appropriate means to provide a signal in response to a change in marginal light thereon. Connector 246 electrically connects photosensor means 241 to a remotely located control unit not shown.
FIG. 6 shows a light source designed to provide illumination for photosensor 241 of FIG. 5 in marginal light conditions. This light source, generally designated 247, comprises a lamp 248, a support 250, a tripod base 252, and a power cord 254 leading to a power source, not shown.
FIG. 7 schematically depicts an electrical control circuit for use with the stimuli battery means 110 of the type shown in FIG. 3. As shown, a signal from a trigger transmitter 136 is received by a resistor 137 and transmitted to a cyclic switch 134. The cyclic switch 134 can be in the form of a cyclic generator providing six discrete output signals at a frequency of approximately 10 KHz. The cyclic switch 134 is connected through lines 140 to individual one shot trigger circuits 142, 144, 146, 148, 150 and 152, each of which is adapted to provide an output signal of predetermined duration when triggered by a signal from the cyclic switch 134. The output signals are utilized to effect ignition of the lamps 112, 114, 116, 118, 120 and 122, respectively. Each of the one shot trigger circuits includes means, such as the illustrated adjustable resistor, to provide for user control of the time duration of the output signals from the one shot triggers, and hence the duration of lamp ignition. The termination of the output signal from the one shot trigger circuits is utilized to activate an audio signal, indicating that the period during which a predetermined movement pattern should have been completed has expired. Desirably the circuit also includes means such as logic circuit 156 to provide for user controlled disablement of particular lamps in accord with the nature of the movement patterns being utilized for training.
A preferred commercial embodiment of the present invention has been designed to have general applicability to many training programs in different sports, or in rehabilitation and general health and fitness. The preferred embodiment is designed as a portable unit which unfolds, similar to a traveling case, into an upper section 300, FIG. 8, having a top display panel, which may or may not be separable from the bottom section 302, FIG. 9, of the unit with appropriate electrical connections thereto. The unit is microprocessor controlled and programmable, as described in greater detail hereinbelow. The top display panel provides an array of six (6) high intensity lamps 304 that are strobed on/off in a pre-programmed sequence as dictated by the program number indicated by the documentation, and selected via a numeric data entry keypad, and a loudspeaker 306. The time that each lamp is illuminated, as well as the pause time between lamp strobes is also a pre-programmed parameter set for the selected program number, but these parameters can be changed and reprogrammed as described in greater detail hereinbelow.
The control system, which is microprocessor controlled and programmable is mounted in the bottom section 302, FIG. 9, along with a control and programming keypad 308 of control keys, three (alternative embodiments might incorporate four or more) LED seven segment digit displays 310, an external ROM (XROM) memory cartridge port 312, a microprocessor expansion port 314, a volume control 316, an external speaker (horn) switch 318, a remote advance unit and pocket therefor 320, a battery charger unit and pocket therefor 322, an XROM cartrdige storage pocket 324 wherein several XROM program cartridges can be stored, and a screwdriver 326 for assistance in servicing the unit, such as in changing fuses or bulbs.
The keypad 308 allows the user to vary the on/off times as well as the pause times in any selected program drill for any individual or multiple numbers of lamps by simply entering the desired times. This feature allows the user to custom tailor each pre-programmed training drill to the individual talents/progress of the person in training.
The design of the unit accomodates the development environment as well as the end user environment. The development environment is enhanced by allowing the system training program developers to set the various sequences of drills as well as default timing periods that are used to generate the final programs that are contained in response training drill cartridges. The user enviroment allows the selection of these program sequences via the keypad, and allows for selective alteration and reprogramming of the default lamp/pause timing periods by the user.
The base system is equipped with the basic response training programs in an external ROM (XROM) memory memory cartridge plugged into port 312, and is also designed with an expansion port 314 that allows the user to plug in subsequently developed program and/or feature enhancements as offered by the manufacturer. These subsequent programs and/or feature enhancements will be available in cartridge type devices that will simply plug into the expansion port 314.
Some of the programs and/or feature enhancements that can be made available through the expansion port include the following:
1. Drill sequence cartridges-drill cartridges that contain pre-programmed drill sequences that are specifically designed for a particular sport, function within a sport, weakness correction, rehabilitation exercise, etc. For example, individual cartridges may be offered that offer specific movements to improve a weakness in a particular type of commonly required movement for a sport, such as a deep baseline backhand in tennis, etc.
2. Timing measurement and plotting-a slave microprocessor controlled device may be added via the plugin expansion port. Pressure sensitive mats, photoelectric beams, motion detection sensors, etc., measure the actual time that an athlete takes to perform the required movement. These reaction times are stored for subsequent retrieval, computer analysis, charting, etc. to enhance and/or revise a training program based upon the available performance analysis.
3. Voice enhanced coaching-voice synthesis, in addition to the basic voice systhesis that is part of the base system, can be added via the expansion port to provide prompting, tutoring, coaching, etc. to the user during the execution of the drill sequences. For example, if a common mistake during the performance of a particular movement is the incomplete turning of the hips to properly prepare for a tennis backhand, the start system could remind the user (much the same way as a personal coach would) to perform the movement using the correct technique. This feature would be implemented via the voice synthesis module, under program control.
The manufacturer developed sequences, as well as the applications software are stored in volatile memory, and allow for over-writing in the operation of the microprocessor.
All user interaction with the system is by the keypad/display module illustrated in detail in FIG. 9. The elements of the unit, which are primarily elements of this module and their major functions are as follows.
1. Numeric display 310-this is a three or four digit display that indicates the numeric entries as entered by the control keys on the keypad.
(a) The selected preprogrammed drill sequence number (00-99) that is presently being run by the unit.
(b) The drill duration time, which includes the warm-up, exercise, and cool-down times.
(c) The timing associated with the lamp strobeon time, or the lamp strobe off (pause) time. The pause time is a global parameter that is valid for all pauses, and is not individually selectable per lamp.
2. START/STOP-This key alternately initiates and terminates the automatic pre-programmed or user modified drill sequence.
3. LAMP-This key allows the user to select the lamp or lamps whose strobe time is to be modified via the TIMER key and the numeric data entry keys, or via the 5% faster/5% slower keys, the lamp(s) selected for timing modification are indicated by the numeric display.
4. PROG (program)-This key allows the user to select the pre-programmed sequence in the XROM that is to entered via the numeric date entry keys. Each XROM cartridge contains approximately thirty separate sequence drills in memory.
5. PAUSE-This key allows the user to set the global pause time (the off time of each lamp in a sequence).
6. TIMER-This key when used in the proper sequence with the lamp select (LAMP) key allows the user to alter the on (strobe) time of the lamp(s) selected for modification, when used with the DUR key allows the selection of duration time, and when used with the PAUSE key allows selection of the global pause time. The times are entered via the numeric data entry keypad. The least significant digit provides resolution to 1/100th of a second.
8. ENTER-This key is used subsequent to any numeric entry to confirm the entry into the microprocessor.
9. CLEAR-This key is used to erase any numeric data entry (prior to entry) and/or to edit an erroneous selection.
10. Lamp Field-The lamp array provides six (6) high intensity lamps 304 that will blink as indicated by the program drill selected for training.
11. Audio Output-The volume control 316 controls an internally located speech/sound synthesis system including an amplifier, a speaker 306, a speech synthesis processor, and speech/sound PROM containing digitally encoded speech/sound data, with the circuit chips being connected together in a standard fashion as is well known and developed in the voice synthesizer arts to provide the following functions.
(a) Generation of a tone in synchronism with the off (pause)time of each sequenced lamp, thereby providing the user with instant audible feedback to determine if the particular movement was performed within the program alloted time. It has been observed that an additional benefit to the tone feedback is the stimulation of game situation reactions. The user, tending to positive feedback and reinforcement, is challenged by the system in much the same way as in an actual game situation.
(b) Speech synthesized prompting of the user to indicate, for example:
(1) System status, diagnostic failures;
(2) Operator error in selecting or entering the parameters for setting up or running a drill sequence;
(3) Next expected key entry;
(4) Notification of the start or completion time of various program segments that comprise a complete drill.
12. 5%F. (5% faster)-This key causes either all of the lamps in a sequence, the selected lamp(s), or the pause timer to run at a five (5) percent faster rate. Multiple operations of this key will increment the timing reduction by 5% for each key operation.
13. 5%S (5% slower)-The same as above (#12) except that the sequence will run slower.
14. DUR (duration)-This key allows the user to specify the time duration of the particular training program drill selected by the user.
15. MOD (modify)-This key is used in conjunction with several other keys to alert the system that the user wishes to modify certain parameters of the training program.
16. FO (BEG) (beginner)-This is a function key which initially sets the selected training program from the XROM memory to the beginner level.
17. F1 (INT) (intermediate)-This is a function key which initially sets the selected training program to the intermediate level.
18. F2 (ADV) (advanced)-This is a function key which initially sets the selected training program to the advanced level.
19. All LAMPS-This key allows the user to specify all lamps for timing modification, as opposed to individual lamps via the LAMP key.
20. CANCEL WARM UP-This key allows the user to cancel the warm up period for timing modification/entry.
21. POWER ON-This switch applies power to the circuitry of the unit, after which the processor then maintains control over power to the system.
22. POWER OFF-This switch terminates power to the unit, and is a separate switch because of the processor control over the power.
23. REMOTE-This switch allows the user to step the selected program via the wireless remote advance coaches module or a wire connected foot switch.
The START system provides the following basic features in an external ROM (XROM) module plugged into port 312:
1. Seven random lamp sequences that can be selected as pre-programmed sequence drill numbers 01-10, The number of lamps used in each sequence will correspond to the sequence number with the exception of 07 e.g. Seq. #02 will use two lamps that will flash in a random pattern. The 07 drill number will be an alternate five lamp pattern.
2. Forty four or more preprogrammed sequences that are selected by entering the numbers via the numeric keypad. The program drill corresponds to those nomenclated on the training documentation and will run from 11 to 50.
3. A preprogrammed time period (approx. 15 secs.) that delays the start of any user selected drill until the timer has expired, thereby affording the user the opportunity to position him/herself prior to the start of the drill.
4. A preprogrammed warm-up and cool-down sequence that precedes and follows, respectively, each selected sequence. As noted above, the warm-up period is cancellable by the user. The warm-up and cool-down durations are automatically set by the system in direct relationship to the drill duration (DUR) time set for the particular selected program.
FIG. 10 is a plan view of a preferred embodiment of an exercise mat 340 developed for use in association with the START system, particularly for rehabilitation programs and in the measurement of timed responses. The training mat has the upper surface thereof marked with areas of position 342 and areas of response 344. The training mat is generally rectangular in shape, and is prefereably square, and the marked areas of response 344 are arranged in a pattern around the periphery thereof, with the marked areas of position 342, being marked integrally therein. In this design, touch pads 345 can be positioned beneath different marked areas of response on the mat, or can be integrally constructed therein, such that a person orients himself with respect to a marked area of position, and then reacts to input stimulis signals to execute particular movement patterns, at the end of which the person touches a marked area of response on the training mat. Moreover, in a preferred embodiment each side of the training mat is preferably between four and ten feet in length, most preferably six feet, and includes a minimum of four, a maximum of sixteen, and in one preferred embodiment six square areas of response 344 arranged contiguously along the length thereof. A central square area 346 is thereby delineated on the central area of the training mat inside the square marked areas of response, and one exemplary central mat section is illustrated in phantom in the drawing.
FIG. 11 is a block diagram of the major components of a preferred embodiment of a microprocessor controlled START system. Referring thereto, the START system includes the following major functional elements, a power supply 350, a microprocessor 352 with address 354, control 356, and data 358 busses, a remote advance and coaches module 360, lamp drivers 362 and lamps 364, speech synthesis chips including a processor chip 366 and a speech PROM chip 368, a keyboard 308 and LED digit displays 310, an external ROM cartridge 370 and an expansion port 372, decoder/latches 374 and bus interfaces 376.
The microprocessor contains both PROM memory that provides the program execution instructions as well as certain data constants, and RAM memory that contains variables, registers, etc. that enable various processing steps and modifications.
The various system devices (lamps, speech processor, keyboard and displays, etc.) are peripherals to the microprocessor, whose selection are controlled by the microprocessor address bus and control bus. Each peripheral has its own unique address, stored as permanent data in the microprocessor memory. The control bus maintains a read (RD) function, which is used by the microprocessor to transfer data to a peripheral device. The data bus 358 is a bidirectional bus which contains, under program control, the data that is read from or written to a selected peripheral device.
To enable a particular function to be energized, the microprocessor determines the address of the device, and configures the address bus, which includes placing the proper address thereon, to perform the device selection. The data that is to be placed on the data bus is provided by the microprocessor for a write function and by a peripheral for a read function. A read or write strobe then causes the data to be accepted by the appropriate device (microprocessor or peripheral). In this manner, a number of bits equal to the data bus size (8) is transferred between the microprocessor and the peripheral.
Some devices require all eight (8) bits of data (e.g. speech synthesis phrase selection), while some require less than eight (8) bits (e.g. lamps require one bit for on/off.)
The microprocessor, via the stored program control logic as described herinbelow, determines the functions to be performed, the timing requirments, the processing required, etc.
When the microprocessor program determines that a lamp is to be turned on for a specific period of time, it determines the address of the particular lamp required, configures the address bus 354, places the appropriate data on the data bus 358, and issues a write command. The data is then latched in the decoder latch 374, which turns on the lamp driver 362 and lamp 364. The microprocessor then performs the timing function required to accurately time the lamp on state. When the time expires, the microprocessor re-addresses the lamp, but now configures different data on the data bus, which causes the lamp driver/lamp to enter the opposite, off, state.
When the microprocessor program determines that the speech processor is to output a tone, a word, or a phrase, it determines the location in memory of the word(s) required, configures the address bus 354 to select the speech processor, places the word location on the data bus 358, and then issues a write command. The speech processor 366 receives and stores the selected word(s) location, and interacts with the speech memeory PROM 368 to provide an analog output that represents the speech data. The PROM 368 contains the Linear Predictive Coded (LPC) speech data as well as the frequency and the amplitude data required for each speech output. The filter and amplifier section of the circuit provides a frequency response over the audio spectrum that produces a quality voice synthesis over the loudspeaker 306 and possibly over a remote speaker (HORN).
In one designed embodiment the speech synthesis technology utilized well known designs incorporating the National Semiconductor MM54104 DIGITALKER speech synthesis processor and INTEL CORP 2764 EPROMS for speech memory storage.
The displays 310 are common cathode seven segment LED displays that are driven by a decoder driver. The decoder driver takes a BCD input, and provides an appropriate output configuration to translate this input to the proper segment drives to display the required character. These outputs apply a high current drive to all necessary segments, and the circuit is completed (and displays lit) by pulling the common cathode to ground.
The keyboard is an XY matrix, which allows a particular crosspoint to be made when that position on the matrix is depressed by the operator.
The microprocessor combines the energizing of the displays with the scanning of the keyboard for operator input. The displays and keyboard are constantly scanned by the microprocessor to provide a power saving multiplexing of the displays and a continuing scanning of the keyboard for operator input.
The common cathode of the display is provided with the same address as the X (row) location of the keyboard matrix. Therefore, energizing a display member also results in energizing the X (row) number of the keyboard.
For any particular scan, the microprocessor determines the address of the display to be energized (which is the same X (row) on the keyboard), and determines the data to be written on that display. The common display decoder driver latch address is determined, the address placed on the address bus 354 , and the data to be displayed is placed on the data bus 358. A write (WR) strobe is then issued which causes this data to be written and stored in the latch. To energize the LED displays (complete the circuit), the microprocessor determines which digit display is to be energized, places that address on the address bus, places the data to be writen on the data bus, and issues a write strobe. This causes the selected common cathode to be energized and latched, as well as the scan input to the selected X (row) of the keyboard.
To determine if a key has been depressed, the microprocessor reads the column (Y) output of the keyboard via the bus interface and places this on the address bus 354. This is decoded and the column data selected for application to the bidirectional data bus 358. The microprocessor 352 then issues a read (RD) command which causes this data to be stored in a bus memory location. Analysis of this bit pattern allows the microprocessor to determine if a keyboard crosspoint was made, corresponding to an operator selector. This scanning operation is performed at a sufficiently high rate to detect normal keystrokes as well as to provide a multiplexed output that is bright and appears nonflickering to the human eye.
The external ROM (XROM) contains the preprogrammed drill sequence data used to run an operator selected drill. This design approach provides great flexibility in setting up drills while using the resources of the microprocessor controlled peripheral devices. The XROM is programmed with data, in sequence, that allows the microprocessor to perform the following tasks:
(1) select a lamp;
(2) select a speech synthesizer word/phrase;
(3) select a tone output.
The XROM also contains default timing data for the following which is used in the exercise program when the operator does not select and enter alternative times:
(1) lamp-on time; and
(2) pause time.
It can be readily seen that by properly encoding the XROM data, the microprocessor can execute numerous types of drill sequences which can combine the above mentioned parameters. It can also be observed that the use of plug-in cartridge XROMS allows a variety of sequence drills to be developed, equipped and executed with little if any programming by the user. A variety of plug-in cartridges can be developed for specific sports, weakness drills, rehabilitation programs, etc.
When the microprocessor 352 determines that the user has selected the START/END key, and is thereby requesting the initiation of a drill sequence, it obtains the address of the present step to be executed in the XROM, and places this address on the system address bus 354. The XROM is then activated, and places the selected data on the data bus 358. The microprocessor 352 then issues a read command, which causes this data to be stored in the microprocessor register for interpretation and processing. The XROM storage formats are fixed, so that if a lamp-on command is read from the XROM, the microprocessor knows that the next sequential address contains the lamp-on operation time.
The microprocessor continues the execution of the XROM instructed drill sequence until the drill operation time has expired, or until the user stops the drill manually. It should be noted that each drill sequence is comprised of a limited finite number of steps (locations) in the XROM memory. The microprocessor continually cycles through the steps to perform the drill. However, to achieve a truly random nature for a drill, the microprocessor does not always start each sequence at the intitial step (location), but rather starts at some randomly indexed namable location, as explained further hereinbelow with reference to FIG. 18.
The START system preferably is controlled and run by a single chip microprocessor, and in one embodiment the particular microprocessor used was the P8749H type chip from the Intel Corporation which contains an 8-bit Central Processing Unit, 2K×8 EPROM Program Memory, 128×8 RAM Data Memory, 27 I/O lines, and an 8-bit Timer/Event Counter. Details of the architecture and use of this chip are described in detail in numerous publications by the manufacturer, including a manual entitle INTEL MCS-48 FAMILY OF SINGLE CHIP MICROCOMPUTERS USER'S MANUAL.
Referring to Figures. 12 through 33, the logic flow charts illustrated therein reveal the major steps of the program, which is stored in the microprocessor non-volatile memory, for controlling the operation of the processor. A program listing of the instruction for the control of the particular instrument being described herein is attached to this patent application as an EXHIBIT and forms a part thereof.
The resident firmware that controls the operation of the unit can, for the purposes of explanation, be divided into four major categories. These are: the foreground task, the background task, the utility subroutines, and the data tables. It should be noted that although the word "task" is intermixed throughout this firmware description with the word "program", indeed no true task structure associated mechanism (i.e. task switching/scheduling) has been implemented.
The foreground task has as its responsibilities, hardware and software initialization, start-up device diagnostics, user interaction (including input error checking and feedback), drill selection and modification, drill execution, and overall device state control (e.g. running/paused/idle). This portion of the program performs its duties by both interacting with the free-running background task to interface with the hardware environment, and tracks all time dependent functions as well as calling upon the various subroutines that exist to carry out their predetermined assignments.
The functions of these subroutines include: reseeding of the pseude-random drill index, fetching and executing selected drill data from the external ROM (XROM), general purpose muliplication by ten, binary to decimal conversion, speech processor invocation, computation of "warm-up" and "cool-down" times, user preparation prompting, crosspage jump execution, service SVC request flag manipulation (both setting and checking for completion), and local/remote mode determination. As these routines are called solely by the foreground program, they can be thought of as an extension thereof which have been demarcated for the purpose of saving Program Memory as well as to allow for their independent development/testing.
The background task, which is functionally described in greater detail hereinbelow, has as its responsibilities, event timer control, I/O execution/timing control, LED display refreshing, and keyboard scanning and debouncing.
The data tables, which are located on a special "page" of Program Memory to maximize look-up speed and efficiency, supply sythesized speech address and script information, keyboard matrix translation information, present-to-next state transition data, and warm-up/cool-down duration ratios.
In operation, the foreground program is activated upon power-up, at which time it initializes (FIGS. 12 through 16) both hardware and software environments to a known condition. A diagnostic test of the device (LED display, XROM interface, clock circuitry, speech synthesizer ans associated filters/amplifier/speaker) is then performed. Any detected failure causes the user to be notified and the device to be powered-off barring further unpredictable operation. If all is operating properly, the program enters a loop awaiting either the expiration of a watchdog timer that serves to preserve battery power if the device is left unattended, or the inputting of drill selection/modification commands by the user via the front panel mounted keyboard. Once a selected drill is running, the foreground task retrieves the drill steps from the XROM, formulates the necessary SVC requests, and passes them to the background task for execution.
At a frequency of 1 kHz, an interrrupt is generated by the timer/counter circuitry causing suspension of the foreground program and activation of the background program to check for outstanding or in progress I/O requests, event timer expiration, keyboard entry, and updating of the LED displays. Coordination of the two programs is achieved through the use of the service (SVC) request flags and shared buffers.
The detection of any event (an expired timer, keystroke, etc.) by the background task results in the examination of the current machine state by the foreground program and the subsequent table-driven change to the next appropriate state. Referring to FIG. 17, the four possible machine states are 0 IDLE, 1 ENTRY, 2 MODIFY, and 3 DRILL, which together with the three dri11 state definition of WARM-UP, NORMAL, and COOL-DOWN and the five entry mode classifications of PROGRAM, MODIFY, DURATION, LAMP and TIMER serve to keep the foreground program informed at all times of the ongoing activity as well as the correct next-state progression.
This entire process is repeated for each step of the active drill. In addition, the EXECUTE subroutine will not, if Remote Operation has been selected, return to the caller until detection of a Remote Advance signal from the wireless transmitter/receiver pair.
Modification of the drill duration, lamp (either individually or all) on-time duration or inter-lamp pause duration on either an absolute (as entered via the numeric keypad) or percentage (+/- 5%) basis is handled by the foreground task by the manipulation of RAM-based timer registers.
Referring to FIG. 18, the interrupt clock is managed by two routines: the clock initialization and the interrupt handler. The initialization code sets the clock interrupt interval and starts the clock. This function is performed only upon power-up/restart. The clock interrupt routine is called each time an interrupt is generated by the real-time clock. The interrupt handler immediately (after context switching from foreground background) reinitializes the clock to allow for the generation of the next clock pulse. The interrupt handler then passes control to the background program via a call to the SYSTEM subroutine.
Referring to FIGS. 19 and 20, once activated by the interrupt handler, the background program starts its time management duties by checking the SVC control word for an outstanding 30 second multiple timing request (e.g. drill warm-up duration timer). If found, an additional check is made to determine if this is an initial or a subsequent request. In the case of the former, the associated first pass flag is cleared in the SVC control word, and the 0.01, 1.0, and 30 second cascaded timers are initialized. In the case of the latter, the 0.01, 1.0, and 30 second prescalers are updated (in modulo-N manner) and a check is made for overall timer expiration. If detected, the associated request flag is cleared in the SVC control word, signalling to the foreground program that the event timer has expired and appropriated action should be taken.
Referring to FIGS. 19 and 21, the background program then assess what (if any) I/O control is required by checking the SVC control word for an outstanding pause, beep, or lamp request. If one (they are mutually exclusive) is found, an additional check is made to determine if this is an initial or a subsequent request. In the case of the former, the associated first pass flag is cleared in the SVC control word and the 0.01 second I/O prescaler is initialized. A further test is made to determine if the request was for a pause which, although treated in a identical manner up to this point as a beep or lamp request, requires no actual hardware manipulation and would free the background task to perform its display and keyboard scanning functions. A beep or lamp request would instead cause the background task to interface to the appropriate decoders to turn the requested device on, skipping the display/keyboard scanning function in this pass. In the case of the latter (subsequent request), the 0.01 second I/O prescaler is updated and checked for expiration. If not yet expired, no further I/O control is perfomed, and the background program continues with its display/keyboard duties. Upon expiration, the associated request flag is cleared in the SVC control word as a signal to the foreground program that the I/O is completed. In addition, if the request was for a beep or lamp, the background program simultaneously interfaces to the appropriate decoders to turn off the requested device. In any case (pause/beep/lamp), the background task advances to the display/keyboard scanning function.
Referring to FIG. 22, the algorithm for driving the display uses a block of internal RAM as display registers, with one byte corresponding to each character of the display. The rapid modifications to the display are made under the control of the microprocessor. At each periodic interval the CPU quickly turns off the display segment driver, disables the character currently being displayed, and enables the next character. This sequence is performed fast enough to ensure that the display characters seem to be on constantly, with no appearance of flashing or flickering. A global hardware flag is employed as a "blank all digits" controller, while individual digits may be blanked by the writing of a special control code into the corresponding display register.
Referring to FIG. 22, as each character of the display is turned on, the same signal is used to enable one row of the keyboard matrix. Any keys in that row which are being pressed at the time will pass the signal on to one of several return lines, one corresponding to each column of the matrix. By reading the state of these control lines and knowing which row is enabled, it determines which (if any) keys are down. The scanning algorithm employed requires a key be down for some number of complete display scans to be acknowledged. Since the device has been designed for "one finger" operation, two-key rollever/N-key lockout has been implemented. When a debounced key has been detected, its encoded position in the matrix is placed into RAM location "KEYIN". Thereafter the foreground program need only read this shared location repeatedly to determine when a key has been pressed. The foreground program then frees the buffer by writing therein a special release code.
Referring to FIG. 12, the hardware initialization as set forth in the top block is performed automatically upon power-up reset. The system components in the second block are then initialized. The third block represents a pause of 500 milliseconds. The last block on FIG. 12 and the top of FIG. 13 represents a routine to light each of the six lamps in turn for 50 milliseconds. After that, the LED displays are initialized to display a 9, and the speech synthesizer simultaneously voices "nine" for 0.5 seconds. The lower section of FIG. 13 represents a routine wherein that same function is repeated for 8, 7, etc. until the digit 0 is reached.
Referring to FIG. 14, the LED displays are then disabled, and the byte at a given set location in the XROM cartridge is read out, which byte should correspond to a test byte pattern. If so, the location in XROM is incremented for a second test byte pattern. If both test patterns match, the logic flow continues to FIG. 15. If either of the test patterns do not match, a speech subroutine is called to vocalize "error", and the system power is shut off.
Referring to FIG. 15, the top blocks therein represent a routine for proceeding through fourteen sequential XROM test instructions, after which the remote input is checked to determine if remote control is indicated. If local control is indicated by the switch on the control panel, the blink counter is set to 10, and if remote control is indicated, the blink counter is set to 11.
The routine at the top of FIG. 16 causes a blinking of the LED displays for 250 milliseconds and the successive decrementing of the blink counter to 0. At that time, the speech synthesizer is invoked to voice "START is ready", and the diagnostics are now completed. The system is then prepared for operation by initializing all flags and starting the idle counter, which is a power-saving counter to shut the system off after 10 minutes if no input commands, such as pressing the START key, are received.
The system then enters the main program loop of FIG. 17, which allows an operator to select a particular drill and set up all selected parameters of the drill, after which the operator presses the START key. The top of FIG. 17 represents the speech synthesizer being invoked to enable a key "click" to be heard after each entry, and the idle counter is reset after each entry.
The right portion of FIG. 17 represents 32 different routines corresponding to the possible keystrokes, the more complicated of which routines are illustrated in FIGS. 28 through 35. The middle left of FIG. 17 represents four state routines of the system, the 1, 2 and 3 states of which are illustrated in FIGS. 25, 26 and 27. The 0 state routine is an idle state, during which the idle counter is running. The 1 state routine, FIG. 25, is a numeric state routine in which a selected numeric mode is displayed in accordance with each key entry. The 2 state routine, FIG. 26, is a time modify display routine, and the 3 state routine FIG. 27, is a drill running routine. After completing one of the four state routines, the routine of FIG. 17 is repeated.
FIG. 18 is a high level overview of the background tacks, and represents the background clock interrupt routine which serves as the entry and exit mechanism to the background tasks. Upon receipt of the real-time clock interrupt (every millisecond) the present state of the system is stored in memory for later restoration by selecting alternating sets of registers. The clock is reloaded with the necessary divisor for subsequent interrupt generation, and a call is made to the "system" subroutine to perform all timekeeping functions, keyboard scanning, LED refreshing and any outstanding I/O.
Upon return from the "system" subroutine, the clock interrupt routine re-seeds the psudo-random number generator for use as the starting drill index into the XROM, effectively giving the drill program its random nature.
The state of the system is then restored to the same state as prior to executing the clock interrupt routine, and the program then returns from the background tasks of FIG. 18, to the main loop of FIG. 17.
FIGS. 19 through 24 represent background tasks which are performed approximately once every millisecond, and the logic flow diagrams of FIGS. 19 through 24 are all interconnected as shown throughout those Figures, such that the actual operation of the logic flow is dependent entirely on the state of the overall system.
Referring to FIG. 19, if a timer is on, the system proceeds to the timing routine of FIG. 20, and then returns back to FIG. 19 on input B3 to the same logic point in FIG. 19 as when no timer is on. The routine then checks if any pause, beep or lamp has been requested, and if not, proceeds to the keyboard scanning function and LED display refresh routine of FIG. 22. If a request was present, a check is made as to whether this a first request, and if not, it proceeds to the Input/Output (I/O) pass routine of FIG. 21. If the request is a first request, a first pass flag of the requested I/O is cleared so that subsequent passes merely decrement the associated timer until time expires. If the I/O request was for a pause, the routine proceeds to the keyboard scanning and LED refresh routine of FIG. 22, and if not, the data bus is configured to activate the lamp or beep as requested, and the routine then exits from the background task routine.
FIG. 20 represents the logic flow diagram for a 0.01 second counter, a 1.0 second counter, and a 30 second counter. The microprocessor described herein is an eight bit machine, and accordingly contiguous bytes are utilized to obtain the necessary timing resolution. In this routine, if this is a first pass for the timing request, the first pass flag is cleared and the 0.01 sec., 1.0 sec., and 30 sec. prescalers are initialized. The prescalers are then incremented as shown in this routine, which is fairly standard in the art.
FIG. 21 represents an I/O pass routine for generally checking the state of the light times, and more particularly on resetting the I/O prescalers, clearing the I/O request flags, and configuring the data bus to turn off a lamp or beep as requested, and also is a straight forward routine.
FIG. 22 represents the LED display refresh and keyboard matrix scanner which are interdependent as described hereinabove. In this routine, the n digit display data is initially obtained, and the inhibit display flag is then checked. If it is set (i.e. inhibit requested), the digit segement display data is replaced by a special "null data" code which forces the LED decoder driver to turn all segments off on the selected digit. If not set, the address bus, control bus and data bus are configured to drive the LED digit cathode and keyboard row, and then read and interpret the output from that row of the keyboard. If a key was depressed, the program proceeds to the key detect and debouncing routine of FIGS. 23 and 24, which again is a fairly standard routine in the art. If a key was depressed, the key row and column are encoded and a scan flag is set as an indicator that the debounce counter should be reinitialized upon exit from the background task.
The routine then proceeds to the key detect and debouncing routine of FIGS. 23 and 24, depending upon whether the same key had been previously detected as being pressed on either inputs G3 or E3 as shown. The key detecting and debouncing routine of FIGS. 23 and 24 is a fairly standard routine, and accordingly is not described in detail herein. At the end of the routine of FIG. 24, the background routines of FIGS. 19 through 24 is exited. As noted hereinabove, these background routines are repeated every 0.001 seconds.
FIGS. 25, 26 and 27 represent the 01 numeric display routine, the 02 modify display routine, and the 03 drill running state routines of FIG. 17. In the 01 numeric display routine, the number to be displayed is converted into 3 bit decimal numbers, which are then decoded and drive the LED displays. In the 02 modify display routine, the modify byte at the modify index is mulitplied by five, the resultant number is converted into 3 bit decimal numbers which are then decoded and drive the LED displays. In the 03 drill running state routine, the status of a run flag is checked, if it is not set to run, the routine exits. In review, each XROM cartridge contains a number of drills, each of which consists of a number of sequential commands to the end. At the end, a new random command (FIG. 18) is selected, so the drill starts at some random state in the middle thereof and then proceeds to the end, after which a new random command is entered, etc., until the expiration of the drill time period.
Referring to FIGS. 28 to 38 which represent the processing of the corresponding keystrokes, an example will serve to illustrate how the users' requestes to select, modify, run, pause, and stop a drill are satisfied.
Upon system initialization (FIGS. 12-16) the following default parameters exist: mode-idle, run flag=running, drill state=warm up, skill level=beginner, drill duration=1 minute, and drill #=1. The user presses the "advanced" key which is detected, debounced, and passed to the foreground program main loop (FIG. 17) by the background task (FIGS. 19-24). A key-jump table "KEYJTB" causes program execution to resume at "ADV" which merely changes the skill level to "advanced" (=2). It can readily be seen that all of the skill level modifers--beginner/intermediate/advanced--cause similar re-assignments of the skill level flag "skill", which serves to change the SROM index at run time.
The user then decides to forfeit the warm-up period and does so by pressing the CANCEL WARM-UP KEY causing the main loop (FIG. 17) to direct the program to cancel the warm-up. (FIG. 29, case #19). A test is then made for the valid modes, idle or drill, which permit the cancellation of the warm-up drill by changing the drill state from "warm-up" to "normal".
Next the user decides to select drill #4 from the XROM which he does by first depressing the "program" key forcing an exit from the main program loop to the "prog" routine. A test is then made for the valid current mode of "idle", which permits the "prog" routine to prepare for subsequent entry of the drill # as follows. The minimum and maximum drill # limits are set, the program mode is changed from "idle" to "entry", the entry type flag is set to "program", and the temporary digit entry number is set to 0. The user then enters the digit 4 from the keyboard, causing execution to resume at the numeric processor "four", which like its counterparts "zero . . . nine", change the temporary digit entry number and test for the valid mode of "entry". Numeric entries of more than one digit would simply cause the previous entry to be adjusted through multiplication by ten and the result added to the entered digit. In this manner a maximum of three digits may be processed, with a digit counter incremented upon receipt of each digit, and the background task displaying the running total (in the example "004") via the routine in FIG. 22.
The user must then terminate his numeric entry by depressing the "enter" key, forcing the main loop to pass control to the "enter" program. A test is made for the valid "entry" mode, which if satisfied causes an additional limit check of the entered value as per the minimum and maximum numbers mentioned above. Finally, the "enter" program decides which field (drill/lamp/ duration/timer) is to be replaced with the entered value based on the flag previously set to "program". The mode is then reset to "idle", and the LED inhibit flag set before the main program loop is re-entered. Note that at any time prior to pressing the "enter" key the user can delete the current numeric entry by pressing the "clear" key which invokes the "clear" routine to reset the temporary digit entry number to zero.
Next the user decides he would like to extend the "on time" of all the lamps in the selected drill by 10%. This is done by first pressing the "modify" key, causing the main loop to transfer control to the "modify" routine. This routine checks that the current mode is "idle" and changes it to "modify". Depressing the "all lamps" key transfers control to the all lamps routine, which points the modify index to the "all lamps" field. It can be seen that the time/pause/lamp modifier keys work in similar manner . . . manipulating the modify index appropriately. The 10% adjustment can then be made by successive depressions of the "+5%" key. A test is made for the valid "modify" mode and, if passed, the "all lamps" field pointed to by the modify index is incremented twice for later adjustment of the lamp-on times. The "-5%" mechanism is identical, except that it succesively decrements the addressed field.
Continuing our hypothetical example, the user then decides to start the selected drill (#4) by pressing the "start/stop" key causing the main loop to branch to the "start" routine. Here a test is made to see if the mode is already set to "drill" in which case the request would have been interrpreted as "stop" and the mode changed to "idle". Since it is not, the "start" routine computes the XROM drill pointers based upon drill # and skill level and adjusts the starting step index based upon the random number seed. The mode is then changed to "drill" and the run/pause flag is set to "run". The system commands contained in the XROM are then executed to allow for introductory speech, instructions, etc. and the user is given an opportunity to position him/herself by virtue of an audible countdown followed by the words "ready, set, go". The selected drill is now executed, step by step, as shown in FIG. 27. The user may elect to temporarily suspend the drill by pressing the "pause" key, invoking the "pause" routine causing the run flag to be toggled from "run" to "pause" (and subsequently back to "run"), which informs the drill running routine of FIG. 27 to forego execution of the next drill step. The drill then continues running in this manner until stopped by the user as mentioned above, or upon expiration of the timer as shown in FIG. 17.
While several embodiments and variations of the present invention for a system for technique and accelerated reaction training are described in detail herein, it should be apparent that the disclosure and teachings of the present invention will suggest many alternative designs to those skilled in the art.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US2678692 *||Apr 23, 1952||May 18, 1954||Coordination measuring device|
|US2957693 *||Dec 3, 1956||Oct 25, 1960||Ross Arthur C||Electrical robot dueler|
|US3008712 *||May 2, 1958||Nov 14, 1961||Konopka Richard O||Pistol draw game apparatus|
|US3024020 *||Jun 27, 1958||Mar 6, 1962||Alton|
|US3902723 *||Jan 11, 1974||Sep 2, 1975||Dacoll Engineering Services Li||Board game apparatus|
|US3933354 *||Sep 18, 1974||Jan 20, 1976||Brunswick Corporation||Reflex testing amusement device|
|US3992786 *||Feb 5, 1975||Nov 23, 1976||Grumman Aerospace Corporation||Apparatus for sequence training|
|US4027875 *||Apr 23, 1975||Jun 7, 1977||Carson Monroe Hurley||Recation speed training device|
|US4058113 *||Nov 13, 1975||Nov 15, 1977||Fields Louis G||Time perception device|
|US4099713 *||Jul 15, 1977||Jul 11, 1978||Donald Spector||Electronic physical trainer system|
|US4121488 *||Feb 24, 1977||Oct 24, 1978||Nep Company, Ltd.||Step-on type tone scale play device|
|US4166452 *||Jan 5, 1978||Sep 4, 1979||Generales Constantine D J Jr||Apparatus for testing human responses to stimuli|
|US4169592 *||Feb 22, 1977||Oct 2, 1979||Hall David J||Electronic reflex game|
|US4240638 *||Jan 6, 1978||Dec 23, 1980||Marvin Glass & Associates||Microprocessor controlled game apparatus|
|US4285517 *||Feb 9, 1979||Aug 25, 1981||Marvin Glass & Associates||Adaptive microcomputer controlled game|
|US4355806 *||Apr 28, 1980||Oct 26, 1982||Mattel, Inc.||Electronic jacks game|
|US4408613 *||Oct 2, 1981||Oct 11, 1983||Aerobitronics, Inc.||Interactive exercise device|
|US4492582 *||Jan 6, 1981||Jan 8, 1985||Mattel, Inc.||Teaching and entertainment device|
|US4493655 *||Aug 5, 1983||Jan 15, 1985||Groff James W||Radio-controlled teaching device|
|US4497036 *||Apr 12, 1983||Jan 29, 1985||Microffice Systems Technology||Portable computer|
|US4518361 *||Aug 5, 1982||May 21, 1985||Conway Malcolm J||Method and apparatus for effecting and evaluating action upon visual imaging|
|US4534557 *||May 25, 1983||Aug 13, 1985||Bigelow Stephen L||Reaction time and applied force feedback|
|US4586905 *||Mar 15, 1985||May 6, 1986||Groff James W||Computer-assisted audio/visual teaching system|
|US4627620 *||Dec 26, 1984||Dec 9, 1986||Yang John P||Electronic athlete trainer for improving skills in reflex, speed and accuracy|
|GB792054A *||Title not available|
|JP22005920A *||Title not available|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US4915384 *||Jul 21, 1988||Apr 10, 1990||Bear Robert A||Player adaptive sports training system|
|US5277426 *||Nov 22, 1991||Jan 11, 1994||Donald A. Wilson||Sports simulation system|
|US5290043 *||May 14, 1993||Mar 1, 1994||Blagoje Vidinic||Game for practicing soccer skills|
|US5823779 *||May 2, 1996||Oct 20, 1998||Advanced Interactive Systems, Inc.||Electronically controlled weapons range with return fire|
|US5836853 *||Oct 15, 1996||Nov 17, 1998||Computer Masters International||System for the detection and signalling of hits in the sport of fencing|
|US5857939 *||Jun 5, 1997||Jan 12, 1999||Talking Counter, Inc.||Exercise device with audible electronic monitor|
|US5901961 *||Nov 4, 1996||May 11, 1999||Holland, Iii; Don Charles||Reaction speed timing and training system for athletes|
|US5980254 *||Apr 6, 1998||Nov 9, 1999||Advanced Interactive Systems, Inc.||Electronically controlled weapons range with return fire|
|US6010414 *||Mar 13, 1998||Jan 4, 2000||Murray Charles Snow||Random bounce reaction training device|
|US6308565||Oct 15, 1998||Oct 30, 2001||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US6430997||Sep 5, 2000||Aug 13, 2002||Trazer Technologies, Inc.||System and method for tracking and assessing movement skills in multidimensional space|
|US6765726||Jul 17, 2002||Jul 20, 2004||Impluse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US6776732 *||Jun 26, 2002||Aug 17, 2004||Paul Parkinson||Simulated tennis ball trajectory & delivery system|
|US6876496||Jul 9, 2004||Apr 5, 2005||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US7038855||Apr 5, 2005||May 2, 2006||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US7285061 *||Aug 4, 2003||Oct 23, 2007||Ervin Wagner||Sports skills training method and apparatus|
|US7292151||Jul 22, 2005||Nov 6, 2007||Kevin Ferguson||Human movement measurement system|
|US7309234 *||Dec 18, 2001||Dec 18, 2007||David Ross Mathog||Method and device for introducing state changes into athletic activities|
|US7359121||May 1, 2006||Apr 15, 2008||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US7455622||May 8, 2006||Nov 25, 2008||Icon Ip, Inc.||Systems for interaction with exercise device|
|US7492268||Nov 6, 2007||Feb 17, 2009||Motiva Llc||Human movement measurement system|
|US7510493 *||Oct 12, 2007||Mar 31, 2009||Ervin Wagner||Sports skills training apparatus|
|US7510509||May 24, 2006||Mar 31, 2009||Icon Ip, Inc.||Method and apparatus for remote interactive exercise and health equipment|
|US7513852||Feb 17, 2006||Apr 7, 2009||Scott & Wilkins Enterprises, Llc||Exercise device having position verification feedback|
|US7537546||Sep 29, 2003||May 26, 2009||Icon Ip, Inc.||Systems and methods for controlling the operation of one or more exercise devices and providing motivational programming|
|US7549947||Jun 13, 2005||Jun 23, 2009||Icon Ip, Inc.||Mobile systems and methods for health, exercise and competition|
|US7556590||May 8, 2006||Jul 7, 2009||Icon Ip, Inc.||Systems and methods for enabling two-way communication between one or more exercise devices and computer devices and for enabling users of the one or more exercise devices to competitively exercise|
|US7572206 *||Jun 18, 2003||Aug 11, 2009||Scott & Wilkins Enterprises, Llc||Exercise device having position verification feedback|
|US7575536 *||Dec 5, 2003||Aug 18, 2009||Icon Ip, Inc.||Method and apparatus for remote interactive exercise and health equipment|
|US7604570||Oct 20, 2009||Scott & Wilkins Enterprises, Llc||Exercise device having position verification feedback|
|US7604571||Oct 20, 2009||Scott & Wilkins Enterprises, Llc||Exercise device with a user-defined exercise mode|
|US7625315||Feb 6, 2004||Dec 1, 2009||Icon Ip, Inc.||Exercise and health equipment|
|US7628730||May 28, 2004||Dec 8, 2009||Icon Ip, Inc.||Methods and systems for controlling an exercise apparatus using a USB compatible portable remote device|
|US7637847||Dec 30, 2003||Dec 29, 2009||Icon Ip, Inc.||Exercise system and method with virtual personal trainer forewarning|
|US7645213||Jan 12, 2010||Watterson Scott R||Systems for interaction with exercise device|
|US7691012 *||Oct 27, 2004||Apr 6, 2010||Precision Sports Robotics, Llc||Programmable ball throwing apparatus|
|US7713171||Jan 23, 2007||May 11, 2010||Icon Ip, Inc.||Exercise equipment with removable digital script memory|
|US7766770||Dec 22, 2005||Aug 3, 2010||Precision Sports Robotics, Llc||Programmable ball throwing apparatus|
|US7789800||Sep 7, 2010||Icon Ip, Inc.||Methods and systems for controlling an exercise apparatus using a USB compatible portable remote device|
|US7791808||Apr 10, 2008||Sep 7, 2010||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US7794370||Jun 28, 2005||Sep 14, 2010||Joseph A Tackett||Exercise unit and system utilizing MIDI signals|
|US7857731||Jun 22, 2009||Dec 28, 2010||Icon Ip, Inc.||Mobile systems and methods for health, exercise and competition|
|US7862478||May 18, 2009||Jan 4, 2011||Icon Ip, Inc.||System and methods for controlling the operation of one or more exercise devices and providing motivational programming|
|US7864168||May 10, 2006||Jan 4, 2011||Impulse Technology Ltd.||Virtual reality movement system|
|US7951045 *||May 31, 2011||Jason Brader||Multi-functional athletic training system|
|US7952483||Feb 16, 2009||May 31, 2011||Motiva Llc||Human movement measurement system|
|US7980967||Jul 19, 2011||Precision Sports Robotics, Llc||Programmable ball throwing apparatus|
|US7980996||May 3, 2010||Jul 19, 2011||Icon Ip, Inc.||Method and apparatus for remote interactive exercise and health equipment|
|US7981000||Jan 8, 2010||Jul 19, 2011||Icon Ip, Inc.||Systems for interaction with exercise device|
|US7985164||Dec 21, 2005||Jul 26, 2011||Icon Ip, Inc.||Methods and systems for controlling an exercise apparatus using a portable data storage device|
|US8029410||May 11, 2010||Oct 4, 2011||Shea Michael J||Exercise system and portable module for same|
|US8029415||Mar 27, 2009||Oct 4, 2011||Icon Ip, Inc.||Systems, methods, and devices for simulating real world terrain on an exercise device|
|US8047965||May 16, 2010||Nov 1, 2011||Shea Michael J||Exercise machine information system|
|US8057360||Nov 15, 2011||Shea Michael J||Exercise system|
|US8092346||Jan 10, 2012||Shea Michael J||Exercise system|
|US8159354||Apr 28, 2011||Apr 17, 2012||Motiva Llc||Human movement measurement system|
|US8213680||Jul 3, 2012||Microsoft Corporation||Proxy training data for human body tracking|
|US8221295||Jul 17, 2012||Scott & Wilkins Enterprises, Llc||Exercise device with features for simultaneously working out the upper and lower body|
|US8251874||Aug 28, 2012||Icon Health & Fitness, Inc.||Exercise systems for simulating real world terrain|
|US8253746||May 1, 2009||Aug 28, 2012||Microsoft Corporation||Determine intended motions|
|US8262517 *||Sep 11, 2012||Ramesh Balasubramanyan||Sensor based tennis serve training apparatus|
|US8264536||Sep 11, 2012||Microsoft Corporation||Depth-sensitive imaging via polarization-state mapping|
|US8265341||Sep 11, 2012||Microsoft Corporation||Voice-body identity correlation|
|US8267781||Sep 18, 2012||Microsoft Corporation||Visual target tracking|
|US8279418||Mar 17, 2010||Oct 2, 2012||Microsoft Corporation||Raster scanning for depth detection|
|US8284847||Oct 9, 2012||Microsoft Corporation||Detecting motion for a multifunction sensor device|
|US8287404 *||Oct 16, 2012||PrecisionSports Robotics, LLC||Programmable ball throwing apparatus|
|US8294767||Jan 30, 2009||Oct 23, 2012||Microsoft Corporation||Body scan|
|US8295546||Oct 23, 2012||Microsoft Corporation||Pose tracking pipeline|
|US8296151||Jun 18, 2010||Oct 23, 2012||Microsoft Corporation||Compound gesture-speech commands|
|US8298123||Oct 30, 2012||Icon Health & Fitness, Inc.||Method and apparatus for remote interactive exercise and health equipment|
|US8320619||Nov 27, 2012||Microsoft Corporation||Systems and methods for tracking a model|
|US8320621||Dec 21, 2009||Nov 27, 2012||Microsoft Corporation||Depth projector system with integrated VCSEL array|
|US8325909||Dec 4, 2012||Microsoft Corporation||Acoustic echo suppression|
|US8325984||Jun 9, 2011||Dec 4, 2012||Microsoft Corporation||Systems and methods for tracking a model|
|US8330134||Dec 11, 2012||Microsoft Corporation||Optical fault monitoring|
|US8330822||Dec 11, 2012||Microsoft Corporation||Thermally-tuned depth camera light source|
|US8340432||Jun 16, 2009||Dec 25, 2012||Microsoft Corporation||Systems and methods for detecting a tilt angle from a depth image|
|US8342968||Jan 1, 2013||Fuccillo Ralph C||Methods and system for improving a user's reaction time and accuracy in propelling an object|
|US8351651||Jan 8, 2013||Microsoft Corporation||Hand-location post-process refinement in a tracking system|
|US8351652||Jan 8, 2013||Microsoft Corporation||Systems and methods for tracking a model|
|US8363212||Jan 29, 2013||Microsoft Corporation||System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed|
|US8371990||Feb 12, 2013||Michael J. Shea||Exercise system|
|US8374423||Mar 2, 2012||Feb 12, 2013||Microsoft Corporation||Motion detection using depth images|
|US8379101||Feb 19, 2013||Microsoft Corporation||Environment and/or target segmentation|
|US8379919||Feb 19, 2013||Microsoft Corporation||Multiple centroid condensation of probability distribution clouds|
|US8381108||Jun 21, 2010||Feb 19, 2013||Microsoft Corporation||Natural user input for driving interactive stories|
|US8385557||Jun 19, 2008||Feb 26, 2013||Microsoft Corporation||Multichannel acoustic echo reduction|
|US8385596||Dec 21, 2010||Feb 26, 2013||Microsoft Corporation||First person shooter control with virtual skeleton|
|US8390680||Jul 9, 2009||Mar 5, 2013||Microsoft Corporation||Visual representation expression based on player expression|
|US8401225||Jan 31, 2011||Mar 19, 2013||Microsoft Corporation||Moving object segmentation using depth images|
|US8401242||Mar 19, 2013||Microsoft Corporation||Real-time camera tracking using depth maps|
|US8408706||Apr 2, 2013||Microsoft Corporation||3D gaze tracker|
|US8411948||Apr 2, 2013||Microsoft Corporation||Up-sampling binary images for segmentation|
|US8416187||Jun 22, 2010||Apr 9, 2013||Microsoft Corporation||Item navigation using motion-capture data|
|US8418085||Apr 9, 2013||Microsoft Corporation||Gesture coach|
|US8422769||Apr 16, 2013||Microsoft Corporation||Image segmentation using reduced foreground training data|
|US8427325||Apr 23, 2013||Motiva Llc||Human movement measurement system|
|US8428340||Sep 21, 2009||Apr 23, 2013||Microsoft Corporation||Screen space plane identification|
|US8430547||Aug 3, 2009||Apr 30, 2013||Nike, Inc.||Compact motion-simulating device|
|US8437506||Sep 7, 2010||May 7, 2013||Microsoft Corporation||System for fast, probabilistic skeletal tracking|
|US8448056||May 21, 2013||Microsoft Corporation||Validation analysis of human target|
|US8448094||Mar 25, 2009||May 21, 2013||Microsoft Corporation||Mapping a natural input device to a legacy system|
|US8451278||Aug 3, 2012||May 28, 2013||Microsoft Corporation||Determine intended motions|
|US8452051||Dec 18, 2012||May 28, 2013||Microsoft Corporation||Hand-location post-process refinement in a tracking system|
|US8452087||May 28, 2013||Microsoft Corporation||Image selection techniques|
|US8456419||Jun 4, 2013||Microsoft Corporation||Determining a position of a pointing device|
|US8457353||May 18, 2010||Jun 4, 2013||Microsoft Corporation||Gestures and gesture modifiers for manipulating a user-interface|
|US8467574||Jun 18, 2013||Microsoft Corporation||Body scan|
|US8483436||Nov 4, 2011||Jul 9, 2013||Microsoft Corporation||Systems and methods for tracking a model|
|US8487871||Jun 1, 2009||Jul 16, 2013||Microsoft Corporation||Virtual desktop coordinate transformation|
|US8487938||Feb 23, 2009||Jul 16, 2013||Microsoft Corporation||Standard Gestures|
|US8488888||Dec 28, 2010||Jul 16, 2013||Microsoft Corporation||Classification of posture states|
|US8497838||Feb 16, 2011||Jul 30, 2013||Microsoft Corporation||Push actuation of interface controls|
|US8498481||May 7, 2010||Jul 30, 2013||Microsoft Corporation||Image segmentation using star-convexity constraints|
|US8499257||Feb 9, 2010||Jul 30, 2013||Microsoft Corporation||Handles interactions for human—computer interface|
|US8503086||Aug 16, 2010||Aug 6, 2013||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US8503494||Apr 5, 2011||Aug 6, 2013||Microsoft Corporation||Thermal management system|
|US8503766||Dec 13, 2012||Aug 6, 2013||Microsoft Corporation||Systems and methods for detecting a tilt angle from a depth image|
|US8508919||Sep 14, 2009||Aug 13, 2013||Microsoft Corporation||Separation of electrical and optical components|
|US8509479||Jun 16, 2009||Aug 13, 2013||Microsoft Corporation||Virtual object|
|US8509545||Nov 29, 2011||Aug 13, 2013||Microsoft Corporation||Foreground subject detection|
|US8514269||Mar 26, 2010||Aug 20, 2013||Microsoft Corporation||De-aliasing depth images|
|US8523667||Mar 29, 2010||Sep 3, 2013||Microsoft Corporation||Parental control settings based on body dimensions|
|US8526734||Jun 1, 2011||Sep 3, 2013||Microsoft Corporation||Three-dimensional background removal for vision system|
|US8542252||May 29, 2009||Sep 24, 2013||Microsoft Corporation||Target digitization, extraction, and tracking|
|US8542910||Feb 2, 2012||Sep 24, 2013||Microsoft Corporation||Human tracking system|
|US8548270||Oct 4, 2010||Oct 1, 2013||Microsoft Corporation||Time-of-flight depth imaging|
|US8553934||Dec 8, 2010||Oct 8, 2013||Microsoft Corporation||Orienting the position of a sensor|
|US8553939||Feb 29, 2012||Oct 8, 2013||Microsoft Corporation||Pose tracking pipeline|
|US8558873||Jun 16, 2010||Oct 15, 2013||Microsoft Corporation||Use of wavefront coding to create a depth image|
|US8564534||Oct 7, 2009||Oct 22, 2013||Microsoft Corporation||Human tracking system|
|US8565476||Dec 7, 2009||Oct 22, 2013||Microsoft Corporation||Visual target tracking|
|US8565477||Dec 7, 2009||Oct 22, 2013||Microsoft Corporation||Visual target tracking|
|US8565485||Sep 13, 2012||Oct 22, 2013||Microsoft Corporation||Pose tracking pipeline|
|US8571263||Mar 17, 2011||Oct 29, 2013||Microsoft Corporation||Predicting joint positions|
|US8577084||Dec 7, 2009||Nov 5, 2013||Microsoft Corporation||Visual target tracking|
|US8577085||Dec 7, 2009||Nov 5, 2013||Microsoft Corporation||Visual target tracking|
|US8578302||Jun 6, 2011||Nov 5, 2013||Microsoft Corporation||Predictive determination|
|US8587583||Jan 31, 2011||Nov 19, 2013||Microsoft Corporation||Three-dimensional environment reconstruction|
|US8587773||Dec 13, 2012||Nov 19, 2013||Microsoft Corporation||System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed|
|US8588465||Dec 7, 2009||Nov 19, 2013||Microsoft Corporation||Visual target tracking|
|US8588517||Jan 15, 2013||Nov 19, 2013||Microsoft Corporation||Motion detection using depth images|
|US8592739||Nov 2, 2010||Nov 26, 2013||Microsoft Corporation||Detection of configuration changes of an optical element in an illumination system|
|US8597142||Sep 13, 2011||Dec 3, 2013||Microsoft Corporation||Dynamic camera based practice mode|
|US8605763||Mar 31, 2010||Dec 10, 2013||Microsoft Corporation||Temperature measurement and control for laser and light-emitting diodes|
|US8610665||Apr 26, 2013||Dec 17, 2013||Microsoft Corporation||Pose tracking pipeline|
|US8611607||Feb 19, 2013||Dec 17, 2013||Microsoft Corporation||Multiple centroid condensation of probability distribution clouds|
|US8613666||Aug 31, 2010||Dec 24, 2013||Microsoft Corporation||User selection and navigation based on looped motions|
|US8618405||Dec 9, 2010||Dec 31, 2013||Microsoft Corp.||Free-space gesture musical instrument digital interface (MIDI) controller|
|US8619122||Feb 2, 2010||Dec 31, 2013||Microsoft Corporation||Depth camera compatibility|
|US8620113||Apr 25, 2011||Dec 31, 2013||Microsoft Corporation||Laser diode modes|
|US8622843||Nov 28, 2012||Jan 7, 2014||Ralph C. Fuccillo||Methods and system for improving a user's reaction time and accuracy in propelling an object|
|US8625837||Jun 16, 2009||Jan 7, 2014||Microsoft Corporation||Protocol and format for communicating an image from a camera to a computing environment|
|US8629976||Feb 4, 2011||Jan 14, 2014||Microsoft Corporation||Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems|
|US8630457||Dec 15, 2011||Jan 14, 2014||Microsoft Corporation||Problem states for pose tracking pipeline|
|US8631355||Jan 8, 2010||Jan 14, 2014||Microsoft Corporation||Assigning gesture dictionaries|
|US8633890||Feb 16, 2010||Jan 21, 2014||Microsoft Corporation||Gesture detection based on joint skipping|
|US8635637||Dec 2, 2011||Jan 21, 2014||Microsoft Corporation||User interface presenting an animated avatar performing a media reaction|
|US8638985||Mar 3, 2011||Jan 28, 2014||Microsoft Corporation||Human body pose estimation|
|US8644609||Mar 19, 2013||Feb 4, 2014||Microsoft Corporation||Up-sampling binary images for segmentation|
|US8649554||May 29, 2009||Feb 11, 2014||Microsoft Corporation||Method to control perspective for a camera-controlled computer|
|US8655069||Mar 5, 2010||Feb 18, 2014||Microsoft Corporation||Updating image segmentation following user input|
|US8659658||Feb 9, 2010||Feb 25, 2014||Microsoft Corporation||Physical interaction zone for gesture-based user interfaces|
|US8660303||Dec 20, 2010||Feb 25, 2014||Microsoft Corporation||Detection of body and props|
|US8660310||Dec 13, 2012||Feb 25, 2014||Microsoft Corporation||Systems and methods for tracking a model|
|US8667519||Nov 12, 2010||Mar 4, 2014||Microsoft Corporation||Automatic passive and anonymous feedback system|
|US8670029||Jun 16, 2010||Mar 11, 2014||Microsoft Corporation||Depth camera illuminator with superluminescent light-emitting diode|
|US8675981||Jun 11, 2010||Mar 18, 2014||Microsoft Corporation||Multi-modal gender recognition including depth data|
|US8676581||Jan 22, 2010||Mar 18, 2014||Microsoft Corporation||Speech recognition analysis via identification information|
|US8681255||Sep 28, 2010||Mar 25, 2014||Microsoft Corporation||Integrated low power depth camera and projection device|
|US8681321||Dec 31, 2009||Mar 25, 2014||Microsoft International Holdings B.V.||Gated 3D camera|
|US8682028||Dec 7, 2009||Mar 25, 2014||Microsoft Corporation||Visual target tracking|
|US8687044||Feb 2, 2010||Apr 1, 2014||Microsoft Corporation||Depth camera compatibility|
|US8690735||Jul 15, 2011||Apr 8, 2014||Icon Health & Fitness, Inc.||Systems for interaction with exercise device|
|US8693724||May 28, 2010||Apr 8, 2014||Microsoft Corporation||Method and system implementing user-centric gesture control|
|US8702507||Sep 20, 2011||Apr 22, 2014||Microsoft Corporation||Manual and camera-based avatar control|
|US8707216||Feb 26, 2009||Apr 22, 2014||Microsoft Corporation||Controlling objects via gesturing|
|US8717469||Feb 3, 2010||May 6, 2014||Microsoft Corporation||Fast gating photosurface|
|US8723118||Oct 1, 2009||May 13, 2014||Microsoft Corporation||Imager for constructing color and depth images|
|US8724887||Feb 3, 2011||May 13, 2014||Microsoft Corporation||Environmental modifications to mitigate environmental factors|
|US8724906||Nov 18, 2011||May 13, 2014||Microsoft Corporation||Computing pose and/or shape of modifiable entities|
|US8727784||Jan 10, 2011||May 20, 2014||Jeffrey D. Wolf||Sports board drill training apparatus and method therefore|
|US8744121||May 29, 2009||Jun 3, 2014||Microsoft Corporation||Device for identifying and tracking multiple humans over time|
|US8745541||Dec 1, 2003||Jun 3, 2014||Microsoft Corporation||Architecture for controlling a computer using hand gestures|
|US8749557||Jun 11, 2010||Jun 10, 2014||Microsoft Corporation||Interacting with user interface via avatar|
|US8751215||Jun 4, 2010||Jun 10, 2014||Microsoft Corporation||Machine based sign language interpreter|
|US8758201||Jul 3, 2012||Jun 24, 2014||Icon Health & Fitness, Inc.||Portable physical activity sensing system|
|US8760395||May 31, 2011||Jun 24, 2014||Microsoft Corporation||Gesture recognition techniques|
|US8760571||Sep 21, 2009||Jun 24, 2014||Microsoft Corporation||Alignment of lens and image sensor|
|US8762894||Feb 10, 2012||Jun 24, 2014||Microsoft Corporation||Managing virtual ports|
|US8773355||Mar 16, 2009||Jul 8, 2014||Microsoft Corporation||Adaptive cursor sizing|
|US8775916||May 17, 2013||Jul 8, 2014||Microsoft Corporation||Validation analysis of human target|
|US8781156||Sep 10, 2012||Jul 15, 2014||Microsoft Corporation||Voice-body identity correlation|
|US8782567||Nov 4, 2011||Jul 15, 2014||Microsoft Corporation||Gesture recognizer system architecture|
|US8784270||Sep 7, 2010||Jul 22, 2014||Icon Ip, Inc.||Portable physical activity sensing system|
|US8786730||Aug 18, 2011||Jul 22, 2014||Microsoft Corporation||Image exposure using exclusion regions|
|US8787658||Mar 19, 2013||Jul 22, 2014||Microsoft Corporation||Image segmentation using reduced foreground training data|
|US8788973||May 23, 2011||Jul 22, 2014||Microsoft Corporation||Three-dimensional gesture controlled avatar configuration interface|
|US8803800||Dec 2, 2011||Aug 12, 2014||Microsoft Corporation||User interface control based on head orientation|
|US8803888||Jun 2, 2010||Aug 12, 2014||Microsoft Corporation||Recognition system for sharing information|
|US8803952||Dec 20, 2010||Aug 12, 2014||Microsoft Corporation||Plural detector time-of-flight depth mapping|
|US8811938||Dec 16, 2011||Aug 19, 2014||Microsoft Corporation||Providing a user interface experience based on inferred vehicle state|
|US8818002||Jul 21, 2011||Aug 26, 2014||Microsoft Corp.||Robust adaptive beamforming with enhanced noise suppression|
|US8824749||Apr 5, 2011||Sep 2, 2014||Microsoft Corporation||Biometric recognition|
|US8843857||Nov 19, 2009||Sep 23, 2014||Microsoft Corporation||Distance scalable no touch computing|
|US8854426||Nov 7, 2011||Oct 7, 2014||Microsoft Corporation||Time-of-flight camera with guided light|
|US8856691||May 29, 2009||Oct 7, 2014||Microsoft Corporation||Gesture tool|
|US8860663||Nov 22, 2013||Oct 14, 2014||Microsoft Corporation||Pose tracking pipeline|
|US8861091||Aug 6, 2013||Oct 14, 2014||Impulse Technology Ltd.||System and method for tracking and assessing movement skills in multidimensional space|
|US8861839||Sep 23, 2013||Oct 14, 2014||Microsoft Corporation||Human tracking system|
|US8864581||Jan 29, 2010||Oct 21, 2014||Microsoft Corporation||Visual based identitiy tracking|
|US8866889||Nov 3, 2010||Oct 21, 2014||Microsoft Corporation||In-home depth camera calibration|
|US8867820||Oct 7, 2009||Oct 21, 2014||Microsoft Corporation||Systems and methods for removing a background of an image|
|US8869072||Aug 2, 2011||Oct 21, 2014||Microsoft Corporation||Gesture recognizer system architecture|
|US8879831||Dec 15, 2011||Nov 4, 2014||Microsoft Corporation||Using high-level attributes to guide image processing|
|US8882310||Dec 10, 2012||Nov 11, 2014||Microsoft Corporation||Laser die light source module with low inductance|
|US8884968||Dec 15, 2010||Nov 11, 2014||Microsoft Corporation||Modeling an object from image data|
|US8885890||May 7, 2010||Nov 11, 2014||Microsoft Corporation||Depth map confidence filtering|
|US8888331||May 9, 2011||Nov 18, 2014||Microsoft Corporation||Low inductance light source module|
|US8891067||Jan 31, 2011||Nov 18, 2014||Microsoft Corporation||Multiple synchronized optical sources for time-of-flight range finding systems|
|US8891827||Nov 15, 2012||Nov 18, 2014||Microsoft Corporation||Systems and methods for tracking a model|
|US8892495||Jan 8, 2013||Nov 18, 2014||Blanding Hovenweep, Llc||Adaptive pattern recognition based controller apparatus and method and human-interface therefore|
|US8896721||Jan 11, 2013||Nov 25, 2014||Microsoft Corporation||Environment and/or target segmentation|
|US8897491||Oct 19, 2011||Nov 25, 2014||Microsoft Corporation||System for finger recognition and tracking|
|US8897493||Jan 4, 2013||Nov 25, 2014||Microsoft Corporation||Body scan|
|US8897495||May 8, 2013||Nov 25, 2014||Microsoft Corporation||Systems and methods for tracking a model|
|US8898687||Apr 4, 2012||Nov 25, 2014||Microsoft Corporation||Controlling a media program based on a media reaction|
|US8908091||Jun 11, 2014||Dec 9, 2014||Microsoft Corporation||Alignment of lens and image sensor|
|US8917240||Jun 28, 2013||Dec 23, 2014||Microsoft Corporation||Virtual desktop coordinate transformation|
|US8920241||Dec 15, 2010||Dec 30, 2014||Microsoft Corporation||Gesture controlled persistent handles for interface guides|
|US8926431||Mar 2, 2012||Jan 6, 2015||Microsoft Corporation||Visual based identity tracking|
|US8928579||Feb 22, 2010||Jan 6, 2015||Andrew David Wilson||Interacting with an omni-directionally projected display|
|US8929612||Nov 18, 2011||Jan 6, 2015||Microsoft Corporation||System for recognizing an open or closed hand|
|US8929668||Jun 28, 2013||Jan 6, 2015||Microsoft Corporation||Foreground subject detection|
|US8933884||Jan 15, 2010||Jan 13, 2015||Microsoft Corporation||Tracking groups of users in motion capture system|
|US8942428||May 29, 2009||Jan 27, 2015||Microsoft Corporation||Isolate extraneous motions|
|US8942917||Feb 14, 2011||Jan 27, 2015||Microsoft Corporation||Change invariant scene recognition by an agent|
|US8953844||May 6, 2013||Feb 10, 2015||Microsoft Technology Licensing, Llc||System for fast, probabilistic skeletal tracking|
|US8959541||May 29, 2012||Feb 17, 2015||Microsoft Technology Licensing, Llc||Determining a future portion of a currently presented media program|
|US8963829||Nov 11, 2009||Feb 24, 2015||Microsoft Corporation||Methods and systems for determining and tracking extremities of a target|
|US8968091||Mar 2, 2012||Mar 3, 2015||Microsoft Technology Licensing, Llc||Scalable real-time motion recognition|
|US8970487||Oct 21, 2013||Mar 3, 2015||Microsoft Technology Licensing, Llc||Human tracking system|
|US8971612||Dec 15, 2011||Mar 3, 2015||Microsoft Corporation||Learning image processing tasks from scene reconstructions|
|US8976986||Sep 21, 2009||Mar 10, 2015||Microsoft Technology Licensing, Llc||Volume adjustment based on listener position|
|US8982151||Jun 14, 2010||Mar 17, 2015||Microsoft Technology Licensing, Llc||Independently processing planes of display data|
|US8983233||Aug 30, 2013||Mar 17, 2015||Microsoft Technology Licensing, Llc||Time-of-flight depth imaging|
|US8988432||Nov 5, 2009||Mar 24, 2015||Microsoft Technology Licensing, Llc||Systems and methods for processing an image for target tracking|
|US8988437||Mar 20, 2009||Mar 24, 2015||Microsoft Technology Licensing, Llc||Chaining animations|
|US8988508||Sep 24, 2010||Mar 24, 2015||Microsoft Technology Licensing, Llc.||Wide angle field of view active illumination imaging system|
|US8994718||Dec 21, 2010||Mar 31, 2015||Microsoft Technology Licensing, Llc||Skeletal control of three-dimensional virtual world|
|US9001118||Aug 14, 2012||Apr 7, 2015||Microsoft Technology Licensing, Llc||Avatar construction using depth camera|
|US9007417||Jul 18, 2012||Apr 14, 2015||Microsoft Technology Licensing, Llc||Body scan|
|US9008355||Jun 4, 2010||Apr 14, 2015||Microsoft Technology Licensing, Llc||Automatic depth camera aiming|
|US9013489||Nov 16, 2011||Apr 21, 2015||Microsoft Technology Licensing, Llc||Generation of avatar reflecting player appearance|
|US9015638||May 1, 2009||Apr 21, 2015||Microsoft Technology Licensing, Llc||Binding users to a gesture based system and providing feedback to the users|
|US9019201||Jan 8, 2010||Apr 28, 2015||Microsoft Technology Licensing, Llc||Evolving universal gesture sets|
|US9028368||Jul 5, 2011||May 12, 2015||Icon Health & Fitness, Inc.||Systems, methods, and devices for simulating real world terrain on an exercise device|
|US9031103||Nov 5, 2013||May 12, 2015||Microsoft Technology Licensing, Llc||Temperature measurement and control for laser and light-emitting diodes|
|US9039528||Dec 1, 2011||May 26, 2015||Microsoft Technology Licensing, Llc||Visual target tracking|
|US9052382||Oct 18, 2013||Jun 9, 2015||Microsoft Technology Licensing, Llc||System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed|
|US9052746||Feb 15, 2013||Jun 9, 2015||Microsoft Technology Licensing, Llc||User center-of-mass and mass distribution extraction using depth images|
|US9054764||Jul 20, 2011||Jun 9, 2015||Microsoft Technology Licensing, Llc||Sensor array beamformer post-processor|
|US9056254||Oct 6, 2014||Jun 16, 2015||Microsoft Technology Licensing, Llc||Time-of-flight camera with guided light|
|US9063001||Nov 2, 2012||Jun 23, 2015||Microsoft Technology Licensing, Llc||Optical fault monitoring|
|US9067136||Mar 10, 2011||Jun 30, 2015||Microsoft Technology Licensing, Llc||Push personalization of interface controls|
|US9069381||Mar 2, 2012||Jun 30, 2015||Microsoft Technology Licensing, Llc||Interacting with a computer based application|
|US9075434||Aug 20, 2010||Jul 7, 2015||Microsoft Technology Licensing, Llc||Translating user motion into multiple object responses|
|US9092657||Mar 13, 2013||Jul 28, 2015||Microsoft Technology Licensing, Llc||Depth image processing|
|US9098110||Aug 18, 2011||Aug 4, 2015||Microsoft Technology Licensing, Llc||Head rotation tracking from depth-based center of mass|
|US9098493||Apr 24, 2014||Aug 4, 2015||Microsoft Technology Licensing, Llc||Machine based sign language interpreter|
|US9098873||Apr 1, 2010||Aug 4, 2015||Microsoft Technology Licensing, Llc||Motion-based interactive shopping environment|
|US9100685||Dec 9, 2011||Aug 4, 2015||Microsoft Technology Licensing, Llc||Determining audience state or interest using passive sensor data|
|US9117281||Nov 2, 2011||Aug 25, 2015||Microsoft Corporation||Surface segmentation from RGB and depth images|
|US9123316||Dec 27, 2010||Sep 1, 2015||Microsoft Technology Licensing, Llc||Interactive content creation|
|US9135516||Mar 8, 2013||Sep 15, 2015||Microsoft Technology Licensing, Llc||User body angle, curvature and average extremity positions extraction using depth images|
|US9137463||May 12, 2011||Sep 15, 2015||Microsoft Technology Licensing, Llc||Adaptive high dynamic range camera|
|US9141193||Aug 31, 2009||Sep 22, 2015||Microsoft Technology Licensing, Llc||Techniques for using human gestures to control gesture unaware programs|
|US9147253||Jun 19, 2012||Sep 29, 2015||Microsoft Technology Licensing, Llc||Raster scanning for depth detection|
|US9154837||Dec 16, 2013||Oct 6, 2015||Microsoft Technology Licensing, Llc||User interface presenting an animated avatar performing a media reaction|
|US9159151||Jul 13, 2009||Oct 13, 2015||Microsoft Technology Licensing, Llc||Bringing a visual representation to life via learned input from the user|
|US9171264||Dec 15, 2010||Oct 27, 2015||Microsoft Technology Licensing, Llc||Parallel processing machine learning decision tree training|
|US9182814||Jun 26, 2009||Nov 10, 2015||Microsoft Technology Licensing, Llc||Systems and methods for estimating a non-visible or occluded body part|
|US9191570||Aug 5, 2013||Nov 17, 2015||Microsoft Technology Licensing, Llc||Systems and methods for detecting a tilt angle from a depth image|
|US9195305||Nov 8, 2012||Nov 24, 2015||Microsoft Technology Licensing, Llc||Recognizing user intent in motion capture system|
|US9208571||Mar 2, 2012||Dec 8, 2015||Microsoft Technology Licensing, Llc||Object digitization|
|US9210401||May 3, 2012||Dec 8, 2015||Microsoft Technology Licensing, Llc||Projected visual cues for guiding physical movement|
|US9215478||Nov 27, 2013||Dec 15, 2015||Microsoft Technology Licensing, Llc||Protocol and format for communicating an image from a camera to a computing environment|
|US9242171||Feb 23, 2013||Jan 26, 2016||Microsoft Technology Licensing, Llc||Real-time camera tracking using depth maps|
|US9244533||Dec 17, 2009||Jan 26, 2016||Microsoft Technology Licensing, Llc||Camera navigation for presentations|
|US9247238||Jan 31, 2011||Jan 26, 2016||Microsoft Technology Licensing, Llc||Reducing interference between multiple infra-red depth cameras|
|US9248358||Apr 10, 2012||Feb 2, 2016||Apexk Inc.||Interactive cognitive-multisensory interface apparatus and methods for assessing, profiling, training, and improving performance of athletes and other populations|
|US9251590||Jan 24, 2013||Feb 2, 2016||Microsoft Technology Licensing, Llc||Camera pose estimation for 3D reconstruction|
|US9256282||Mar 20, 2009||Feb 9, 2016||Microsoft Technology Licensing, Llc||Virtual object manipulation|
|US9259643||Sep 20, 2011||Feb 16, 2016||Microsoft Technology Licensing, Llc||Control of separate computer game elements|
|US9262673||May 24, 2013||Feb 16, 2016||Microsoft Technology Licensing, Llc||Human body pose estimation|
|US9264807||Jan 23, 2013||Feb 16, 2016||Microsoft Technology Licensing, Llc||Multichannel acoustic echo reduction|
|US9268404||Jan 8, 2010||Feb 23, 2016||Microsoft Technology Licensing, Llc||Application gesture interpretation|
|US9274606||Mar 14, 2013||Mar 1, 2016||Microsoft Technology Licensing, Llc||NUI video conference controls|
|US9274747||Feb 19, 2013||Mar 1, 2016||Microsoft Technology Licensing, Llc||Natural user input for driving interactive stories|
|US9278287||Oct 20, 2014||Mar 8, 2016||Microsoft Technology Licensing, Llc||Visual based identity tracking|
|US9280203||Aug 2, 2011||Mar 8, 2016||Microsoft Technology Licensing, Llc||Gesture recognizer system architecture|
|US9291449||Nov 25, 2013||Mar 22, 2016||Microsoft Technology Licensing, Llc||Detection of configuration changes among optical elements of illumination system|
|US9292083||May 29, 2014||Mar 22, 2016||Microsoft Technology Licensing, Llc||Interacting with user interface via avatar|
|US9298263||Oct 27, 2010||Mar 29, 2016||Microsoft Technology Licensing, Llc||Show body position|
|US9298287||Mar 31, 2011||Mar 29, 2016||Microsoft Technology Licensing, Llc||Combined activation for natural user interface systems|
|US9311560||Aug 12, 2015||Apr 12, 2016||Microsoft Technology Licensing, Llc||Extraction of user behavior from depth images|
|US9313376||Apr 1, 2009||Apr 12, 2016||Microsoft Technology Licensing, Llc||Dynamic depth power equalization|
|US9342139||Dec 19, 2011||May 17, 2016||Microsoft Technology Licensing, Llc||Pairing a computing device to a user|
|US9349040||Nov 19, 2010||May 24, 2016||Microsoft Technology Licensing, Llc||Bi-modal depth-image analysis|
|US9372544||May 16, 2014||Jun 21, 2016||Microsoft Technology Licensing, Llc||Gesture recognition techniques|
|US9377857||May 1, 2009||Jun 28, 2016||Microsoft Technology Licensing, Llc||Show body position|
|US9383823||May 29, 2009||Jul 5, 2016||Microsoft Technology Licensing, Llc||Combining gestures beyond skeletal|
|US9384329||Jun 11, 2010||Jul 5, 2016||Microsoft Technology Licensing, Llc||Caloric burn determination from body movement|
|US20020165048 *||Jun 26, 2002||Nov 7, 2002||Paul Parkinson||Simulated tennis ball trajectory & delivery system|
|US20030114256 *||Dec 18, 2001||Jun 19, 2003||Mathog David Ross||Method and device for introducing state changes into athletic activities|
|US20040193413 *||Dec 1, 2003||Sep 30, 2004||Wilson Andrew D.||Architecture for controlling a computer using hand gestures|
|US20040259689 *||Jun 18, 2003||Dec 23, 2004||Wilkins Larry C.||Exercise device having position verification feedback|
|US20050032581 *||Aug 4, 2003||Feb 10, 2005||Ervin Wagner||''Sports skills training method and apparatus''|
|US20050167907 *||Nov 26, 2004||Aug 4, 2005||Curkendall Leland D.||Method and apparatus for portable exercise system with electronic targets|
|US20050172943 *||Oct 27, 2004||Aug 11, 2005||Fungoman, Inc.||Programmable ball throwing apparatus|
|US20050179202 *||Apr 5, 2005||Aug 18, 2005||French Barry J.||System and method for tracking and assessing movement skills in multidimensional space|
|US20050245331 *||May 3, 2004||Nov 3, 2005||Renbarger Michael D||Method and system of enhancing a game|
|US20050288159 *||Jun 28, 2005||Dec 29, 2005||Tackett Joseph A||Exercise unit and system utilizing MIDI signals|
|US20060022833 *||Jul 22, 2005||Feb 2, 2006||Kevin Ferguson||Human movement measurement system|
|US20060118096 *||Jan 31, 2006||Jun 8, 2006||Fungoman, Inc.||Programmable ball throwing apparatus|
|US20060142126 *||Feb 17, 2006||Jun 29, 2006||Wilkins Larry C||Exercise device having position verification feedback|
|US20060142127 *||Feb 17, 2006||Jun 29, 2006||Wilkins Larry C||Exercise device having position verification feedback|
|US20060205566 *||May 8, 2006||Sep 14, 2006||Watterson Scott R||Systems for interaction with exercise device|
|US20060211462 *||May 1, 2006||Sep 21, 2006||French Barry J||System and method for tracking and assessing movement skills in multidimensional space|
|US20060236993 *||Dec 22, 2005||Oct 26, 2006||Fungoman, Inc.||Programmable ball throwing apparatus|
|US20060287025 *||May 10, 2006||Dec 21, 2006||French Barry J||Virtual reality movement system|
|US20070005540 *||Jan 6, 2006||Jan 4, 2007||Fadde Peter J||Interactive video training of perceptual decision-making|
|US20070032353 *||Oct 3, 2006||Feb 8, 2007||Scott & Wilkins Enterprises, Llc||Exercise device with a user-defined exercise mode|
|US20070254778 *||Apr 13, 2007||Nov 1, 2007||Ashby Darren C||Exercise apparatuses, components for exercise apparatuses and related methods|
|US20070265138 *||Dec 21, 2005||Nov 15, 2007||Ashby Darren C||Methods and systems for controlling an exercise apparatus using a portable data storage device|
|US20080061949 *||Nov 6, 2007||Mar 13, 2008||Kevin Ferguson||Human movement measurement system|
|US20080096698 *||Jun 27, 2007||Apr 24, 2008||Ramesh Balasubramanyan||Tennis serve ball machine cum training device II|
|US20080110115 *||Nov 12, 2007||May 15, 2008||French Barry J||Exercise facility and method|
|US20080119337 *||Oct 22, 2007||May 22, 2008||Wilkins Larry C||Exercise device with features for simultaneously working out the upper and lower body|
|US20080254918 *||Oct 12, 2007||Oct 16, 2008||Ervin Wagner||Sports skills training apparatus|
|US20080280704 *||May 11, 2007||Nov 13, 2008||Doug Noll, Llc||Basketball shooting training aid and method of use|
|US20080287225 *||Apr 8, 2008||Nov 20, 2008||Joseph Smull||Baseball batting instruction system and method|
|US20090046893 *||Apr 10, 2008||Feb 19, 2009||French Barry J||System and method for tracking and assessing movement skills in multidimensional space|
|US20090166684 *||Dec 29, 2008||Jul 2, 2009||3Dv Systems Ltd.||Photogate cmos pixel for 3d cameras having reduced intra-pixel cross talk|
|US20090197708 *||Feb 26, 2008||Aug 6, 2009||Fuccillo Ralph C||Methods and system for improving a user's reaction time and accuracy in propelling an object|
|US20090268945 *||Oct 29, 2009||Microsoft Corporation||Architecture for controlling a computer using hand gestures|
|US20090316923 *||Jun 19, 2008||Dec 24, 2009||Microsoft Corporation||Multichannel acoustic echo reduction|
|US20100146455 *||Feb 12, 2010||Jun 10, 2010||Microsoft Corporation||Architecture For Controlling A Computer Using Hand Gestures|
|US20100171813 *||Dec 31, 2009||Jul 8, 2010||Microsoft International Holdings B.V.||Gated 3d camera|
|US20100194762 *||Aug 5, 2010||Microsoft Corporation||Standard Gestures|
|US20100194872 *||Jan 30, 2009||Aug 5, 2010||Microsoft Corporation||Body scan|
|US20100195869 *||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197390 *||Aug 5, 2010||Microsoft Corporation||Pose tracking pipeline|
|US20100197391 *||Dec 7, 2009||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197392 *||Dec 7, 2009||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197393 *||Dec 7, 2009||Aug 5, 2010||Geiss Ryan M||Visual target tracking|
|US20100197395 *||Dec 7, 2009||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197399 *||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197400 *||Dec 7, 2009||Aug 5, 2010||Microsoft Corporation||Visual target tracking|
|US20100197469 *||Apr 8, 2010||Aug 5, 2010||Scott & Wilkins Enterprises, Llc||Exercise device with features for simultaneously working out the upper and lower body|
|US20100199228 *||Feb 23, 2009||Aug 5, 2010||Microsoft Corporation||Gesture Keyboarding|
|US20100222178 *||May 16, 2010||Sep 2, 2010||Michael J Shea||Exercise machine information system|
|US20100231512 *||Sep 16, 2010||Microsoft Corporation||Adaptive cursor sizing|
|US20100238182 *||Sep 23, 2010||Microsoft Corporation||Chaining animations|
|US20100252015 *||Apr 5, 2010||Oct 7, 2010||Fungoman, Inc.||Programmable ball throwing apparatus|
|US20100255449 *||Mar 31, 2010||Oct 7, 2010||Fadde Peter J||Interactive video training of perceptual decision-making|
|US20100277411 *||Jun 22, 2010||Nov 4, 2010||Microsoft Corporation||User tracking feedback|
|US20100277470 *||Jun 16, 2009||Nov 4, 2010||Microsoft Corporation||Systems And Methods For Applying Model Tracking To Motion Capture|
|US20100277489 *||Nov 4, 2010||Microsoft Corporation||Determine intended motions|
|US20100278393 *||Nov 4, 2010||Microsoft Corporation||Isolate extraneous motions|
|US20100278431 *||Jun 16, 2009||Nov 4, 2010||Microsoft Corporation||Systems And Methods For Detecting A Tilt Angle From A Depth Image|
|US20100281436 *||Nov 4, 2010||Microsoft Corporation||Binding users to a gesture based system and providing feedback to the users|
|US20100281438 *||Nov 4, 2010||Microsoft Corporation||Altering a view perspective within a display environment|
|US20100281439 *||Nov 4, 2010||Microsoft Corporation||Method to Control Perspective for a Camera-Controlled Computer|
|US20100295771 *||May 20, 2009||Nov 25, 2010||Microsoft Corporation||Control of display objects|
|US20100302138 *||May 29, 2009||Dec 2, 2010||Microsoft Corporation||Methods and systems for defining or modifying a visual representation|
|US20100302142 *||Dec 2, 2010||French Barry J||System and method for tracking and assessing movement skills in multidimensional space|
|US20100302145 *||Dec 2, 2010||Microsoft Corporation||Virtual desktop coordinate transformation|
|US20100302395 *||May 29, 2009||Dec 2, 2010||Microsoft Corporation||Environment And/Or Target Segmentation|
|US20100303289 *||Dec 2, 2010||Microsoft Corporation||Device for identifying and tracking multiple humans over time|
|US20100303290 *||Dec 2, 2010||Microsoft Corporation||Systems And Methods For Tracking A Model|
|US20100303291 *||Dec 2, 2010||Microsoft Corporation||Virtual Object|
|US20100304813 *||Dec 2, 2010||Microsoft Corporation||Protocol And Format For Communicating An Image From A Camera To A Computing Environment|
|US20100306712 *||May 29, 2009||Dec 2, 2010||Microsoft Corporation||Gesture Coach|
|US20100306713 *||May 29, 2009||Dec 2, 2010||Microsoft Corporation||Gesture Tool|
|US20100306714 *||Dec 2, 2010||Microsoft Corporation||Gesture Shortcuts|
|US20100306715 *||Dec 2, 2010||Microsoft Corporation||Gestures Beyond Skeletal|
|US20100306716 *||May 29, 2009||Dec 2, 2010||Microsoft Corporation||Extending standard gestures|
|US20110007079 *||Jan 13, 2011||Microsoft Corporation||Bringing a visual representation to life via learned input from the user|
|US20110007142 *||Jan 13, 2011||Microsoft Corporation||Visual representation expression based on player expression|
|US20110025689 *||Feb 3, 2011||Microsoft Corporation||Auto-Generating A Visual Representation|
|US20110050885 *||Aug 25, 2009||Mar 3, 2011||Microsoft Corporation||Depth-sensitive imaging via polarization-state mapping|
|US20110055846 *||Aug 31, 2009||Mar 3, 2011||Microsoft Corporation||Techniques for using human gestures to control gesture unaware programs|
|US20110062309 *||Sep 14, 2009||Mar 17, 2011||Microsoft Corporation||Optical fault monitoring|
|US20110064402 *||Sep 14, 2009||Mar 17, 2011||Microsoft Corporation||Separation of electrical and optical components|
|US20110069221 *||Mar 24, 2011||Microsoft Corporation||Alignment of lens and image sensor|
|US20110069841 *||Mar 24, 2011||Microsoft Corporation||Volume adjustment based on listener position|
|US20110069870 *||Sep 21, 2009||Mar 24, 2011||Microsoft Corporation||Screen space plane identification|
|US20110079714 *||Oct 1, 2009||Apr 7, 2011||Microsoft Corporation||Imager for constructing color and depth images|
|US20110083108 *||Oct 5, 2009||Apr 7, 2011||Microsoft Corporation||Providing user interface feedback regarding cursor position on a display screen|
|US20110085705 *||Dec 20, 2010||Apr 14, 2011||Microsoft Corporation||Detection of body and props|
|US20110093820 *||Oct 19, 2009||Apr 21, 2011||Microsoft Corporation||Gesture personalization and profile roaming|
|US20110099476 *||Apr 28, 2011||Microsoft Corporation||Decorating a display environment|
|US20110102438 *||May 5, 2011||Microsoft Corporation||Systems And Methods For Processing An Image For Target Tracking|
|US20110109617 *||Nov 12, 2009||May 12, 2011||Microsoft Corporation||Visualizing Depth|
|US20110151974 *||Jun 23, 2011||Microsoft Corporation||Gesture style recognition and reward|
|US20110154266 *||Dec 17, 2009||Jun 23, 2011||Microsoft Corporation||Camera navigation for presentations|
|US20110169726 *||Jan 8, 2010||Jul 14, 2011||Microsoft Corporation||Evolving universal gesture sets|
|US20110173204 *||Jan 8, 2010||Jul 14, 2011||Microsoft Corporation||Assigning gesture dictionaries|
|US20110173574 *||Jul 14, 2011||Microsoft Corporation||In application gesture interpretation|
|US20110175809 *||Jul 21, 2011||Microsoft Corporation||Tracking Groups Of Users In Motion Capture System|
|US20110182481 *||Jan 25, 2010||Jul 28, 2011||Microsoft Corporation||Voice-body identity correlation|
|US20110184735 *||Jan 22, 2010||Jul 28, 2011||Microsoft Corporation||Speech recognition analysis via identification information|
|US20110187819 *||Aug 4, 2011||Microsoft Corporation||Depth camera compatibility|
|US20110187820 *||Aug 4, 2011||Microsoft Corporation||Depth camera compatibility|
|US20110187826 *||Aug 4, 2011||Microsoft Corporation||Fast gating photosurface|
|US20110188027 *||Aug 4, 2011||Microsoft Corporation||Multiple synchronized optical sources for time-of-flight range finding systems|
|US20110188028 *||Aug 4, 2011||Microsoft Corporation||Methods and systems for hierarchical de-aliasing time-of-flight (tof) systems|
|US20110190055 *||Aug 4, 2011||Microsoft Corporation||Visual based identitiy tracking|
|US20110193939 *||Aug 11, 2011||Microsoft Corporation||Physical interaction zone for gesture-based user interfaces|
|US20110197161 *||Feb 9, 2010||Aug 11, 2011||Microsoft Corporation||Handles interactions for human-computer interface|
|US20110199291 *||Aug 18, 2011||Microsoft Corporation||Gesture detection based on joint skipping|
|US20110201428 *||Aug 18, 2011||Motiva Llc||Human movement measurement system|
|US20110205147 *||Aug 25, 2011||Microsoft Corporation||Interacting With An Omni-Directionally Projected Display|
|US20110210915 *||Sep 1, 2011||Microsoft Corporation||Human Body Pose Estimation|
|US20110216976 *||Mar 5, 2010||Sep 8, 2011||Microsoft Corporation||Updating Image Segmentation Following User Input|
|US20110217683 *||Jul 9, 2010||Sep 8, 2011||Olga Vlasenko||Methods and systems for using a visual signal as a concentration aid|
|US20110221755 *||Mar 12, 2010||Sep 15, 2011||Kevin Geisner||Bionic motion|
|US20110228251 *||Sep 22, 2011||Microsoft Corporation||Raster scanning for depth detection|
|US20110228976 *||Sep 22, 2011||Microsoft Corporation||Proxy training data for human body tracking|
|US20110234481 *||Sep 29, 2011||Sagi Katz||Enhancing presentations using depth sensing cameras|
|US20110234589 *||Sep 29, 2011||Microsoft Corporation||Systems and methods for tracking a model|
|US20110234756 *||Sep 29, 2011||Microsoft Corporation||De-aliasing depth images|
|US20110237324 *||Sep 29, 2011||Microsoft Corporation||Parental control settings based on body dimensions|
|US20120004055 *||Oct 22, 2010||Jan 5, 2012||Ramesh Balasubramanyan||Sensor based tennis serve training apparatus|
|US20120232360 *||Nov 17, 2010||Sep 13, 2012||Koninklijke Philips Electronics N.V.||Fitness test system|
|US20120276507 *||Apr 27, 2012||Nov 1, 2012||Dana Taylor||Athletic training device with lighted indicators|
|US20140288682 *||Jun 5, 2014||Sep 25, 2014||Nike, Inc.||Athletic Performance Monitoring Systems and Methods in a Team Sports Environment|
|WO1989003710A1 *||Oct 26, 1988||May 5, 1989||Innovative Training Products, Inc.||Start system batting unit|
|WO2011017324A1 *||Aug 3, 2010||Feb 10, 2011||Nike International Ltd.||A compact motion-simulating device|
|U.S. Classification||273/445, 482/84, 434/247, 482/902, 473/459, 273/454, 482/901|
|International Classification||A63B69/00, A63B69/38, A63B23/00, A63B26/00|
|Cooperative Classification||Y10S482/902, Y10S482/901, A63B2220/803, A63B69/0024, A63B69/0053, A63B71/0686, A63B71/0622, A63B2220/56, A63B2220/805, A63B2207/02, A63B2071/0625, A63B2024/0078, A63B2210/50, A63B69/38, A63B2220/62, A63B2225/50|
|Jul 25, 1986||AS||Assignment|
Owner name: INNOVATIVE TRAINING PRODUCTS, INC., 75 HASKETT DRI
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST.;ASSIGNORS:ELSTEIN, RICK A.;FARET, SVEIN;GAZZO, JOHN J.;REEL/FRAME:004585/0317;SIGNING DATES FROM 19860717 TO 19860723
Owner name: INNOVATIVE TRAINING PRODUCTS, INC.,NEW YORK
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ELSTEIN, RICK A.;FARET, SVEIN;GAZZO, JOHN J.;SIGNING DATES FROM 19860717 TO 19860723;REEL/FRAME:004585/0317
|Jan 17, 1991||FPAY||Fee payment|
Year of fee payment: 4
|Apr 19, 1995||FPAY||Fee payment|
Year of fee payment: 8
|May 18, 1999||REMI||Maintenance fee reminder mailed|
|Oct 24, 1999||LAPS||Lapse for failure to pay maintenance fees|
|Jan 4, 2000||FP||Expired due to failure to pay maintenance fee|
Effective date: 19991027