|Publication number||US5690492 A|
|Application number||US 08/683,272|
|Publication date||Nov 25, 1997|
|Filing date||Jul 18, 1996|
|Priority date||Jul 18, 1996|
|Publication number||08683272, 683272, US 5690492 A, US 5690492A, US-A-5690492, US5690492 A, US5690492A|
|Inventors||Gordon L. Herald|
|Original Assignee||The United States Of America As Represented By The Secretary Of The Army|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (8), Referenced by (22), Classifications (6), Legal Events (5)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The invention described herein may be manufactured, used and licensed by or for the United States Government without payment to me of any royalty thereon.
The present invention relates generally to the field of training devices and their component features, and more specifically to devices that offer interactive simulated weapon system training by having images projected upon a screen.
In a man-in-the-loop, real-time target simulation the participants (trainees) are interacting in a simulated or virtual reality environment which can consist of a large theatrical type screen on which a fixed or time changing visible video image is projected. The target image, such as an aircraft, for this application is created by a computer generated imagery system in which all the images are constructed from computer data bases and the attributes of the image are known including the screen coordinates of the objects projected onto the screen.
The complete computer generated image can consist of objects which depict the outside world including trees, hills, roads, buildings and other such objects. Moving and static objects defined as targets also appear in the overall image. For example, the target objects may consist of aircraft, helicopters, trucks, personnel, buildings and other military or industrial type targets. It is also possible that the image and targets could be symbolic in style for other applications. Generally, the computer generated image is projected on the imaging screen by video projectors.
In the above described simulation, the participants are provided with a targeting device such as a simulated weapon, such as a rifle or surface-to-air missile launcher, or other pointer device which has a sighting system to allow the participant to aim the targeting device at a selected target. The participant activates a trigger to indicate to the computer system which target has been selected and the action which the participant desires. The computer and its related image generation system then applies visible changes to the target to indicate a "hit" or a "miss" or some other type of damage.
In current available simulation systems, the simulated weapon or pointing device is generally instrumented with a system that determines the x, y, z, pitch, roll, yaw and attitude parameters of the device as referenced to a simulation database. These instrumentation systems use magnetic, acoustic, or pattern recognition methods to determine the position and attitude parameters. These position and attitude parameters can be related to simulation image data in order to determine which target the participant is pointing to or aiming at. Other systems use video image analysis and interpretation to determine the possibility of a target, and determine which target is selected.
The prior art methods do not work well in some simulation environments or they are too complex to use in a real-time, man-in-the-loop simulation environment. For example, the accuracy of magnetic systems is affected by nearby metal objects and fields from equipment such as CRT's. Also, magnetic sensor systems cannot be mounted on metals which affect the magnetic fields of the system. Acoustical systems have limited range between the source and sensor and have accuracy limitations. Video image analysis and interpretation is computationally complex, time consuming, and not highly reliable due to target image quality problems and the large number of possible target aspects that must be considered.
It is therefore an object of the present invention to provide a device to accurately detect which of several target images projected on a large theatrical type screen is being aimed at during interactive weapon system simulation training.
A further object of the present invention is to provide a non-visible, light based method and apparatus wherein it is possible to identify which one of several target images that are visible on a large theatrical type image screen is being pointed at that does not require magnetic sensors, acoustical sensors, or image analysis.
A still further object is to provide a means for target identification during imaged simulation training that does not interfere with the visible image.
Still other objects and advantages of the present invention will become readily apparent to those skilled in this art from the detailed description, wherein only the preferred embodiment of the present invention is shown and described, simply by way of illustration of the best mode contemplated of carrying out the present invention. As will be realized, the present invention is capable of other and different embodiments, and its several details are capable of modifications in various obvious respects, all without departing from the present invention. Accordingly, the drawings and descriptions are to be regarded as illustrative in nature, and not as restrictive.
These and other objects are achieved by providing a device to determine which target a participant has selected during simulated weapon training is disclosed. The device uses a second projector in addition to the computer generated image projector to project a non-visible light onto a screen. The non-visible projector consists of a high intensity light source such as one or more strobe lights, infrared pass filters, a lens system, and a dynamically positioned aperture system which uses a liquid crystal display. The strobe lights are capable of producing an intense, short burst of light. A light controller triggers each strobe light in order to produce a pulsed light stream of a desired pattern. A serial light pulse pattern (analogous to the "1" or "0" binary states, where logic "0" is zero volts and logic "1" is at 5 volts) will be generated where a logic "1" is represented by a pulse of light and a logic "0" is represented by the absence of a light pulse. The pulsed light stream is analogous to an asynchronous computer communication data link control protocol and can consist of up to 8 data pulses, which provide up to 255 target identification numbers. Data pulses are framed by a start pulse at the beginning and ending with a stop pulse.
The computer generated imagery projector and the non-visible light projector are adjusted to create a composite image, visible and non-visible, projected on the screen such that the non-visible image is overlaid on the visible image at all times. The non-visible image contains information related only to certain pre-defined targets. The participant is able to sense the visible portion of the image on the screen but not the non-visible image, which is reflected from the screen to an electronic IR sensor that senses only the non-visible image. This reflected non-visible light is converted to pulsed electrical signals by the sensor, and outputted to an interface where it is further processed to adjust the signal output levels to those required by the computer input port. Software in the computer determines which target is currently being pointed at by the participants pointing device or simulated weapon.
FIG. 1 depicts a participant aiming a simulated rocket launcher at a simulated target projected on a large screen.
FIG. 2 is a block diagram of the present invention.
Referring now in detail to the drawings wherein like parts are designated by like reference numerals throughout, there is illustrated in FIG. 1 a participant 37 aiming a simulated weapon 3 at a simulated target 32 projected on a large theater-type screen 1 by computer generated imagery projector 2. Target 32 is shown as an aircraft, but can be any number of simulated threats, such as a missile, ship, tank, or, as shown, helicopter 33. Other terrain features are also being projected by projector 2 such as trees 34 and 35, and ground 36. Other features can be projected as desired by the particular simulation.
In addition to computer generated imagery projector 2, a non-visible light projector 4 is provided. Projector 2 and projector 4 are adjusted to create a composite target image; the visible portion is represented by target 32, seen by participant 37, and the non-visible image 5, not seen by participant 37, but reflected off screen I to sensor 7 located on weapon 3. Non-visible image 5 is overlaid on target 32 by projector 4, is correlated to visible image 32 at all times, and contains information related only to defined targets. In other words, non-visible image 5 is only overlaid on targets 32 and 33, but not on the non-target images 34, 35, and 36.
Operation and design of the present invention is best shown in FIG. 2. Within non-visible projector 4 light is transmitted by one or more strobe lights 31, controlled by strobe light controller 11 which is in turn controlled by computer 9 via data line 41. Strobe light systems require a short recovery time after the light burst is generated before they can be triggered again to provide a subsequent light burst. For applications where the light pulse pattern requires a rapid light pulse stream which cannot be generated by a single strobe light, a system of using two or more strobe lights can be used as shown. When two or more strobe lights are used, strobe light controller 11 will output triggers to the strobe lights in a sequential mode to enable light pulses to be generated by one of the strobe lights which is no longer in the recovery state. The light emitted by strobe lights 31 is received by a lens 13 which spreads the light over the surface of a dynamic light positioning aperture 24. Light positioning aperture 24 consists of a liquid crystal display (LCD) in which a small light transmitting area (aperture) is created by LCD aperture controller 10 which is controlled by computer 9 via data line 42. This aperture corresponds to the screen position of the target and is maintained in a transmitting state for the duration of the non-visible light pulses associated with that particular target. The remainder of LCD 24 is in the light absorbing state. Although an aperture is created for each target, there is only one aperture open at any one time. A second lens 23 and an infrared (IR) pass filter 22, on the light output side of LCD 24 focuses the pulsed IR light beam 25 on screen 1. For each target, a new aperture is created and the position of the aperture on LCD 24 is determined by the coordinates of the target on screen 1. Each aperture is maintained for the duration of the non-visible light pulses 25 which identify the target. The position of the apertures is at the video frame rate or a sub-frame rate of the computer generated imagery system (for a 30 Hz frame rate the aperture position will be maintained for 33 milliseconds). Apertures are created one after the other for each target. After the last target aperture has been completed, the aperture sequence will be started again for all remaining targets in the visual field.
Detection of a target is accomplished as follows. The pulsed IR light beam 25 is reflected off screen 1 in a reflected beam 26 to a reflected IR sensor 7 located on weapon 3. Sensor 7 can be tubular with a lens system at the front and a photo detector at the rear as shown. The tube size and lens system are designed to limit its' field-of-view such that only the non-visible light reflected from a limited area of screen 1 is incident on the photo detector. This allows target discrimination and prevents interference from the reflected non-visible light of other possible nearby targets. Reflected beam 26 detected by the photo detector in sensor 7 has the pulse characteristics generated by the non-visible light projector 4 at each target position on screen 1. Reflected light 26 carries the same unique pulse pattern for each target that was originally generated by projector 4 light pulse stream which provides a target identification number for each target. A stop filter, or an equivalent optical device is placed at the output of the computer generated imagery projector 2 in order to allow passage of the visible light which composes the image and stops the non-visible light to which the photo detector is sensitive.
Sensor 7 has electronic circuitry to convert the serially pulsed light data to a serially pulsed electronic signal. An interface 8 receives the output of the IR sensor 7 and converts it to acceptable voltage levels and polarity for transmission to host computer 9 via data line 43. Target detection and recognition occurs upon decoding and processing of the serial electronic pulse stream by computer 9.
It will be readily seen by one of ordinary skill in the art that the present invention fulfills all of the objects set forth above. After reading the foregoing specification, one of ordinary skill will be able to effect various changes, substitutions of equivalents and various other aspects of the present invention as broadly disclosed herein. It is therefore intended that the protection granted hereon be limited only by the definition contained in the appended claims and equivalents thereof.
Having thus shown and described what is at present considered to be the preferred embodiment of the present invention, it should be noted that the same has been made by way of illustration and not limitation. Accordingly, all modifications, alterations and changes coming within the spirit and scope of the present invention are herein meant to be included.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4290757 *||Jun 9, 1980||Sep 22, 1981||The United States Of America As Represented By The Secretary Of The Navy||Burst on target simulation device for training with rockets|
|US4336018 *||Dec 19, 1979||Jun 22, 1982||The United States Of America As Represented By The Secretary Of The Navy||Electro-optic infantry weapons trainer|
|US4496158 *||Dec 13, 1982||Jan 29, 1985||Sanders Associates, Inc.||Electro-optical sensor for color television games and training systems|
|US4824324 *||Apr 29, 1988||Apr 25, 1989||Koyo Seiko Co., Ltd.||Water pump|
|US5194008 *||Mar 26, 1992||Mar 16, 1993||Spartanics, Ltd.||Subliminal image modulation projection and detection system and method|
|US5215464 *||Nov 5, 1991||Jun 1, 1993||Marshall Albert H||Aggressor shoot-back simulation|
|US5321263 *||May 10, 1993||Jun 14, 1994||Simon Marketing, Inc.||Recording target|
|US5541746 *||Oct 10, 1995||Jul 30, 1996||Sanyo Electric Co., Ltd.||Light source device for use in liquid crystal projectors|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6283862 *||Apr 3, 1998||Sep 4, 2001||Rosch Geschaftsfuhrungs Gmbh & Co.||Computer-controlled game system|
|US6530782 *||Mar 1, 2001||Mar 11, 2003||The United States Of America As Represented By The Secretary Of The Navy||Launcher training system|
|US6537153 *||Jul 19, 2001||Mar 25, 2003||Namco Ltd.||Game system, program and image generating method|
|US6569019 *||Jul 10, 2001||May 27, 2003||William Cochran||Weapon shaped virtual reality character controller|
|US6771349 *||Sep 30, 2002||Aug 3, 2004||David H. Sitrick||Anti-piracy protection system and methodology|
|US6811267 *||Jun 9, 2003||Nov 2, 2004||Hewlett-Packard Development Company, L.P.||Display system with nonvisible data projection|
|US7345265 *||Jul 15, 2005||Mar 18, 2008||Cubic Corporation||Enhancement of aimpoint in simulated training systems|
|US7687751||Oct 31, 2007||Mar 30, 2010||Cubic Corporation||Enhancement of aimpoint in simulated training systems|
|US8006311||Sep 18, 2008||Aug 23, 2011||Korishma Holdings, Llc||System and methodology for validating compliance of anti-piracy security and reporting thereupon|
|US8267690 *||Apr 4, 2006||Sep 18, 2012||Saab Ab||Simulating device|
|US20040061676 *||Sep 30, 2002||Apr 1, 2004||Sitrick David H.||Anti-piracy protection system and methodology|
|US20060073438 *||Jul 15, 2005||Apr 6, 2006||Cubic Corporation||Enhancement of aimpoint in simulated training systems|
|US20060228677 *||Apr 4, 2006||Oct 12, 2006||Saab Ab||Simulating device|
|US20070197290 *||Sep 17, 2004||Aug 23, 2007||Ssd Company Limited||Music Game Device, Music Game System, Operation Object, Music Game Program, And Music Game Method|
|US20070254266 *||May 1, 2006||Nov 1, 2007||George Galanis||Marksmanship training device|
|US20080212833 *||Oct 31, 2007||Sep 4, 2008||Cubic Corporation||Enhancement of aimpoint in simulated training systems|
|US20090021372 *||Sep 18, 2008||Jan 22, 2009||Sitrick David H||System and methodology for validating anti-piracy security compliance and reporting thereupon, for one to a plurality of movie theaters|
|US20090134332 *||Nov 27, 2007||May 28, 2009||Thompson Jason R||Infrared Encoded Objects and Controls for Display Systems|
|US20110053120 *||Aug 3, 2010||Mar 3, 2011||George Galanis||Marksmanship training device|
|DE112004002945B4 *||Sep 7, 2004||Oct 2, 2008||Hewlett-Packard Development Co., L.P., Houston||Projektionsmaschine|
|WO2006028459A1 *||Sep 7, 2004||Mar 16, 2006||Hewlett-Packard Development Company, L.P.||Display system with nonvisible data projection|
|WO2011028008A3 *||Aug 31, 2010||Jul 21, 2011||Nam-Woo Kim||Dynamic real direction shooting training system|
|U.S. Classification||434/20, 463/5, 463/51|
|Jul 18, 1996||AS||Assignment|
Owner name: ARMY, UNITED STATES OF AMERICA, AS REPRESENTED BY
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HERALD, GORDON L.;REEL/FRAME:008643/0219
Effective date: 19960708
|Dec 1, 2000||FPAY||Fee payment|
Year of fee payment: 4
|Jun 15, 2005||REMI||Maintenance fee reminder mailed|
|Nov 25, 2005||LAPS||Lapse for failure to pay maintenance fees|
|Jan 24, 2006||FP||Expired due to failure to pay maintenance fee|
Effective date: 20051125