|Publication number||USH2099 H1|
|Application number||US 09/349,357|
|Publication date||Apr 6, 2004|
|Filing date||Jul 6, 1999|
|Priority date||Jul 6, 1999|
|Publication number||09349357, 349357, US H2099 H1, US H2099H1, US-H1-H2099, USH2099 H1, USH2099H1|
|Inventors||Bruce M. Heydlauff, Thomas F. Reese|
|Original Assignee||The United States Of America As Represented By The Secretary Of The Navy|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (11), Referenced by (7), Classifications (5), Legal Events (1)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The invention described herein may be manufactured and used by or for the government of the United States of America for governmental purposes without the payment of any royalties thereon or therefor.
1. Field of the Invention
The present invention pertains generally to digital video injection systems and methods, and more particularly to a real time flight simulation system that allows matching of a complex scene image to weapons system optics to generate a realistic detailed response.
2. Brief Description of the Related Art
Many new weapons systems are currently being developed which employ advanced high resolution imaging systems. These imaging systems often operate at frequencies outside the visible light spectrum to enhance their ability to detect targets at night or during adverse weather conditions. Costs to develop these systems are rapidly increasing as advanced signal processing algorithms are employed to increase the probability of target detection and target recognition, with the ultimate goal of accurate target tracking and precise aim point selection for maximum probability of kill and minimum collateral damage. Live missile firings and captive-flight tests are expensive, often restricted to a limited number of available test sites. Unpredictable environmental conditions diminish the effectiveness of the testing to anticipate problems during actual combat operations.
Various weapon systems have been developed which employ advanced high resolution electro-optical / infrared (EO/IR) raster-scan image-based seeker systems. These image-based weapon systems typically utilize advanced signal processing algorithms to increase the probability of target detection and target recognition. Validation of such signal processing algorithms has traditionally been carried out through free flight, captive carry and static field tests of the image-based weapon systems, followed by lab analysis, modification of the algorithms, and then subsequent field tests followed by further analysis and modifications. This process is generally costly and time-intensive. Obtaining the correct target and weather conditions can add additional cost and time delay to this test/modify/re-test cycle. Further the process is incapable of working in a simulated “virtual” environment.
By accurately generating a digital image, rendered for the correct frequency spectrum, and fed to the imaging signal processing electronics, algorithms may be more easily developed and tested at a significantly lower cost. Images may be geometrically altered depending on the desired look angle and range to the target. Atmospheric conditions may be altered to test their effects.
Prior technology used to develop imaging weapons systems which do not fully resolve the problems addressed by the present invention include free flight test, captive carry flight test, static test, and CARCO Table dynamic simulation. Free flight, captive carry, and static testing all have the advantage of testing the actual weapons system, but often times are not repeatable, lack all the desired test parameters required at one time, and may be very expensive. CARCO Table dynamic testing is probably the best developmental testing tool, but creating a dynamic real time, accurate image in the correct frequency band, which the seeker may actually see is cost prohibitive, and in most cases beyond the state of the art currently available.
Accordingly, there is a need for a real time detailed video injection system and method that generates correct, complex, real-time output for testing. The present invention addresses these needs.
The Digital Video Injection System, abbreviated as DVIS, of the present invention comprises a real time, weapons virtual reality system that facilitates the design, development, test, validation, and simulation of imaging systems, such as imaging weapons systems.
The present invention includes a digital video injection system comprising a digital image source constituted from a reference image, means for image processing capable of converting an input from the digital image source into a geometrically correct frequency rendered digital image, a scan converter capable of accepting the digital image, wherein the scan converter is capable of storing and converting the digital image for compatibility with a imaging system, a seeker-dynamics interface capable of simulating the pointing system of the imaging system, and, a digital system controller capable of solving motion import in real time to the imaging system. In preferred embodiments, the imaging system of the present invention also is part of the digital video injection system, and the imaging system comprises an imaging weapons system.
The present invention further includes a method for injecting digital video, comprising the steps of generating a digital image source constituted from a reference image, converting an input from the digital image source into a digital image, storing and rate converting the digital image for compatibility with an imaging system, controlling the imaging system sufficiently for target orientation, and, solving motion-rate change to the imaging system, wherein the imaging system engages in real-time simulated flight.
Additionally, the present invention includes a digital video input product provided by the process of generating a digital image source constituted from a reference image, converting an input from the digital image source into a digital image, storing and rate converting the digital image for compatibility with an imaging system, controlling the imaging system sufficiently for target orientation, and, solving motion-rate change to the imaging system, wherein the imaging system engages in real-time simulated flight.
FIG. 1 is a functional block diagram of a digital video injection system of the present invention; and,
FIG. 2 is an operational flowchart illustrating the digital signal injection modeling process of the present invention; and
FIG. 3 is an optional flowchart illustrating the digital signal injection modeling process of the present invention.
The present invention comprises a system and method for digital video injection of real time flight simulation that allows matching of a complex scene image to weapons system optics to generate a realistic detailed response. In addition to weapons systems, the present invention is applicable to other imaging systems, such as the space shuttle, auto-pilot devices and/or other systems that direct a guided object to a specific coordinate location. Operational aspects of the present invention are disclosed in U.S. patent application Ser. No. 09/267,912 under the title “REAL TIME DETAILED SCENE CONVOLVER” of Dennis Mckinney, Bruce Heydlauff, and John Charmer, filed Mar. 5, 1999, the disclosure of which is herein incorporated by reference.
FIG. 1 shows the Digital Video Injection System (DVIS) 10 that comprises a digital image source 100, a means for image processing 200, a scan converter 300, a seeker-dynamics interface 400, and a digital system controller 600 that correlate information to an imaging system 500. The digital video injection system 10 comprises a unique application of commercially available hardware and software, as well as custom electronic interfaces, capable of providing real time, frequency rendered, geometrically corrected, formatted images to the imaging system 500.
The digital image source 100 is constituted from a reference image. Reference images are obtained from a larger “library” of images that are organized in geocoordinate order for selection. The image source 100 may be any digital image file representing the area of interest. Any image source 100 capable of rendering a representative geographical location may be used to be converted to a digital image file. Examples include photographs, real sensor data, SPOT satellite images and/or other like image sources that represent geographic locations.
The means for image processing 200 converts an input from the digital image source 100 into a geometrically correct frequency rendered digital image. The inputted digital image source 100 is converted into a visual digital image, which is then converted into a specified frequency band of a given imaging system 500 by correcting gray scales for the inputted digital image to the proper frequency spectrum. The means for image processing 200 also geometrically corrects the look angle and range of the digital image. The means for image processing 200 preferably comprises a Silicon Graphics ONYX Image Processing System 210 and associated software 220. The Silicon Graphics ONYX Digital Image Processing Computer 210 and the associated software 220 accept any image that may be converted to a digital image file. Once received by the Silicon Graphics ONYX Digital Image Processing Computer 210, the image is processed in real time to geometrically correct the image for look angle and range simulation as properly “seen” by the imaging system 500. The image is then rendered to the correct gray scales for the actual frequency spectrum of the imaging system 500, i.e., a visual frequency band photograph could be rendered to an infrared (IR), ultraviolet (UV), x-ray, or other equivalent image, with the selection of the equivalent image determinable by those skilled in the art.
Software 220 of the means for image processing 200 includes four processes: image processing 222, three-dimensional modeling 224, infrared scene model generation 226, and real-time infrared scene traversal 228. Image processing 222 comprises the first step in generating the scene to be injected. Preferably, image processing 222 uses photogrammetric workstations developed by GDE of San Diego, California under the tradename GeoSet, and Autometrics under the tradename Griffon. The workstations use mathematical models to rectify and register images collected by satellites to earth coordinates. Other capabilities of the workstations include feature and terrain extraction, modeling, and manipulation. The data generated from these workstations are exported to the three-dimensional modeling process 224.
The three-dimensional modeling 224 process produces an accurate polygonal representative of the scene. Preferably, a three dimensional modeler, developed by MultiGenParadigm of San Jose, California under the tradename MultiGen, is used. The MultiGen modeler imports data from the photogrammatric workstations to accurately position and render the significant features within the scene. Once the significant features are accurately modeled, an image analyst determines the material characteristics of the feature and assigns a material property to each polygon within the database. The three-dimensional model 224 is then exported for processing by the infrared scene generation 226 process.
The infrared scene generation 226 process converts the three-dimensional model 224 into the desired infrared spectrum. Preferably, an infrared data base modeler, developed by Technology Service Corp. of Bloomington, Indiana under the tradename IRGen, is used. IRGen contains a thermal model to compute the radiance values for each polygon within the database. A gray scale value for each vertex of each polygon is computed to represent the radiance of each polygon perceived by the seeker modeled in the sensor model. In addition, the atmospheric model computes transmittance and sky radiance which is later integrated with the visibility/haze function of the real-time scene traversal process. The model is then exported in a MultiGen data base format for processing by the real-time scene traversal 228 process.
The real-time scene traversal 228 software reads the database and the atmospheric profile computed by IRGen. The position of the seeker within the scene and the viewing angles are computed by an external simulation. The position and angles are sent through a reflective shared memory network for processing by the real-time scene traversal 228 software. The software then generates the proper scene and injects it into the seeker processor of the imaging system 500.
Once the image has been geometrically corrected and frequency rendered in the means for image processing, the image is outputted in a digital format to the scan converter 300.
The scan converter 300 accepts the digital image from the means for image processing 200. Once accepted, the scan converter 300 stores the digital image for organized retrieval, and converts the digital image for compatibility with a given imaging system 500. The digital image is stored in one of three dual-ported random access memory arrays. The scan converter retrieves the digital image at a rate required for real time operation of the imaging system 500. The scan converter 300 preferably comprises a custom designed electronic scan converter 300 that accepts the ONYX digital image, storing and rate converting the image to match the requirements of the imaging system 500. In the scan converter 300, the digital image 100 is selectively retrieved from dual ported random access memory arrays 310 in the format and at a rate required by the imaging system 500. The dual ported random access memory arrays 310 output the current scene image as the scene image for the next update is fed into the dual ported random access memory arrays 310.
The seeker-dynamics interface 400 simulates the pointing system of the imaging system 500 by providing control input and output signals to the imaging system 500. Preferably, the seeker dynamics interface 400 comprises a custom designed electronic seeker dynamics interface 400 that provides all the controls required by the imaging weapons systems 500, such as a simulated gimbal with rate controls and position feedback. This satisfies the expected control loop required by the imaging system 500.
The imaging system 500 of the present invention may be any imaging seeker 510, preferably an imaging seeker 510 for a weapons system. Imaging seekers 510 are standardized for given systems, and tested for the normal flight capabilities and performance of those systems, such as a Tomahawk missile, Harpoon missile, Space Shuttle navigational system, and/or other imaging systems 500 capable of being tested. Generally, the imaging seekers 510 comprise a sensor, a pointing system, input/output interfaces, and a signal processor. Imaging systems 500 may non-exclusively include surface-to-air, air-to-surface or air-to-air threat weapon signal processors, including a combination of tracker, counter-countermeasure and guidance circuitry, such as imaging infrared (IR), millimeter wave, laser detection and ranging (LADAR), synthetic aperture radar (SAR), and television. Sensor field of view, resolution, scan patterns, and sensitivity are modeled in real time. The target scene is injected into the seeker video processor to test the target acquisition, tracking, and man-in-the-loop characteristics of the imaging seeker in conjunction with the rest of the missile, data link, and aircraft systems.
The digital system controller 600 computes real time updates to the imaging system 500. The digital system controller 600 solves motion import in real time to the imaging system 500, i.e., real-time updated inputs or updates of motion, speed and orientation. The digital system controller 600 comprises a real time computer which solves the equations of motion and updates the imaging system 500, as well as the means for image processing 200 with current position and seeker look angle. With the update from the digital system controller 600, the means for image processing 200 may dynamically correct and render the digital image 100 for the simulated range and line of sight from the target to the imaging seeker 510. The real time digital system controller 600 is capable of solving the equations of motion and updating the imaging system 500, seeker dynamics 400, and the means for image processing 200 with the necessary controls, simulated position, and line of sight information. The digital system controller 600 preferably comprises an array of SPARC 1 E, 2E, and 10, and PowerPC VME based Single Board Computers.
FIG. 2 is an operational flowchart illustrating the digital signal injection modeling process of the present invention in real time. As seen in FIG. 2 blocks A, B, C1, C2 and D, the Digital Signal Injection Software approach includes four processes. First, an image processing that uses mathematical models to rectify and register the reference images to earth coordinates and to do feature/ terrain extraction. This occurs in block A via functional blocks: MATRIX, SOCKETSET AND GRIFFON, DATAMASTER, EO IMAGE, IR IMAGE, SAR IMAGE, and PPDBS. Second, in block B, MULTIGEN, three dimensional modeling produces an accurate polygonal representation of the scene where each polygon may be assigned material characteristics relative to the frequency spectrum desired. Third, in blocks Cl and C2, scene generation converts the threedimensional model of the scene to radiance values in the desired frequency spectrum by applying via functional blocks of Cl: a Thermal model, Environmental Model, Sensor Model, and an Atmospheric Model to each polygon. This occurs in the blocks IRGEN Cl, and NONCONVENTIONAL EXPLOITATION FACTORS DATA SYSTEM C2. The first three processes (thermal, environmental and sensor models) are nonreal-time events. In the fourth process, real-time scene transversal reads the database of the processed scene, and using the computed seeker position and viewing angles, generates the geometrically correct, frequency rendered scene in real-time for injection into the seeker processor.
In operation, a digital image source. constituted from a reference image is inputted into the digital video injection system 10, previously described. The digital video injection system 10 converts the digital image source into a formatted digital image, and stores that digital image. The digital video injection system 10 then rate converts the digital image for compatibility with a selected imaging system 500. Additionally, the digital video injection system 10 controls the selected imaging system 500 to provide target orientation. The digital video injection system 10 engages the selected imaging system 500 in real-time simulated flight by solving motion-rate change over an appropriate time of flight to the selected imaging system 500. This flight simulation is particularly useful for imaging systems 500 that comprise imaging weapons systems.
The actual flight simulation, or digital video input product, provides an imaging system 500, such as an imaging weapons system, realistic flight test conditions and evaluation. The flight test conditions are sequenced in real-time scenarios. The flight simulation does not expend an actual test system in live firing. The flight simulation product also is particularly advantageous in that the actual imaging system 500 of an operational device, such as a guided missile, is tested for a given target over a given set of conditions. The components and software of the present invention are designed and tested to provide a real time, virtual image to the imaging weapons system to make it “think” it is actually flying against a real target. The present invention also requires no moving parts, such as a CARCO Table to simulate motion, reducing the cost and complexity of the testing. As such, the DVIS 10 may be used to verify and test the imaging weapons system 500 in a laboratory environment at a significant reduction in cost compared to actual field or flight-testing.
The imaging system 500 is evaluated from realistic threat analysis capability which mimics real weapon free-flight behavior in highly realistic, and credible, simulation of a weapon system. Evaluations are done on criteria such as generated realistic weapon free-flight behavior through the use of real threat weapon signal processing electronics, real-time operations, full-detailed targets, countermeasures and backgrounds, dynamic behavior of all scene objects, realistic simulation of weapon end-game performance, and/or operations in simultaneous multiple spectral bands. Flight scenarios may be varied in such aspects as heading and altitude profile, time of year, time of day, cloud cover, haze, temperature, and/or any other environmental condition, either natural or man-made. Target modeling and evaluation may be integrated with command and control networks for comprehensive mission planning and execution functions. As such the imaging system 500 is evaluated in a laboratory environment with a limitless variation of target and attack scenarios of complex flight path, target, and weather condition scenarios.
The foregoing summary, description, and drawings of the present invention are not intended to be limiting, but are only exemplary of the inventive features which are defined in the claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4267562 *||Mar 9, 1979||May 12, 1981||The United States Of America As Represented By The Secretary Of The Army||Method of autonomous target acquisition|
|US4463380 *||Sep 25, 1981||Jul 31, 1984||Vought Corporation||Image processing system|
|US4855822 *||Jan 26, 1988||Aug 8, 1989||Honeywell, Inc.||Human engineered remote driving system|
|US5309522 *||Jun 30, 1992||May 3, 1994||Environmental Research Institute Of Michigan||Stereoscopic determination of terrain elevation|
|US5546943 *||Dec 9, 1994||Aug 20, 1996||Gould; Duncan K.||Stimulating a beneficial human response by using visualization of medical scan data to achieve psychoneuroimmunological virtual reality|
|US5649706 *||Sep 21, 1994||Jul 22, 1997||Treat, Jr.; Erwin C.||Simulator and practice method|
|US5719797 *||Dec 15, 1995||Feb 17, 1998||The United States Of America As Represented By The Secretary Of The Army||Simulator for smart munitions testing|
|US5914661 *||Jan 22, 1996||Jun 22, 1999||Raytheon Company||Helmet mounted, laser detection system|
|US6011581 *||Jan 20, 1995||Jan 4, 2000||Reveo, Inc.||Intelligent method and system for producing and displaying stereoscopically-multiplexed images of three-dimensional objects for use in realistic stereoscopic viewing thereof in interactive virtual reality display environments|
|US6100897 *||Dec 17, 1996||Aug 8, 2000||Art +Com Medientechnologie Und Gestaltung Gmbh||Method and device for pictorial representation of space-related data|
|US6157385 *||May 17, 1999||Dec 5, 2000||Oxaal; Ford||Method of and apparatus for performing perspective transformation of visible stimuli|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7425919 *||Feb 14, 2007||Sep 16, 2008||The United States Of America As Represented By The Secretary Of The Navy||Radar video data viewer|
|US8543990||May 14, 2008||Sep 24, 2013||Raytheon Company||Methods and apparatus for testing software with real-time source data from a projectile|
|US8897931 *||Aug 2, 2011||Nov 25, 2014||The Boeing Company||Flight interpreter for captive carry unmanned aircraft systems demonstration|
|US20080191931 *||Feb 14, 2007||Aug 14, 2008||Houlberg Christian L||Radar video data viewer|
|US20080288927 *||May 14, 2008||Nov 20, 2008||Raytheon Company||Methods and apparatus for testing software with real-time source data from a projectile|
|EP2156284A1 *||May 14, 2008||Feb 24, 2010||Raytheon Company||Methods and apparatus for testing software with real-time source data from a projectile|
|EP2156284A4 *||May 14, 2008||Sep 26, 2012||Raytheon Co||Methods and apparatus for testing software with real-time source data from a projectile|
|International Classification||H04N7/18, G06T1/00|
|Jul 2, 1999||AS||Assignment|
Owner name: UNITED STATES OF AMERICA AS REPRESENTED BY THE SEC
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HEYDLAUFF, BRUCE;REESE, THOMAS F.;REEL/FRAME:010114/0077;SIGNING DATES FROM 19990527 TO 19990702