|Publication number||US20070040062 A1|
|Application number||US 11/420,313|
|Publication date||Feb 22, 2007|
|Filing date||May 25, 2006|
|Priority date||May 25, 2005|
|Also published as||US20090080700, WO2008060257A2, WO2008060257A3|
|Publication number||11420313, 420313, US 2007/0040062 A1, US 2007/040062 A1, US 20070040062 A1, US 20070040062A1, US 2007040062 A1, US 2007040062A1, US-A1-20070040062, US-A1-2007040062, US2007/0040062A1, US2007/040062A1, US20070040062 A1, US20070040062A1, US2007040062 A1, US2007040062A1|
|Inventors||Daniel Lau, Michael Shaw|
|Original Assignee||Lau Daniel L, Shaw Michael F|
|Export Citation||BiBTeX, EndNote, RefMan|
|Referenced by (19), Classifications (10)|
|External Links: USPTO, USPTO Assignment, Espacenet|
1. Field of the Invention
The present invention relates to infrared imaging methods and systems. More particularly, this invention relates to determining the track of a projectile using a thermal signature fingerprint of the projectile.
2. Description of Prior Art
Existing counter-sniper systems predominantly use a passive sensor, which measures naturally available energy emitted by the target, rather than an active sensor, which actively emits radiation and uses the back reflection to detect objects. The passive sensors can be further categorized as acoustic and thermal infrared as well as hybrid sensors which fuse multiple sensing mechanisms. Acoustic sensors are usually microphone arrays that triangulate their recorded signals (e.g. sound wave produced by the targeted object) to rectify the source location. The benefits of using acoustic sensors are that they provide omni-directional detection and are inexpensive to build. However, this technology is not completely appropriate for detecting subsonic projectiles or for detecting supersonic projectiles that arrive at the target prior to the arrival of the acoustical energy generated by the firing of these projectiles. Moreover, muzzle blasts are often interfered with by background noise (e.g. sea current, urban noises) and/or signals that have similar propagation speeds.
Due to the disadvantages of acoustic sensing, thermal imaging technology has become an alternative option to scientists and engineers for counter-sniper targeting systems. For thermal imaging, hot spots in the image are used to detect the muzzle flash and/or the projectiles in flight. An example of thermal imaging is infrared radiation (IR) imaging, where infrared detectors are categorized as (1) thermal detectors that sense the changes of temperature of a sensing element heated by incoming IR radiation and (2) photon detectors that convert incoming photons directly into an electrical signal.
Even though IR imaging provides images that might represent the bullet discharge (muzzle flash) as well as the projectile in flight, the existing counter-sniper targeting systems that use this technology fail in many cases to locate a sniper. This is due to the fact that these systems rely on knowing the time of firing of the bullet in order to properly model the path of the bullet.
For instance, U.S. Pat. No. 5,596,509 to Karr teaches, as shown in
As shown in
The Karr patent suggests measuring the intensity of infrared radiation emitted from the bullet 14, and determining the path of the bullet 14 by measuring changes in the intensity of infrared radiation emitted from the bullet 14 as the bullet 14 travels through the region 12.
However, since bullets are relatively small, each pixel of the camera sensor “sees” the bullet as well as its background. Thus, the measured intensity of infrared radiation for each pixel of each bullet spot is a combination of the background radiation intensity and the bullet radiation intensity. Since the background radiation of the image can and will change from portion to portion of the image of the region, as well as from time to time depending on environmental conditions, the measured changes in intensity reflect both changes in the bullet intensity and changes in the background intensity. The Karr patent does not teach how to measure only the intensity of infrared radiation emitted from the bullet 14, as the sensor 10 measures infrared radiation that is a blended function of both the bullet 14 and the background. Because of the blended nature of the measured infrared radiation, simple frame differencing will not produce an accurate measure of the infrared radiation emitted by the bullet 14, and, therefore, cannot be used to accurately determine changes in the intensity of infrared radiation emitted from the bullet. Thus, the issue of determining the path of a bullet by measuring changes in the intensity of infrared radiation emitted from the bullet is left unresolved.
Therefore, there is a need for a system and method for determining the track of a projectile, such as a bullet, using the thermal signature of the projectile (the intensity of the infrared radiation emitted from by the projectile independent of the background radiation), which allows determining the track of the projectile without knowing the time of firing of the bullet.
The present invention meets the aforementioned needs, and others. Exemplary embodiments of the invention provide a system and method for tracking projectiles by their thermal signatures. As used herein, the term “projectile” shall be understood to include bullets as well as artillery shells, missiles, and other objects that exhibit the characteristics consistent with a bullet in flight. In one embodiment, a high speed infrared camera feeds images to a digital image processor and a command and control computer. Software identifies objects with characteristics consistent with a projectile in flight, and determines a projectile track solution, including the location from which the projectile was fired. Information on tracked projectiles is transmitted to other sensors or actuators by a variety of methods, including Local Area Networks (LAN's), wireless LAN's, Personal Digital Assistants (PDA's) and other similar devices. The system may be mounted on a variety of platforms, including stationary, vehicles, aerial vehicles, watercraft, etc.
Generally described, the invention allows determining the track of a projectile using a thermal signature of the projectile. A system according to the invention includes an infrared sensor, a database component, and a processing component. The infrared sensor acquires sequential infrared image frames. The database component relates projectile thermal signature values for pixel for projectile tracks detectable by the sensor. The processing component is operatively connected to the database component and the infrared sensor for: identifying a set of frames containing spots with characteristics consistent with a projectile in flight; identifying at least one possible projectile track solution for the spots; determining a projectile thermal signature value for each pixel of each spot of the possible projectile track solution; and ascertaining whether the determined projectile thermal signature substantially matches an actual projectile thermal signature from said database component for a substantially similar projectile track.
According to an aspect of the invention, the processing component comprises a projectile detection element and a track determination element. The projectile detection element identifies the frames containing spots with characteristics consistent with a projectile in flight. The track determination element identifies a possible projectile track solution, determines a projectile thermal signature value for the pixels of the spots given the possible projectile track solution, and ascertains whether the determined signature matches an actual signature from the database.
According to another aspect of the invention, identifying a set of frames containing spots with characteristics consistent with a projectile in flight includes identifying a series of spots over several frames that: are in a substantially straight line; have substantially similar spacing; and have spacing indicating a relatively fast moving object. Further, identifying a set of frames containing “projectile spots” may also include searching frames before and after the set of frames for additional spots along the substantially straight line, and including any frames containing the additional spots in the set of frames.
According to another aspect of the invention, identifying a possible projectile track solution includes: determining a centroid position of each of the spots; determining the spacing of the spot centroid positions relative to each other; and identifying at least one possible solution for a projectile track that would produce a projectile track having matching spot centroid positions.
Determining a projectile thermal signature value for each pixel of each spot of the possible solution may include: determining a measured brightness value for each pixel of each spot; determining a background brightness value for each pixel of each spot; and determining a projectile thermal signature value for each pixel of each spot by applying a predetermined blending function for each pixel of each spot of the possible projectile track solution to the measured brightness values and the average background brightness values. More specifically described, the predetermined blending function is a second-order Taylor Series expansion of the measured brightness value into intensity of the infrared radiation attributable to the projectile and the intensity of the infrared radiation attributable to the background.
According to yet another aspect of the invention, the system further has a graphical user interface component operatively connected to the processing component for presenting a final projectile track solution to a user.
The system may also have a visible light sensor positioned so as to have a field of view that overlaps a field of view of the infrared sensor. The visible light sensor would be operatively connected to the graphical user interface component, and the graphical user interface component would be further for overlaying an infrared image from the infrared sensor with a visible image from the visible light sensor for providing the user with a visible light context for the infrared image.
The system may still further have a position/direction component positioned adjacent to the infrared sensor. The position/direction component would be operatively connected to the processing component for providing the actual global position and direction of the infrared sensor to the processing component, so that the processing component can provide an actual global projectile track solution, including the actual global location of the point from which the projectile was fired.
The system may further have an active target designator unit operatively connected to the processing component for designating and tracking the projectile using the final projectile track solution.
Advantageously, the steps of the invention are efficiently and effectively performed on the processing component. Therefore, another aspect of the invention is a computer readable medium having computer executable instructions for performing a method for determining the track of a projectile using a thermal signature of the projectile, as described above.
Yet another aspect of the invention is a method for building a projectile thermal signature fingerprint record. The thermal signature fingerprint building method includes the steps of: (a) selecting an initial projectile track; (b) aiming the field of view of an infrared sensor at a portion of a path of travel of the projectile track; (c) repeatedly shooting projectiles in the projectile track in a first environmental condition; (d) recording infrared images of the projectiles of step (c); (e) repeatedly shooting projectiles in the projectile track in a second environmental condition that has a substantially different ambient temperature from the first environmental condition; (f) recording infrared images of the projectiles of step (e); (g) determining a projectile thermal signature value for each pixel corresponding to a position along the projectile track; (h) moving the infrared sensor to another portion of the path of travel of the projectile track and repeating steps (c) through (h) until the full path of travel of the projectile track is documented; and (i) selecting anther projectile track and repeating steps (b) through (i) until blended function values and projectile thermal signature values are determined for observable solution tracks. The projectile thermal signature value for each pixel corresponding to a position along the projectile track is determined by: using a blending function to characterize the measured brightness value of each pixel as a blend of the infrared radiation attributable to the projectile and the infrared radiation attributable to the background; setting the average values of the radiation attributable to the projectile for each pixel of each set of images equal to one other; solving for the unknown values of the blending function for each pixel corresponding to a position along the projectile track; and solving for the projectile thermal signature value for each pixel corresponding to a position along the projectile track. Again, the blending function may be a second-order Taylor Series expansion of the measured brightness value into intensity of the infrared radiation attributable to the projectile and the intensity of the infrared radiation attributable to the background.
The preceding description is provided as a non-limiting summary of the invention only. A better understanding of the invention will be had by reference to the following detail description, and to the appended drawings and claims.
However, at position B, which has a relatively small distance from the sensor 40, the sensor 40 detects a thermal spot 50 having an area of two pixels as the projectile “streaks” by the location of the sensor 40 over the integration time of the image. The thermal spot 50 also has a measured intensity that includes intensity from the projectile 44 b and intensity from any background radiation 48 b. The physical area of the projectile 44 b with respect to the area of the background 48 b is relatively large and the image represents movement or “streaking” of the projectile 44 b past the sensor 40 during the integration time of the image frame. Thus, the radiation from the projectile 44 b is relatively transparent or blurred with respect to the radiation from the background 48 b.
It should be noted that the sensor 40 most likely has a field of view that is much narrower than the entire region of the projectile track 41, and, most likely, has a sensitivity range and distance beyond which a projectile would be undetectable. For instance, as shown in
However, to the extent that a projectile is detectable within the field of view of the sensor 40, the relation of the thermal characteristics of the projectile with respect to the range and angle of the projectile from the sensor 40 creates a unique “thermal signature” of the projectile. Further, projectiles of a common caliber and composition have common thermal and aerodynamic characteristics. The unique thermal signature of the projectile will be consistent for projectiles of a common caliber and composition, and substantially independent of the environmental conditions.
In one embodiment, the measured brightness of each pixel that makes up a projectile spot is written as a second-order Taylor Series expansion as follows:
The alpha term has a different value for every angle and range position within the detectable region of the projectile track and ProjectileSpotBrightnesspixel is the unique thermal signature value of the projectile at the angle and range position for the associated alpha value. One of skill in the art will recognize that higher-order expansions would produce results with greater accuracy. However, it has been determined that the second-order Taylor Series expansion provided in equation (1) will produce results with adequate accuracy.
Alpha can be derived using the following process:
As shown in
As shown in
Noting that unique thermal signature of the projectile, ProjectileSpotBrightnesspixel, will be consistent for projectiles of a common caliber and composition at a given angle and range position with respect to the sensor 40, and substantially independent of the environmental conditions, equation (1) can be rewritten as:
Setting (alpha)ProjectileSpotBrightnesspixel equal for the pixels of images of the two batches of M shots, one can solve for alpha for each pixel location, d3, along the projectile path image, as follows:
(MeasuredBrightnesspixel−(1-alpha)BackgroundBrightnesspixel)|first batch=(MeasuredBrightnesspixel−(1-alpha)BackgroundBrightnesspixel)|second batch (3)
Then, as shown in
Then the process of rotating the sensor 40 such that the left edge of the field of view is lined up with the right edge of the previous field of view is continued until the entire path of the projectile from the firing location 42 until the projectile is beyond the sensor's detectable range is covered.
Once the alpha values for each pixel location, d3, along the projectile path image are determined, the ProjectileSpotBrightnesspixel values for each batch can be determined. The ProjectileSpotBrightnesspixel values can then be averaged. Thus, the complete record will include the alpha values and ProjectileSpotBrightnesspixel values for each angle and range position (measured in terms of d1, d2 and d3) along the projectile path.
The process is then repeated for other possible d1 and d2 to build a data record of the characteristics of the projectile with respect to angle and range positions along detectable projectile tracks. To save time, only a discrete set of d1 and d2 values can be selected and the intermediate values interpolated. The resulting data record acts as a “thermal signature fingerprint” for projectiles having the caliber and composition of the subject projectiles.
Data records can then be developed for other projectile calibers and compositions, if desired, by following the same procedure.
In one embodiment, the infrared sensor 52 is an optical, focal-plane-array detector having a 3-5 micron IR filter and working in a snap-shot style recording mode. The sensor 52 also has a high-speed video output unit, such as RS-422, camera-link, gigabit Ethernet, or similar cable interface.
The projectile detection element 54 is preferably a combination of a high-speed, digital signal processor (DSP) and software running thereon for acquiring sequential infrared images frames from a sensor at a given position, and identifying a set of frames containing spots with characteristics consistent with a projectile in flight. The projectile detection element 54 then passes the set of frames along with projectile track structure data to the tract determination element 56. The steps for identifying a set of frames containing spots with characteristics consistent with a projectile in flight will be described below.
The tract determination element 56 is preferably a combination of a computer and software running thereon for receiving the set of frames and the projectile tract structure data from the projectile detection element 54. The tract determination element 56 then: identifies at least one possible projectile track solution for the spots; determines a projectile thermal signature value for each pixel of each spot of the possible projectile track solution; retrieves actual thermal signature values for a substantially similar projectile track solution from the database component; and compares the determined thermal signature values with actual thermal signature values to determine the accuracy of the possible projectile track solution. If the accuracy is within an acceptable limit, i.e. a match, the possible projectile track solution is accepted as the actual projectile track solution. If the accuracy is not within an acceptable limit, another possible projectile track solution is identified and tested for accuracy. The steps for identifying possible projectile track solutions and determining a projectile thermal signature value for each pixel of each spot of the possible solutions will also be described below.
Once an actual projectile track solution is determined, the projectile track solution is presented to a user on the graphical user interface (GUI) component 60. The GUI 60 may be a tablet PC, a PDA, or any other interactive graphical interface. Advantageously, the visible light sensor 64, such as a video camera, can be selected and positioned so as to have a field of view that overlaps the field of view of the infrared sensor 52. In this manner, the infrared image and the visible image can be overlaid to provide the user with a visible light context for the infrared images.
Further, while the actual projectile track solution will provide the location 42 from which the projectile was fired as well as the projectile track 41 with respect to the location of the infrared sensor 52, the position/direction component 62 will provide the actual position and direction of the infrared sensor 52. This will allow global identification of the location 42 from which the projectile was fired and the projectile track 41, rather than just identification of the parameters with respect to the location of the infrared sensor 52. As shown, the position/direction component 62 may include a global positioning system (GPS) unit 70 and an electronic compass unit 72.
Further, the actual projectile track solution may be output to an active target designator unit 66, such as a Light Detection and Ranging (LIDAR) device, for designating and tracking the projectile.
The step of identifying a set of frames containing spots with characteristics consistent with a projectile in flight is shown in more detail in
Alternatively, the filtering steps could include the following steps: A mean, variance, and standard deviation of the previous twenty difference video frames are calculated recursively. As a new frame is captured, the oldest frame is removed from the mean and the new frame is added. The new mean is calculated without having to use the entire 20 frames. The most recent difference frame is threshold pixelwise using the standard deviation. Any pixel value below a multiple of the standard deviation is set to zero. In this way, projectiles whose pixel values exceed the background standard deviation will be detected, but Gaussian noise which will only rarely exceed a value of three times the standard deviation will be filtered out. The threshold image is segmented into blobs to isolate the projectile data. The resulting threshold difference video will still contain some high frequency noise along with the projectile data. Noise data generally has a small blob size and can be eliminated by excluding blobs having an area less than a certain limit.
Once potential projectile spots are identified following the filtering process, the potential projectile spots are analyzed to determine if they have characteristics of a projectile in flight. To be classified as a projectile spot, all combinations of three spots over three consecutive frames (one spot for each frame) are examined. The centroids of spots comprising more than one pixel may be determined for the purpose of analyzing the spots.
If any of the determinations S204, S206, S208 is negative, then the spots do not have the characteristics of a projectile in flight, and the next step would be S210 obtaining the next combination of three spots over the three consecutive frames. The next combination of three spots would then be analyzed for the necessary criteria in steps S204, S206 and S208.
However, if the determinations S204, S206, S208 are affirmative, then the spots are classified as a projectile track, and a new projectile track structure record is created. The projectile track structure includes data such as: the frame numbers of the frames containing the spots, centroids of the spots, the trajectory angle between the best fit straight line connecting A, B, C and the horizontal axis, and the Y-intercept of the same best fit line.
The steps S200, S202, S204, S206, S208 and S210 must be performed in real-time, meaning that the projectile detection element 54 (
Assuming that A is not the first appearance of the projectile, then there exist a projectile track structure generated during the previous frames. As such prior to creating a new projectile track, the list of all tracks available from the previous frame is searched such that, if the trajectory angle and Y-intercept of the best fit line to A, B and C is equal or close to the angle and intercept of an existing track, then the projectile track structure data is updated with the new data for C.
The next frame may or may not contain an additional spot to add to the projectile track structure. If it does not contain an additional spot to add, then the projectile track structure may be classified as expired, and ready to be post processed.
Post processing includes searching frames before and after the frames containing the spots in the projectile track structure for additional spots along the best-fit line and at increments of the anticipated spacing, such as shown in
For four consecutive spots, A, B, C, and D, the average distance ratio is:
Given the average distance ratio, AverageDistanceRatio, the number of frames prior to a projectile's first sighting as well as after its last sighting when it may not have been detecting because of adaptive thresholding or because its path may have been obscured from view can be determined. With continued reference to
Defining this new point as b, then travel a distance (|bA|/AverageDistanceRatio) from b and away from A to get point c, and so on until the number of steps needed to move outside the camera's field of view is found. This number, minus 1, is the number of frames prior to A that the projectile might be visible, although not detected.
For getting the number of frames after C, a distance (|BC|*AverageDistanceRatio) from C and away from B, must be moved along the best-fit line. Repeating this process, the number of frames after C that the projectile may be visible can be estimated.
In order to complete the post processing of an expired projectile track, all video frames from the circular input frame buffer corresponding to the detected spots in the track, as well as all frames where the spot may have been visible prior to A and all frames where the spot may have been visible after the last spot C are extracted. These extracted video frames along with the projectile track structure record are then moved off the projectile detection element 54 (
Thus, returning to
Turning now to determining the track of the projectile, including the location from which the projectile was fired,
Returning now to
MeasuredBrightnesspixel and BackgroundBrightnesspixel are obtained from the actual spots from the set of frames, as described for obtaining these values for the projectile “thermal signature fingerprint” record.
After determining the ProjectileSpotBrightnesspixel (projectile thermal signature) values for the pixels of each spot, these values can be compared to the actual values from the projectile “thermal signature fingerprint” record. The final projectile track solution is the one that minimizes the mean square error, over all of the pixels, of ProjectileSpotBrightnesspixel (projectile thermal signature) values for the pixels of each spot from the possible projectile track solutions.
Advantageously, the system and methods disclosed herein are applicable to detecting and tracking projectiles, and have potential applications well beyond “sniper” detection. For instance, the thermal signature fingerprint of a projectile may be used to evaluate projectiles larger than bullets for verifying or creating ballistics range tables for creation of Surface Danger Zone templates, for gathering ballistic firing table data, and for gathering terminal ballistics data against specific targets. Other potential uses include: munitions arena testing, projectile flight characteristics development, terminal ballistics lethality data collection, operational suitability analysis, verifying lethality models in support of future combat system programs, and for safety and operational suitability testing.
Additional potential applications of the system and methods disclosed herein include: law enforcement (routine, special events (e.g. large spectator events), surveillance of high crime rate areas, convoy security for VIPs/diplomats); homeland security (border patrolling); airport security; government office security (embassy surveillance); and military applications (projectiles and munitions, stealth craft, aircraft and watercraft detection through clouds and fog, perimeter security, convoy security, Military Operations on Urban Terrain (MOUT) operations and environment, and counter-sniper/counter battery fires).
Thus, the improvements described herein provide a method and system for determining the track of a projectile using a thermal signature of the projectile. One of ordinary skill in the art will recognize that additional configurations are possible without departing from the teachings of the invention or the scope of the claims which follow. This detailed description, and particularly the specific details of the exemplary embodiments disclosed, is given primarily for completeness and no unnecessary limitations are to be imputed therefrom, for modifications will become obvious to those skilled in the art upon reading this disclosure and may be made without departing from the spirit or scope of the claimed invention.
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7660483||Nov 30, 2005||Feb 9, 2010||Adobe Systems, Incorporated||Method and apparatus for removing noise from a digital image|
|US7696919 *||Feb 19, 2008||Apr 13, 2010||Lockheed Martin Corporation||Bullet approach warning system and method|
|US7965868 *||Jul 20, 2007||Jun 21, 2011||Lawrence Livermore National Security, Llc||System and method for bullet tracking and shooter localization|
|US8064721||Dec 28, 2009||Nov 22, 2011||Adobe Systems Incorporated||Method and apparatus for removing noise from a digital image|
|US8068641 *||Jun 19, 2008||Nov 29, 2011||Qualcomm Incorporated||Interaction interface for controlling an application|
|US8294609 *||Apr 28, 2010||Oct 23, 2012||Src, Inc.||System and method for reduction of point of origin errors|
|US8373127 *||Jun 25, 2009||Feb 12, 2013||Lynntech, Inc.||Method of searching for a thermal target|
|US8570499 *||Aug 24, 2010||Oct 29, 2013||Sius Ag||Method for electronically determining the shooting position on a shooting target|
|US8981989 *||Mar 25, 2011||Mar 17, 2015||Bae Systems Plc||Projectile detection system|
|US9103628||Mar 14, 2013||Aug 11, 2015||Lockheed Martin Corporation||System, method, and computer program product for hostile fire strike indication|
|US20090321636 *||Jun 25, 2009||Dec 31, 2009||Lynntech, Inc.||Method of searching for a thermal target|
|US20110267218 *||Apr 28, 2010||Nov 3, 2011||Src, Inc.||System and Method for Reduction of Point of Origin Errors|
|US20120194802 *||Aug 24, 2010||Aug 2, 2012||Sius Ag||Method for electronically determining the shooting position on a shooting target|
|US20120242809 *||Mar 19, 2012||Sep 27, 2012||Ben White||Video surveillance apparatus using dual camera and method thereof|
|US20130021195 *||Mar 25, 2011||Jan 24, 2013||Bae Systems Plc||Projectile detection system|
|US20130201052 *||Feb 6, 2012||Aug 8, 2013||The Boeing Company||Object Detection System for Mobile Platforms|
|WO2007065087A1 *||Nov 28, 2006||Jun 7, 2007||Adobe Systems Inc||Method and apparatus for removing noise from a digital image|
|WO2009158494A1 *||Jun 25, 2009||Dec 30, 2009||Lynntech, Inc.||Method of searching for a thermal target|
|WO2012115594A1 *||Feb 21, 2012||Aug 30, 2012||Stratech Systems Limited||A surveillance system and a method for detecting a foreign object, debris, or damage in an airfield|
|U.S. Classification||244/3.16, 244/3.15, 244/3.1|
|Cooperative Classification||G01S17/66, F41G3/147, G01S17/023|
|European Classification||G01S17/66, F41G3/14D, G01S17/02C|