Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20030140775 A1
Publication typeApplication
Application numberUS 10/060,565
Publication dateJul 31, 2003
Filing dateJan 30, 2002
Priority dateJan 30, 2002
Publication number060565, 10060565, US 2003/0140775 A1, US 2003/140775 A1, US 20030140775 A1, US 20030140775A1, US 2003140775 A1, US 2003140775A1, US-A1-20030140775, US-A1-2003140775, US2003/0140775A1, US2003/140775A1, US20030140775 A1, US20030140775A1, US2003140775 A1, US2003140775A1
InventorsJohn Stewart
Original AssigneeStewart John R.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set
US 20030140775 A1
Abstract
A method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set are disclosed. The method includes sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and targeting a controlled system to the position from the three-dimensional data set. The apparatus includes a program storage medium, a controller, and a controller interface. The program storage medium is capable of storing a three-dimensional data set representing a field of view. The controller is capable of generating a presentation of the three-dimensional data set. A position represented by at least a subset of the three-dimensional data can be sighted and the position can be targeted from the subset through the controller interface.
Images(8)
Previous page
Next page
Claims(46)
What is claimed:
1. A method, comprising:
sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and
targeting a controlled system to the position from the three-dimensional data set.
2. The method of claim 1, wherein the three-dimensional data comprises LADAR data.
3. The method of claim 1, further comprising at least one of:
acquiring the three-dimensional data;
processing the three-dimensional data;
displaying a representation of the three-dimensional data;
displaying a projected target point after the controlled system is targeted; and
taking an action responsive to targeting the position.
4. The method of claim 3, wherein acquiring the three-dimensional data includes:
transmitting a plurality of LADAR pulses; and
receiving the LADAR pulses after they are reflected.
5. The method of claim 3, wherein processing the three-dimensional data includes generating a three-dimensional image from the three-dimensional data.
6. The method of claim 5, wherein the three-dimensional image is the representation.
7. The method of claim 5, wherein generating the three-dimensional image includes:
pre-processing the three-dimensional data;
detecting a target represented by a subset of the three-dimensional data;
segmenting the subset from the remainder of the three-dimensional data;
extracting features of the target from the segmented data; and
classifying the segmented subset as including a particular kind of target based on the extracted features.
8. The method of claim 1, wherein sighting the position indicating a portion of a displayed image generated from the three-dimensional data.
9. The method of claim 8, wherein targeting the controlled system includes aiming a weapon system at the sighted position.
10. The method of claim 1, wherein targeting the controlled system includes aiming a weapon system at the sighted position.
11. An apparatus, comprising:
a program storage medium capable of storing a three-dimensional data set representing a field of view;
a controller capable of generating a presentation of the three-dimensional data set;
a controller interface through which a position represented by at least a subset of the three-dimensional data can be sighted and through which the position can be targeted from the subset.
12. The apparatus of claim 11, wherein the program storage medium comprises a magnetic program storage medium or an optical program storage medium.
13. The apparatus of claim 11, wherein the magnetic program storage medium comprises a floppy disk, a zip disk, or a hard disk.
14. The apparatus of claim 12, wherein the optical program storage medium comprises an optical disk.
15. The apparatus of claim 11, wherein the controller comprises a digital processor.
16. The apparatus of claim 15, wherein the digital processor is a microprocessor or a digital signal processor.
17. The apparatus of claim 11, wherein the controller interface includes a display.
18. The apparatus of claim 17, wherein the display is a helmet-mounted display or a rack-mounted display.
19. The apparatus of claim 11, wherein the display includes a touch screen.
20. The apparatus of claim 17, wherein the controller interface includes at least one peripheral input/output device.
21. A controlled system, comprising:
a data acquisition system capable of acquiring a three-dimensional data set representing a field of view;
a sighting and targeting subsystem, including:
a program storage medium capable of storing the three-dimensional data set;
a controller capable of generating a presentation of the three-dimensional data set; and
a controller interface through which a position represented by at least a subset of the three-dimensional data can be sighted and through which the position can be targeted from a presentation of the subset;
a control subsystem capable of implementing instructions from the sighting and targeting subsystem.
22. The controlled system of claim 21, wherein the data acquisition system includes a LADAR system.
23. The controlled system of claim 21, wherein the LADAR system comprises a direct diode LADAR system.
24. The controlled system of claim 21, wherein the control subsystem comprises a weapon pointing system.
25. A method, comprising:
acquiring a three-dimensional data set representing the content of a field of view;
generating a three-dimensional representation of the content from the three-dimensional data set;
displaying the three-dimensional representation;
sighting a position within the field of view from the three-dimensional representation; and
targeting the sighted position using the three-dimensional data set.
26. The method of claim 25, wherein acquiring the three-dimensional data set includes:
transmitting a plurality of light pulses; and
receiving a plurality of the transmitted light pulses upon their reflection by an object in the field of view.
27. The method of claim 26, further comprising:
extracting the three-dimensional data from the received light pulses; and
storing the received light pulses in a row column format.
28. The method of claim 25, wherein generating the three-dimensional representation includes:
detecting a region of interest in the three-dimensional image;
segmenting a target in the region of interest from the three-dimensional image;
extracting features of the segmented target; and
classifying the target from the extracted features.
29. The method of claim 25, further comprising pre-processing the three-dimensional data.
30. The method of claim 25, further comprising transmitting the generated three-dimensional image to a remote location before displaying the three-dimensional image.
31. An apparatus, comprising:
means for sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and
means for targeting a controlled system to the position from the three-dimensional data set.
32. The apparatus of claim 31, wherein the three-dimensional data comprises LADAR data.
33. The apparatus of claim 31, further comprising at least one of:
means for acquiring the three-dimensional data;
means for processing the three-dimensional data;
means for displaying a representation of the three-dimensional data;
means for displaying a projected target point after the controlled system is targeted; and
means for taking an action responsive to targeting the position.
34. The apparatus of claim 31, wherein targeting the controlled system includes aiming a weapon system at the sighted position.
35. An apparatus, comprising:
means for storing a three-dimensional data set representing a field of view;
means for generating a presentation of the three-dimensional data set;
means for sighting a position represented by at least a subset of the three-dimensional data and for targeting the position from the subset.
36. The apparatus of claim 35, wherein the storing means comprises a magnetic program storage medium or an optical program storage medium.
37. The apparatus of claim 35, wherein the generating means comprises a digital processor.
38. The apparatus of claim 35, wherein the sighting and targeting means includes a display.
39. The apparatus of claim 21, wherein the program storage medium comprises a magnetic program storage medium or an optical program storage medium.
40. The apparatus of claim 21, wherein the magnetic program storage medium comprises a floppy disk, a zip disk, or a hard disk.
41. The apparatus of claim 21, wherein the controller comprises a digital processor.
42. The apparatus of claim 21, wherein the controller interface includes a display.
43. The apparatus of claim 21, wherein the display includes a touch screen.
44. The method of claim 25, wherein sighting the position indicating a portion of a displayed image generated from the three-dimensional data.
45. The method of claim 25, wherein targeting the controlled system includes aiming a weapon system at the sighted position.
46. The method of claim 25, wherein targeting the controlled system includes aiming a weapon system at the sighted position.
Description
BACKGROUND OF THE INVENTION

[0001] 1. Field of the Invention

[0002] The present invention is directed to sighting and targeting systems and, more particularly, to a method and apparatus for targeting a controlled system from a common three-dimensional data set.

[0003] 2. Description of the Related Art

[0004] Many human endeavors involve “sighting” and “targeting.” Although many of these endeavors are civilian, these terms are most commonly associated with weapons systems in a military context. For instance, consider the automatic fire control system disclosed in U.S. Pat. No. 4,004,729, issued Jan. 25, 1977, to Lockheed Electronics Co., Inc. as assignee of the inventors Harris C. Rawicz, et al. (“Rawicz et al.”). A gunner uses an optical sight to sight a weapons system on a target. A digital processor then manipulates a variety of factors, e.g., the velocity of the target and the target's position, to determine a projected position for the target. The digital processor then targets the weapon to fire at the target at its projected position.

[0005] One important distinction between sighting and targeting in these types of systems is the nature of the data involved. Sighting is typically performed on two-dimensional data whereas targeting is performed on three dimensional data. Consider, again, the weapons control system in Rawicz, et al. The gunner sights the target by placing the cross hairs of the optical sight on the target. Thus, the gunner sights in azimuth (i.e., where on the horizon) and elevation (i.e., how far above the horizon). However, targeting also takes into account the range (i.e., how far away) of the target. Range is important because projectile trajectories and environmental conditions (e.g., windage) affect the targeting.

[0006] Operating off two different sets of data can be disadvantageous for a number of reasons. Most of these reasons arise from the fact that two different data sets generally require two different acquisition systems. Returning to the weapons system of Rawicz et al., the digital processor determines the projected position using both the two-dimensional data (i.e. the current position indicated by the sighting) and the three-dimensional data (i.e., the velocity of the target). Thus, errors occurring in both the two-dimensional and the three-dimensional data acquisition infect the targeting. Two different acquisition systems also consume more physical space than does a single acquisition system, which can be a significant constraint in some demanding applications.

[0007] Some sighting and targeting techniques employ a single set of two-dimensional data. A foot soldier may sight and target a firearm using the same data, i.e., the two-dimensional data available from, e.g., an infrared scope. However, while this process enjoys some of the benefits available from using a common set of data for both sighting and targeting, it suffers inherently from the lack of three-dimensional data. For instance, the soldier must still manually adjust for range and windage because the data lacks information regarding the range. The process also exposes the soldier to enemy fire, which is generally considered undesirable.

[0008] The present invention is directed to resolving, or at least reducing, one or all of the problems mentioned above.

SUMMARY OF THE INVENTION

[0009] The invention, in its various aspects, is a method and apparatus for sighting and targeting a controlled system from a common three-dimensional data set. It includes a method, comprising sighting a position correlated to at least a subset of a three-dimensional data set representing a field of view; and targeting a controlled system to the position from the three-dimensional data set. It also includes an apparatus comprising a program storage medium, a controller, and a controller interface. The program storage medium is capable of storing a three-dimensional data set representing a field of view. The controller is capable of generating a presentation of the three-dimensional data set. A position represented by at least a subset of the three-dimensional data can be sighted and the position can be targeted from the subset through the controller interface.

BRIEF DESCRIPTION OF THE DRAWINGS

[0010] The invention may be understood by reference to the following description taken in conjunction with the accompanying drawings, in which like reference numerals identify like elements, and in which:

[0011]FIG. 1 conceptually illustrates a controlled system with which the present invention can be implemented to sight and target a position within a field of view;

[0012]FIG. 2 is a conceptual block diagram of selected portions of the controlled system first shown in FIG. 1;

[0013]FIG. 3 is a block diagram illustrating how selected portions of a controlled system, such as that shown in FIG. 2, may be implemented;

[0014]FIG. 4 illustrates one particular embodiment of the present invention employed in a military context;

[0015]FIG. 5 is a block diagram depicting a computing apparatus such as may be used in implementing the embodiment in FIG. 4;

[0016]FIG. 6 illustrates a second embodiment alternative to that in FIG. 4;

[0017]FIG. 7 illustrates one particular implementation of the scenario set forth in FIG. 4;

[0018]FIG. 8 is a block diagram of one particular implementation of the controlled system first illustrated in FIG. 4; and

[0019]FIG. 9 depicts the handling of three-dimensional data acquired in the scenario in FIG. 7.

[0020] While the invention is susceptible to various modifications and alternative forms, the drawings illustrate specific embodiments herein described in detail by way of example. It should be understood, however, that the description herein of specific embodiments is not intended to limit the invention to the particular forms disclosed, but on the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention as defined by the appended claims.

DETAILED DESCRIPTION OF THE INVENTION

[0021] The detailed description below illustrates exemplary embodiments of the invention. In the interest of clarity, not all features of an actual implementation are described in this specification. It will of course be appreciated that in the development of any such actual embodiment, numerous implementation-specific decisions must be made to achieve the developers' specific goals, such as compliance with system-related and business-related constraints, which will vary from one implementation to another. Moreover, it will be appreciated that such a development effort, even if complex and time-consuming, would be a routine undertaking for those of ordinary skill in the art having the benefit of this disclosure.

[0022] Turning now to the drawings, FIG. 1 conceptually illustrates a controlled system 100 with which the present invention can be implemented to sight and target a position 110 within a field of view 120. The field of view 120 comprises the volume defined by the propagation boundaries 122 and the plane 124 in accordance with conventional practice. The plane 124 represents the maximum range of the data acquisition mechanism (not shown in FIG. 1) used to gather three-dimensional data concerning the content of the field of view 120.

[0023]FIG. 2 is a block diagram of one implementation 200 of the controlled system 100 in FIG. 1. The controlled system 200 implements a fire control system in a military context using the present invention. The controlled system 200 includes a direct diode LADAR system 210, a weapon platform 220, a digital processor 230, and a helmet mounted display 240. The direct diode LADAR system 210 transmits laser light pulses that reflect from the target 250 back to the direct diode LADAR system 210. The digital processor 230 extracts three-dimensional data from the reflected light received by the direct diode LADAR system 210, and processes it to present a three-dimensional image (not shown) on the helmet mounted display 240. The digital processor 230 also receives position information from the weapon platform 220. A user then sights the controlled system at a position, e.g., the position 110 in FIG. 1, portrayed in the three-dimensional image. In the implementation of FIG. 2, the sighted position 110 is the position of the target 250. The digital processor 230 receives the sighting information from the helmet-mounted display 240, and targets the position by issuing pointing commands to the weapon platform 220. The targeting commands are formulated from the sighting information and the three-dimensional data extracted from the reflected light.

[0024] Note that the sighting and targeting in the controlled system 200 are performed from a common set of three-dimensional data. Thus, the sighting and targeting a priori account for the range to the target 250. Furthermore, because the target 250 is sighted from an image and the targeting is performed by the digital processor 230, the soldier is not necessarily exposed to enemy fire. Indeed, the ability to sight from the three-dimensional image also means that the soldier can be located remotely from the weapon platform 220 in some implementations. Such an arrangement would provide an additional degree of safety for the soldier. Another advantage of employing a common set of three-dimensional data is that an impact point projected from the issued targeting commands can be displayed to the user for, e.g., confirmation.

[0025] Note that the invention admits wide variation in implementation. Many of these variations may be realized in the narrow context of a military application. For instance, even in the context of a weapon system such as the controlled system 200 in FIG. 2, many different types of LADAR systems, displays, processors, and weapon platforms might be employed in various alternative embodiments. Many of these variations might also arise from the nature of the application, as the invention has many civilian applications as well. For instance, many civilian applications such as hazardous waste disposal or remote surveillance would better benefit from a remote display instead of a helmet-mounted display. Some of these kinds of variations are explored below.

[0026]FIG. 3 is a block diagram illustrating how selected portions of a controlled system 300, such as the controlled system 200 in FIG. 2, may be implemented. The controlled system 300 comprises a data acquisition subsystem 305; an apparatus, which, in the context of the controlled system 300, may be, referred to as a sighting and targeting subsystem 310, and a control subsystem 320. Generally, the data acquisition subsystem 305 acquires a three-dimensional data set representing the field of view 120 (shown in FIG. 1) and its contents, including the position 110 (also shown in FIG. 1). The data acquisition may include, e.g., the direct diode LADAR 210 of FIG. 2. The data in the three-dimensional data set typically is measured in a spherical coordinate system (e.g., range, elevation, and azimuth) but other coordinate systems (e.g., Cartesian) may be employed. A user (not shown) then interacts with the sighting and targeting subsystem 310 to sight and target the position 110 as previously discussed relative to FIG. 2. The control subsystem 330 then implements instructions from the sighting and targeting subsystem 320 to implement the targeting.

[0027] More particularly, the three-dimensional data set acquired by the data acquisition subsystem 300 is stored in a program storage medium 312. The program storage medium may be of any suitable type known to the art. For instance, it may be implemented in a magnetic (e.g., a floppy disk or a hard drive), or optical (e.g., a compact disk read only memory, or “CD ROM”) medium and may be read only or random access. However the program storage medium 312 will generally be implemented in a magnetic, random access medium. The three-dimensional data may be stored by encoding it in any suitable data structure on the program storage medium 312.

[0028] A controller 314 then operates on the three-dimensional data set to process it for a controller interface 316. The controller 314 may be any suitable data processing device, e.g., a digital signal processor (“DSP”) or a microprocessor, such as the digital processor 230 in FIG. 2. The controller interface 316 will typically present the three-dimensional data set to the user by displaying it as a three-dimensional image. The controller interface 316 will therefore typically process the three-dimensional data into a video data set for display as a three dimensional image. The controller interface 316 might also perform a variety of preprocessing activities associated with this type of processing. For instance, the controller 314 might fuse the three-dimensional data set with two-dimensional data regarding the field of view 120 (shown in FIG. 1) acquired in addition to the three-dimensional data set. Many techniques for this type of processing and pre-processing are known to the art and any such technique suitable to the particular implementation may be employed.

[0029] Many details of the controller interface 316 will be implementation specific. As was mentioned, the controller interface 316 typically presents the three-dimensional data set as a three-dimensional image. The controller interface 316 will therefore typically include a video display (not shown in FIG. 3) of some kind that may be rack-mounted or part of a heads-up display (“HUD”), such as the helmet-mounted display 240 in FIG. 2. The controller interface 316 may also include one or more peripheral input/output (“I/O”) devices (also not shown in FIG. 3), such as a keyboard, a mouse, and a joystick. However, the video display could include, for instance, a touch screen such that the user can input directly without the aid of peripheral I/O devices.

[0030] Thus, in one aspect, the invention includes an apparatus that may be employed as a subsystem (e.g., the sighting and targeting subsystem 310) in a controlled system (e.g., the controlled system 300). This apparatus comprises a program storage medium (e.g., the program storage medium 312), a controller (e.g., the controller 314), and a controller interface (e.g., the controller interface 316). The program storage medium, when the apparatus is employed in accordance with the present invention, stores a three-dimensional data set representing a field of view (e.g., the field of view 120). The controller generates a presentation of the three-dimensional data set. A user then sights and targets a position in that field of view from a presentation of the three-dimensional data by indicating a subset of that presentation. This aspect further includes a method in which the user sights a position correlated to a subset of the three-dimensional data set representing the field of view and targets a controlled system to the position from the three-dimensional data set.

[0031] For a more concrete example, consider the scenario 400 presented in FIG. 4. The scenario 400 occurs in a military context, although the invention is not so limited. In the scenario 400, a vehicle 410 includes a weapon platform and is equipped and operated in accordance with the present invention. The occupant (not shown) of the vehicle 410 is interested in destroying, or at least incapacitating, the lead tank 420 in an advancing column within the field of view 430. The occupant employs the present invention to sight and target the weapons system (also not shown) on the position 440 of the lead tank 420. Once the weapons system is targeted, it can be fired.

[0032] The vehicle 410 is equipped with a laser-radar (“LADAR”)—based data acquisition system (not shown) that paints the field of view 420 with a number of laser pulses (also not shown). The laser pulses propagate through the field of view 430 until they encounter an object (e.g., the tank 420, a tree, or the ground) and are reflected back to the vehicle 410. From the reflected pulses, the data acquisition system extracts a three-dimensional data set. Any suitable LADAR system known to the art may be employed. The vehicle 410 is also equipped with a rack-mounted computing apparatus 500, conceptually illustrated in FIG. 5. The computing apparatus includes a processor 505 communicating with some storage 510 over a bus system 515. The storage 510 may include a hard disk and/or RAM and/or removable storage such as the floppy magnetic disk 515 and the optical disk 520. The storage 510 is encoded with a data structure 525 storing the three-dimensional data set acquired as discussed above. Thus, the storage 510 is one implementation of the program storage medium 312 (shown in FIG. 3).

[0033] The storage 510 is also encoded with an operating system 530 and some user interface software 535 that, in conjunction with the display 540, constitute a user interface 545. The user interface 545 is one implementation of the controller interface 316 in FIG. 3. As previously noted, the display 540 may be a touch screen allowing the user to input directly into the computing apparatus. However, the user interface 545 may include peripheral I/O devices such as the keyboard 550, the mouse 555, or the joystick 560, for use with other types of displays, e.g., a HUD. The processor 505 runs under the control of the operating system 530, which may be practically any operating system known to the art. The processor 505, under the control of the operating system 530, invokes the user interface software 535 on startup so that the operator can control the computing apparatus 500.

[0034] The storage 510 is also encoded with an application 565 invoked by the processor 505 under the control of the operating system 530 or by the user through the user interface 545. The application 565, when executed by the processor 505, performs any processing or pre-processing on the three-dimensional data stored in the data structure 525. The application 565 also displays the three-dimensional image 450 of the field of view 430 (shown in FIG. 4), or a portion thereof, on the display 540. The three-dimensional image 450 may be presented to a passenger of the controlled system 410 or to a remote user. The user then sights the weapons subsystem (not shown) by indicating a subset 460 of the three-dimensional image 450. The manner in which this designation occurs will be implementation specific depending upon the manner in which I/O is to occur, e.g., by touching a touch screen or designating with a mouse or joystick.

[0035] The invention admits wide variation in many aspects. For instance, the invention is not limited to ground-based or stationary controlled systems and the user may be local or remote relative to the controlled system. Consider, for instance, the scenario 600 shown in FIG. 6, in which the controlled system is implemented as a flying submunition 610 and the target is a moving ship 620. The flying submunition 610 transmits laser pulses 630 that are reflected from the ship 620. The three dimensional data is extracted from the returned signals 640, and either a three-dimensional data or a three-dimensional image is transmitted to a remote user aboard, for instance, an aircraft 650. This information is transmitted by electromagnetic signals 660. Note that data pre-processing and processing usually occur where the three-dimensional image is generated, although this is not necessary to the practice of the invention. The three-dimensional image is then displayed to the remote user, who then makes the designation to sight the flying submunition 610 on the ship 620. The designation is transmitted by the electromagnetic signals 670 to the flying submunition 610. Upon receiving the designation, the flying submunition 610 then targets the ship 620 for destruction. Note that alternative scenarios might be ground-to-air or air-to-air scenarios.

[0036]FIG. 7 presents one implementation 700 of the scenario 400 shown in FIG. 4. The implementation 700 is modified from a data acquisition and target identification process first shown in:

[0037] U.S. Pat. No. 5,644,386, entitled “Visual Recognition System for LADAR Sensors,” issued Jul. 1, 1997, to Loral Vought Systems Corp. as assignee of the inventors Gary Kim Jenkins, et al.

[0038] This patent discloses a method by which targets are identified from three-dimensional images generated from three-dimensional data. The present invention employs a three-dimensional data set such as the one obtained by this prior art method to both sight and target the controlled system, i.e., a weapon platform in this scenario.

[0039] In general, a LADAR system scans a target scene to provide on-site a three-dimensional image (representation) of the target scene. This image is processed to detect and segment potential targets. The segmentations representing these potential targets are then further processed by feature extracting and classification processes to identify the target. The segmentations of targets of interest previously completed prior to feature extraction and classification are either, or both, displayed locally or transmitted to a remote site for display. Because only the segmented targets rather than the entire scene is transmitted, this process allows communications over data links of limited bandwidth. A position within this segmented target image may then be sighted from this segmented three-dimensional image. A weapon system can then be targeted using the same three-dimensional data set.

[0040] Referring now to FIG. 7, a system 710 is shown for producing, processing, displaying, and transmitting images of one or more targets in a target scene 712. The system 710 includes a vehicle 714 for housing a weapon platform 715, a transmitter and sensor platform 716 and a processing center 717. The platform 716 includes a conventional LADAR system that generates and directs a laser beam to scan the target scene 712, including the targets. Reflected laser light is detected by the platform 716, and the processing center 717 processes the reflected light in a conventional manner into a scan data representative of the target scene 712. A Global Positioning System (“GPS”) transmitter 718 transmits a signal for providing accurate position data for the vehicle 714.

[0041] The system 710 processes three-dimensional LADAR images of the target scene 712 in the processing center 717, manipulates the resulting data into packets of information containing a segment of the scene, and transmits these packets of information by a communications link 720 to a remote site 722 for display on a display 724. The remote site 722 may be, for example, a combat platform or control and command node at a remote site and has access to data characterizing a local scene.

[0042]FIG. 8 depicts one embodiment of the sensor platform 716 and processing platform 717 housed on the vehicle 714 in FIG. 7. The sensor platform 716 supports a GPS receiver 830 disposed to receive a signal 833 from the GPS transmitter 718 in FIG. 7. A thermal imager 832 provides passive field of view search capability of the target scene 712 in FIG. 7. A LADAR sensor 834 generates a scan signal representative of an image of the target scene 712 in FIG. 7 by scanning a laser beam across the scene 712 and detecting reflections of the laser beam as it scans the scene 712. Although a number of different known LADAR arrangements may be employed, a suitable system is disclosed in:

[0043] U.S. Pat. No. 5,200,606, entitled “Laser Radar Scanning System”, filed Jul. 2, 1991, to LTV Missiles and Electronics Group as assignee of the inventors Nicholas J. Krasutsky et al.

[0044] One suitable implementation for the sensor 834 and associated pulse processing circuitry is disclosed in:

[0045] U.S. Pat. No. 5,243,553 entitled “Gate Array Pulse Capture Device” filed Jul. 2, 1991, to Loral Vought Systems Corp. as assignee of the inventor Stuart W. Flockencier.

[0046] The processing center 717 includes a digital processor 836 for processing the scan signal into three-dimensional LADAR images. A data manipulator 838 is provided for manipulating selected three-dimensional LADAR image data into packets of information which may be displayed by a local display terminal 840 at the vehicle 714 (shown in FIG. 7) for on-site display of the target 721. A data transmitter 842 is also provided for transmitting the packets of segmented information over the limited bandwidth communications link 720 in FIG. 7.

[0047] In operation, the vehicle 714 in FIG. 7 maneuvers into position to survey the target scene 712 in FIG. 7. The position of the vehicle 714 is read from the GPS receiver 830 housed on the sensor platform 716 in FIG. 7. The target scene 712 in FIG. 7 is scanned with the LADAR sensor 834 which is aligned with a compass providing a north reference. The LADAR sensor 834 collects data from the scanned target scene 712 in FIG. 7 and generates scan data representative of a three-dimensional image. The digital processor 836 of processing center 717 further processes the scan data. This processing generally involves initially representing detected signals as data elements in a spherical coordinate system, wherein each data element includes a range value and an intensity value that correspond to a point on the target scene 712 in FIG. 7.

[0048] The processor 836 then converts these data elements into a row-column format where the row-column represents the two angles in the spherical coordinate system and the data element is the range. In particular, the digital processor 836 initially processes the scan data into a three-dimensional LADAR image according to a spherical coordinate system of some type, which has an origin that coincides with the location of the LADAR sensor's detecting optics. This may be performed in accordance with known techniques.

[0049] The spherical coordinate system is convenient in generating the three-dimensional images since the angular position of a point in the target scene 712 in FIG. 7 may be measured with respect to axes that coincide with the axes of rotation of the LADAR sensor's detecting optics during scanning of the target scene 712 in FIG. 7. Moreover, the spherical coordinate system is conducive to storing the range of a point in the target scene 712 in FIG. 7, since this range corresponds to a radius from the LADAR sensor's detecting optics to the point. Each data element also includes an intensity value, representative of the intensity of the reflected light. Additionally, each data element includes an azimuth angle and an elevation angle. As indicated, this three-dimensional LADAR image is stored by the processor 836 in a row-column format for later use.

[0050]FIG. 9 illustrates the handling of the three-dimensional data set acquired as discussed immediately above. The LADAR three-dimensional data in row column format (at 950) is further processed by the digital processor 836 or, alternatively, by an off-site processor (not shown) such as a personal computer, a mini-computer, or other suitable computing device. This further processing generally involves preprocessing (at 952), detection (at 954), segmentation (at 956), feature extraction (at 958), and classification (at 960).

[0051] Generally, the preprocessing (at 952) is directed to minimizing noise effects, such as identifying so-called intensity dropouts in the converted three-dimensional image, where the range value of the LADAR image is set to zero. Noise in the converted three-dimensional LADAR image introduced by low signal-to-noise ratio (“SNR”) conditions is processed so that performance of the overall system 10 is not degraded. In this regard, the converted LADAR image signal is used so that absolute range measurement distortion is minimized, edge preservation is maximized, and preservation of texture step (that results from actual structure in objects being imaged) is maximized.

[0052] In general, detection (at 954) identifies specific regions of interest in the three-dimensional LADAR image. The detection (at 954) uses range cluster scores as a measure to locate flat, vertical surfaces in an image. More specifically, a range cluster score is computed at each pixel to determine if the pixel lies on a flat, vertical surface. The flatness of a particular surface is determined by looking at how many pixels are within a given range in a small region of interest. The given range is defined by a threshold value that can be adjusted to vary performance. For example, if a computed range cluster score exceeds a specified threshold value, the corresponding pixel is marked as a detection. If a corresponding group of pixels meets a specified size criteria, the group of pixels is referred to as a region of interest. Regions of interest, for example those regions containing one or more targets, are determined and passed to a segmenter for further processing.

[0053] Segmentation (at 956) determines, for each detection of a target, which pixels in a region of interest belong to the detected target and which belong to the detected target's background. Segmentation (at 956) identifies possible targets, for example, those whose connected pixels exceed a height threshold above the ground plane. More specifically, the segmentation (at 956) separates target pixels from adjacent ground pixels and the pixels of nearby objects, such as bushes and trees.

[0054] Feature extraction (at 958) provides information about a segmentation (at 956) so that the target and its features in that segmentation can be classified. Features include, for example, orientation, length, width, height, radial features, turret features, and moments. The feature extraction (at 958) also typically compensates for errors resulting from segmentation (at 956) and other noise contamination. Feature extraction (at 958) generally determines a target's three-dimensional orientation and size and a target's size. The feature extraction (at 958) also distinguishes between targets and false alarms and between different classes of targets.

[0055] Classification (at 960) classifies segmentations to contain particular targets, usually in a two stage process. First, features such as length, width, height, height variance, height skew, height kurtosis, and radial measures are used to initially discard non-target segmentations. The segmentations that survive this step are then matched with true target data stored in a target database. The data in the target database, for example, may include length, width, height, average height, hull height, and turret height to classify a target. The classification (at 960) is performed using known methods for table look-ups and comparisons.

[0056] Data obtained from the segmentation (at 956), the feature extraction (at 958), and the classification (at 960) is assembled into a packet of information (not shown). The packet may be rapidly and accurately transmitted to the remote site 722 in FIG. 7 and displayed in one of a variety of user-selectable formats. Typical formats include a three-view commonly used by armed forces to identify targets during combat, a north reference plan view, or a rotated perspective. These display options available to the operator, either local or remote, are based on the three-dimensional nature of the LADAR image. The results of the feature extraction (at 958) provide target information including orientation, length, width and height. The target image can be displayed from any perspective, independent of the sensor perspective, and the operator can select one of the several display formats that utilize the adjustable perspective.

[0057] Returning to FIG. 8, the data processed by the method 900 in FIG. 9 is transmitted as described over the communication link 720 to the remote site 722 and displayed on the remote display 724. Once the target image is displayed, the user can sight the weapon platform 715 on the target 721 (shown in FIG. 7) by indicating a point on the displayed image. Note that the sighting can also be performed in this manner on the local display 840, if desired. The sighted position is transmitted back to the processing center 717, whereupon the digital processor 836 issues pointing commands to the weapon platform 715. The control subsystem 844 then implements the pointing commands to complete the targeting.

[0058] Although the invention is described above relative to military fire control systems, the invention is not so limited. The above described invention makes it possible in a number of military and civilian applications to integrate sighting and targeting activities more accurately than with the state of the art. For instance, robotic tools used in hazardous waste cleanup may be operated more safely and efficiently since the sighting and targeting the robotic tool (e.g., to a hazardous waste deposit needing cleanup) are from a common, three-dimensional data set. Other civilian applications such as law enforcement, data for robotics force-fighting, crime deterrence, and border patrol functions may also benefit from the application of the current invention. Thus, the present invention is not limited to applications found in a military context.

[0059] Some portions of the detailed descriptions herein are presented in terms of a software implemented process involving symbolic representations of operations on data bits within a memory in a computing apparatus. These descriptions and representations are the means used by those in the art to most effectively convey the substance of their work to others skilled in the art. The process and operation require physical manipulations of physical quantities. Usually, though not necessarily, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, compared, and otherwise manipulated. It has proven convenient at times, principally for reasons of common usage, to refer to these signals as bits, values, elements, symbols, characters, terms, numbers, or the like.

[0060] It should be borne in mind, however, that all of these and similar terms are to be associated with the appropriate physical quantities and are merely convenient labels applied to these quantifies. Unless specifically stated or otherwise as may be apparent, throughout the present disclosure, these descriptions refer to the action and processes of an electronic device, that manipulates and transforms data represented as physical (electronic, magnetic, or optical) quantities within some electronic device's storage into other data similarly represented as physical quantities within the storage, or in transmission or display devices. Exemplary of the terms denoting such a description are, without limitation, the terms “processing,” “computing,” “calculating,” “determining,” “displaying,” and the like.

[0061] The following references are hereby incorporated by reference for their teachings regarding techniques and principles associated with cited subjects:

[0062] data acquisition, pre-processing, processing, and display:

[0063] U.S. Pat. No. 5,644,386, entitled “Visual Recognition System for LADAR Sensors,” issued Jul. 1, 1997, to Loral Vought Systems Corp. as assignee of the inventors Gary Kim Jenkins, et al.

[0064] U.S. Pat. No. 5,424,823, entitled “System for Identifying Flat Orthogonal Objects Using Reflected Energy Signals”, issued Jun. 13, 1995, to Loral Vought Systems Corporation as assignee of the inventors James L. Nettles, et al.;

[0065] for selected hardware useful in implementing the invention in certain embodiments, and in particular data acquisition:

[0066] U.S. Pat. No. 5,200,606, entitled “Laser Radar Scanning System”, filed Jul. 2, 1991, to LTV Missiles and Electronics Group as assignee of the inventors Nicholas J. Krasutsky et al.;

[0067] U.S. Pat. No. 5,243,553 entitled “Gate Array Pulse Capture Device” filed Jul. 2, 1991, to Loral Vought Systems Corp. as assignee of the inventor Stuart W. Flockencier.

[0068] Each of these patents is commonly assigned herewith.

[0069] This concludes the detailed description. The particular embodiments disclosed above are illustrative only, as the invention may be modified and practiced in different but equivalent manners apparent to those skilled in the art having the benefit of the teachings herein. Furthermore, no limitations are intended to the details of construction or design herein shown, other than as described in the claims below. It is therefore evident that the particular embodiments disclosed above may be altered or modified and all such variations are considered within the scope and spirit of the invention. Accordingly, the protection sought herein is as set forth in the claims below.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7032495 *Jan 24, 2003Apr 25, 2006Rheinmetall Landsysteme GmbhCombat vehicle having an observation system
US8050863 *Mar 16, 2006Nov 1, 2011Gray & Company, Inc.Navigation and control system for autonomous vehicles
US8245623Dec 7, 2010Aug 21, 2012Bae Systems Controls Inc.Weapons system and targeting method
US8346480Sep 22, 2011Jan 1, 2013Gray & Company, Inc.Navigation and control system for autonomous vehicles
US20130002525 *Jun 29, 2011Jan 3, 2013Bobby Duane FooteSystem for locating a position of an object
DE102011106810A1 *Jul 7, 2011Jan 10, 2013Testo AgHandheld type thermal imaging camera has image analyzing and processing unit that is provided to perform image analyzing and image processing for selected picture area
WO2013144502A1Mar 27, 2013Oct 3, 2013Nexter SystemsMethod for acquiring the coordinates of a triggering point of a projectile and fire control implementing such a method
Classifications
U.S. Classification89/41.05
International ClassificationF41G3/06, F41G3/16, F41G3/22
Cooperative ClassificationF41G3/06, F41G3/165, F41G3/225
European ClassificationF41G3/06, F41G3/16B, F41G3/22B
Legal Events
DateCodeEventDescription
Jan 30, 2002ASAssignment
Owner name: LOCKHEED MARTIN CORPORATION, TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:STEWART, JOHN R.;REEL/FRAME:012552/0697
Effective date: 20020124