CA2250063A1 - Airborne imaging system - Google Patents
Airborne imaging system Download PDFInfo
- Publication number
- CA2250063A1 CA2250063A1 CA002250063A CA2250063A CA2250063A1 CA 2250063 A1 CA2250063 A1 CA 2250063A1 CA 002250063 A CA002250063 A CA 002250063A CA 2250063 A CA2250063 A CA 2250063A CA 2250063 A1 CA2250063 A1 CA 2250063A1
- Authority
- CA
- Canada
- Prior art keywords
- data
- sensor
- imu
- aircraft
- gps
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C11/00—Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
- G01C11/02—Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures
Abstract
A remote data collection system, which may be used in a vehicle such as an aircraft or a ground vehicle, includes a directional sensor, such as one or more cameras, for sensing a characteristic of interest and providing sensor data. The system further includes a global positioning system (GPS) receiver for providing GPS data representative of the position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of the attitude of the sensor, a processing unit and a storage unit. The processing unit determines geographic data referenced to the sensor data in response to the GPS data and the IMU data. The processing unit may utilize an error model to determine IMU errors which may be used in determining the geographic data with high accuracy. The sensor data and the geographic data are stored in the data storage unit for subsequent use. The system may include a stabilized platform on which the sensor and the IMU are mounted. The stabilized platform is rotated about at least one axis of rotation to control the sensing direction of the sensor as the vehicle moves.
Description
CA 022~0063 1998-09-22 AIR~BO~U~EIM AGING SYSTE M
Field of the Invention This invention relates to vehicle-mounted sensing systems and~ more particularly. to high 5 resolution, low cost sensing systems which operate on a moving vehicle. The invention is particularly useful for airborne im~ging, but is not limited to this use.
~ark~round of the Invention The remote sensing market originated with early satellites and goals of global monitoring of o terrestrial activities. However, the cost of data and the response times for obtaining data has limited the broad applicability of imagery for use in day-to-day business operations.
Nevertheless, for relatively small scale users, such as farmers, city planners, utilities managers and forest managers, the computerized geographic information system is recognized as the information integration tool of the future. Geographic information systems include computer 15 tools for locating geographic coordinates of points within images, for overlaying maps and images, and for making quantitative measurements, such as areas, distances and precise locations of objects, from the images. Applications as diverse as (I ) a farmer selecting chemical application strategies based upon expected crop yield predictions, (2) a tax assessor directing a manual inspection of an observed dwelling addition, or (3) ac~essing timber yields and timber 20 harvest costs from multiple property tracts are but a few of the numerous potential applications of imagery and geographic information systems.
The profit structure of the agricultural industry is heavily dominated by the use of chemicals (fertilizers, pesticides and herbicides), with a trend toward the use of more chemicals. However, since the EPA and world environmental organizations recognize long term hazards of un~b~ted 25 chemical tre:~m~nt.~, chemical regulations are throttling the agricultural business. Of equal significance as populations increase, more land is cleared for f~nin~, and reduced food yields per acre lead to higher deforestation and even greater threats to the environment. The recognized answer to these global scale problems is metered usage of chemicals, such that the chemicals are used where they are of maximum benefit. Metered chemical distribution systems are in wide ,o use. However, the data to define the spatial metering values are lacking. Multispectral imagerv with sub-meter resolution and spatial registration is required. The imagery must be available at low cost and with short response times to requests for imagery.
CA 022~0063 1998-09-22 A satellite system known as the SPOT satellite. sponsored by the government of France. is representative of current operational satellite capabilities. This system provides 10 meter resolution panchromatic imagery or 20 meter resolution imagery in the visible/near infrared bands. Geodetic registration is accurate to 15 meters in the U.S. where ground control points are s plentiful and well surveyed. ~xperience has shown that the response time for imagery requests is usually no better than 10 days. The drawbacks to satellite imagery include the potential for cloud coverage (Y3 of the earth is cloud covered), high costs and the inflexibility of satellite imagery collection systems. Custom tailored resolutions, spectral bands and responsiveness carmot be provided by satellite systems.
o Airborne imagery is commonly used by many small scale users. Benefits of airborne imagery collection include tailorable resolution, response time and data processing methodology.
Drawbacks include higher costs and the need to contract for a dedicated aircraft, flight crew and post-mission processing system.
A variety of airborne sensing and survey systems have been disclosed in the prior art. A
survey system for obtaining geophysical data with aircraft using real time differential operation ofthe global positioning system is disclosed in U.S. Pat. No. 4,814,711 issued March 21, 1989 to Olsen et al. An airborne system using two color video carneras and an IR imager head mounted below the fuselage of an aircraft is disclosed in U.S. Pat. No. 5,166,789 issued November 24, 1992 to Myrick. Latitude and longitude information obtained from the global positioning system is recorded on each image frame. A system including a CCD camera and a global positioning system receiver for recording an image signal and position information on magnetic tape is disclosed in U.S. Pat. No. 5,267,042 issued November 30, 1993 to Tsuchiya et al. A technique for airborne imaging wherein multiple overlapping images are ~upe~ osed by observing a stationary object that appears in ~ ce~t images is disclosed in U.S. Pat. No. 5, 247,356 issued ~s September 21, 1993 to Ciampa. A technique for generating high resolution images from a CCD
carnera in an aircraft is disclosed in U.S. Pat. No. 5,251,037 issued October 5. 1993 to Busenberg. A technique for generating high resolution vidicon aerial images is disclosed in U.S.
Pat. No. 5,270,756 issued December 14, 1993 to Busenberg. An airborne contour mapping system is disclosed in U.S. Pat. No. 3.191~170 issued June 22. 1965 to Lustig et al. A technique 30 for remote sensing using inertial navigation systems and the global positioning system for georeferencing of remotely sensed data is described by K.P. Schwarz et al. in Photo~rammetric Engineerin~ & Remote Sensing, Vol . 59. No . 11, November. 1993, pp. 1667- 1674.
The examples above are characterized by components integrated tightly to the aircraft so that a dedicated aircraft is required. None of the prior art airborne imaging systems have been practical from the viewpoint of small scale users with respect to cost, resolution, flexibility and response time.
s Summarv of the Invention According to the present invention, methods and apparatus for remote data collection are provided. The invention is used in a vehicle such as an aircraft or a ground vehicle. A remote data collection system in accordance with the invention comprises a directional sensor for o sensing a characteristic of interest and providing sensor data repl~s~l"~live of the characteristic of interest, a global positioning system (GPS) receiver for providing GPS data representative of position and velocity of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of attitude rate and acceleration of the sensor, a processing unit and a data storage unit. The processing unit deterrnin~s IMU errors in ~ onse to the IMU data and an IMU error model. The processing unit also det~rminPs geographic data ler~le.lced to the sensor data in response to the GPS data, the IMU data and the IMU errors. The sensor data and the geographic data are stored in the data storage unit for subsequent use. The sensor may comprise one or more cameras, and the sensor data may represent images.
According to another aspect of the invention, a remote data collection system for use in a 20 vehicle comrri~es a stabilized platform, a directional sensor rigidly mounted to the stabilized platform for sensing a characteristic of interest and providing sensor data Ic~lcscll~alive of the characteristic of interest, an inertial measurement unit (IMU) rigidly mounted to the stabilized platform for providing IMU data leples~ alive of attitude of the sensor, a control unit responsive to the IMU data for rotating the stabilized platform about at least one axis of rotation with respect ~5 to the vehicle to control the sensing direction of the sensor as the vehicle moves, a global positioning system (GPS) receiver for providing GPS data le~lcs~ ive of position of the sensor, a processing system responsive to the GPS data and the IMU data for determining geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data for subsequent use.
The control unit may comprise a gimbal. a support member rigidly mounted to the vehicle, a first motor cormected between the support member and the gimbal for rotating the gimbal about a first axis relative to the support member and a second motor connected between the gimbal and CA 022~0063 1998-09-22 the stabilized platforrn for rotating the stabilized platform about a second axis relative to the gimbal. The direction of the sensor is stabilized with respect to the first and second axes as the vehicle moves. In a preferred embodiment, the vehicle comprises an aircraft, and the control unit stabilizes the sensing direction of the sensor in a vertical orientation with respect to pitch and roll s of the aircraft.
All facets of the invention are optimized for low operational cost and rapid response. The sensor and the IMU are integrated to a readily detachable vehicle component (e.g. cargo door) so that no modifications to the vehicle are required. The IMU used for sensor stabilization is also used for attitude determination and navigation, thus reducing cost. A mission planning/vehicle o steering comrnand system is included to allow one man operation for cost reduction and operational ease of use.
Brief De3c.;plion of the D~
For a better underst~n~ing of the present invention, reference is made to the accompanying drawings, which are incorporated herein by l~r~ ,ce and in which:
FIG. 1 is a block diagrarn of an embodiment of an airborne imaging system in accordance with the present invention;
FIG. 2 is a pictorial diagrarn showing the components of the airborne imaging system of FIG. I;
FIG. 3 is a pictorial diagram illustrating the airborne imaging system of FIG. 1 installed on an aircraft;
FIG. 4 is a top schematic view showing the stabilized platform assembly mounted on the door of an aircraft;
FIG. 5 is a top view showing the stabilized platform assembly in more detail;
~5 FIG. 6 is an exploded view of the stabilized platforrn assembly;
FIG. 7 is a software flow diagram that illustrates operation of the airbome imaging system;
and FIG. 8 is a pictorial diagram that illustrates an example of an airborne survey mission.
Detailed Description A block diagram of an embodiment of an imaging system in accordance with the invention is shown in FIG. 1. The imaging system includes an airborne data collection svstem 10 and a . . .
CA 022~0063 1998-09-22 ground processing workstation 12. and makes use of a GPS ground station 14. The airborne data collection system 10 is typically mounted in an aircraft and is used for obtaining ima_es of a prescribed survey area of the earth. For example~ the imaging system may be used to obtain images of a prescribed agricultural area or forest area. As described in detail below, the ground s processing workstation 12 is used to define a trajectory to be followed by the aircraft in order to obtain images of the survey area with complete coverage. The ground processing workstation 12 may also be used for post-mission processing of image data and for ~-1mini~trative functions.
The airborne data collection system 10, in general, includes a directional sensor for generating sensor data, a global positioning system (GPS) receiver for providing GPS data i o re~l~st;llt~live of the position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of the attitude of the sensor, a processing unit responsive to the GPS
data and the IMU data for providing geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data. The geographic data establishes the ground coordinates of the sensor data with high resolution. The airborne data collection 5 system 10 may include a stabilized platforrn assembly for stabilizing the direction of the sensor during aircraft flight. The data collection system of the invention may be used in ground vehicles as well as in aircraft.
Referring again to FIG. 1, an embodiment of the airborne data collection system 10 includes cameras 20, 22 and 24. The cameras 20, 22 and 24 may, for example, be charge coupled device 20 (CCD) cameras with filters having different spectral responses. The cameras 20, 22 and 24 supply image data and synchronization signals to a frame grabber 28. The frame grabber 28 supplies image data 1~ It;s~llL~live of individual images obtained by each of the cameras to a system computer 30. A disk storage unit 32 connected to the system computer 30 is used for storage of image data and geographic data. The airborne data collection system 10 further 25 includes an inertial measurement unit 40 that provides IMU data to the system computer 30 through an IMU interface 42. The inertial measurement unit 40 is rigidly mechanically connected to cameras 20, 22 and 24 and typically senses acceleration and rotation rate with respect to three coordinate axes. A GPS receiver 46 receives positioning signals from GPS
satellites through a GPS antenna 48. The GPS receiver 46 also receives positioning signals from 30 the GPS ground station 14 through a data linl~ antenna 50 and an RF modem 52. The GPS
receiver 46 supplies GPS data and a GPS clock to the system computer 30. As known in the art, the GPS data accurately represents the position of GPS receiver 46 and therefore represents the CA 022~0063 1998-09-22 position of cameras 20, 22 and 24. The system uses differential GPS for steering the v ehicle to accuracies of 1-3 m. A kinem~tic GPS processing procedure is applied post-mission to allow determination of position accuracy to the 10 cm level.
When the airborne data collection system 10 includes a stabilized platform assembly, at least s one stabilizing motor is provided. In the example of FIG. 1, a roll motor 60 stabilizes the cameras 20. 22 and 24 with respect to aircraft roll, and a pitch motor 62 stabilizes cameras 20, 22 and 24 with respect to aircraft pitch. Roll motor 60 is energized by system computer 30 through a motor amplifier 64, and pitch motor 62 is energized by system computer 30 through a motor amplifier 66. Each of the motors 60 and 62 includes an encoder which provides to system o computer 30 a signal le~f~s~ Live of motor angle with respect to a reference angle. Operation of the stabilized platform assembly is described in detail below. A power supply 70 receives aircraft power, typically 28 volts, and supplies operating power to the components of the airborne data collection system 10. A graphical display 72 is connected to system computer 30.
As described below, the display 72 provides comm~n-l~ to the pilot when the aircraft deviates s from a preplanrled trajectory over the survey area.
FIG. 2 is a pictorial diagram that illustrates a preferred configuration of the airborne data collection system 10. A stabilized platform assembly includes cameras 20, 22 and 24, IMU 40, motors 60 and 62 and additional colllpol1enl~ described below. In a l"er~"ed embodiment, the stabilized platform assembly 80 is mounted to a cargo door 82 of an aircraft (not shown in FIG.
Field of the Invention This invention relates to vehicle-mounted sensing systems and~ more particularly. to high 5 resolution, low cost sensing systems which operate on a moving vehicle. The invention is particularly useful for airborne im~ging, but is not limited to this use.
~ark~round of the Invention The remote sensing market originated with early satellites and goals of global monitoring of o terrestrial activities. However, the cost of data and the response times for obtaining data has limited the broad applicability of imagery for use in day-to-day business operations.
Nevertheless, for relatively small scale users, such as farmers, city planners, utilities managers and forest managers, the computerized geographic information system is recognized as the information integration tool of the future. Geographic information systems include computer 15 tools for locating geographic coordinates of points within images, for overlaying maps and images, and for making quantitative measurements, such as areas, distances and precise locations of objects, from the images. Applications as diverse as (I ) a farmer selecting chemical application strategies based upon expected crop yield predictions, (2) a tax assessor directing a manual inspection of an observed dwelling addition, or (3) ac~essing timber yields and timber 20 harvest costs from multiple property tracts are but a few of the numerous potential applications of imagery and geographic information systems.
The profit structure of the agricultural industry is heavily dominated by the use of chemicals (fertilizers, pesticides and herbicides), with a trend toward the use of more chemicals. However, since the EPA and world environmental organizations recognize long term hazards of un~b~ted 25 chemical tre:~m~nt.~, chemical regulations are throttling the agricultural business. Of equal significance as populations increase, more land is cleared for f~nin~, and reduced food yields per acre lead to higher deforestation and even greater threats to the environment. The recognized answer to these global scale problems is metered usage of chemicals, such that the chemicals are used where they are of maximum benefit. Metered chemical distribution systems are in wide ,o use. However, the data to define the spatial metering values are lacking. Multispectral imagerv with sub-meter resolution and spatial registration is required. The imagery must be available at low cost and with short response times to requests for imagery.
CA 022~0063 1998-09-22 A satellite system known as the SPOT satellite. sponsored by the government of France. is representative of current operational satellite capabilities. This system provides 10 meter resolution panchromatic imagery or 20 meter resolution imagery in the visible/near infrared bands. Geodetic registration is accurate to 15 meters in the U.S. where ground control points are s plentiful and well surveyed. ~xperience has shown that the response time for imagery requests is usually no better than 10 days. The drawbacks to satellite imagery include the potential for cloud coverage (Y3 of the earth is cloud covered), high costs and the inflexibility of satellite imagery collection systems. Custom tailored resolutions, spectral bands and responsiveness carmot be provided by satellite systems.
o Airborne imagery is commonly used by many small scale users. Benefits of airborne imagery collection include tailorable resolution, response time and data processing methodology.
Drawbacks include higher costs and the need to contract for a dedicated aircraft, flight crew and post-mission processing system.
A variety of airborne sensing and survey systems have been disclosed in the prior art. A
survey system for obtaining geophysical data with aircraft using real time differential operation ofthe global positioning system is disclosed in U.S. Pat. No. 4,814,711 issued March 21, 1989 to Olsen et al. An airborne system using two color video carneras and an IR imager head mounted below the fuselage of an aircraft is disclosed in U.S. Pat. No. 5,166,789 issued November 24, 1992 to Myrick. Latitude and longitude information obtained from the global positioning system is recorded on each image frame. A system including a CCD camera and a global positioning system receiver for recording an image signal and position information on magnetic tape is disclosed in U.S. Pat. No. 5,267,042 issued November 30, 1993 to Tsuchiya et al. A technique for airborne imaging wherein multiple overlapping images are ~upe~ osed by observing a stationary object that appears in ~ ce~t images is disclosed in U.S. Pat. No. 5, 247,356 issued ~s September 21, 1993 to Ciampa. A technique for generating high resolution images from a CCD
carnera in an aircraft is disclosed in U.S. Pat. No. 5,251,037 issued October 5. 1993 to Busenberg. A technique for generating high resolution vidicon aerial images is disclosed in U.S.
Pat. No. 5,270,756 issued December 14, 1993 to Busenberg. An airborne contour mapping system is disclosed in U.S. Pat. No. 3.191~170 issued June 22. 1965 to Lustig et al. A technique 30 for remote sensing using inertial navigation systems and the global positioning system for georeferencing of remotely sensed data is described by K.P. Schwarz et al. in Photo~rammetric Engineerin~ & Remote Sensing, Vol . 59. No . 11, November. 1993, pp. 1667- 1674.
The examples above are characterized by components integrated tightly to the aircraft so that a dedicated aircraft is required. None of the prior art airborne imaging systems have been practical from the viewpoint of small scale users with respect to cost, resolution, flexibility and response time.
s Summarv of the Invention According to the present invention, methods and apparatus for remote data collection are provided. The invention is used in a vehicle such as an aircraft or a ground vehicle. A remote data collection system in accordance with the invention comprises a directional sensor for o sensing a characteristic of interest and providing sensor data repl~s~l"~live of the characteristic of interest, a global positioning system (GPS) receiver for providing GPS data representative of position and velocity of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of attitude rate and acceleration of the sensor, a processing unit and a data storage unit. The processing unit deterrnin~s IMU errors in ~ onse to the IMU data and an IMU error model. The processing unit also det~rminPs geographic data ler~le.lced to the sensor data in response to the GPS data, the IMU data and the IMU errors. The sensor data and the geographic data are stored in the data storage unit for subsequent use. The sensor may comprise one or more cameras, and the sensor data may represent images.
According to another aspect of the invention, a remote data collection system for use in a 20 vehicle comrri~es a stabilized platform, a directional sensor rigidly mounted to the stabilized platform for sensing a characteristic of interest and providing sensor data Ic~lcscll~alive of the characteristic of interest, an inertial measurement unit (IMU) rigidly mounted to the stabilized platform for providing IMU data leples~ alive of attitude of the sensor, a control unit responsive to the IMU data for rotating the stabilized platform about at least one axis of rotation with respect ~5 to the vehicle to control the sensing direction of the sensor as the vehicle moves, a global positioning system (GPS) receiver for providing GPS data le~lcs~ ive of position of the sensor, a processing system responsive to the GPS data and the IMU data for determining geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data for subsequent use.
The control unit may comprise a gimbal. a support member rigidly mounted to the vehicle, a first motor cormected between the support member and the gimbal for rotating the gimbal about a first axis relative to the support member and a second motor connected between the gimbal and CA 022~0063 1998-09-22 the stabilized platforrn for rotating the stabilized platform about a second axis relative to the gimbal. The direction of the sensor is stabilized with respect to the first and second axes as the vehicle moves. In a preferred embodiment, the vehicle comprises an aircraft, and the control unit stabilizes the sensing direction of the sensor in a vertical orientation with respect to pitch and roll s of the aircraft.
All facets of the invention are optimized for low operational cost and rapid response. The sensor and the IMU are integrated to a readily detachable vehicle component (e.g. cargo door) so that no modifications to the vehicle are required. The IMU used for sensor stabilization is also used for attitude determination and navigation, thus reducing cost. A mission planning/vehicle o steering comrnand system is included to allow one man operation for cost reduction and operational ease of use.
Brief De3c.;plion of the D~
For a better underst~n~ing of the present invention, reference is made to the accompanying drawings, which are incorporated herein by l~r~ ,ce and in which:
FIG. 1 is a block diagrarn of an embodiment of an airborne imaging system in accordance with the present invention;
FIG. 2 is a pictorial diagrarn showing the components of the airborne imaging system of FIG. I;
FIG. 3 is a pictorial diagram illustrating the airborne imaging system of FIG. 1 installed on an aircraft;
FIG. 4 is a top schematic view showing the stabilized platform assembly mounted on the door of an aircraft;
FIG. 5 is a top view showing the stabilized platform assembly in more detail;
~5 FIG. 6 is an exploded view of the stabilized platforrn assembly;
FIG. 7 is a software flow diagram that illustrates operation of the airbome imaging system;
and FIG. 8 is a pictorial diagram that illustrates an example of an airborne survey mission.
Detailed Description A block diagram of an embodiment of an imaging system in accordance with the invention is shown in FIG. 1. The imaging system includes an airborne data collection svstem 10 and a . . .
CA 022~0063 1998-09-22 ground processing workstation 12. and makes use of a GPS ground station 14. The airborne data collection system 10 is typically mounted in an aircraft and is used for obtaining ima_es of a prescribed survey area of the earth. For example~ the imaging system may be used to obtain images of a prescribed agricultural area or forest area. As described in detail below, the ground s processing workstation 12 is used to define a trajectory to be followed by the aircraft in order to obtain images of the survey area with complete coverage. The ground processing workstation 12 may also be used for post-mission processing of image data and for ~-1mini~trative functions.
The airborne data collection system 10, in general, includes a directional sensor for generating sensor data, a global positioning system (GPS) receiver for providing GPS data i o re~l~st;llt~live of the position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of the attitude of the sensor, a processing unit responsive to the GPS
data and the IMU data for providing geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data. The geographic data establishes the ground coordinates of the sensor data with high resolution. The airborne data collection 5 system 10 may include a stabilized platforrn assembly for stabilizing the direction of the sensor during aircraft flight. The data collection system of the invention may be used in ground vehicles as well as in aircraft.
Referring again to FIG. 1, an embodiment of the airborne data collection system 10 includes cameras 20, 22 and 24. The cameras 20, 22 and 24 may, for example, be charge coupled device 20 (CCD) cameras with filters having different spectral responses. The cameras 20, 22 and 24 supply image data and synchronization signals to a frame grabber 28. The frame grabber 28 supplies image data 1~ It;s~llL~live of individual images obtained by each of the cameras to a system computer 30. A disk storage unit 32 connected to the system computer 30 is used for storage of image data and geographic data. The airborne data collection system 10 further 25 includes an inertial measurement unit 40 that provides IMU data to the system computer 30 through an IMU interface 42. The inertial measurement unit 40 is rigidly mechanically connected to cameras 20, 22 and 24 and typically senses acceleration and rotation rate with respect to three coordinate axes. A GPS receiver 46 receives positioning signals from GPS
satellites through a GPS antenna 48. The GPS receiver 46 also receives positioning signals from 30 the GPS ground station 14 through a data linl~ antenna 50 and an RF modem 52. The GPS
receiver 46 supplies GPS data and a GPS clock to the system computer 30. As known in the art, the GPS data accurately represents the position of GPS receiver 46 and therefore represents the CA 022~0063 1998-09-22 position of cameras 20, 22 and 24. The system uses differential GPS for steering the v ehicle to accuracies of 1-3 m. A kinem~tic GPS processing procedure is applied post-mission to allow determination of position accuracy to the 10 cm level.
When the airborne data collection system 10 includes a stabilized platform assembly, at least s one stabilizing motor is provided. In the example of FIG. 1, a roll motor 60 stabilizes the cameras 20. 22 and 24 with respect to aircraft roll, and a pitch motor 62 stabilizes cameras 20, 22 and 24 with respect to aircraft pitch. Roll motor 60 is energized by system computer 30 through a motor amplifier 64, and pitch motor 62 is energized by system computer 30 through a motor amplifier 66. Each of the motors 60 and 62 includes an encoder which provides to system o computer 30 a signal le~f~s~ Live of motor angle with respect to a reference angle. Operation of the stabilized platform assembly is described in detail below. A power supply 70 receives aircraft power, typically 28 volts, and supplies operating power to the components of the airborne data collection system 10. A graphical display 72 is connected to system computer 30.
As described below, the display 72 provides comm~n-l~ to the pilot when the aircraft deviates s from a preplanrled trajectory over the survey area.
FIG. 2 is a pictorial diagram that illustrates a preferred configuration of the airborne data collection system 10. A stabilized platform assembly includes cameras 20, 22 and 24, IMU 40, motors 60 and 62 and additional colllpol1enl~ described below. In a l"er~"ed embodiment, the stabilized platform assembly 80 is mounted to a cargo door 82 of an aircraft (not shown in FIG.
2). The airborne data collection system 10 further includes an electronics unit 84. The electronics unit 84 includes system computer 30, disk storage unit 32, GPS receiver 46, RF
modem 52, frame grabber 28, IMU interface 42, motor amplifiers 64 and 66, power supply 70 and display 72. The electronics unit 84 is interconnected to stabilized platform assembly 80 by a cable 86.
s F;IG. 3 is a pictorial diagram illustrating inct~ tion of the airborne data collection system of the present invention in an aircraft 90. As indicated above, stabilized platform assembly 80 is preferably mounted on cargo door 82. Electronics unit 84 is positioned in the cargo area of the aircraft 90. GPS antenna 48 may be mounted on the upper surface of the aircraft, and data link antenna 50 may be mounted on the underside of the aircraft. The pilot display 76 is positioned ,o for convenient viewing by the pilot.
FIG. 4 is a pictorial diagram of the stabilized platform 80 mounted on cargo door 82. FIG. 5 shows the stabilized platform assembly in more detail. and FIG. 6 shows an exploded view of the ~ ..
CA 022~0063 1998-09-22 WO97/35166 PCT~US97/04668 stabilized platform assembly. Cameras 20? 22 and 24 are rigidly mounted to a stabilized platform 100. The stabilized platform 100 extends through an opening in cargo door 82 such that cameras 20, 22 and 24 are located externally of the aircraft. The opening in cargo door 82 provides sufficient clearance to permit movement of stabili~ed platform l O0 with respect to the s aircraft. In a preferred embodiment, the opening is a vertically-oriented slot to permit pitch and roll stabilization of the cameras. The cameras 20, 22 and 24 are protected by a cowling 102 that is open at the bottom. The IMU 40 is rigidly mounted to a portion of stabilized platform 100 within the aircraft. A support member 102 is mounted to an inside surface of cargo door 82, and roll motor 60 is secured to an inwardly-extending portion of support member 102. The shaft of o roll motor 60 is connected to a gimbal 106. Pitch motor 62 is mounted to gimbal 106, and the shaft of pitch motor 62 is secured to stabilized platform 100. As shown in FIGS. 5 and 6~
cameras 20, 22 and 24 are mounted in a camera housing 120, which is attached to stabilized platform 100. Stabilized platform 100 is connected to gimbal 106 by a pivot pin 122 and is connected to the shaft of pitch motor 62 by a flange 124. Gimbal 106 is connected to the shaft of 5 roll motor 60 by a flange 126.
When the roll motor 60 is energized, cameras 20, 22 and 24, stabilized platform 100, IMU
40, gimbal 106 and pitch motor 62 are rotated with respect to a roll axis 110. When the pitch motor 62 is enel~ized, cameras 20, 22 and 24, stabilized platform 100 and IMU 40 are rotated with respect to a pitch axis 112. The nominal direction of flight of the aircraft is indicated in 20 FIG. 4 by arrow 114.
In an example of the stabilized platform assembly 80, the roll motor 60 was a Hathaway type HT03802 brushless DC motor, and the pitch motor 62 was a Hathaway type HT02301 brushless DC motor. The motor amplifiers 64 and 66 were Hathaway type BLC048 motor amplifiers.
~5 In operation, the IMU 40 senses changes in velocity and angle in three coordinate directions.
Since cameras 20, 22 and 24 and IMU 40 are rigidly connected to stabilized platform 100, velocity changes and angle changes sensed by IMU 40 represent velocity and angle changes of cameras 20, 22 and 24. IMU data representative of the velocity and angle changes is supplied to system computer 30. The system computer 30 uses the angle changes to determine deviations of 30 the attitude of the cameras from a desired attitude. These deviations are used to generate error si~nals which are supplied through motor amplifiers 64 and 66 to roll motor 60 and pitch motor 62~ respectively. The roll motor 60 rotates cameras 20, 22 and 24 with respect to roll axis 110~
, . . . .. , . _ , ..
CA 022~0063 1998-09-22 WO97/3~166 PCT/US97104668 and pitch motor 62 rotates cameras 20~ 22 and 24 with respect to pitch axis 112. so as to m~int~in a desired attitude. Typically, the cameras 20~ 22 and 24 are m~int~ined in a vertical attitude with respect to the earth's surface. However. other boresight directions may be utilized. Furtherrnore.
the cameras may be scanned, for example, with respect to the roll axis 110 to obtain images of a 5 wider strip on each pass over the survey area.
The stabilized platform assembly 80 provides advantageous operation of the imaging system. Because the cameras 20, 22 and 24 are stabilized, typically in a vertical orientation, the spacing between adjacent aircraft passes over the survey area can be increased without risking loss of coverage between images in adjacent passes. This is possible because it is not necessary o to account for inadvertent aircraft roll in der~rrnining the spacing between passes. By increasing the spacing between adjacent passes of the aircraR trajectory, the time and cost for completing a given survey is reduced. In an alternative approach, the cameras are scanned with respect to the roll axis at a rate relative to the aircraft speed which permits imaging of a wider strip than is possible with stationary cameras.
The stabilized platform assembly 80, shown in FIGS. 4-6 and described above, provides stabilization with respect to the pitch and roll axes of the aircraft. In another configuration, the stabilized platform assembly is simplified to provide stabilization with respect to the roll axis only. In this configuration, the shaft of roll motor 60 may be connected directly to stabilized platform 100 so as to rotate cameras 20, 22 and 24 with respect to roll axis 110. It is believed 20 that stabilization of the cameras 20, 22 and 24 with respect to the yaw axis (perpendicular to axes 110 and 112) would not provide substantial benefits in the operation of the imaging system.
Mounting of the stabilized platform assembly 80 on the aircraft door provides significant practical advantages in operation of the imaging system. In general, it is desired to install the imaging system of the present invention in a,l,illal.y aircraft. One alternative is to install the 25 cameras in a hole cut in the floor of the aircraft. However, this requires a special modification to the aircraft and requires certification of the inct~ tion by the ~AA. Such a hole is unlikely to be acceptable to many aircraR owners. Wing mounting of the camera assembly is undesirable for similar reasons. Mounting of the stabilized platform assembly on the cargo door provides an attractive solution. Cessna 172 aircraR, for example, have a cargo door that is easily removable.
30 Other small. four-passenger commercial aircraft have a similar cargo door which mav be modified for inct~ tion of the stabilized platform assembly. A 6x6 inch hole is cut in the lower interior portion of the dual wall aluminum door structure. The support member 102~ having a CA 022~0063 1998-09-22 box structure~ is used to carry torque from the pitch and roll motors. A torque transfer stiffener 130 (FIG. 6), 14 inches in length, is part of the support member 102 and transfers the roll motor torque in the vertical plane of the door. A vertically-oriented slot is cut in the door to allow the camera support portion of the stabilized platforrn 100 to pass through the door to the exterior of the aircraft. When the imaging system of the invention is to be used in an aircraft. the standard cargo door is removed and is replaced with a cargo door having a preinstalled stabilized platform assembly. The electronics unit 84 is placed in the cargo area of the aircraft.
The carneras used in the imaging system may include three compact monochrome CCDcameras. Such cameras are available from numerous suppliers. A preferred camera is the Sony o XC-7500, which provides 640 x 480 pixel resolution in non-interlace (progressive scan) mode.
The cameras typically use a 16 mm lens with an f-stop of 2.8. Different filters can be utilized in the camera lens to provide dirr.,lell~ spectral responses. For example, red, green, blue and near infrared filters may be utilized to obtain different information regarding the survey area. A color image can be formed by using red, green and blue filters. The frame grabber may be a Mu-tech model M- 1000 which allows access to up to 4 cameras simultaneously.
The imaging system has been described thus far with reference to a configuration utilizing three cameras. It will be understood that any number of cameras may be utilized. More generally, any sensor having a boresight direction for sensing may be utilized for data collection.
Thus, for example, the sensor may be a laser system, an atmospheric pollution sensor~ a thermal camera, a radar system or any other suitable sensor.
The IMU may be a Honeywell H- 1700 system, which has an accuracy characterized by 10 ~
per hour gyro accuracy. While higher accuracy IMU's are available, the cost is also higher. In order to utilize a low-cost IMU with moderate accuracy, an error model of the IMU is utilized as described below.
The GPS receiver 46 may comprise an eight-channel Motorola Encore airborne unit, and the GPS ground station 14 may comprise an 8-channel Motorola Encore differential GPS base station. The Motorola Encore is a C/A code unit with capability for using differential corrections transmitted by the base station. The GPS receiver 46 is connected by a coaxial cable to GPS
antenna 48~ installed on the upper surface of the aircraft. A true kinematic GPS system is a preferred implementation to achieve accuracies of 10 cm or better.
The RF modem 52~ which provides the differential GPS datalink to GPS ground station 14.
may be a Pacific Crest RFM 96S radio modem. capable of two-wav communication at 9600 baud using a carrier frequency of 460 MHz. This system provides approximately a 100 mile radius of coverage with a 15 watt transmitter and omnidirectional datalink antenna 50.
The system computer 30 may comprise an industry standard model PCI single board computer, which utilizes a P5 150 MHz processor. I/O functions are handled by a model ATC40 5 carrier board available from Greenspring, which provides four lndustry Pack (IP) board slots for tailoring the I/O functions performed by the board. One IP board is the IP-ADIO available from Greenspring, which provides analog-to-digital, digital-to-analog and discrete digital I/O
functions. This IP board receives IMU data and the GPS clock and supplies motor control signals to the motor amplifiers 64 and 66. An IP servo board decodes the motor encoder signals o received from roll motor 60 and pitch motor 62.
The disk storage unit 32 must have sufficient storage volume and a sufficient data transfer rate to store image data supplied by the frame grabber 28. Consider an airborne mission requiring one foot per pixel image resolution and an aircraft speed of 100 knots at one frarne per second. The 640 x 480 pixel single camera image frame provides a 65% overlap between images s (480 pixel dimension along direction of motion). At one byte per pixel for each of the three cameras, a data storage rate of 0.92 megabyte per image, or 0.92 megabyte per second, is required. A conservative 1.5 megabyte per second storage rate is used as the nominal image storage transfer rate specific~tiQn.
A nominal time for an imaging survey mission may be 3 hours, with over 2 hours ~csllmPd 20 for actual image collection. The re..,~ time is spent flying to and from the survey area and for turnarounds after completion of each swath. Two hours of imaging will generate a 7.2 gigabyte imagery file at a 1.0 megabyte per second storage rate. The GPS data and IMU data recorded during the mission (0.04 megabyte per second for 3 hours) contributes only 0.4 gigabyte of additional storage, for a total of 7.6 gigabytes. One exarnple of a suitable disk unit is ~5 the Seagate Elite-9, having 9 gigabytes of storage and 11 milli~econds access time. The standard SCSI disk drive interface allows storage throughput up to 1.5 megabytes per second.
The imaging system of the present invention utilizes direct digital photography and digital storage of spatially registered imagery. Other airborne video systems use a videotape system as the airborne image storage medium. This allows several hours of imagery to be captured at a 30 30 Hz image rate. However. videotape does not capture the full resolution or the full dynarnic range of the CCD camera svstems. All known airborne video systems offer videotape of the surveyed terrain. with frames tagged with GPS positions.
WO97/35166 PCT~US97/04668 The system uses a display to provide steering cues to the pilot to m~inf~in the appropriate - flight line. An Accuphoto system provided by Genysis Cornm. Inc. is used for this purpose. The Accuphoto system provides a software tool for use in planning the mission, resultin~ in a software file defining the mission profile in GPS coordinates. The onboard GPS receiver is then s used to provide pilot cues via a simple LCD display indicating need for a left/right correction and the magnitude of the correction.
The configuration described above wherein stabilized platform assembly 80 is mounted on a cargo door of the aircraft provides a number of advantages in operation of the imaging system.
However, other configurations may be utilized within the scope of the present invention. For o example, the stabilized platform assembly may be omitted from the imaging system. In this configuration, the cameras are rigidly mounted to the aircraft, and the IMU data is used to co~ ,nsate the image data for aircraft pitch, roll and yaw. Furthermore, the cameras or other sensors are not n.~cess,.. ;ly mounted on the aircraft door. For example, the cameras may be mounted in a hole in the floor of the aircraft, in a pod beneath the aircraft or on one of the wings.
The ground processing workstation 12 perforrns survey mission pl~nning and post-mission processing. The ground processing workstation 12 may be implemented using a PC-based graphical workstation and comrnercially available geographic information system tools (GIS) such as Arcview available from ESRI. Several mission p l:~nning functions are provided by the ground processing workstation. It allows the user to view a digital line graph (map) database, ~o available from conl~ .cial sources, depicting the survey area of interest. The user selects the boundary points to define the survey area selected for mission coverage. An aircraft trajectory is computed from a takeoff point to the survey area with sufficient passes over the survey area to provide a high probability of coverage of the selected area at a specified resolution and aircraft flight time. Multiple missions are prescribed where required. The aircraft trajectory is displayed ~5 to the user ~upt~illlposed over a digital map of the area of interest. The aircraft trajectory, defined by waypoints, and image collection start/stop times are stored on a floppy disk for entering into the airborne data collection system 10. System parameters such as camera setup parameters (frame resolution and angular field of view), aircraft parameters (endurance, velocity and turn radius) and mission descriptors (airport location and percent frame overlap) may be 30 modified. An example of a mission trajectory ~50 is illustrated in FIG. 8.
Pos~-mission processing functions of the ground processing workstation 1'2 include extracting data from the disk storage unit 32 of the airborne data collection system 10.
CA 022~0063 1998-09-22 WO97/35166 PCT~S97/04668 registration of the individual images onto a geodetic reference frame and combining the images into a contiguous imagery file stored in a standard GIS format. The post-processing functions use GIS tools that are similar to those used for mission planning. The mass storage media in the ground processing workstation 12 is compatible with the disk storage unit 32 in the airborne data s collection system 10. Files in the disk storage unit 32 may be copied to the mass storage system in the ground processing workstation 12. The ground processing workstation arcesse~ the stored files and registers each file individually into the ground plane. Multiple neighboring individual images from a single mission can be overlaid onto a common geodetic grid. All or a selected subset of the image frames may be mosaiced onto a geodetic reference grid. GIS tools may be o used to scroll, zoom and perform measurements on the mosaiced imagery. Imagery operations may be performed individually or on weighted combinations of the images from the three cameras.
In one upc.~ g mode, the end user makes ~I;~lg~ to lease an aircraft and performs mission planning. Using the ground processing workstation 12, the user lays out the survey area 5 to be imaged by selecting boundary points on a digital map. As indicated above, the ground processing workstation considers the endurance and turning properties of the aircraft to be used, the base airport location and camera pararneters. This allows the automatic design of a three-dimensional trajectory for the aircraft, with image collection points selected for ideal coverage of the survey area. The trajectory, or multiple mission trajectories, is presented to the user for 20 approval. Higher resolution requirements require more passes and possibly additional missions.
A three-hour mission with two hours of the flight time collecting image data provides over 20,000 acres (32 square miles) of image coverage at a one foot image resolution.Following the trajectory design stage, the user obtains from the ground processing workstation a floppy disk that contains the digital specifications of the trajectory (X, Y, Z
25 position versus time) and the image collection points. The airborne data collection system is installed on the aircraft, and a checkout of all subsystems is p~,.ro..l.cd ~llt~m~tically~ Upon valid checkout, the aircraft is ready to begin the mission. The pilot display leads the pilot through the mission from takeoff to landing, although the pilot can exit and re-enter the trajectorv waypoint files at any time. if desired. The display also provides the current status of 30 the mission, for example, flight legs completed~ time to next turn, loss of GPS lock. or the occurrence of any anomalies which might result in loss of data.
During image frame recording over the survey area, image data from the cameras 20~ 22 and CA 022~0063 1998-09-22 W097/35166 PCT~S97/04668 24 is stored on the disk storage unit 32. The stabilized platforrn assembly m~int~ins the cameras in a vertical orientation as described above. GPS data, representative of position of the cameras.
and IMU data. representative of attitude of the cameras. is simultaneously stored on the disk storage unit 32. Each image frame has corresponding GPS data and IMU data. so that the image 5 data may be spatially registered with high accuracy.
After the mission is completed, the airborne data collection system may be deinstalled from the aircraft. The electronics unit 84 can be connected to the ground processing workstation 12, so that the stored data may be transferred to the ground processing workstation. The individual image frames are transferred to the storage media in the ground processing workstation and are o registered in the ground plane using the GPS data and IMU data stored with the images. The image data can immeAi~tely be registered because position coordinates for each image pixel are known. This allows ple~ ion of a contiguous registered image of the survey area and review of this image on the workstation using standard GIS tools. Maps showing roads, cultural features, hydrology and the like can easily be overlaid on the image. The user can now use the 5 image in any desired manner.
In the above scenario, the end user was responsible for all aspects of the survey mission. In other scenarios, an airborne imaging service may be established to support a higher volume operation. In this case, the service u~ i~lion may offer mission planning support to users or may accept mission description disks from users who operate their own ground processing 20 workstations. For example, an individual farmer or chemical supplier may request once per week imagery of his acreage to precisely time harvest and/or rhemic~l application for maximum yield.
The software in the system computer 30 of the airborne data collection system 10 is required to perform the following functions. The IMU data is processed to provide a strapdown ~5 navigation solution p.up~g;~ g the position, velocity and attitude of the camera axes. The strapdown navigation solution is combined with the GPS velocity data to provide a transfer alignment resulting in the attitude of the camera boresight relative to North~ East and down. The three CCD cameras are comm~n~ed to obtain imagery in a synchronous manner with the GPS
data and the IMU data. Precise GPS timing is used to synchronize all data collectiûn functions.
30 A trajectory manager monitors the current aircraft position relative to the desired trajectory and provides comm~n~c to the pilot indicating the degree of error in hori~ontal and vertical planes.
The two-axis stabilization system uses the measured camera boresight attitude~ IMU rotation CA 022~0063 1998-09-22 _1~
rates and motor encoder values to control the camera axes to point in a comm~n~ed direction.
nominally down. The image frame data~ GPS data. IMU data. attitude solution and system help status are logged on the disk storage system.
A block diagram that illustrates the interrelationship of the software modules in the airborne s data collection system is shown in FIG. 7. A strapdown navigation routine 200 propa_ates the position. velocity and attitude of the IMU coordinate axes forward in time at a data rate of 100 Hz using digital measul~llle.lls of change in velocity and change in angle. The inputs to the routine 200 are the IMU data samples at a 100 Hz rate and initialization values for the IMU
attitude. Additionally, attitude error values are input from the transfer alignment Kalman filter o 202. The outputs are ( I ) latitude, longitude and altitude, (2) North, East, down components of velocity, and (3) attitude Euler angles (roll, pitch and yaw) relating the IMU axes to the North.
East and down axes.
The transfer ~lignm~nt Kalman filter 202 merges the GPS velocity measurement and the strapdown navigation routine output to produce an estimate of the error in the IMU axes attitude 5 computation. Inputs include the GPS velocity measurements and the strapdown navigation solution synchronized in time. Additionally, the lever arm (in aircraft body axes) from the GPS
antenna phase center to the IMU location is required. Finally, an IMU error model 204 representing the statistics of the IMU errors is also required. The transfer ali~nm~rlt process utilizes a Kalman filter formulation based upon the IMU error model. The outputs include the 20 IMU attitude errors which are supplied to the strapdown navigation routine 200 as corrections.
Errors in the IMU gyro and accelerometer instruments are logged to address the IMU in-flight performance.
A sensor boresight stabilization module 208 processes IMU attitude rates, motor encoder values and IMU attitude data to control the pitch and roll motors so as to properly point the 25 camera boresight to the desired direction. Inputs include motor encoder values~ IMU rotation rates. strapdown navigation attitude~ Euler angles and timing signals from the IMU interrupt.
The stabilization module includes two linear control systems identical in structure but having different gains to accommodate the different inertias presented to the motors. Stabilization is performed at a 100 Hz data rate synchronized with the IMU interrupt. A conventional ,o proportional integral derivative control is used. The proportional term comes from the pitch or roll attitude errors. and the derivative terrns come from the IMU rate gyro measurements.
Coordinate transformations must be applied to both the Euler angles and the rotation rates to .. . .
CA 022~0063 1998-09-22 account for the specific Euler angle set mechanized by the gimbal. The control gains are selected by a knowledge of the various control inertias, motor ~ain. and the amplifier gain and the selected bandwidth of the control loops. The pitch and roll bandwidths are selected at 10 Hz.
The motor encoder data is not normally used within the control loop. However. this data is used to determine the orientation of the IMU axes relative to the aircraft for lever arm calculations and to determine the proximity of the gimbals to their mechanical limits.
An image projection module 210 manages camera image frame collection and buffer storage. Inputs include frame time synchronization from the IMU 100 Hz interrupt, image memory addresses, kinematic GPS position and attitude angles from the strapdown navigation o routine 200. The image synchronization is controlled by the IMU interrupts at 100 Hz with nominal frame rates of I to 2 Hz, i.e. 50 to 100 IMU samples between frame collection events.
The frame collection comm~n~ use Mu-tech routines which provide frame triggering and synchronization of the three carneras. The outputs are memory mapped image frames for the three CCD cameras.
The real-time software consists of three modules: 1) the strapdown navigation routine 200, 2) the transfer alignment routine implemented in Kalman filter 202, and 3) the gimbal co~ d routine implement~d in the sensor boresight stabilization module 208.
The strapdown navigation routine 200 consists of integration of 6-degree-of-freedom equations with a body-fixed coordinate system. The body coordinate system has the z-axis fixed o to the camera boresight axis, the x-axis nominally pointed forward and the y-axis to the right of the motion. The internal coordinate system used for navigation is the Earth-Centered-Earth-Fixed (ECEF) system. Accelerations and rotation rates are integrated from the initial assumptions of attitude and using the GPS measured velocity components. The equations consider a WGS-84 datum for all computations for compliance with GPS. The strapdown ~5 computations are performed at a lO0 Hz rate, which coincides with the availability of the IMU
data.
The transfer alignment routine implements a 22 state Kalman filter with stages of Covariance Propagation and state/covariance update at each measurement. The filter states include velocity errors~ attitude errors, accelerometer biases, gyro biases~ accelerometer scale ,o factor errors, gyro scale factor errors with each of these error terms containing x, y. and z components ( 18 individual terms!. An additional state is used to represent the time latencv between the GPS and IMU measurement devices. Three additional states are used to represent CA 022~0063 1998-09-22 WO97/35166 PCT~S97/04668 the time-integral of the velocity (average velocity) over a 200 msec window prior to each GPS
one pulse-per-second (IPPS) time point. This velocity average is used to model exactly the functioning of the specific GPS receiver used in the preferred embodiment. The Kalman filter propagates elements of the covariance matrix and state between each I PPS GPS time and 5 performs a covariance and state update at each GPS time. The resulting attitude errors and IMU
instrument errors are fed back into the strapdown navigation routine 200 to act as a continuous source of calibration. This allows use of small, low-cost IMU devices which are currently being manufactured by several vendors.
The pitch and roll motors are controlled by a conventional proportional, integral, derivative o (PID) controller implemented in the gimbal command routine. The rate gyros from the IMU
provide the necessary rate feedback and the transfer ~ nment, coupled with the strapdown navigation routine, provides the position feedb~ck. The three PID gains are derived from knowledge of the inertias and desired bandwidth of the closed loop system. This pointing system differs from conventional systems in that the IMU used for navigation is placed on the inner 5 gimbal of the stabilized platform, which is also directly attached to the camera/sensor package.
This is enabled by availability of small, low-cost IMU components, direct drive servo motors and camera sensors.
While there have been shown and described what are at present considered the preferred embodiments of the present invention, it will be obvious to those skilled in the art that various ~o changes and modifications may be made therein without departing from the scope of the invention as defined by the appended claims.
modem 52, frame grabber 28, IMU interface 42, motor amplifiers 64 and 66, power supply 70 and display 72. The electronics unit 84 is interconnected to stabilized platform assembly 80 by a cable 86.
s F;IG. 3 is a pictorial diagram illustrating inct~ tion of the airborne data collection system of the present invention in an aircraft 90. As indicated above, stabilized platform assembly 80 is preferably mounted on cargo door 82. Electronics unit 84 is positioned in the cargo area of the aircraft 90. GPS antenna 48 may be mounted on the upper surface of the aircraft, and data link antenna 50 may be mounted on the underside of the aircraft. The pilot display 76 is positioned ,o for convenient viewing by the pilot.
FIG. 4 is a pictorial diagram of the stabilized platform 80 mounted on cargo door 82. FIG. 5 shows the stabilized platform assembly in more detail. and FIG. 6 shows an exploded view of the ~ ..
CA 022~0063 1998-09-22 WO97/35166 PCT~US97/04668 stabilized platform assembly. Cameras 20? 22 and 24 are rigidly mounted to a stabilized platform 100. The stabilized platform 100 extends through an opening in cargo door 82 such that cameras 20, 22 and 24 are located externally of the aircraft. The opening in cargo door 82 provides sufficient clearance to permit movement of stabili~ed platform l O0 with respect to the s aircraft. In a preferred embodiment, the opening is a vertically-oriented slot to permit pitch and roll stabilization of the cameras. The cameras 20, 22 and 24 are protected by a cowling 102 that is open at the bottom. The IMU 40 is rigidly mounted to a portion of stabilized platform 100 within the aircraft. A support member 102 is mounted to an inside surface of cargo door 82, and roll motor 60 is secured to an inwardly-extending portion of support member 102. The shaft of o roll motor 60 is connected to a gimbal 106. Pitch motor 62 is mounted to gimbal 106, and the shaft of pitch motor 62 is secured to stabilized platform 100. As shown in FIGS. 5 and 6~
cameras 20, 22 and 24 are mounted in a camera housing 120, which is attached to stabilized platform 100. Stabilized platform 100 is connected to gimbal 106 by a pivot pin 122 and is connected to the shaft of pitch motor 62 by a flange 124. Gimbal 106 is connected to the shaft of 5 roll motor 60 by a flange 126.
When the roll motor 60 is energized, cameras 20, 22 and 24, stabilized platform 100, IMU
40, gimbal 106 and pitch motor 62 are rotated with respect to a roll axis 110. When the pitch motor 62 is enel~ized, cameras 20, 22 and 24, stabilized platform 100 and IMU 40 are rotated with respect to a pitch axis 112. The nominal direction of flight of the aircraft is indicated in 20 FIG. 4 by arrow 114.
In an example of the stabilized platform assembly 80, the roll motor 60 was a Hathaway type HT03802 brushless DC motor, and the pitch motor 62 was a Hathaway type HT02301 brushless DC motor. The motor amplifiers 64 and 66 were Hathaway type BLC048 motor amplifiers.
~5 In operation, the IMU 40 senses changes in velocity and angle in three coordinate directions.
Since cameras 20, 22 and 24 and IMU 40 are rigidly connected to stabilized platform 100, velocity changes and angle changes sensed by IMU 40 represent velocity and angle changes of cameras 20, 22 and 24. IMU data representative of the velocity and angle changes is supplied to system computer 30. The system computer 30 uses the angle changes to determine deviations of 30 the attitude of the cameras from a desired attitude. These deviations are used to generate error si~nals which are supplied through motor amplifiers 64 and 66 to roll motor 60 and pitch motor 62~ respectively. The roll motor 60 rotates cameras 20, 22 and 24 with respect to roll axis 110~
, . . . .. , . _ , ..
CA 022~0063 1998-09-22 WO97/3~166 PCT/US97104668 and pitch motor 62 rotates cameras 20~ 22 and 24 with respect to pitch axis 112. so as to m~int~in a desired attitude. Typically, the cameras 20~ 22 and 24 are m~int~ined in a vertical attitude with respect to the earth's surface. However. other boresight directions may be utilized. Furtherrnore.
the cameras may be scanned, for example, with respect to the roll axis 110 to obtain images of a 5 wider strip on each pass over the survey area.
The stabilized platform assembly 80 provides advantageous operation of the imaging system. Because the cameras 20, 22 and 24 are stabilized, typically in a vertical orientation, the spacing between adjacent aircraft passes over the survey area can be increased without risking loss of coverage between images in adjacent passes. This is possible because it is not necessary o to account for inadvertent aircraft roll in der~rrnining the spacing between passes. By increasing the spacing between adjacent passes of the aircraR trajectory, the time and cost for completing a given survey is reduced. In an alternative approach, the cameras are scanned with respect to the roll axis at a rate relative to the aircraft speed which permits imaging of a wider strip than is possible with stationary cameras.
The stabilized platform assembly 80, shown in FIGS. 4-6 and described above, provides stabilization with respect to the pitch and roll axes of the aircraft. In another configuration, the stabilized platform assembly is simplified to provide stabilization with respect to the roll axis only. In this configuration, the shaft of roll motor 60 may be connected directly to stabilized platform 100 so as to rotate cameras 20, 22 and 24 with respect to roll axis 110. It is believed 20 that stabilization of the cameras 20, 22 and 24 with respect to the yaw axis (perpendicular to axes 110 and 112) would not provide substantial benefits in the operation of the imaging system.
Mounting of the stabilized platform assembly 80 on the aircraft door provides significant practical advantages in operation of the imaging system. In general, it is desired to install the imaging system of the present invention in a,l,illal.y aircraft. One alternative is to install the 25 cameras in a hole cut in the floor of the aircraft. However, this requires a special modification to the aircraft and requires certification of the inct~ tion by the ~AA. Such a hole is unlikely to be acceptable to many aircraR owners. Wing mounting of the camera assembly is undesirable for similar reasons. Mounting of the stabilized platform assembly on the cargo door provides an attractive solution. Cessna 172 aircraR, for example, have a cargo door that is easily removable.
30 Other small. four-passenger commercial aircraft have a similar cargo door which mav be modified for inct~ tion of the stabilized platform assembly. A 6x6 inch hole is cut in the lower interior portion of the dual wall aluminum door structure. The support member 102~ having a CA 022~0063 1998-09-22 box structure~ is used to carry torque from the pitch and roll motors. A torque transfer stiffener 130 (FIG. 6), 14 inches in length, is part of the support member 102 and transfers the roll motor torque in the vertical plane of the door. A vertically-oriented slot is cut in the door to allow the camera support portion of the stabilized platforrn 100 to pass through the door to the exterior of the aircraft. When the imaging system of the invention is to be used in an aircraft. the standard cargo door is removed and is replaced with a cargo door having a preinstalled stabilized platform assembly. The electronics unit 84 is placed in the cargo area of the aircraft.
The carneras used in the imaging system may include three compact monochrome CCDcameras. Such cameras are available from numerous suppliers. A preferred camera is the Sony o XC-7500, which provides 640 x 480 pixel resolution in non-interlace (progressive scan) mode.
The cameras typically use a 16 mm lens with an f-stop of 2.8. Different filters can be utilized in the camera lens to provide dirr.,lell~ spectral responses. For example, red, green, blue and near infrared filters may be utilized to obtain different information regarding the survey area. A color image can be formed by using red, green and blue filters. The frame grabber may be a Mu-tech model M- 1000 which allows access to up to 4 cameras simultaneously.
The imaging system has been described thus far with reference to a configuration utilizing three cameras. It will be understood that any number of cameras may be utilized. More generally, any sensor having a boresight direction for sensing may be utilized for data collection.
Thus, for example, the sensor may be a laser system, an atmospheric pollution sensor~ a thermal camera, a radar system or any other suitable sensor.
The IMU may be a Honeywell H- 1700 system, which has an accuracy characterized by 10 ~
per hour gyro accuracy. While higher accuracy IMU's are available, the cost is also higher. In order to utilize a low-cost IMU with moderate accuracy, an error model of the IMU is utilized as described below.
The GPS receiver 46 may comprise an eight-channel Motorola Encore airborne unit, and the GPS ground station 14 may comprise an 8-channel Motorola Encore differential GPS base station. The Motorola Encore is a C/A code unit with capability for using differential corrections transmitted by the base station. The GPS receiver 46 is connected by a coaxial cable to GPS
antenna 48~ installed on the upper surface of the aircraft. A true kinematic GPS system is a preferred implementation to achieve accuracies of 10 cm or better.
The RF modem 52~ which provides the differential GPS datalink to GPS ground station 14.
may be a Pacific Crest RFM 96S radio modem. capable of two-wav communication at 9600 baud using a carrier frequency of 460 MHz. This system provides approximately a 100 mile radius of coverage with a 15 watt transmitter and omnidirectional datalink antenna 50.
The system computer 30 may comprise an industry standard model PCI single board computer, which utilizes a P5 150 MHz processor. I/O functions are handled by a model ATC40 5 carrier board available from Greenspring, which provides four lndustry Pack (IP) board slots for tailoring the I/O functions performed by the board. One IP board is the IP-ADIO available from Greenspring, which provides analog-to-digital, digital-to-analog and discrete digital I/O
functions. This IP board receives IMU data and the GPS clock and supplies motor control signals to the motor amplifiers 64 and 66. An IP servo board decodes the motor encoder signals o received from roll motor 60 and pitch motor 62.
The disk storage unit 32 must have sufficient storage volume and a sufficient data transfer rate to store image data supplied by the frame grabber 28. Consider an airborne mission requiring one foot per pixel image resolution and an aircraft speed of 100 knots at one frarne per second. The 640 x 480 pixel single camera image frame provides a 65% overlap between images s (480 pixel dimension along direction of motion). At one byte per pixel for each of the three cameras, a data storage rate of 0.92 megabyte per image, or 0.92 megabyte per second, is required. A conservative 1.5 megabyte per second storage rate is used as the nominal image storage transfer rate specific~tiQn.
A nominal time for an imaging survey mission may be 3 hours, with over 2 hours ~csllmPd 20 for actual image collection. The re..,~ time is spent flying to and from the survey area and for turnarounds after completion of each swath. Two hours of imaging will generate a 7.2 gigabyte imagery file at a 1.0 megabyte per second storage rate. The GPS data and IMU data recorded during the mission (0.04 megabyte per second for 3 hours) contributes only 0.4 gigabyte of additional storage, for a total of 7.6 gigabytes. One exarnple of a suitable disk unit is ~5 the Seagate Elite-9, having 9 gigabytes of storage and 11 milli~econds access time. The standard SCSI disk drive interface allows storage throughput up to 1.5 megabytes per second.
The imaging system of the present invention utilizes direct digital photography and digital storage of spatially registered imagery. Other airborne video systems use a videotape system as the airborne image storage medium. This allows several hours of imagery to be captured at a 30 30 Hz image rate. However. videotape does not capture the full resolution or the full dynarnic range of the CCD camera svstems. All known airborne video systems offer videotape of the surveyed terrain. with frames tagged with GPS positions.
WO97/35166 PCT~US97/04668 The system uses a display to provide steering cues to the pilot to m~inf~in the appropriate - flight line. An Accuphoto system provided by Genysis Cornm. Inc. is used for this purpose. The Accuphoto system provides a software tool for use in planning the mission, resultin~ in a software file defining the mission profile in GPS coordinates. The onboard GPS receiver is then s used to provide pilot cues via a simple LCD display indicating need for a left/right correction and the magnitude of the correction.
The configuration described above wherein stabilized platform assembly 80 is mounted on a cargo door of the aircraft provides a number of advantages in operation of the imaging system.
However, other configurations may be utilized within the scope of the present invention. For o example, the stabilized platform assembly may be omitted from the imaging system. In this configuration, the cameras are rigidly mounted to the aircraft, and the IMU data is used to co~ ,nsate the image data for aircraft pitch, roll and yaw. Furthermore, the cameras or other sensors are not n.~cess,.. ;ly mounted on the aircraft door. For example, the cameras may be mounted in a hole in the floor of the aircraft, in a pod beneath the aircraft or on one of the wings.
The ground processing workstation 12 perforrns survey mission pl~nning and post-mission processing. The ground processing workstation 12 may be implemented using a PC-based graphical workstation and comrnercially available geographic information system tools (GIS) such as Arcview available from ESRI. Several mission p l:~nning functions are provided by the ground processing workstation. It allows the user to view a digital line graph (map) database, ~o available from conl~ .cial sources, depicting the survey area of interest. The user selects the boundary points to define the survey area selected for mission coverage. An aircraft trajectory is computed from a takeoff point to the survey area with sufficient passes over the survey area to provide a high probability of coverage of the selected area at a specified resolution and aircraft flight time. Multiple missions are prescribed where required. The aircraft trajectory is displayed ~5 to the user ~upt~illlposed over a digital map of the area of interest. The aircraft trajectory, defined by waypoints, and image collection start/stop times are stored on a floppy disk for entering into the airborne data collection system 10. System parameters such as camera setup parameters (frame resolution and angular field of view), aircraft parameters (endurance, velocity and turn radius) and mission descriptors (airport location and percent frame overlap) may be 30 modified. An example of a mission trajectory ~50 is illustrated in FIG. 8.
Pos~-mission processing functions of the ground processing workstation 1'2 include extracting data from the disk storage unit 32 of the airborne data collection system 10.
CA 022~0063 1998-09-22 WO97/35166 PCT~S97/04668 registration of the individual images onto a geodetic reference frame and combining the images into a contiguous imagery file stored in a standard GIS format. The post-processing functions use GIS tools that are similar to those used for mission planning. The mass storage media in the ground processing workstation 12 is compatible with the disk storage unit 32 in the airborne data s collection system 10. Files in the disk storage unit 32 may be copied to the mass storage system in the ground processing workstation 12. The ground processing workstation arcesse~ the stored files and registers each file individually into the ground plane. Multiple neighboring individual images from a single mission can be overlaid onto a common geodetic grid. All or a selected subset of the image frames may be mosaiced onto a geodetic reference grid. GIS tools may be o used to scroll, zoom and perform measurements on the mosaiced imagery. Imagery operations may be performed individually or on weighted combinations of the images from the three cameras.
In one upc.~ g mode, the end user makes ~I;~lg~ to lease an aircraft and performs mission planning. Using the ground processing workstation 12, the user lays out the survey area 5 to be imaged by selecting boundary points on a digital map. As indicated above, the ground processing workstation considers the endurance and turning properties of the aircraft to be used, the base airport location and camera pararneters. This allows the automatic design of a three-dimensional trajectory for the aircraft, with image collection points selected for ideal coverage of the survey area. The trajectory, or multiple mission trajectories, is presented to the user for 20 approval. Higher resolution requirements require more passes and possibly additional missions.
A three-hour mission with two hours of the flight time collecting image data provides over 20,000 acres (32 square miles) of image coverage at a one foot image resolution.Following the trajectory design stage, the user obtains from the ground processing workstation a floppy disk that contains the digital specifications of the trajectory (X, Y, Z
25 position versus time) and the image collection points. The airborne data collection system is installed on the aircraft, and a checkout of all subsystems is p~,.ro..l.cd ~llt~m~tically~ Upon valid checkout, the aircraft is ready to begin the mission. The pilot display leads the pilot through the mission from takeoff to landing, although the pilot can exit and re-enter the trajectorv waypoint files at any time. if desired. The display also provides the current status of 30 the mission, for example, flight legs completed~ time to next turn, loss of GPS lock. or the occurrence of any anomalies which might result in loss of data.
During image frame recording over the survey area, image data from the cameras 20~ 22 and CA 022~0063 1998-09-22 W097/35166 PCT~S97/04668 24 is stored on the disk storage unit 32. The stabilized platforrn assembly m~int~ins the cameras in a vertical orientation as described above. GPS data, representative of position of the cameras.
and IMU data. representative of attitude of the cameras. is simultaneously stored on the disk storage unit 32. Each image frame has corresponding GPS data and IMU data. so that the image 5 data may be spatially registered with high accuracy.
After the mission is completed, the airborne data collection system may be deinstalled from the aircraft. The electronics unit 84 can be connected to the ground processing workstation 12, so that the stored data may be transferred to the ground processing workstation. The individual image frames are transferred to the storage media in the ground processing workstation and are o registered in the ground plane using the GPS data and IMU data stored with the images. The image data can immeAi~tely be registered because position coordinates for each image pixel are known. This allows ple~ ion of a contiguous registered image of the survey area and review of this image on the workstation using standard GIS tools. Maps showing roads, cultural features, hydrology and the like can easily be overlaid on the image. The user can now use the 5 image in any desired manner.
In the above scenario, the end user was responsible for all aspects of the survey mission. In other scenarios, an airborne imaging service may be established to support a higher volume operation. In this case, the service u~ i~lion may offer mission planning support to users or may accept mission description disks from users who operate their own ground processing 20 workstations. For example, an individual farmer or chemical supplier may request once per week imagery of his acreage to precisely time harvest and/or rhemic~l application for maximum yield.
The software in the system computer 30 of the airborne data collection system 10 is required to perform the following functions. The IMU data is processed to provide a strapdown ~5 navigation solution p.up~g;~ g the position, velocity and attitude of the camera axes. The strapdown navigation solution is combined with the GPS velocity data to provide a transfer alignment resulting in the attitude of the camera boresight relative to North~ East and down. The three CCD cameras are comm~n~ed to obtain imagery in a synchronous manner with the GPS
data and the IMU data. Precise GPS timing is used to synchronize all data collectiûn functions.
30 A trajectory manager monitors the current aircraft position relative to the desired trajectory and provides comm~n~c to the pilot indicating the degree of error in hori~ontal and vertical planes.
The two-axis stabilization system uses the measured camera boresight attitude~ IMU rotation CA 022~0063 1998-09-22 _1~
rates and motor encoder values to control the camera axes to point in a comm~n~ed direction.
nominally down. The image frame data~ GPS data. IMU data. attitude solution and system help status are logged on the disk storage system.
A block diagram that illustrates the interrelationship of the software modules in the airborne s data collection system is shown in FIG. 7. A strapdown navigation routine 200 propa_ates the position. velocity and attitude of the IMU coordinate axes forward in time at a data rate of 100 Hz using digital measul~llle.lls of change in velocity and change in angle. The inputs to the routine 200 are the IMU data samples at a 100 Hz rate and initialization values for the IMU
attitude. Additionally, attitude error values are input from the transfer alignment Kalman filter o 202. The outputs are ( I ) latitude, longitude and altitude, (2) North, East, down components of velocity, and (3) attitude Euler angles (roll, pitch and yaw) relating the IMU axes to the North.
East and down axes.
The transfer ~lignm~nt Kalman filter 202 merges the GPS velocity measurement and the strapdown navigation routine output to produce an estimate of the error in the IMU axes attitude 5 computation. Inputs include the GPS velocity measurements and the strapdown navigation solution synchronized in time. Additionally, the lever arm (in aircraft body axes) from the GPS
antenna phase center to the IMU location is required. Finally, an IMU error model 204 representing the statistics of the IMU errors is also required. The transfer ali~nm~rlt process utilizes a Kalman filter formulation based upon the IMU error model. The outputs include the 20 IMU attitude errors which are supplied to the strapdown navigation routine 200 as corrections.
Errors in the IMU gyro and accelerometer instruments are logged to address the IMU in-flight performance.
A sensor boresight stabilization module 208 processes IMU attitude rates, motor encoder values and IMU attitude data to control the pitch and roll motors so as to properly point the 25 camera boresight to the desired direction. Inputs include motor encoder values~ IMU rotation rates. strapdown navigation attitude~ Euler angles and timing signals from the IMU interrupt.
The stabilization module includes two linear control systems identical in structure but having different gains to accommodate the different inertias presented to the motors. Stabilization is performed at a 100 Hz data rate synchronized with the IMU interrupt. A conventional ,o proportional integral derivative control is used. The proportional term comes from the pitch or roll attitude errors. and the derivative terrns come from the IMU rate gyro measurements.
Coordinate transformations must be applied to both the Euler angles and the rotation rates to .. . .
CA 022~0063 1998-09-22 account for the specific Euler angle set mechanized by the gimbal. The control gains are selected by a knowledge of the various control inertias, motor ~ain. and the amplifier gain and the selected bandwidth of the control loops. The pitch and roll bandwidths are selected at 10 Hz.
The motor encoder data is not normally used within the control loop. However. this data is used to determine the orientation of the IMU axes relative to the aircraft for lever arm calculations and to determine the proximity of the gimbals to their mechanical limits.
An image projection module 210 manages camera image frame collection and buffer storage. Inputs include frame time synchronization from the IMU 100 Hz interrupt, image memory addresses, kinematic GPS position and attitude angles from the strapdown navigation o routine 200. The image synchronization is controlled by the IMU interrupts at 100 Hz with nominal frame rates of I to 2 Hz, i.e. 50 to 100 IMU samples between frame collection events.
The frame collection comm~n~ use Mu-tech routines which provide frame triggering and synchronization of the three carneras. The outputs are memory mapped image frames for the three CCD cameras.
The real-time software consists of three modules: 1) the strapdown navigation routine 200, 2) the transfer alignment routine implemented in Kalman filter 202, and 3) the gimbal co~ d routine implement~d in the sensor boresight stabilization module 208.
The strapdown navigation routine 200 consists of integration of 6-degree-of-freedom equations with a body-fixed coordinate system. The body coordinate system has the z-axis fixed o to the camera boresight axis, the x-axis nominally pointed forward and the y-axis to the right of the motion. The internal coordinate system used for navigation is the Earth-Centered-Earth-Fixed (ECEF) system. Accelerations and rotation rates are integrated from the initial assumptions of attitude and using the GPS measured velocity components. The equations consider a WGS-84 datum for all computations for compliance with GPS. The strapdown ~5 computations are performed at a lO0 Hz rate, which coincides with the availability of the IMU
data.
The transfer alignment routine implements a 22 state Kalman filter with stages of Covariance Propagation and state/covariance update at each measurement. The filter states include velocity errors~ attitude errors, accelerometer biases, gyro biases~ accelerometer scale ,o factor errors, gyro scale factor errors with each of these error terms containing x, y. and z components ( 18 individual terms!. An additional state is used to represent the time latencv between the GPS and IMU measurement devices. Three additional states are used to represent CA 022~0063 1998-09-22 WO97/35166 PCT~S97/04668 the time-integral of the velocity (average velocity) over a 200 msec window prior to each GPS
one pulse-per-second (IPPS) time point. This velocity average is used to model exactly the functioning of the specific GPS receiver used in the preferred embodiment. The Kalman filter propagates elements of the covariance matrix and state between each I PPS GPS time and 5 performs a covariance and state update at each GPS time. The resulting attitude errors and IMU
instrument errors are fed back into the strapdown navigation routine 200 to act as a continuous source of calibration. This allows use of small, low-cost IMU devices which are currently being manufactured by several vendors.
The pitch and roll motors are controlled by a conventional proportional, integral, derivative o (PID) controller implemented in the gimbal command routine. The rate gyros from the IMU
provide the necessary rate feedback and the transfer ~ nment, coupled with the strapdown navigation routine, provides the position feedb~ck. The three PID gains are derived from knowledge of the inertias and desired bandwidth of the closed loop system. This pointing system differs from conventional systems in that the IMU used for navigation is placed on the inner 5 gimbal of the stabilized platform, which is also directly attached to the camera/sensor package.
This is enabled by availability of small, low-cost IMU components, direct drive servo motors and camera sensors.
While there have been shown and described what are at present considered the preferred embodiments of the present invention, it will be obvious to those skilled in the art that various ~o changes and modifications may be made therein without departing from the scope of the invention as defined by the appended claims.
Claims (25)
1. A remote data collection system for use in a vehicle as the vehicle moves. said system comprising:
a sensor for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, said sensor having a sensing direction;
a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor under moving conditions;
an inertial measurement unit (IMU) for providing IMU data representative of attitude of said sensor;
a processing unit responsive to said IMU data and an IMU error model for determining IMU errors, and responsive to said GPS data, said IMU data and said IMU errors for determining geographic data referenced to said sensor data; and a data storage unit for storing said sensor data and said geographic data for subsequent use.
a sensor for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, said sensor having a sensing direction;
a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor under moving conditions;
an inertial measurement unit (IMU) for providing IMU data representative of attitude of said sensor;
a processing unit responsive to said IMU data and an IMU error model for determining IMU errors, and responsive to said GPS data, said IMU data and said IMU errors for determining geographic data referenced to said sensor data; and a data storage unit for storing said sensor data and said geographic data for subsequent use.
2. A sensing system as defined in claim 1 wherein said sensor comprises a camera and said sensor data represents an image
3. A sensing system as defined in claim 1 wherein said sensor comprises a charge coupled device (CCD) camera and wherein said sensor data represents an image.
4. A sensing system as defined in claim 1 wherein said sensor comprises a plurality of CCD
cameras, each having a different spectral characteristic. and wherein said sensor data represents images in different spectral ranges.
cameras, each having a different spectral characteristic. and wherein said sensor data represents images in different spectral ranges.
5. A sensing system as defined in claim 1 wherein said vehicle is an aircraft and wherein said sensing system is used for airborne imaging of a predetermined region.
6 A sensing system as defined in claim 1 wherein said GPS receiver comprises a differential GPS receiver.
7. A sensing system as defined in claim 1 wherein said GPS receiver comprises a kinematic-capable GPS receiver.
8. A sensing system as defined in claim 1 further comprising a stabilized platform to which said sensor and said IMU are rigidly mounted and a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction of said sensor as the vehicle moves.
9. A sensing system as defined in claim 8 wherein said vehicle comprises an aircraft and wherein said control unit includes means for rotating said stabilized platform about pitch and roll axes with respect to the aircraft to maintain the sensing direction of said sensor substantially vertical during flight.
10. A sensing system as defined in claim 9 wherein said sensor, said IMU and said stabilized platform are affixed to a door of the aircraft.
I l. A sensing system as defined in claim 10 wherein said sensor comprises a plurality of CCD
cameras. each having a different spectral characteristic.
cameras. each having a different spectral characteristic.
12. A sensing system as defined in claim 9 further comprising means responsive to a desired aircraft trajectory and said GPS data for indicating deviations of the aircraft from the desired aircraft trajectory.
13. A remote data collection system for use in a vehicle as the vehicle moves. said system comprising:
a stabilized platform;
a sensor rigidly mounted to said stabilized platform for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, said sensor having a sensing direction:
an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said sensor;
a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction of said sensor as the vehicle moves:
a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor;
a processing system responsive to said GPS data and said IMU data for determining geographic data referenced to said sensor data; and a data storage unit for storing said sensor data and said geographic data for subsequent use.
a stabilized platform;
a sensor rigidly mounted to said stabilized platform for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, said sensor having a sensing direction:
an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said sensor;
a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction of said sensor as the vehicle moves:
a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor;
a processing system responsive to said GPS data and said IMU data for determining geographic data referenced to said sensor data; and a data storage unit for storing said sensor data and said geographic data for subsequent use.
14. A sensing system as defined in claim 13 wherein said control unit comprises a gimbal, a support member rigidly mounted to the vehicle, a first motor connected between said support member and said gimbal for rotating said gimbal about a first axis relative to said support member and a second motor connected between said gimbal and said stabilized platform for rotating said stabilized platform about a second axis relative to said gimbal, wherein the sensing direction of said sensor is stabilized with respect to said first and second axes as the vehicle moves.
15. A sensing system as defined in claim 14 wherein said vehicle comprises an aircraft and wherein said control unit stabilizes the sensing direction of said sensor in a vertical orientation with respect to pitch and roll of the aircraft.
16. A sensing system as defined in claim 13 wherein said control unit comprises a support member rigidly mounted to the vehicle and a motor connected between said support member and said stabilized platform for rotating said stabilized platform about an axis of rotation relative to said fixed frame. wherein the direction of said sensor is stabilized with respect to said axis of rotation as the vehicle moves.
17. A sensing system as defined in claim 16 wherein said vehicle comprises an aircraft and wherein said control unit stabilizes the sensing direction of said sensor in a vertical orientation with respect to roll of the aircraft.
18. An imaging system for use in an aircraft. said imaging system comprising:
a stabilized platform;
a camera system rigidly mounted to said stabilized platform for providing image data, said camera system having a boresight direction;
an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said camera system;
a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the aircraft to control the boresight direction of said camera system during flight;
a global positioning system (GPS) receiver for providing GPS data representative of position of said camera system;
a processing unit responsive to said GPS data and said IMU data for determining geographic data referenced to said image data; and a data storage unit for storing said image data and said geographic data for subsequent use.
a stabilized platform;
a camera system rigidly mounted to said stabilized platform for providing image data, said camera system having a boresight direction;
an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said camera system;
a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the aircraft to control the boresight direction of said camera system during flight;
a global positioning system (GPS) receiver for providing GPS data representative of position of said camera system;
a processing unit responsive to said GPS data and said IMU data for determining geographic data referenced to said image data; and a data storage unit for storing said image data and said geographic data for subsequent use.
19. An imaging system as defined in claim 18 wherein said camera system comprises a plurality of CCD cameras, each having a different spectral response.
20. An imaging system as defined in claim 18 wherein said aircraft includes a cargo door and wherein said camera system, said IMU and said stabilized platform are mounted to said cargo door.
21. An imaging system as defined in claim 18 further comprising means responsive to a desired aircraft trajectory and said GPS data for indicating deviations of the aircraft from the desired aircraft trajectory.
22. An imaging system as defined in claim 18 wherein said GPS receiver comprises a differential GPS receiver.
23. An imaging system as defined in claim 18 wherein said GPS receiver comprises a kinematic GPS receiver.
24. An imaging system as defined in claim 18 wherein said control unit includes means for rotating said stabilized platform about pitch and roll axes with respect to the aircraft to maintain the boresight direction of said camera system substantially vertical during flight.
25. An imaging system as defined in claim 18 wherein said processing unit further comprises means responsive to said IMU data and an IMU error model for determining IMU errors and wherein said processing unit is responsive to said GPS data. said IMU data and said IMU errors for determining said geographic data with high accuracy.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/621,107 US5894323A (en) | 1996-03-22 | 1996-03-22 | Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data |
US08/621,107 | 1996-03-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2250063A1 true CA2250063A1 (en) | 1997-09-25 |
Family
ID=24488754
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002250063A Abandoned CA2250063A1 (en) | 1996-03-22 | 1997-03-21 | Airborne imaging system |
Country Status (4)
Country | Link |
---|---|
US (1) | US5894323A (en) |
AU (1) | AU726815B2 (en) |
CA (1) | CA2250063A1 (en) |
WO (1) | WO1997035166A1 (en) |
Families Citing this family (178)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10361802B1 (en) | 1999-02-01 | 2019-07-23 | Blanding Hovenweep, Llc | Adaptive pattern recognition based control system and method |
US8352400B2 (en) | 1991-12-23 | 2013-01-08 | Hoffberg Steven M | Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore |
DE19714396A1 (en) * | 1997-04-08 | 1998-10-15 | Zeiss Carl Fa | Photogrammetric camera used in aircraft or satellite |
US6597818B2 (en) * | 1997-05-09 | 2003-07-22 | Sarnoff Corporation | Method and apparatus for performing geo-spatial registration of imagery |
WO1999007139A1 (en) * | 1997-07-30 | 1999-02-11 | Pinotage, L.L.C. | Imaging device |
JP3833786B2 (en) * | 1997-08-04 | 2006-10-18 | 富士重工業株式会社 | 3D self-position recognition device for moving objects |
DE19752559B4 (en) * | 1997-11-27 | 2004-01-22 | Honeywell Ag | Procedure for guiding aircraft on taxiways |
JPH11160084A (en) * | 1997-11-28 | 1999-06-18 | Mitsumi Electric Co Ltd | Navigation information receiver |
US6281970B1 (en) * | 1998-03-12 | 2001-08-28 | Synergistix Llc | Airborne IR fire surveillance system providing firespot geopositioning |
US6172470B1 (en) * | 1998-04-30 | 2001-01-09 | Trw Inc. | Large aperture precision gimbal drive module |
US6714240B1 (en) * | 1998-06-23 | 2004-03-30 | Boeing North American, Inc. | Optical sensor employing motion compensated integration-device and process |
GB2342242A (en) * | 1998-09-25 | 2000-04-05 | Environment Agency | Environmental data collection system |
US6023241A (en) * | 1998-11-13 | 2000-02-08 | Intel Corporation | Digital multimedia navigation player/recorder |
US6205400B1 (en) * | 1998-11-27 | 2001-03-20 | Ching-Fang Lin | Vehicle positioning and data integrating method and system thereof |
US7966078B2 (en) | 1999-02-01 | 2011-06-21 | Steven Hoffberg | Network media appliance system and method |
DE19950033B4 (en) * | 1999-10-16 | 2005-03-03 | Bayerische Motoren Werke Ag | Camera device for vehicles |
US6965397B1 (en) | 1999-11-22 | 2005-11-15 | Sportvision, Inc. | Measuring camera attitude |
US7143130B2 (en) * | 1999-12-09 | 2006-11-28 | Ching-Fang Lin | Portable multi-tracking method and system |
US6298286B1 (en) * | 1999-12-17 | 2001-10-02 | Rockwell Collins | Method of preventing potentially hazardously misleading attitude data |
DE10010366C2 (en) * | 2000-03-07 | 2003-08-07 | Ekos Entwicklung Und Konstrukt | Process for the digital recording and storage of aerial photographs under flight conditions |
US6535114B1 (en) * | 2000-03-22 | 2003-03-18 | Toyota Jidosha Kabushiki Kaisha | Method and apparatus for environment recognition |
US6281797B1 (en) | 2000-04-04 | 2001-08-28 | Marconi Data Systems Inc. | Method and apparatus for detecting a container proximate to a transportation vessel hold |
US6734796B2 (en) | 2000-04-04 | 2004-05-11 | Ian J. Forster | Self-check for a detector detecting the proximity of a transportation vessel |
US6373521B1 (en) * | 2000-07-19 | 2002-04-16 | Kevin D. Carter | Aircraft incident surveillance system |
GB2368219A (en) * | 2000-09-13 | 2002-04-24 | Roke Manor Research | Camera system with GPS |
US6421610B1 (en) * | 2000-09-15 | 2002-07-16 | Ernest A. Carroll | Method of preparing and disseminating digitized geospatial data |
US20020184348A1 (en) * | 2000-09-20 | 2002-12-05 | Lockheed Martin Corporation | Object oriented framework architecture for sensing and/or control environments |
EP1319203A2 (en) * | 2000-09-20 | 2003-06-18 | Lockheed Martin Corporation | Object oriented framework architecture for sensing and/or control environments |
US6622090B2 (en) * | 2000-09-26 | 2003-09-16 | American Gnc Corporation | Enhanced inertial measurement unit/global positioning system mapping and navigation process |
JP2002135758A (en) * | 2000-10-20 | 2002-05-10 | Yazaki Corp | On-vehicle transmitting system, receiving apparatus and transmitting apparatus for video data |
US7565008B2 (en) | 2000-11-06 | 2009-07-21 | Evryx Technologies, Inc. | Data capture and identification system and process |
US7680324B2 (en) | 2000-11-06 | 2010-03-16 | Evryx Technologies, Inc. | Use of image-derived information as search criteria for internet and other search engines |
US8224078B2 (en) | 2000-11-06 | 2012-07-17 | Nant Holdings Ip, Llc | Image capture and identification system and process |
US9310892B2 (en) | 2000-11-06 | 2016-04-12 | Nant Holdings Ip, Llc | Object information derived from object images |
US7899243B2 (en) | 2000-11-06 | 2011-03-01 | Evryx Technologies, Inc. | Image capture and identification system and process |
US20020067424A1 (en) * | 2000-12-01 | 2002-06-06 | Brunner Joseph F. | Environmentally sealed cameras for mounting externally on aircraft and systems for using the same |
US6424804B1 (en) * | 2000-12-27 | 2002-07-23 | Cessna Aircraft Company | Modular airborne flir support and extension structure |
US20040257441A1 (en) * | 2001-08-29 | 2004-12-23 | Geovantage, Inc. | Digital imaging system for airborne applications |
US20030048357A1 (en) * | 2001-08-29 | 2003-03-13 | Geovantage, Inc. | Digital imaging system for airborne applications |
FR2830097B1 (en) * | 2001-09-21 | 2004-02-20 | Univ Compiegne Tech | PROCESS FOR TAKING MOTION IMAGES |
AU2002328690B2 (en) * | 2001-10-11 | 2007-10-25 | Cgg Data Services Ag | Airborne geophysical measurements |
US6759979B2 (en) * | 2002-01-22 | 2004-07-06 | E-Businesscontrols Corp. | GPS-enhanced system and method for automatically capturing and co-registering virtual models of a site |
IL149934A (en) * | 2002-05-30 | 2007-05-15 | Rafael Advanced Defense Sys | Airborne reconnaissance system |
JP4181800B2 (en) * | 2002-06-20 | 2008-11-19 | Nec東芝スペースシステム株式会社 | Topographic measurement system, storage medium, and program using stereo image |
US6831599B2 (en) * | 2002-08-26 | 2004-12-14 | Honeywell International Inc. | Remote velocity sensor slaved to an integrated GPS/INS |
US7725258B2 (en) * | 2002-09-20 | 2010-05-25 | M7 Visual Intelligence, L.P. | Vehicle based data collection and processing system and imaging sensor system and methods thereof |
US7893957B2 (en) * | 2002-08-28 | 2011-02-22 | Visual Intelligence, LP | Retinal array compound camera system |
US8994822B2 (en) | 2002-08-28 | 2015-03-31 | Visual Intelligence Lp | Infrastructure mapping system and method |
US8483960B2 (en) | 2002-09-20 | 2013-07-09 | Visual Intelligence, LP | Self-calibrated, remote imaging and data processing system |
US7212938B2 (en) | 2002-09-17 | 2007-05-01 | M7 Visual Intelligence, Lp | Method of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data |
US6928194B2 (en) * | 2002-09-19 | 2005-08-09 | M7 Visual Intelligence, Lp | System for mosaicing digital ortho-images |
USRE49105E1 (en) | 2002-09-20 | 2022-06-14 | Vi Technologies, Llc | Self-calibrated, remote imaging and data processing system |
US7002551B2 (en) * | 2002-09-25 | 2006-02-21 | Hrl Laboratories, Llc | Optical see-through augmented reality modified-scale display |
US20070035562A1 (en) * | 2002-09-25 | 2007-02-15 | Azuma Ronald T | Method and apparatus for image enhancement |
US20040066391A1 (en) * | 2002-10-02 | 2004-04-08 | Mike Daily | Method and apparatus for static image enhancement |
US20040068758A1 (en) * | 2002-10-02 | 2004-04-08 | Mike Daily | Dynamic video annotation |
US7424133B2 (en) * | 2002-11-08 | 2008-09-09 | Pictometry International Corporation | Method and apparatus for capturing, geolocating and measuring oblique images |
US6975959B2 (en) * | 2002-12-03 | 2005-12-13 | Robert Bosch Gmbh | Orientation and navigation for a mobile device using inertial sensors |
WO2005017550A2 (en) * | 2002-12-13 | 2005-02-24 | Utah State University Research Foundation | A vehicle mounted system and method for capturing and processing physical data |
US7046259B2 (en) * | 2003-04-30 | 2006-05-16 | The Boeing Company | Method and system for presenting different views to passengers in a moving vehicle |
US7088310B2 (en) * | 2003-04-30 | 2006-08-08 | The Boeing Company | Method and system for presenting an image of an external view in a moving vehicle |
US7212921B2 (en) * | 2003-05-21 | 2007-05-01 | Honeywell International Inc. | System and method for multiplexing and transmitting DC power, IMU data and RF data on a single cable |
US7117086B2 (en) * | 2003-09-08 | 2006-10-03 | Honeywell International Inc. | GPS/IMU clock synchronization particularly for deep integration vector tracking loop |
JP4253239B2 (en) * | 2003-10-07 | 2009-04-08 | 富士重工業株式会社 | Navigation system using image recognition |
US7308342B2 (en) * | 2004-01-23 | 2007-12-11 | Rafael Armament Development Authority Ltd. | Airborne reconnaissance system |
US7065449B2 (en) * | 2004-03-05 | 2006-06-20 | Bell Geospace, Inc. | Method and system for evaluating geophysical survey data |
EP1759173A4 (en) * | 2004-06-02 | 2012-02-22 | Rockwell Collins Control Technologies Inc | Image-augmented inertial navigation system (iains) and method |
AU2005322595B2 (en) * | 2004-06-02 | 2010-04-22 | Rockwell Collins Control Technologies, Inc. | Systems and methods for controlling dynamic systems |
US7542850B2 (en) * | 2004-06-24 | 2009-06-02 | Bell Geospace, Inc. | Method and system for synchronizing geophysical survey data |
US7458264B2 (en) * | 2004-09-10 | 2008-12-02 | Honeywell International Inc. | Generalized inertial measurement error reduction through multiple axis rotation during flight |
US7274439B2 (en) * | 2004-09-10 | 2007-09-25 | Honeywell International Inc. | Precise, no-contact, position sensing using imaging |
US7289902B2 (en) * | 2004-09-10 | 2007-10-30 | Honeywell International Inc. | Three dimensional balance assembly |
US20060054660A1 (en) * | 2004-09-10 | 2006-03-16 | Honeywell International Inc. | Articulated gas bearing support pads |
US7617070B2 (en) * | 2004-09-10 | 2009-11-10 | Honeywell International Inc. | Absolute position determination of an object using pattern recognition |
US7340344B2 (en) * | 2004-09-10 | 2008-03-04 | Honeywell International Inc. | Spherical position monitoring system |
US7295947B2 (en) * | 2004-09-10 | 2007-11-13 | Honeywell International Inc. | Absolute position determination of an object using pattern recognition |
US7366613B2 (en) * | 2004-09-10 | 2008-04-29 | Honeywell International Inc. | RF wireless communication for deeply embedded aerospace systems |
US7698064B2 (en) * | 2004-09-10 | 2010-04-13 | Honeywell International Inc. | Gas supported inertial sensor system and method |
US7668655B2 (en) * | 2004-12-07 | 2010-02-23 | Honeywell International Inc. | Navigation component modeling system and method |
US7586514B1 (en) * | 2004-12-15 | 2009-09-08 | United States Of America As Represented By The Secretary Of The Navy | Compact remote tactical imagery relay system |
WO2006090368A1 (en) * | 2005-02-22 | 2006-08-31 | Israel Aerospace Industries Ltd. | A calibration method and system for position measurements |
US20060210169A1 (en) * | 2005-03-03 | 2006-09-21 | General Dynamics Advanced Information Systems, Inc. | Apparatus and method for simulated sensor imagery using fast geometric transformations |
US7260389B2 (en) * | 2005-07-07 | 2007-08-21 | The Boeing Company | Mobile platform distributed data load management system |
US8732233B2 (en) * | 2005-07-13 | 2014-05-20 | The Boeing Company | Integrating portable electronic devices with electronic flight bag systems installed in aircraft |
US7827400B2 (en) | 2005-07-28 | 2010-11-02 | The Boeing Company | Security certificate management |
US7788002B2 (en) * | 2005-08-08 | 2010-08-31 | The Boeing Company | Fault data management |
JP4307427B2 (en) * | 2005-08-31 | 2009-08-05 | 株式会社パスコ | Laser surveying apparatus and laser surveying method |
WO2007041690A2 (en) | 2005-10-04 | 2007-04-12 | Alexander Eugene J | Device for generating three dimensional surface models of moving objects |
US20070076096A1 (en) * | 2005-10-04 | 2007-04-05 | Alexander Eugene J | System and method for calibrating a set of imaging devices and calculating 3D coordinates of detected features in a laboratory coordinate system |
ES2369229T3 (en) * | 2005-10-07 | 2011-11-28 | Saab Ab | PROCEDURE AND APPLIANCE TO GENERATE A ROUTE. |
EP1960941A4 (en) | 2005-11-10 | 2012-12-26 | Motion Analysis Corp | Device and method for calibrating an imaging device for generating three-dimensional surface models of moving objects |
US9182228B2 (en) * | 2006-02-13 | 2015-11-10 | Sony Corporation | Multi-lens array system and method |
US20100245571A1 (en) * | 2006-04-24 | 2010-09-30 | Northrop Grumman Corporation | Global hawk image mosaic |
JP4938351B2 (en) * | 2006-05-16 | 2012-05-23 | トヨタ自動車株式会社 | Positioning information update device for vehicles |
US7873238B2 (en) | 2006-08-30 | 2011-01-18 | Pictometry International Corporation | Mosaic oblique images and methods of making and using same |
US7647176B2 (en) * | 2007-01-11 | 2010-01-12 | Honeywell International Inc. | Method and system for wireless power transfers through multiple ports |
US8593518B2 (en) * | 2007-02-01 | 2013-11-26 | Pictometry International Corp. | Computer system for continuous oblique panning |
US8520079B2 (en) * | 2007-02-15 | 2013-08-27 | Pictometry International Corp. | Event multiplexer for managing the capture of images |
US7463340B2 (en) * | 2007-03-28 | 2008-12-09 | Honeywell International Inc. | Ladar-based motion estimation for navigation |
US20080255736A1 (en) * | 2007-04-10 | 2008-10-16 | Helena Holding Company | Geo-referenced agricultural levees |
US8385672B2 (en) * | 2007-05-01 | 2013-02-26 | Pictometry International Corp. | System for detecting image abnormalities |
US9262818B2 (en) | 2007-05-01 | 2016-02-16 | Pictometry International Corp. | System for detecting image abnormalities |
US7762133B2 (en) * | 2007-07-17 | 2010-07-27 | Honeywell International Inc. | Inertial measurement unit with gas plenums |
US7425097B1 (en) | 2007-07-17 | 2008-09-16 | Honeywell International Inc. | Inertial measurement unit with wireless power transfer gap control |
US8024119B2 (en) * | 2007-08-14 | 2011-09-20 | Honeywell International Inc. | Systems and methods for gyrocompass alignment using dynamically calibrated sensor data and an iterated extended kalman filter within a navigation system |
US7671607B2 (en) * | 2007-09-06 | 2010-03-02 | Honeywell International Inc. | System and method for measuring air bearing gap distance |
US8965812B2 (en) * | 2007-10-09 | 2015-02-24 | Archer Daniels Midland Company | Evaluating commodity conditions using aerial image data |
US7991226B2 (en) | 2007-10-12 | 2011-08-02 | Pictometry International Corporation | System and process for color-balancing a series of oblique images |
US8531472B2 (en) | 2007-12-03 | 2013-09-10 | Pictometry International Corp. | Systems and methods for rapid three-dimensional modeling with real façade texture |
DE102007058943A1 (en) * | 2007-12-07 | 2009-06-10 | Emt Ingenieurgesellschaft Dipl.-Ing. Hartmut Euer Mbh | Multi-spectral video device for air-based surveillance and real time aerial photograph monitoring of uncrewed aircraft, has electronic mosaic/fusion device designed such that fusion device receives image signals of video scanner cameras |
WO2009105254A2 (en) * | 2008-02-20 | 2009-08-27 | Actioncam, Llc | Aerial camera system |
US20090245581A1 (en) * | 2008-03-31 | 2009-10-01 | Sean Dey | Airborne terrain acquisition and processing system with fluid detection |
US8497905B2 (en) * | 2008-04-11 | 2013-07-30 | nearmap australia pty ltd. | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8675068B2 (en) * | 2008-04-11 | 2014-03-18 | Nearmap Australia Pty Ltd | Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features |
US8213706B2 (en) * | 2008-04-22 | 2012-07-03 | Honeywell International Inc. | Method and system for real-time visual odometry |
US9235334B2 (en) * | 2008-05-09 | 2016-01-12 | Genesis Industries, Llc | Managing landbases and machine operations performed thereon |
US8373127B2 (en) * | 2008-06-26 | 2013-02-12 | Lynntech, Inc. | Method of searching for a thermal target |
US8588547B2 (en) | 2008-08-05 | 2013-11-19 | Pictometry International Corp. | Cut-line steering methods for forming a mosaic image of a geographical area |
US8401222B2 (en) | 2009-05-22 | 2013-03-19 | Pictometry International Corp. | System and process for roof measurement using aerial imagery |
US8577518B2 (en) * | 2009-05-27 | 2013-11-05 | American Aerospace Advisors, Inc. | Airborne right of way autonomous imager |
PT104783B (en) | 2009-10-13 | 2014-08-27 | Univ Aveiro | HIGH PRECISION POSITIONING SYSTEM ADAPTED TO A TERRESTRIAL MOBILE PLATFORM |
US9330494B2 (en) | 2009-10-26 | 2016-05-03 | Pictometry International Corp. | Method for the automatic material classification and texture simulation for 3D models |
WO2011062525A1 (en) | 2009-11-20 | 2011-05-26 | Saab Ab | A method estimating absolute orientation of a vehicle |
WO2011089477A1 (en) * | 2010-01-25 | 2011-07-28 | Tarik Ozkul | Autonomous decision system for selecting target in observation satellites |
JP2011155361A (en) * | 2010-01-26 | 2011-08-11 | Sony Corp | Imaging apparatus, imaging control method, and program |
CA2796162A1 (en) * | 2010-04-13 | 2012-10-04 | Visual Intelligence, LP | Self-calibrated, remote imaging and data processing system |
FR2959633B1 (en) * | 2010-04-29 | 2012-08-31 | Airbus Operations Sas | METHOD FOR UPGRADING AN AIRCRAFT |
US8477190B2 (en) | 2010-07-07 | 2013-07-02 | Pictometry International Corp. | Real-time moving platform management system |
US8823732B2 (en) | 2010-12-17 | 2014-09-02 | Pictometry International Corp. | Systems and methods for processing images with edge detection and snap-to feature |
US8552905B2 (en) | 2011-02-25 | 2013-10-08 | Raytheon Company | Automated layout of beams |
EP2719163A4 (en) | 2011-06-10 | 2015-09-09 | Pictometry Int Corp | System and method for forming a video stream containing gis data in real-time |
WO2013020158A1 (en) * | 2011-08-10 | 2013-02-14 | John Lucas | Inspecting geographically spaced features |
US8687062B1 (en) | 2011-08-31 | 2014-04-01 | Google Inc. | Step-stare oblique aerial camera system |
US8430578B1 (en) * | 2011-11-18 | 2013-04-30 | Raytheon Company | Separation of main and secondary inertial measurements for improved line of sight error of an imaging vehicle's isolated detector assembly |
US8552350B2 (en) | 2012-01-15 | 2013-10-08 | Raytheon Company | Mitigation of drift effects in secondary inertial measurements of an isolated detector assembly |
US9183538B2 (en) | 2012-03-19 | 2015-11-10 | Pictometry International Corp. | Method and system for quick square roof reporting |
ES2394540B1 (en) * | 2012-07-26 | 2013-12-11 | Geonumerics, S.L. | PROCEDURE FOR THE ACQUISITION AND PROCESSING OF GEOGRAPHICAL INFORMATION OF A TRAJECT |
IL222221B (en) * | 2012-09-27 | 2019-03-31 | Rafael Advanced Defense Systems Ltd | Improved inertial navigation system and method |
CN103148803B (en) * | 2013-02-28 | 2015-12-02 | 中国地质大学(北京) | Small-sized three-dimensional laser scanning measurement system and method |
US9881163B2 (en) | 2013-03-12 | 2018-01-30 | Pictometry International Corp. | System and method for performing sensitive geo-spatial processing in non-sensitive operator environments |
US9244272B2 (en) | 2013-03-12 | 2016-01-26 | Pictometry International Corp. | Lidar system producing multiple scan paths and method of making and using same |
US9753950B2 (en) | 2013-03-15 | 2017-09-05 | Pictometry International Corp. | Virtual property reporting for automatic structure detection |
US9275080B2 (en) | 2013-03-15 | 2016-03-01 | Pictometry International Corp. | System and method for early access to captured images |
US9441974B2 (en) * | 2013-03-15 | 2016-09-13 | Novatel Inc. | System and method for calculating lever arm values photogrammetrically |
US9430846B2 (en) | 2013-04-19 | 2016-08-30 | Ge Aviation Systems Llc | Method of tracking objects using hyperspectral imagery |
US9182236B2 (en) | 2013-10-25 | 2015-11-10 | Novatel Inc. | System for post processing GNSS/INS measurement data and camera image data |
US9528834B2 (en) | 2013-11-01 | 2016-12-27 | Intelligent Technologies International, Inc. | Mapping techniques using probe vehicles |
US9751639B2 (en) * | 2013-12-02 | 2017-09-05 | Field Of View Llc | System to control camera triggering and visualize aerial imaging missions |
WO2015081383A1 (en) * | 2013-12-04 | 2015-06-11 | Spatial Information Systems Research Ltd | Method and apparatus for developing a flight path |
KR101429166B1 (en) * | 2013-12-27 | 2014-08-13 | 대한민국 | Imaging system on aerial vehicle |
MX2016008890A (en) | 2014-01-10 | 2017-01-16 | Pictometry Int Corp | Unmanned aircraft structure evaluation system and method. |
US9292913B2 (en) | 2014-01-31 | 2016-03-22 | Pictometry International Corp. | Augmented three dimensional point collection of vertical structures |
CA2938973A1 (en) | 2014-02-08 | 2015-08-13 | Pictometry International Corp. | Method and system for displaying room interiors on a floor plan |
CN103984193B (en) * | 2014-03-14 | 2020-10-16 | 广州虹天航空科技有限公司 | Photographing apparatus stabilizer and control method thereof |
US20150358522A1 (en) * | 2014-03-31 | 2015-12-10 | Goodrich Corporation | Stabilization Of Gyro Drift Compensation For Image Capture Device |
CN103994755B (en) * | 2014-05-29 | 2016-03-30 | 清华大学深圳研究生院 | A kind of space non-cooperative object pose measuring method based on model |
CN104062687B (en) * | 2014-06-12 | 2018-08-10 | 中国航空无线电电子研究所 | A kind of earth's magnetic field joint observation method and system of vacant lot one |
US9440750B2 (en) | 2014-06-20 | 2016-09-13 | nearmap australia pty ltd. | Wide-area aerial camera systems |
US9052571B1 (en) | 2014-06-20 | 2015-06-09 | nearmap australia pty ltd. | Wide-area aerial camera systems |
US9641736B2 (en) | 2014-06-20 | 2017-05-02 | nearmap australia pty ltd. | Wide-area aerial camera systems |
US9046759B1 (en) | 2014-06-20 | 2015-06-02 | nearmap australia pty ltd. | Compact multi-resolution aerial camera system |
US9185290B1 (en) | 2014-06-20 | 2015-11-10 | Nearmap Australia Pty Ltd | Wide-area aerial camera systems |
EP3169974A2 (en) * | 2014-07-18 | 2017-05-24 | Altec S.p.A. | Image and/or radio signals capturing platform |
CN106796276A (en) | 2014-10-08 | 2017-05-31 | 斯布克费舍创新私人有限公司 | Aerocamera system |
CN105493496B (en) | 2014-12-14 | 2019-01-18 | 深圳市大疆创新科技有限公司 | A kind of method for processing video frequency, device and picture system |
PL3257168T3 (en) * | 2015-02-09 | 2019-04-30 | European Space Agency Esa | Method for creating a constellation of electronic devices for providing optical or radio-frequency operations on a predetermined geographical area, and a system of such a constellation of electronic devices |
US10754922B2 (en) | 2015-04-23 | 2020-08-25 | Lockheed Martin Corporation | Method and apparatus for sensor fusion |
US9945828B1 (en) | 2015-10-23 | 2018-04-17 | Sentek Systems Llc | Airborne multispectral imaging system with integrated navigation sensors and automatic image stitching |
US10132933B2 (en) * | 2016-02-02 | 2018-11-20 | Qualcomm Incorporated | Alignment of visual inertial odometry and satellite positioning system reference frames |
AU2017221222B2 (en) | 2016-02-15 | 2022-04-21 | Pictometry International Corp. | Automated system and methodology for feature extraction |
US10671648B2 (en) | 2016-02-22 | 2020-06-02 | Eagle View Technologies, Inc. | Integrated centralized property database systems and methods |
US10438326B2 (en) | 2017-07-21 | 2019-10-08 | The Boeing Company | Recursive suppression of clutter in video imagery |
US10453187B2 (en) | 2017-07-21 | 2019-10-22 | The Boeing Company | Suppression of background clutter in video imagery |
FR3072475B1 (en) * | 2017-10-17 | 2019-11-01 | Thales | METHOD OF PROCESSING AN ERROR DURING THE EXECUTION OF A PREDETERMINED AVIONIC PROCEDURE, COMPUTER PROGRAM AND SYSTEM FOR DETECTION AND ALERT |
US10267889B1 (en) * | 2017-11-15 | 2019-04-23 | Avalex Technologies Corporation | Laser source location system |
AT520253A3 (en) * | 2018-07-16 | 2019-04-15 | Umweltdata G M B H | Selective harvesting method |
KR20200028648A (en) * | 2018-09-07 | 2020-03-17 | 삼성전자주식회사 | Method for adjusting an alignment model for sensors and an electronic device performing the method |
CN114935331B (en) * | 2022-05-27 | 2023-05-26 | 中国科学院西安光学精密机械研究所 | Aviation camera dynamic imaging ground test method |
Family Cites Families (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3191170A (en) * | 1963-01-07 | 1965-06-22 | Gen Instrument Corp | Contour mapping system |
US4589610A (en) * | 1983-11-08 | 1986-05-20 | Westinghouse Electric Corp. | Guided missile subsystem |
US4814711A (en) * | 1984-04-05 | 1989-03-21 | Deseret Research, Inc. | Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network |
US4764781A (en) * | 1987-02-26 | 1988-08-16 | Grumman Aerospace Corporation | Universal translational and rotational film drive mechanism |
US5060175A (en) * | 1989-02-13 | 1991-10-22 | Hughes Aircraft Company | Measurement and control system for scanning sensors |
US5166789A (en) * | 1989-08-25 | 1992-11-24 | Space Island Products & Services, Inc. | Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates |
JPH04250436A (en) * | 1991-01-11 | 1992-09-07 | Pioneer Electron Corp | Image pickup device |
US5247356A (en) * | 1992-02-14 | 1993-09-21 | Ciampa John A | Method and apparatus for mapping and measuring land |
US5270756A (en) * | 1992-02-18 | 1993-12-14 | Hughes Training, Inc. | Method and apparatus for generating high resolution vidicon camera images |
US5251037A (en) * | 1992-02-18 | 1993-10-05 | Hughes Training, Inc. | Method and apparatus for generating high resolution CCD camera images |
US5477459A (en) * | 1992-03-06 | 1995-12-19 | Clegg; Philip M. | Real time three-dimensional machine locating system |
NL9202019A (en) * | 1992-11-19 | 1994-06-16 | Gatso Special Prod Bv | Method, system and vehicle for making and analyzing multispectral images. |
US5438404A (en) * | 1992-12-16 | 1995-08-01 | Aai Corporation | Gyroscopic system for boresighting equipment by optically acquiring and transferring parallel and non-parallel lines |
US5503350A (en) * | 1993-10-28 | 1996-04-02 | Skysat Communications Network Corporation | Microwave-powered aircraft |
JP2807622B2 (en) * | 1993-12-13 | 1998-10-08 | 株式会社コア | Aircraft integrated photography system |
US5467271A (en) * | 1993-12-17 | 1995-11-14 | Trw, Inc. | Mapping and analysis system for precision farming applications |
US5519620A (en) * | 1994-02-18 | 1996-05-21 | Trimble Navigation Limited | Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control |
US5490075A (en) * | 1994-08-01 | 1996-02-06 | The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration | Global positioning system synchronized active light autonomous docking system |
US5557397A (en) * | 1994-09-21 | 1996-09-17 | Airborne Remote Mapping, Inc. | Aircraft-based topographical data collection and processing system |
-
1996
- 1996-03-22 US US08/621,107 patent/US5894323A/en not_active Expired - Fee Related
-
1997
- 1997-03-21 CA CA002250063A patent/CA2250063A1/en not_active Abandoned
- 1997-03-21 AU AU23425/97A patent/AU726815B2/en not_active Ceased
- 1997-03-21 WO PCT/US1997/004668 patent/WO1997035166A1/en active Application Filing
Also Published As
Publication number | Publication date |
---|---|
US5894323A (en) | 1999-04-13 |
AU2342597A (en) | 1997-10-10 |
WO1997035166A1 (en) | 1997-09-25 |
AU726815B2 (en) | 2000-11-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5894323A (en) | Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data | |
Zhou | Near real-time orthorectification and mosaic of small UAV video flow for time-critical event response | |
US10996055B2 (en) | Integrated aerial photogrammetry surveys | |
Mian et al. | Direct georeferencing on small unmanned aerial platforms for improved reliability and accuracy of mapping without the need for ground control points | |
Rinaudo et al. | Archaeological site monitoring: UAV photogrammetry can be an answer | |
GREJNER‐BRZEZINSKA | Direct exterior orientation of airborne imagery with GPS/INS system: Performance analysis | |
Hernandez-Lopez et al. | An automatic approach to UAV flight planning and control for photogrammetric applications | |
Raczynski | Accuracy analysis of products obtained from UAV-borne photogrammetry influenced by various flight parameters | |
Toth | Sensor integration in airborne mapping | |
Zhou | Geo-referencing of video flow from small low-cost civilian UAV | |
Elbahnasawy et al. | Multi-sensor integration onboard a UAV-based mobile mapping system for agricultural management | |
Grejner-Brzezinska | Direct sensor orientation in airborne and land-based mapping applications | |
Chen et al. | Development and calibration of the airborne three-line scanner (TLS) imaging system | |
Eisenbeiss | Applications of photogrammetric processing using an autonomous model helicopter | |
Cramer | On the use of direct georeferencing in airborne photogrammetry | |
Cannata et al. | Autonomous video registration using sensor model parameter adjustments | |
Skaloud et al. | Mapping with MAV: experimental study on the contribution of absolute and relative aerial position control | |
Mostafa et al. | A fully digital system for airborne mapping | |
Ladd et al. | Rectification, georeferencing, and mosaicking of images acquired with remotely operated aerial platforms | |
Kordić et al. | Spatial data performance test of mid-cost UAS with direct georeferencing | |
Conte et al. | Evaluation of a light-weight LiDAR and a photogrammetric system for unmanned airborne mapping applications | |
Elbahnasawy | GNSS/INS-assisted multi-camera mobile mapping: System architecture, modeling, calibration, and enhanced navigation | |
Mostafa et al. | GPS/INS integrated navigation system in support of digital image georeferencing | |
Ip et al. | Fast orthophoto production using the digital sensor system | |
Toth | Direct platform orientation of multisensor data acquisition systems |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FZDE | Dead |