WO1997035166A1 - Airborne imaging system - Google Patents

Airborne imaging system Download PDF

Info

Publication number
WO1997035166A1
WO1997035166A1 PCT/US1997/004668 US9704668W WO9735166A1 WO 1997035166 A1 WO1997035166 A1 WO 1997035166A1 US 9704668 W US9704668 W US 9704668W WO 9735166 A1 WO9735166 A1 WO 9735166A1
Authority
WO
WIPO (PCT)
Prior art keywords
data
sensor
imu
aircraft
gps
Prior art date
Application number
PCT/US1997/004668
Other languages
French (fr)
Inventor
James E. Kain
Charles Yates
Original Assignee
Tasc, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tasc, Inc. filed Critical Tasc, Inc.
Priority to AU23425/97A priority Critical patent/AU726815B2/en
Publication of WO1997035166A1 publication Critical patent/WO1997035166A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • G01C11/02Picture taking arrangements specially adapted for photogrammetry or photographic surveying, e.g. controlling overlapping of pictures

Definitions

  • This invention relates to vehicle-mounted sensing systems and, more particularly, to high resolution, low cost sensing systems which operate on a moving vehicle.
  • the invention is particularly useful for airborne imaging, but is not limited to this use.
  • the remote sensing market originated with early satellites and goals of global monitoring of terrestrial activities.
  • the cost of data and the response times for obtaining data has limited the broad applicability of imagery for use in day-to-day business operations.
  • the computerized geographic information system is recognized as the information integration tool ofthe future.
  • Geographic information systems include computer tools for locating geographic coordinates of points within images, for overlaying maps and images, and for making quantitative measurements, such as areas, distances and precise locations of objects, from the images.
  • the imagery must be available at low cost and with short response times to requests for imagery.
  • the drawbacks to satellite imagery include the potential for cloud coverage (% of the earth is cloud covered), high costs and the inflexibility of satellite imagery collection systems. Custom tailored resolutions, spectral bands and responsiveness cannot be provided by satellite systems.
  • Airborne imagery is commonly used by many small scale users. Benefits of airborne imagery collection include tailorable resolution, response time and data processing methodology. Drawbacks include higher costs and the need to contract for a dedicated aircraft, flight crew and post-mission processing system.
  • a variety of airborne sensing and survey systems have been disclosed in the prior art.
  • a survey system for obtaining geophysical data with aircraft using real time differential operation ofthe global positioning system is disclosed in U.S. Pat. No. 4,814,711 issued March 21, 1989 to Olsen et al.
  • An airborne system using two color video cameras and an IR imager head mounted below the fuselage of an aircraft is disclosed in U.S. Pat. No. 5,166,789 issued November 24, 1992 to Myrick.
  • Latitude and longitude information obtained from the global positioning system is recorded on each image frame.
  • a system including a CCD camera and a global positioning system receiver for recording an image signal and position information on magnetic tape is disclosed in U.S. Pat. No.
  • a remote data collection system in accordance with the invention comprises a directional sensor for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, a global positioning system (GPS) receiver for providing GPS data representative of position and velocity of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of attitude rate and acceleration ofthe sensor, a processing unit and a data storage unit.
  • the processing unit determines IMU errors in response to the IMU data and an IMU error model.
  • the processing unit also determines geographic data referenced to the sensor data in response to the GPS data, the IMU data and the IMU errors.
  • the sensor data and the geographic data are stored in the data storage unit for subsequent use.
  • the sensor may comprise one or more cameras, and the sensor data may represent images.
  • GPS global positioning system
  • the control unit may comprise a gimbal, a support member rigidly mounted to the vehicle, a first motor connected between the support member and the gimbal for rotating the gimbal about a first axis relative to the support member and a second motor connected between the gimbal and the stabilized platform for rotating the stabilized platform about a second axis relative to the gimbal.
  • the direction of the sensor is stabilized with respect to the first and second axes as the vehicle moves.
  • the vehicle comprises an aircraft, and the control unit stabilizes the sensing direction ofthe sensor in a vertical orientation with respect to pitch and roll of the aircraft.
  • All facets of the invention are optimized for low operational cost and rapid response.
  • the sensor and the IMU are integrated to a readily detachable vehicle component (e.g. cargo door) so that no modifications to the vehicle are required.
  • the IMU used for sensor stabilization is also used for attitude determination and navigation, thus reducing cost.
  • a mission planning/vehicle steering command system is included to allow one man operation for cost reduction and operational ease of use.
  • FIG. 1 is a block diagram of an embodiment of an airborne imaging system in accordance with the present invention
  • FIG. 2 is a pictorial diagram showing the components ofthe airborne imaging system of FIG. 1;
  • FIG. 3 is a pictorial diagram illustrating the airborne imaging system of FIG. 1 installed on an aircraft;
  • FIG. 4 is a top schematic view showing the stabilized platform assembly mounted on the door of an aircraft
  • FIG. 5 is a top view showing the stabilized platform assembly in more detail;
  • FIG. 6 is an exploded view of the stabilized platform assembly;
  • FIG. 7 is a software flow diagram that illustrates operation of the airborne imaging system.
  • FIG. 8 is a pictorial diagram that illustrates an example of an airborne survey mission.
  • FIG. 1 A block diagram of an embodiment of an imaging system in accordance with the invention is shown in FIG. 1.
  • the imaging system includes an airborne data collection system 10 and a - ground processing workstation 12. and makes use of a GPS ground station 14.
  • the airborne data collection system 10 is typically mounted in an aircraft and is used for obtaining images of a prescribed survey area of the earth.
  • the imaging system may be used to obtain images of a prescribed agricultural area or forest area.
  • the ground processing workstation 12 is used to define a trajectory to be followed by the aircraft in order to obtain images of the survey area with complete coverage.
  • the ground processing workstation 12 may also be used for post-mission processing of image data and for administrative functions.
  • the airborne data collection system 10 in general, includes a directional sensor for generating sensor data, a global positioning system (GPS) receiver for providing GPS data representative ofthe position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative ofthe attitude ofthe sensor, a processing unit responsive to the GPS data and the IMU data for providing geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data.
  • the geographic data establishes the ground coordinates ofthe sensor data with high resolution.
  • the airborne data collection system 10 may include a stabilized platform assembly for stabilizing the direction ofthe sensor during aircraft flight.
  • the data collection system of the invention may be used in ground vehicles as well as in aircraft.
  • an embodiment of the airborne data collection system 10 includes cameras 20, 22 and 24.
  • the cameras 20, 22 and 24 may, for example, be charge coupled device (CCD) cameras with filters having different spectral responses.
  • the cameras 20, 22 and 24 supply image data and synchronization signals to a frame grabber 28.
  • the frame grabber 28 supplies image data representative of individual images obtained by each ofthe cameras to a system computer 30.
  • a disk storage unit 32 connected to the system computer 30 is used for storage of image data and geographic data.
  • the airborne data collection system 10 further includes an inertial measurement unit 40 that provides IMU data to the system computer 30 through an IMU interface 42.
  • the inertial measurement unit 40 is rigidly mechanically connected to cameras 20, 22 and 24 and typically senses acceleration and rotation rate with respect to three coordinate axes.
  • a GPS receiver 46 receives positioning signals from GPS satellites through a GPS antenna 48.
  • the GPS receiver 46 also receives positioning signals from the GPS ground station 14 through a data link antenna 50 and an RF modem 52.
  • the GPS receiver 46 supplies GPS data and a GPS clock to the system computer 30.
  • the GPS data accurately represents the position of GPS receiver 46 and therefore represents the position of cameras 20. 22 and 24.
  • the system uses differential GPS for steering the vehicle to accuracies of 1 -3 m.
  • a kinematic GPS processing procedure is applied post-mission to allow determination of position accuracy to the 10 cm level.
  • a roll motor 60 stabilizes the cameras 20. 22 and 24 with respect to aircraft roll
  • a pitch motor 62 stabilizes cameras 20, 22 and 24 with respect to aircraft pitch
  • Roll motor 60 is energized by system computer 30 through a motor amplifier 64
  • pitch motor 62 is energized by system computer 30 through a motor amplifier 66.
  • Each ofthe motors 60 and 62 includes an encoder which provides to system computer 30 a signal representative of motor angle with respect to a reference angle. Operation of the stabilized platform assembly is described in detail below.
  • a power supply 70 receives aircraft power, typically 28 volts, and supplies operating power to the components ofthe airborne data collection system 10.
  • a graphical display 72 is connected to system computer 30. As described below, the display 72 provides commands to the pilot when the aircraft deviates from a preplanned trajectory over the survey area.
  • FIG. 2 is a pictorial diagram that illustrates a preferred configuration ofthe airborne data collection system 10.
  • a stabilized platform assembly includes cameras 20, 22 and 24, IMU 40, motors 60 and 62 and additional components described below.
  • the stabilized platform assembly 80 is mounted to a cargo door 82 of an aircraft (not shown in FIG. 2).
  • the airborne data collection system 10 further includes an electronics unit 84.
  • T e electronics unit 84 includes system computer 30, disk storage unit 32, GPS receiver 46, RF modem 52. frame grabber 28, IMU interface 42, motor amplifiers 64 and 66, power supply 70 and display 72.
  • the electronics unit 84 is interconnected to stabilized platform assembly 80 by a cable 86.
  • FIG. 3 is a pictorial diagram illustrating installation ofthe airborne data collection system of the present invention in an aircraft 90.
  • stabilized platform assembly 80 is preferably mounted on cargo door 82.
  • Electronics unit 84 is positioned in the cargo area of the aircraft 90.
  • GPS antenna 48 may be mounted on the upper surface of the aircraft, and data link antenna 50 may be mounted on the underside of the aircraft.
  • the pilot display 76 is positioned for convenient viewing by the pilot.
  • FIG. 4 is a pictorial diagram of the stabilized platform 80 mounted on cargo door 82.
  • FIG. 5 shows the stabilized platform assembly in more detail
  • FIG. 6 shows an exploded view of the stabilized platform assembly.
  • Cameras 20, 22 and 24 are rigidly mounted to a stabilized platform 100.
  • the stabilized platform 100 extends through an opening in cargo door 82 such that cameras 20, 22 and 24 are located externally of the aircraft.
  • the opening in cargo door 82 provides sufficient clearance to permit movement of stabilized platform 100 with respect to the aircraft.
  • the opening is a vertically-oriented slot to permit pitch and roll stabilization of the cameras.
  • the cameras 20, 22 and 24 are protected by a cowling 102 that is open at the bottom.
  • the IMU 40 is rigidly mounted to a portion of stabilized platform 100 within the aircraft.
  • a support member 102 is mounted to an inside surface of cargo door 82, and roll motor 60 is secured to an inwardly-extending portion of support member 102.
  • the shaft of roll motor 60 is connected to a gimbal 106.
  • Pitch motor 62 is mounted to gimbal 106, and the shaft of pitch motor 62 is secured to stabilized platform 100.
  • cameras 20, 22 and 24 are mounted in a camera housing 120, which is attached to stabilized platform 100.
  • Stabilized platform 100 is connected to gimbal 106 by a pivot pin 122 and is connected to the shaft of pitch motor 62 by a flange 124.
  • Gimbal 106 is connected to the shaft of roll motor 60 by a flange 126.
  • the roll motor 60 was a Hathaway type HT03802 brushless DC motor
  • the pitch motor 62 was a Hathaway type HT02301 brushless DC motor
  • the motor amplifiers 64 and 66 were Hathaway type BLC048 motor amplifiers. In operation, the IMU 40 senses changes in velocity and angle in three coordinate directions.
  • IMU 40 Since cameras 20, 22 and 24 and IMU 40 are rigidly connected to stabilized platform 100, velocity changes and angle changes sensed by IMU 40 represent velocity and angle changes of cameras 20. 22 and 24.
  • IMU data representative ofthe velocity and angle changes is supplied to system computer 30.
  • the system computer 30 uses the angle changes to determine deviations of the attitude of the cameras from a desired attitude. These deviations are used to generate error signals which are supplied through motor amplifiers 64 and 66 to roll motor 60 and pitch motor 62. respectively.
  • the roll motor 60 rotates cameras 20. 22 and 24 with respect to roll axis 1 10.
  • pitch motor 62 rotates cameras 20. 22 and 24 with respect to pitch axis 1 12. so as to maintain a desired attitude.
  • the cameras 20 are supplied to stabilized platform 100.
  • the cameras 22 and 24 are maintained in a vertical attitude with respect to the earth's surface.
  • other boresight directions may be utilized.
  • the cameras may be scanned, for example, with respect to the roll axis 1 10 to obtain images of a wider strip on each pass over the survey area.
  • the stabilized platform assembly 80 provides advantageous operation ofthe imaging system. Because the cameras 20, 22 and 24 are stabilized, typically in a vertical orientation, the spacing between adjacent aircraft passes over the survey area can be increased without risking loss of coverage between images in adjacent passes. This is possible because it is not necessary to account for inadvertent aircraft roll in determining the spacing between passes. By increasing the spacing between adjacent passes ofthe aircraft trajectory, the time and cost for completing a given survey is reduced. In an alternative approach, the cameras are scanned with respect to the roll axis at a rate relative to the aircraft speed which permits imaging of a wider strip than is possible with stationary cameras.
  • the stabilized platform assembly 80 shown in FIGS. 4-6 and described above, provides stabilization with respect to the pitch and roll axes ofthe aircraft.
  • the stabilized platform assembly is simplified to provide stabilization with respect to the roll axis only.
  • the shaft of roll motor 60 may be connected directly to stabilized platform 100 so as to rotate cameras 20, 22 and 24 with respect to roll axis 110. It is believed that stabilization of the cameras 20, 22 and 24 with respect to the yaw axis (perpendicular to axes 110 and 112) would not provide substantial benefits in the operation ofthe imaging system.
  • Mounting ofthe stabilized platform assembly 80 on the aircraft door provides significant practical advantages in operation of the imaging system. In general, it is desired to install the imaging system ofthe present invention in arbitrary aircraft. One alternative is to install the cameras in a hole cut in the floor ofthe aircraft. However, this requires a special modification to the aircraft and requires certification of the installation by the FAA. Such a hole is unlikely to be acceptable to many aircraft owners. Wing mounting ofthe camera assembly is undesirable for similar reasons. Mounting of the stabilized platform assembly on the cargo door provides an attractive solution. Cessna 172 aircraft, for example, have a cargo door that is easily removable. Other small, four-passenger commercial aircraft have a similar cargo door which may be modified for installation of the stabilized platform assembly.
  • a 6x6 inch hole is cut in the lower interior portion of the dual wall aluminum door structure.
  • the support member 102. having a box structure, is used to carry torque from the pitch and roll motors.
  • a torque transfer stiffener 130 (FIG. 6), 14 inches in length, is part of the support member 102 and transfers the roll motor torque in the vertical plane of the door.
  • a vertically-oriented slot is cut in the door to allow the camera support portion of the stabilized platform 100 to pass through the door to the exterior of the aircraft.
  • the cameras used in the imaging system may include three compact monochrome CCD cameras. Such cameras are available from numerous suppliers.
  • a preferred camera is the Sony XC-7500, which provides 640 x 480 pixel resolution in non-interlace (progressive scan) mode.
  • the cameras typically use a 16 mm lens with an f-stop of 2.8.
  • Different filters can be utilized in the camera lens to provide different spectral responses. For example, red, green, blue and near infrared filters may be utilized to obtain different information regarding the survey area.
  • a color image can be formed by using red, green and blue filters.
  • the frame grabber may be a Mu-tech model M-1000 which allows access to up to 4 cameras simultaneously.
  • the imaging system has been described thus far with reference to a configuration utilizing three cameras. It will be understood that any number of cameras may be utilized. More generally, any sensor having a boresight direction for sensing may be utilized for data collection. Thus, for example, the sensor may be a laser system, an atmospheric pollution sensor, a thermal camera, a radar system or any other suitable sensor.
  • the IMU may be a Honeywell H-1700 system, which has an accuracy characterized by 10° per hour gyro accuracy. While higher accuracy IMU's are available, the cost is also higher. In order to utilize a low-cost IMU with moderate accuracy, an error model ofthe IMU is utilized as described below.
  • the GPS receiver 46 may comprise an eight-channel Motorola Encore airborne unit, and the GPS ground station 14 may comprise an 8-channel Motorola Encore differential GPS base station.
  • the Motorola Encore is a C/A code unit with capability for using differential corrections transmitted by the base station.
  • the GPS receiver 46 is connected by a coaxial cable to GPS antenna 48. installed on the upper surface ofthe aircraft.
  • a true kinematic GPS system is a preferred implementation to achieve accuracies of 10 cm or better.
  • the RF modem 52 which provides the differential GPS datalink to GPS ground station 14. may be a Pacific Crest RFM 96S radio modem, capable of two-way communication at 9600 baud using a carrier frequency of 460 MHz. This system provides approximately a 100 mile radius of coverage with a 15 watt transmitter and omnidirectional datalink antenna 50.
  • the system computer 30 may comprise an industry standard model PCI single board computer, which utilizes a P5 150 MHz processor. I/O functions are handled by a model ATC40 carrier board available from Greenspring, which provides four Industry Pack (IP) board slots for tailoring the I/O functions performed by the board.
  • IP Industry Pack
  • One IP board is the IP-ADIO available from Greenspring, which provides analog-to-digital, digital-to-analog and discrete digital I/O functions.
  • This IP board receives IMU data and the GPS clock and supplies motor control signals to the motor amplifiers 64 and 66.
  • An IP servo board decodes the motor encoder signals received from roll motor 60 and pitch motor 62.
  • the disk storage unit 32 must have sufficient storage volume and a sufficient data transfer rate to store image data supplied by the frame grabber 28.
  • a sufficient data transfer rate to store image data supplied by the frame grabber 28.
  • a nominal time for an imaging survey mission may be 3 hours, with over 2 hours assumed for actual image collection. The remaining time is spent flying to and from the survey area and for turnarounds after completion of each swath. Two hours of imaging will generate a 7.2 gigabyte imagery file at a 1.0 megabyte per second storage rate.
  • the GPS data and IMU data recorded during the mission contributes only 0.4 gigabyte of additional storage, for a total of 7.6 gigabytes.
  • One example of a suitable disk unit is the Seagate Elite-9, having 9 gigabytes of storage and 11 milliseconds access time.
  • the standard SCSI disk drive interface allows storage throughput up to 1.5 megabytes per second.
  • the imaging system of the present invention utilizes direct digital photography and digital storage of spatially registered imagery.
  • Other airborne video systems use a videotape system as the airborne image storage medium. This allows several hours of imagery to be captured at a 30 Hz image rate. However, videotape does not capture the full resolution or the full dynamic range ofthe CCD camera systems.
  • All known airborne video systems offer videotape ofthe surveyed terrain, with frames tagged with GPS positions.
  • the system uses a display to provide steering cues to the pilot to maintain the appropriate flight line.
  • An Accuphoto system provided by Genysis Comm. Inc. is used for this purpose.
  • the Accuphoto system provides a software tool for use in planning the mission, resulting in a software file defining the mission profile in GPS coordinates.
  • the onboard GPS receiver is then used to provide pilot cues via a simple LCD display indicating need for a left/right correction and the magnitude ofthe correction.
  • stabilized platform assembly 80 is mounted on a cargo door ofthe aircraft provides a number of advantages in operation ofthe imaging system.
  • the stabilized platform assembly may be omitted from the imaging system.
  • the cameras are rigidly mounted to the aircraft, and the IMU data is used to compensate the image data for aircraft pitch, roll and yaw.
  • the cameras or other sensors are not necessarily mounted on the aircraft door.
  • the cameras may be mounted in a hole in the floor of the aircraft, in a pod beneath the aircraft or on one of the wings.
  • the ground processing workstation 12 performs survey mission planning and post-mission processing.
  • the ground processing workstation 12 may be implemented using a PC-based graphical workstation and commercially available geographic information system tools (GIS) such as Arcview available from ESRI.
  • GIS geographic information system tools
  • Several mission planning functions are provided by the ground processing workstation. It allows the user to view a digital line graph (map) database, available from commercial sources, depicting the survey area of interest. The user selects the boundary points to define the survey area selected for mission coverage. An aircraft trajectory is computed from a takeoff point to the survey area with sufficient passes over the survey area to provide a high probability of coverage of the selected area at a specified resolution and aircraft flight time. Multiple missions are prescribed where required. The aircraft trajectory is displayed to the user superimposed over a digital map of the area of interest.
  • GIS geographic information system tools
  • the aircraft trajectory defined by waypoints, and image collection start/stop times are stored on a floppy disk for entering into the airborne data collection system 10.
  • System parameters such as camera setup parameters (frame resolution and angular field of view), aircraft parameters (endurance, velocity and turn radius) and mission descriptors (airport location and percent frame overlap) may be modified.
  • An example of a mission trajectory 250 is illustrated in FIG. 8.
  • Post-mission processing functions of the ground processing workstation 1 include extracting data from the disk storage unit 32 ofthe airborne data collection system 10. Registration of the individual images onto a geodetic reference frame and combining the images into a contiguous imagery file stored in a standard GIS format.
  • the post-processing functions use GIS tools that are similar to those used for mission planning.
  • the mass storage media in the ground processing workstation 12 is compatible with the disk storage unit 32 in the airborne data collection system 10. Files in the disk storage unit 32 may be copied to the mass storage system in the ground processing workstation 12.
  • the ground processing workstation accesses the stored files and registers each file individually into the ground plane. Multiple neighboring individual images from a single mission can be overlaid onto a common geodetic grid.
  • All or a selected subset of the image frames may be mosaiced onto a geodetic reference grid.
  • GIS tools may be used to scroll, zoom and perform measurements on the mosaiced imagery.
  • Imagery operations may be performed individually or on weighted combinations of the images from the three cameras.
  • the end user makes arrangements to lease an aircraft and performs mission planning.
  • the ground processing workstation 12 uses the ground processing workstation 12, the user lays out the survey area to be imaged by selecting boundary points on a digital map.
  • the ground processing workstation considers the endurance and turning properties ofthe aircraft to be used, the base airport location and camera parameters. This allows the automatic design of a three- dimensional trajectory for the aircraft, with image collection points selected for ideal coverage of the survey area.
  • the trajectory, or multiple mission trajectories, is presented to the user for approval. Higher resolution requirements require more passes and possibly additional missions.
  • a three-hour mission with two hours ofthe flight time collecting image data provides over 20,000 acres (32 square miles) of image coverage at a one foot image resolution.
  • the user obtains from the ground processing workstation a floppy disk that contains the digital specifications of the trajectory (X, Y, Z position versus time) and the image collection points.
  • the airborne data collection system is installed on the aircraft, and a checkout of all subsystems is performed automatically. Upon valid checkout, the aircraft is ready to begin the mission.
  • the pilot display leads the pilot through the mission from takeoff to landing, although the pilot can exit and re-enter the trajectory waypoint files at any time, if desired.
  • the display also provides the current status of the mission, for example, flight legs completed, time to next turn, loss of GPS lock, or the occurrence of any anomalies which might result in loss of data.
  • image data from the cameras 20. 22 and 24 is stored on the disk storage unit 32.
  • the stabilized platform assembly maintains the cameras in a vertical orientation as described above.
  • GPS data, representative of position of the cameras, and IMU data, representative of attitude of the cameras, is simultaneously stored on the disk storage unit 32.
  • Each image frame has corresponding GPS data and IMU data, so that the image data may be spatially registered with high accuracy.
  • the airborne data collection system may be deinstalled from the aircraft.
  • the electronics unit 84 can be connected to the ground processing workstation 12, so that the stored data may be transferred to the ground processing workstation.
  • the individual image frames are transferred to the storage media in the ground processing workstation and are registered in the ground plane using the GPS data and IMU data stored with the images.
  • the image data can immediately be registered because position coordinates for each image pixel are known. This allows preparation of a contiguous registered image ofthe survey area and review of this image on the workstation using standard GIS tools. Maps showing roads, cultural features, hydrology and the like can easily be overlaid on the image. The user can now use the image in any desired manner.
  • an airborne imaging service may be established to support a higher volume operation.
  • the service organization may offer mission planning support to users or may accept mission description disks from users who operate their own ground processing workstations. For example, an individual farmer or chemical supplier may request once per week imagery of his acreage to precisely time harvest and/or chemical application for maximum yield.
  • the software in the system computer 30 ofthe airborne data collection system 10 is required to perform the following functions.
  • the IMU data is processed to provide a strapdown navigation solution propagating the position, velocity and attitude ofthe camera axes.
  • the strapdown navigation solution is combined with the GPS velocity data to provide a transfer alignment resulting in the attitude of the camera boresight relative to North. East and down.
  • the three CCD cameras are commanded to obtain imagery in a synchronous manner with the GPS data and the IMU data.
  • Precise GPS timing is used to synchronize all data collection functions.
  • a trajectory manager monitors the current aircraft position relative to the desired trajectory and provides commands to the pilot indicating the degree of error in horizontal and vertical planes.
  • the two-axis stabilization system uses the measured camera boresight attitude. IMU rotation rates and motor encoder values to control the camera axes to point in a commanded direction, nominally down.
  • the image frame data, GPS data. IMU data, attitude solution and system help status are logged on the disk storage system.
  • ⁇ strapdown navigation routine 200 propagates the position, velocity and attitude ofthe IMU coordinate axes forward in time at a data rate of 100 Hz using digital measurements of change in velocity and change in angle.
  • the inputs to the routine 200 are the IMU data samples at a 100 Hz rate and initialization values for the IMU attitude. Additionally, attitude error values are input from the transfer alignment Kalman filter 10 202.
  • the outputs are (1) latitude, longitude and altitude, (2) North, East, down components of velocity, and (3) attitude Euler angles (roll, pitch and yaw) relating the IMU axes to the North. East and down axes.
  • the transfer alignment Kalman filter 202 merges the GPS velocity measurement and the strapdown navigation routine output to produce an estimate ofthe error in the IMU axes attitude 15 computation.
  • Inputs include the GPS velocity measurements and the strapdown navigation solution synchronized in time.
  • the lever arm (in aircraft body axes) from the GPS antenna phase center to the IMU location is required.
  • an IMU error model 204 representing the statistics of the IMU errors is also required.
  • the transfer alignment process utilizes a Kalman filter formulation based upon the IMU error model.
  • the outputs include the 20 IMU attitude errors which are supplied to the strapdown navigation routine 200 as corrections. Errors in the IMU gyro and accelerometer instruments are logged to address the IMU in-flight performance.
  • a sensor boresight stabilization module 208 processes IMU attitude rates, motor encoder values and IMU attitude data to control the pitch and roll motors so as to properly point the 25 camera boresight to the desired direction.
  • Inputs include motor encoder values.
  • the stabilization module includes two linear control systems identical in structure but having different gains to accommodate the different inertias presented to the motors. Stabilization is performed at a 100 Hz data rate synchronized with the IMU interrupt.
  • a conventional 30 proportional integral derivative control is used. The proportional term comes from the pitch or roll attitude errors, and the derivative terms come from the IMU rate gyro measurements.
  • Coordinate transformations must be applied to both the Euler angles and the rotation rates to account for the specific Euler angle set mechanized by the gimbal.
  • the control gains are selected by a knowledge of the various control inertias, motor gain, and the amplifier gain and the selected bandwidth of the control loops.
  • the pitch and roll bandwidths are selected at 10 Hz.
  • the motor encoder data is not normally used within the control loop. However, this data is used to determine the orientation of the IMU axes relative to the aircraft for lever arm calculations and to determine the proximity of the gimbals to their mechanical limits.
  • An image projection module 210 manages camera image frame collection and buffer storage.
  • Inputs include frame time synchronization from the IMU 100 Hz interrupt, image memory addresses, kinematic GPS position and attitude angles from the strapdown navigation routine 200.
  • the image synchronization is controlled by the IMU interrupts at 100 Hz with nominal frame rates of 1 to 2 Hz, i.e. 50 to 100 IMU samples between frame collection events.
  • the frame collection commands use Mu-tech routines which provide frame triggering and synchronization ofthe three cameras.
  • the outputs are memory mapped image frames for the three CCD cameras.
  • the real-time software consists of three modules: 1) the strapdown navigation routine 200,
  • the strapdown navigation routine 200 consists of integration of 6-degree-of- freedom equations with a body-fixed coordinate system.
  • the body coordinate system has the z-axis fixed to the camera boresight axis, the x-axis nominally pointed forward and the y-axis to the right of the motion.
  • the internal coordinate system used for navigation is the Earth-Centered-Earth- Fixed (ECEF) system. Accelerations and rotation rates are integrated from the initial assumptions of attitude and using the GPS measured velocity components.
  • the equations consider a WGS-84 datum for all computations for compliance with GPS.
  • the strapdown computations are performed at a 100 Hz rate, which coincides with the availability of the IMU data.
  • the transfer alignment routine implements a 22 state Kalman filter with stages of Covariance Propagation and state/covariance update at each measurement.
  • the filter states include velocity errors, attitude errors, accelerometer biases, gyro biases, accelerometer scale factor errors, gyro scale factor errors with each of these error terms containing x, y. and z components ( 18 individual terms).
  • An additional state is used to represent the time latency between the GPS and IMU measurement devices.
  • Three additional states are used to represent the time-integral of the velocity (average velocity) over a 200 msec window prior to each GPS one pulse-per-second (1PPS) time point. This velocity average is used to model exactly the functioning ofthe specific GPS receiver used in the preferred embodiment.
  • the Kalman filter propagates elements of the covariance matrix and state between each 1PPS GPS time and performs a covariance and state update at each GPS time.
  • the resulting attitude errors and IMU instrument errors are fed back into the strapdown navigation routine 200 to act as a continuous source of calibration. This allows use of small, low-cost IMU devices which are currently being manufactured by several vendors.
  • the pitch and roll motors are controlled by a conventional proportional, integral, derivative (PID) controller implemented in the gimbal command routine.
  • PID proportional, integral, derivative
  • the rate gyros from the IMU provide the necessary rate feedback and the transfer alignment, coupled with the strapdown navigation routine, provides the position feedback.
  • the three PID gains are derived from knowledge ofthe inertias and desired bandwidth ofthe closed loop system.
  • This pointing system differs from conventional systems in that the IMU used for navigation is placed on the inner gimbal ofthe stabilized platform, which is also directly attached to the camera/sensor package. This is enabled by availability of small, low-cost IMU components, direct drive servo motors and camera sensors.

Abstract

A remote data collection system, which may be used in a vehicle such as an aircraft or a ground vehicle, includes a directional sensor, such as one or more cameras, for sensing a characteristic of interest and providing sensor data. The system further includes a global positioning system (GPS) receiver for providing GPS data representative of the position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of the attitude of the sensor, a processing unit and a storage unit. The processing unit determines geographic data referenced to the sensor data in response to the GPS data and the IMU data. The processing unit may utilize an error model to determine IMU errors which may be used in determining the geographic data with high accuracy. The sensor data and the geographic data are stored in the data storage unit for subsequent use. The system may include a stabilized platform on which the sensor and the IMU are mounted. The stabilized platform is rotated about at least one axis of rotation to control the sensing direction of the sensor as the vehicle moves.

Description

AIRBORNE IMAGING SYSTEM
Field of the Invention
This invention relates to vehicle-mounted sensing systems and, more particularly, to high resolution, low cost sensing systems which operate on a moving vehicle. The invention is particularly useful for airborne imaging, but is not limited to this use.
Background of the Invention
The remote sensing market originated with early satellites and goals of global monitoring of terrestrial activities. However, the cost of data and the response times for obtaining data has limited the broad applicability of imagery for use in day-to-day business operations. Nevertheless, for relatively small scale users, such as farmers, city planners, utilities managers and forest managers, the computerized geographic information system is recognized as the information integration tool ofthe future. Geographic information systems include computer tools for locating geographic coordinates of points within images, for overlaying maps and images, and for making quantitative measurements, such as areas, distances and precise locations of objects, from the images. Applications as diverse as (1) a farmer selecting chemical application strategies based upon expected crop yield predictions, (2) a tax assessor directing a manual inspection of an observed dwelling addition, or (3) assessing timber yields and timber harvest costs from multiple property tracts are but a few ofthe numerous potential applications of imagery and geographic information systems.
The profit structure ofthe agricultural industry is heavily dominated by the use of chemicals (fertilizers, pesticides and herbicides), with a trend toward the use of more chemicals. However, since the EPA and world environmental organizations recognize long term hazards of unabated chemical treatments, chemical regulations are throttling the agricultural business. Of equal significance as populations increase, more land is cleared for farming, and reduced food yields per acre lead to higher deforestation and even greater threats to the environment. The recognized answer to these global scale problems is metered usage of chemicals, such that the chemicals are used where they are of maximum benefit. Metered chemical distribution systems are in wide use. However, the data to define the spatial metering values are lacking. Multispectral imagery with sub-meter resolution and spatial registration is required. The imagery must be available at low cost and with short response times to requests for imagery. A satellite system known as the SPOT satellite, sponsored by the government of France, is representative of current operational satellite capabilities. This system provides 10 meter resolution panchromatic imagery or 20 meter resolution imagery in the visible/near infrared bands. Geodetic registration is accurate to 15 meters in the U.S. where ground control points are plentiful and well surveyed. Experience has shown that the response time for imagery requests is usually no better than 10 days. The drawbacks to satellite imagery include the potential for cloud coverage (% of the earth is cloud covered), high costs and the inflexibility of satellite imagery collection systems. Custom tailored resolutions, spectral bands and responsiveness cannot be provided by satellite systems. Airborne imagery is commonly used by many small scale users. Benefits of airborne imagery collection include tailorable resolution, response time and data processing methodology. Drawbacks include higher costs and the need to contract for a dedicated aircraft, flight crew and post-mission processing system.
A variety of airborne sensing and survey systems have been disclosed in the prior art. A survey system for obtaining geophysical data with aircraft using real time differential operation ofthe global positioning system is disclosed in U.S. Pat. No. 4,814,711 issued March 21, 1989 to Olsen et al. An airborne system using two color video cameras and an IR imager head mounted below the fuselage of an aircraft is disclosed in U.S. Pat. No. 5,166,789 issued November 24, 1992 to Myrick. Latitude and longitude information obtained from the global positioning system is recorded on each image frame. A system including a CCD camera and a global positioning system receiver for recording an image signal and position information on magnetic tape is disclosed in U.S. Pat. No. 5,267,042 issued November 30, 1993 to Tsuchiya et al. A technique for airborne imaging wherein multiple overlapping images are superimposed by observing a stationary object that appears in adjacent images is disclosed in U.S. Pat. No. 5, 247,356 issued September 21 , 1993 to Ciampa. A technique for generating high resolution images from a CCD camera in an aircraft is disclosed in U.S. Pat. No. 5,251,037 issued October 5. 1993 to Busenberg. A technique for generating high resolution vidicon aerial images is disclosed in U.S. Pat. No. 5,270,756 issued December 14, 1993 to Busenberg. An airborne contour mapping system is disclosed in U.S. Pat. No. 3.191,170 issued June 22. 1965 to Lustig et al. A technique for remote sensing using inertial navigation systems and the global positioning system for georeferencing of remotely sensed data is described by K.P. Schwarz et al. in Photogrammetric Engineering & Remote Sensing. Vol. 59. No. 11. November. 1993, pp. 1667-1674. The examples above are characterized by components integrated tightly to the aircraft so that a dedicated aircraft is required. None of the prior art airborne imaging systems have been practical from the viewpoint of small scale users with respect to cost, resolution, flexibility and response time.
Summary of the Invention According to the present invention, methods and apparatus for remote data collection are provided. The invention is used in a vehicle such as an aircraft or a ground vehicle. A remote data collection system in accordance with the invention comprises a directional sensor for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, a global positioning system (GPS) receiver for providing GPS data representative of position and velocity of the sensor, an inertial measurement unit (IMU) for providing IMU data representative of attitude rate and acceleration ofthe sensor, a processing unit and a data storage unit. The processing unit determines IMU errors in response to the IMU data and an IMU error model. The processing unit also determines geographic data referenced to the sensor data in response to the GPS data, the IMU data and the IMU errors. The sensor data and the geographic data are stored in the data storage unit for subsequent use. The sensor may comprise one or more cameras, and the sensor data may represent images.
According to another aspect ofthe invention, a remote data collection system for use in a vehicle comprises a stabilized platform, a directional sensor rigidly mounted to the stabilized platform for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, an inertial measurement unit (IMU) rigidly mounted to the stabilized platform for providing IMU data representative of attitude ofthe sensor, a control unit responsive to the IMU data for rotating the stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction ofthe sensor as the vehicle moves, a global positioning system (GPS) receiver for providing GPS data representative of position of the sensor, a processing system responsive to the GPS data and the IMU data for determining geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data for subsequent use. The control unit may comprise a gimbal, a support member rigidly mounted to the vehicle, a first motor connected between the support member and the gimbal for rotating the gimbal about a first axis relative to the support member and a second motor connected between the gimbal and the stabilized platform for rotating the stabilized platform about a second axis relative to the gimbal. The direction of the sensor is stabilized with respect to the first and second axes as the vehicle moves. In a preferred embodiment, the vehicle comprises an aircraft, and the control unit stabilizes the sensing direction ofthe sensor in a vertical orientation with respect to pitch and roll of the aircraft.
All facets of the invention are optimized for low operational cost and rapid response. The sensor and the IMU are integrated to a readily detachable vehicle component (e.g. cargo door) so that no modifications to the vehicle are required. The IMU used for sensor stabilization is also used for attitude determination and navigation, thus reducing cost. A mission planning/vehicle steering command system is included to allow one man operation for cost reduction and operational ease of use.
Brief Description of the Drawings
For a better understanding of the present invention, reference is made to the accompanying drawings, which are incorporated herein by reference and in which:
FIG. 1 is a block diagram of an embodiment of an airborne imaging system in accordance with the present invention;
FIG. 2 is a pictorial diagram showing the components ofthe airborne imaging system of FIG. 1; FIG. 3 is a pictorial diagram illustrating the airborne imaging system of FIG. 1 installed on an aircraft;
FIG. 4 is a top schematic view showing the stabilized platform assembly mounted on the door of an aircraft;
FIG. 5 is a top view showing the stabilized platform assembly in more detail; FIG. 6 is an exploded view ofthe stabilized platform assembly;
FIG. 7 is a software flow diagram that illustrates operation of the airborne imaging system; and
FIG. 8 is a pictorial diagram that illustrates an example of an airborne survey mission.
Detailed Description
A block diagram of an embodiment of an imaging system in accordance with the invention is shown in FIG. 1. The imaging system includes an airborne data collection system 10 and a - ground processing workstation 12. and makes use of a GPS ground station 14. The airborne data collection system 10 is typically mounted in an aircraft and is used for obtaining images of a prescribed survey area of the earth. For example, the imaging system may be used to obtain images of a prescribed agricultural area or forest area. As described in detail below, the ground processing workstation 12 is used to define a trajectory to be followed by the aircraft in order to obtain images of the survey area with complete coverage. The ground processing workstation 12 may also be used for post-mission processing of image data and for administrative functions.
The airborne data collection system 10, in general, includes a directional sensor for generating sensor data, a global positioning system (GPS) receiver for providing GPS data representative ofthe position of the sensor, an inertial measurement unit (IMU) for providing IMU data representative ofthe attitude ofthe sensor, a processing unit responsive to the GPS data and the IMU data for providing geographic data referenced to the sensor data, and a data storage unit for storing the sensor data and the geographic data. The geographic data establishes the ground coordinates ofthe sensor data with high resolution. The airborne data collection system 10 may include a stabilized platform assembly for stabilizing the direction ofthe sensor during aircraft flight. The data collection system of the invention may be used in ground vehicles as well as in aircraft.
Referring again to FIG. 1, an embodiment of the airborne data collection system 10 includes cameras 20, 22 and 24. The cameras 20, 22 and 24 may, for example, be charge coupled device (CCD) cameras with filters having different spectral responses. The cameras 20, 22 and 24 supply image data and synchronization signals to a frame grabber 28. The frame grabber 28 supplies image data representative of individual images obtained by each ofthe cameras to a system computer 30. A disk storage unit 32 connected to the system computer 30 is used for storage of image data and geographic data. The airborne data collection system 10 further includes an inertial measurement unit 40 that provides IMU data to the system computer 30 through an IMU interface 42. The inertial measurement unit 40 is rigidly mechanically connected to cameras 20, 22 and 24 and typically senses acceleration and rotation rate with respect to three coordinate axes. A GPS receiver 46 receives positioning signals from GPS satellites through a GPS antenna 48. The GPS receiver 46 also receives positioning signals from the GPS ground station 14 through a data link antenna 50 and an RF modem 52. The GPS receiver 46 supplies GPS data and a GPS clock to the system computer 30. As known in the art. the GPS data accurately represents the position of GPS receiver 46 and therefore represents the position of cameras 20. 22 and 24. The system uses differential GPS for steering the vehicle to accuracies of 1 -3 m. A kinematic GPS processing procedure is applied post-mission to allow determination of position accuracy to the 10 cm level.
When the airborne data collection system 10 includes a stabilized platform assembly, at least one stabilizing motor is provided. In the example of FIG. 1 , a roll motor 60 stabilizes the cameras 20. 22 and 24 with respect to aircraft roll, and a pitch motor 62 stabilizes cameras 20, 22 and 24 with respect to aircraft pitch. Roll motor 60 is energized by system computer 30 through a motor amplifier 64, and pitch motor 62 is energized by system computer 30 through a motor amplifier 66. Each ofthe motors 60 and 62 includes an encoder which provides to system computer 30 a signal representative of motor angle with respect to a reference angle. Operation of the stabilized platform assembly is described in detail below. A power supply 70 receives aircraft power, typically 28 volts, and supplies operating power to the components ofthe airborne data collection system 10. A graphical display 72 is connected to system computer 30. As described below, the display 72 provides commands to the pilot when the aircraft deviates from a preplanned trajectory over the survey area.
FIG. 2 is a pictorial diagram that illustrates a preferred configuration ofthe airborne data collection system 10. A stabilized platform assembly includes cameras 20, 22 and 24, IMU 40, motors 60 and 62 and additional components described below. In a preferred embodiment, the stabilized platform assembly 80 is mounted to a cargo door 82 of an aircraft (not shown in FIG. 2). The airborne data collection system 10 further includes an electronics unit 84. T e electronics unit 84 includes system computer 30, disk storage unit 32, GPS receiver 46, RF modem 52. frame grabber 28, IMU interface 42, motor amplifiers 64 and 66, power supply 70 and display 72. The electronics unit 84 is interconnected to stabilized platform assembly 80 by a cable 86. FIG. 3 is a pictorial diagram illustrating installation ofthe airborne data collection system of the present invention in an aircraft 90. As indicated above, stabilized platform assembly 80 is preferably mounted on cargo door 82. Electronics unit 84 is positioned in the cargo area of the aircraft 90. GPS antenna 48 may be mounted on the upper surface of the aircraft, and data link antenna 50 may be mounted on the underside of the aircraft. The pilot display 76 is positioned for convenient viewing by the pilot.
FIG. 4 is a pictorial diagram of the stabilized platform 80 mounted on cargo door 82. FIG. 5 shows the stabilized platform assembly in more detail, and FIG. 6 shows an exploded view of the stabilized platform assembly. Cameras 20, 22 and 24 are rigidly mounted to a stabilized platform 100. The stabilized platform 100 extends through an opening in cargo door 82 such that cameras 20, 22 and 24 are located externally of the aircraft. The opening in cargo door 82 provides sufficient clearance to permit movement of stabilized platform 100 with respect to the aircraft. In a preferred embodiment, the opening is a vertically-oriented slot to permit pitch and roll stabilization of the cameras. The cameras 20, 22 and 24 are protected by a cowling 102 that is open at the bottom. The IMU 40 is rigidly mounted to a portion of stabilized platform 100 within the aircraft. A support member 102 is mounted to an inside surface of cargo door 82, and roll motor 60 is secured to an inwardly-extending portion of support member 102. The shaft of roll motor 60 is connected to a gimbal 106. Pitch motor 62 is mounted to gimbal 106, and the shaft of pitch motor 62 is secured to stabilized platform 100. As shown in FIGS. 5 and 6. cameras 20, 22 and 24 are mounted in a camera housing 120, which is attached to stabilized platform 100. Stabilized platform 100 is connected to gimbal 106 by a pivot pin 122 and is connected to the shaft of pitch motor 62 by a flange 124. Gimbal 106 is connected to the shaft of roll motor 60 by a flange 126.
When the roll motor 60 is energized, cameras 20, 22 and 24, stabilized platform 100, IMU 40, gimbal 106 and pitch motor 62 are rotated with respect to a roll axis 110. When the pitch motor 62 is energized, cameras 20, 22 and 24, stabilized platform 100 and IMU 40 are rotated with respect to a pitch axis 112. T e nominal direction of flight of the aircraft is indicated in FIG. 4 by arrow 114.
In an example of the stabilized platform assembly 80, the roll motor 60 was a Hathaway type HT03802 brushless DC motor, and the pitch motor 62 was a Hathaway type HT02301 brushless DC motor. The motor amplifiers 64 and 66 were Hathaway type BLC048 motor amplifiers. In operation, the IMU 40 senses changes in velocity and angle in three coordinate directions.
Since cameras 20, 22 and 24 and IMU 40 are rigidly connected to stabilized platform 100, velocity changes and angle changes sensed by IMU 40 represent velocity and angle changes of cameras 20. 22 and 24. IMU data representative ofthe velocity and angle changes is supplied to system computer 30. The system computer 30 uses the angle changes to determine deviations of the attitude of the cameras from a desired attitude. These deviations are used to generate error signals which are supplied through motor amplifiers 64 and 66 to roll motor 60 and pitch motor 62. respectively. The roll motor 60 rotates cameras 20. 22 and 24 with respect to roll axis 1 10. and pitch motor 62 rotates cameras 20. 22 and 24 with respect to pitch axis 1 12. so as to maintain a desired attitude. Typically, the cameras 20. 22 and 24 are maintained in a vertical attitude with respect to the earth's surface. However, other boresight directions may be utilized. Furthermore, the cameras may be scanned, for example, with respect to the roll axis 1 10 to obtain images of a wider strip on each pass over the survey area.
The stabilized platform assembly 80 provides advantageous operation ofthe imaging system. Because the cameras 20, 22 and 24 are stabilized, typically in a vertical orientation, the spacing between adjacent aircraft passes over the survey area can be increased without risking loss of coverage between images in adjacent passes. This is possible because it is not necessary to account for inadvertent aircraft roll in determining the spacing between passes. By increasing the spacing between adjacent passes ofthe aircraft trajectory, the time and cost for completing a given survey is reduced. In an alternative approach, the cameras are scanned with respect to the roll axis at a rate relative to the aircraft speed which permits imaging of a wider strip than is possible with stationary cameras. The stabilized platform assembly 80, shown in FIGS. 4-6 and described above, provides stabilization with respect to the pitch and roll axes ofthe aircraft. In another configuration, the stabilized platform assembly is simplified to provide stabilization with respect to the roll axis only. In this configuration, the shaft of roll motor 60 may be connected directly to stabilized platform 100 so as to rotate cameras 20, 22 and 24 with respect to roll axis 110. It is believed that stabilization of the cameras 20, 22 and 24 with respect to the yaw axis (perpendicular to axes 110 and 112) would not provide substantial benefits in the operation ofthe imaging system.
Mounting ofthe stabilized platform assembly 80 on the aircraft door provides significant practical advantages in operation of the imaging system. In general, it is desired to install the imaging system ofthe present invention in arbitrary aircraft. One alternative is to install the cameras in a hole cut in the floor ofthe aircraft. However, this requires a special modification to the aircraft and requires certification of the installation by the FAA. Such a hole is unlikely to be acceptable to many aircraft owners. Wing mounting ofthe camera assembly is undesirable for similar reasons. Mounting of the stabilized platform assembly on the cargo door provides an attractive solution. Cessna 172 aircraft, for example, have a cargo door that is easily removable. Other small, four-passenger commercial aircraft have a similar cargo door which may be modified for installation of the stabilized platform assembly. A 6x6 inch hole is cut in the lower interior portion of the dual wall aluminum door structure. The support member 102. having a box structure, is used to carry torque from the pitch and roll motors. A torque transfer stiffener 130 (FIG. 6), 14 inches in length, is part of the support member 102 and transfers the roll motor torque in the vertical plane of the door. A vertically-oriented slot is cut in the door to allow the camera support portion of the stabilized platform 100 to pass through the door to the exterior of the aircraft. When the imaging system ofthe invention is to be used in an aircraft, the standard cargo door is removed and is replaced with a cargo door having a preinstalled stabilized platform assembly. The electronics unit 84 is placed in the cargo area of the aircraft.
The cameras used in the imaging system may include three compact monochrome CCD cameras. Such cameras are available from numerous suppliers. A preferred camera is the Sony XC-7500, which provides 640 x 480 pixel resolution in non-interlace (progressive scan) mode. The cameras typically use a 16 mm lens with an f-stop of 2.8. Different filters can be utilized in the camera lens to provide different spectral responses. For example, red, green, blue and near infrared filters may be utilized to obtain different information regarding the survey area. A color image can be formed by using red, green and blue filters. The frame grabber may be a Mu-tech model M-1000 which allows access to up to 4 cameras simultaneously.
The imaging system has been described thus far with reference to a configuration utilizing three cameras. It will be understood that any number of cameras may be utilized. More generally, any sensor having a boresight direction for sensing may be utilized for data collection. Thus, for example, the sensor may be a laser system, an atmospheric pollution sensor, a thermal camera, a radar system or any other suitable sensor.
The IMU may be a Honeywell H-1700 system, which has an accuracy characterized by 10° per hour gyro accuracy. While higher accuracy IMU's are available, the cost is also higher. In order to utilize a low-cost IMU with moderate accuracy, an error model ofthe IMU is utilized as described below. The GPS receiver 46 may comprise an eight-channel Motorola Encore airborne unit, and the GPS ground station 14 may comprise an 8-channel Motorola Encore differential GPS base station. The Motorola Encore is a C/A code unit with capability for using differential corrections transmitted by the base station. The GPS receiver 46 is connected by a coaxial cable to GPS antenna 48. installed on the upper surface ofthe aircraft. A true kinematic GPS system is a preferred implementation to achieve accuracies of 10 cm or better.
The RF modem 52. which provides the differential GPS datalink to GPS ground station 14. may be a Pacific Crest RFM 96S radio modem, capable of two-way communication at 9600 baud using a carrier frequency of 460 MHz. This system provides approximately a 100 mile radius of coverage with a 15 watt transmitter and omnidirectional datalink antenna 50.
The system computer 30 may comprise an industry standard model PCI single board computer, which utilizes a P5 150 MHz processor. I/O functions are handled by a model ATC40 carrier board available from Greenspring, which provides four Industry Pack (IP) board slots for tailoring the I/O functions performed by the board. One IP board is the IP-ADIO available from Greenspring, which provides analog-to-digital, digital-to-analog and discrete digital I/O functions. This IP board receives IMU data and the GPS clock and supplies motor control signals to the motor amplifiers 64 and 66. An IP servo board decodes the motor encoder signals received from roll motor 60 and pitch motor 62.
The disk storage unit 32 must have sufficient storage volume and a sufficient data transfer rate to store image data supplied by the frame grabber 28. Consider an airborne mission requiring one foot per pixel image resolution and an aircraft speed of 100 knots at one frame per second. The 640 x 480 pixel single camera image frame provides a 65% overlap between images (480 pixel dimension along direction of motion). At one byte per pixel for each of the three cameras, a data storage rate of 0.92 megabyte per image, or 0.92 megabyte per second, is required. A conservative 1.5 megabyte per second storage rate is used as the nominal image storage transfer rate specification.
A nominal time for an imaging survey mission may be 3 hours, with over 2 hours assumed for actual image collection. The remaining time is spent flying to and from the survey area and for turnarounds after completion of each swath. Two hours of imaging will generate a 7.2 gigabyte imagery file at a 1.0 megabyte per second storage rate. The GPS data and IMU data recorded during the mission (0.04 megabyte per second for 3 hours) contributes only 0.4 gigabyte of additional storage, for a total of 7.6 gigabytes. One example of a suitable disk unit is the Seagate Elite-9, having 9 gigabytes of storage and 11 milliseconds access time. The standard SCSI disk drive interface allows storage throughput up to 1.5 megabytes per second.
The imaging system of the present invention utilizes direct digital photography and digital storage of spatially registered imagery. Other airborne video systems use a videotape system as the airborne image storage medium. This allows several hours of imagery to be captured at a 30 Hz image rate. However, videotape does not capture the full resolution or the full dynamic range ofthe CCD camera systems. All known airborne video systems offer videotape ofthe surveyed terrain, with frames tagged with GPS positions. The system uses a display to provide steering cues to the pilot to maintain the appropriate flight line. An Accuphoto system provided by Genysis Comm. Inc. is used for this purpose. The Accuphoto system provides a software tool for use in planning the mission, resulting in a software file defining the mission profile in GPS coordinates. The onboard GPS receiver is then used to provide pilot cues via a simple LCD display indicating need for a left/right correction and the magnitude ofthe correction.
The configuration described above wherein stabilized platform assembly 80 is mounted on a cargo door ofthe aircraft provides a number of advantages in operation ofthe imaging system. However, other configurations may be utilized within the scope ofthe present invention. For example, the stabilized platform assembly may be omitted from the imaging system. In this configuration, the cameras are rigidly mounted to the aircraft, and the IMU data is used to compensate the image data for aircraft pitch, roll and yaw. Furthermore, the cameras or other sensors are not necessarily mounted on the aircraft door. For example, the cameras may be mounted in a hole in the floor of the aircraft, in a pod beneath the aircraft or on one of the wings. The ground processing workstation 12 performs survey mission planning and post-mission processing. The ground processing workstation 12 may be implemented using a PC-based graphical workstation and commercially available geographic information system tools (GIS) such as Arcview available from ESRI. Several mission planning functions are provided by the ground processing workstation. It allows the user to view a digital line graph (map) database, available from commercial sources, depicting the survey area of interest. The user selects the boundary points to define the survey area selected for mission coverage. An aircraft trajectory is computed from a takeoff point to the survey area with sufficient passes over the survey area to provide a high probability of coverage of the selected area at a specified resolution and aircraft flight time. Multiple missions are prescribed where required. The aircraft trajectory is displayed to the user superimposed over a digital map of the area of interest. The aircraft trajectory, defined by waypoints, and image collection start/stop times are stored on a floppy disk for entering into the airborne data collection system 10. System parameters such as camera setup parameters (frame resolution and angular field of view), aircraft parameters (endurance, velocity and turn radius) and mission descriptors (airport location and percent frame overlap) may be modified. An example of a mission trajectory 250 is illustrated in FIG. 8.
Post-mission processing functions of the ground processing workstation 1 include extracting data from the disk storage unit 32 ofthe airborne data collection system 10. registration of the individual images onto a geodetic reference frame and combining the images into a contiguous imagery file stored in a standard GIS format. The post-processing functions use GIS tools that are similar to those used for mission planning. The mass storage media in the ground processing workstation 12 is compatible with the disk storage unit 32 in the airborne data collection system 10. Files in the disk storage unit 32 may be copied to the mass storage system in the ground processing workstation 12. The ground processing workstation accesses the stored files and registers each file individually into the ground plane. Multiple neighboring individual images from a single mission can be overlaid onto a common geodetic grid. All or a selected subset of the image frames may be mosaiced onto a geodetic reference grid. GIS tools may be used to scroll, zoom and perform measurements on the mosaiced imagery. Imagery operations may be performed individually or on weighted combinations of the images from the three cameras.
In one operating mode, the end user makes arrangements to lease an aircraft and performs mission planning. Using the ground processing workstation 12, the user lays out the survey area to be imaged by selecting boundary points on a digital map. As indicated above, the ground processing workstation considers the endurance and turning properties ofthe aircraft to be used, the base airport location and camera parameters. This allows the automatic design of a three- dimensional trajectory for the aircraft, with image collection points selected for ideal coverage of the survey area. The trajectory, or multiple mission trajectories, is presented to the user for approval. Higher resolution requirements require more passes and possibly additional missions. A three-hour mission with two hours ofthe flight time collecting image data provides over 20,000 acres (32 square miles) of image coverage at a one foot image resolution.
Following the trajectory design stage, the user obtains from the ground processing workstation a floppy disk that contains the digital specifications of the trajectory (X, Y, Z position versus time) and the image collection points. The airborne data collection system is installed on the aircraft, and a checkout of all subsystems is performed automatically. Upon valid checkout, the aircraft is ready to begin the mission. The pilot display leads the pilot through the mission from takeoff to landing, although the pilot can exit and re-enter the trajectory waypoint files at any time, if desired. The display also provides the current status of the mission, for example, flight legs completed, time to next turn, loss of GPS lock, or the occurrence of any anomalies which might result in loss of data.
During image frame recording over the survey area, image data from the cameras 20. 22 and 24 is stored on the disk storage unit 32. The stabilized platform assembly maintains the cameras in a vertical orientation as described above. GPS data, representative of position of the cameras, and IMU data, representative of attitude of the cameras, is simultaneously stored on the disk storage unit 32. Each image frame has corresponding GPS data and IMU data, so that the image data may be spatially registered with high accuracy.
After the mission is completed, the airborne data collection system may be deinstalled from the aircraft. The electronics unit 84 can be connected to the ground processing workstation 12, so that the stored data may be transferred to the ground processing workstation. The individual image frames are transferred to the storage media in the ground processing workstation and are registered in the ground plane using the GPS data and IMU data stored with the images. The image data can immediately be registered because position coordinates for each image pixel are known. This allows preparation of a contiguous registered image ofthe survey area and review of this image on the workstation using standard GIS tools. Maps showing roads, cultural features, hydrology and the like can easily be overlaid on the image. The user can now use the image in any desired manner.
In the above scenario, the end user was responsible for all aspects of the survey mission. In other scenarios, an airborne imaging service may be established to support a higher volume operation. In this case, the service organization may offer mission planning support to users or may accept mission description disks from users who operate their own ground processing workstations. For example, an individual farmer or chemical supplier may request once per week imagery of his acreage to precisely time harvest and/or chemical application for maximum yield.
The software in the system computer 30 ofthe airborne data collection system 10 is required to perform the following functions. The IMU data is processed to provide a strapdown navigation solution propagating the position, velocity and attitude ofthe camera axes. The strapdown navigation solution is combined with the GPS velocity data to provide a transfer alignment resulting in the attitude of the camera boresight relative to North. East and down. The three CCD cameras are commanded to obtain imagery in a synchronous manner with the GPS data and the IMU data. Precise GPS timing is used to synchronize all data collection functions. A trajectory manager monitors the current aircraft position relative to the desired trajectory and provides commands to the pilot indicating the degree of error in horizontal and vertical planes. The two-axis stabilization system uses the measured camera boresight attitude. IMU rotation rates and motor encoder values to control the camera axes to point in a commanded direction, nominally down. The image frame data, GPS data. IMU data, attitude solution and system help status are logged on the disk storage system.
A block diagram that illustrates the interrelationship of the software modules in the airborne 5 data collection system is shown in FIG. 7. Λ strapdown navigation routine 200 propagates the position, velocity and attitude ofthe IMU coordinate axes forward in time at a data rate of 100 Hz using digital measurements of change in velocity and change in angle. The inputs to the routine 200 are the IMU data samples at a 100 Hz rate and initialization values for the IMU attitude. Additionally, attitude error values are input from the transfer alignment Kalman filter 10 202. The outputs are (1) latitude, longitude and altitude, (2) North, East, down components of velocity, and (3) attitude Euler angles (roll, pitch and yaw) relating the IMU axes to the North. East and down axes.
The transfer alignment Kalman filter 202 merges the GPS velocity measurement and the strapdown navigation routine output to produce an estimate ofthe error in the IMU axes attitude 15 computation. Inputs include the GPS velocity measurements and the strapdown navigation solution synchronized in time. Additionally, the lever arm (in aircraft body axes) from the GPS antenna phase center to the IMU location is required. Finally, an IMU error model 204 representing the statistics of the IMU errors is also required. The transfer alignment process utilizes a Kalman filter formulation based upon the IMU error model. The outputs include the 20 IMU attitude errors which are supplied to the strapdown navigation routine 200 as corrections. Errors in the IMU gyro and accelerometer instruments are logged to address the IMU in-flight performance.
A sensor boresight stabilization module 208 processes IMU attitude rates, motor encoder values and IMU attitude data to control the pitch and roll motors so as to properly point the 25 camera boresight to the desired direction. Inputs include motor encoder values. IMU rotation rates, strapdown navigation attitude. Euler angles and timing signals from the IMU interrupt. The stabilization module includes two linear control systems identical in structure but having different gains to accommodate the different inertias presented to the motors. Stabilization is performed at a 100 Hz data rate synchronized with the IMU interrupt. A conventional 30 proportional integral derivative control is used. The proportional term comes from the pitch or roll attitude errors, and the derivative terms come from the IMU rate gyro measurements. Coordinate transformations must be applied to both the Euler angles and the rotation rates to account for the specific Euler angle set mechanized by the gimbal. The control gains are selected by a knowledge of the various control inertias, motor gain, and the amplifier gain and the selected bandwidth of the control loops. The pitch and roll bandwidths are selected at 10 Hz. The motor encoder data is not normally used within the control loop. However, this data is used to determine the orientation of the IMU axes relative to the aircraft for lever arm calculations and to determine the proximity of the gimbals to their mechanical limits.
An image projection module 210 manages camera image frame collection and buffer storage. Inputs include frame time synchronization from the IMU 100 Hz interrupt, image memory addresses, kinematic GPS position and attitude angles from the strapdown navigation routine 200. The image synchronization is controlled by the IMU interrupts at 100 Hz with nominal frame rates of 1 to 2 Hz, i.e. 50 to 100 IMU samples between frame collection events. The frame collection commands use Mu-tech routines which provide frame triggering and synchronization ofthe three cameras. The outputs are memory mapped image frames for the three CCD cameras. The real-time software consists of three modules: 1) the strapdown navigation routine 200,
2) the transfer alignment routine implemented in Kalman filter 202, and 3) the gimbal command routine implemented in the sensor boresight stabilization module 208.
The strapdown navigation routine 200 consists of integration of 6-degree-of- freedom equations with a body-fixed coordinate system. The body coordinate system has the z-axis fixed to the camera boresight axis, the x-axis nominally pointed forward and the y-axis to the right of the motion. The internal coordinate system used for navigation is the Earth-Centered-Earth- Fixed (ECEF) system. Accelerations and rotation rates are integrated from the initial assumptions of attitude and using the GPS measured velocity components. The equations consider a WGS-84 datum for all computations for compliance with GPS. The strapdown computations are performed at a 100 Hz rate, which coincides with the availability of the IMU data.
The transfer alignment routine implements a 22 state Kalman filter with stages of Covariance Propagation and state/covariance update at each measurement. The filter states include velocity errors, attitude errors, accelerometer biases, gyro biases, accelerometer scale factor errors, gyro scale factor errors with each of these error terms containing x, y. and z components ( 18 individual terms). An additional state is used to represent the time latency between the GPS and IMU measurement devices. Three additional states are used to represent the time-integral of the velocity (average velocity) over a 200 msec window prior to each GPS one pulse-per-second (1PPS) time point. This velocity average is used to model exactly the functioning ofthe specific GPS receiver used in the preferred embodiment. The Kalman filter propagates elements of the covariance matrix and state between each 1PPS GPS time and performs a covariance and state update at each GPS time. The resulting attitude errors and IMU instrument errors are fed back into the strapdown navigation routine 200 to act as a continuous source of calibration. This allows use of small, low-cost IMU devices which are currently being manufactured by several vendors.
The pitch and roll motors are controlled by a conventional proportional, integral, derivative (PID) controller implemented in the gimbal command routine. The rate gyros from the IMU provide the necessary rate feedback and the transfer alignment, coupled with the strapdown navigation routine, provides the position feedback. The three PID gains are derived from knowledge ofthe inertias and desired bandwidth ofthe closed loop system. This pointing system differs from conventional systems in that the IMU used for navigation is placed on the inner gimbal ofthe stabilized platform, which is also directly attached to the camera/sensor package. This is enabled by availability of small, low-cost IMU components, direct drive servo motors and camera sensors.
While there have been shown and described what are at present considered the preferred embodiments ofthe present invention, it will be obvious to those skilled in the art that various changes and modifications may be made therein without departing from the scope of the invention as defined by the appended claims.

Claims

What is claimed is
1 A remote data collection system for use m a vehicle as the vehicle moves, said system compπsing a sensor for sensing a characteristic of interest and providing sensor data representative ot the characteristic of interest, said sensor having a sensing direction, a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor under moving conditions, an inertial measurement unit (IMU) for providing IMU data representative of attitude of said sensor, a processing unit responsive to said IMU data and an IMU error model for determining IMU errors, and responsive to said GPS data, said IMU data and said IMU errors for determining geographic data referenced to said sensor data, and a data storage unit for stoπng said sensor data and said geographic data for subsequent use
2 A sensing system as defined in claim 1 wherein said sensor compπses a camera and said sensor data represents an image.
3 A sensing system as defined in claim 1 wherein said sensor comprises a charge coupled device (CCD) camera and wherein said sensor data represents an image.
4 A sensing system as defined in claim 1 wherein said sensor comprises a plurality of CCD cameras, each having a different spectral characteristic, and wherein said sensor data represents images in different spectral ranges.
5 A sensing system as defined in claim 1 wherein said vehicle is an aircraft and wherein said sensing system is used for airborne imaging of a predetermined region
6 A sensing system as defined in claim 1 wherein said GPS receiver comprises a differential GPS receiver.
7. A sensing system as defined in claim 1 wherein said GPS receiver comprises a kinematic- capable GPS receiver.
8. A sensing system as defined in claim 1 further comprising a stabilized platform to which said sensor and said IMU are rigidly mounted and a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction of said sensor as the vehicle moves.
9. A sensing system as defined in claim 8 wherein said vehicle comprises an aircraft and wherein said control unit includes means for rotating said stabilized platform about pitch and roll axes with respect to the aircraft to maintain the sensing direction of said sensor substantially vertical during flight.
10. A sensing system as defined in claim 9 wherein said sensor, said IMU and said stabilized platform are affixed to a door of the aircraft.
11. A sensing system as defined in claim 10 wherein said sensor comprises a plurality of CCD cameras, each having a different spectral characteristic.
12. A sensing system as defined in claim 9 further comprising means responsive to a desired aircraft trajectory and said GPS data for indicating deviations of the aircraft from the desired aircraft trajectory.
13. A remote data collection system for use in a vehicle as the vehicle moves, said system comprising: a stabilized platform; a sensor rigidly mounted to said stabilized platform for sensing a characteristic of interest and providing sensor data representative of the characteristic of interest, said sensor having a sensinc direction: an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said sensor; a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the vehicle to control the sensing direction of said sensor as 5 the vehicle moves: a global positioning system (GPS) receiver for providing GPS data representative of position of said sensor; a processing system responsive to said GPS data and said IMU data for determining geographic data referenced to said sensor data; and 0 a data storage unit for storing said sensor data and said geographic data for subsequent use.
14. A sensing system as defined in claim 13 wherein said control unit comprises a gimbal, a support member rigidly mounted to the vehicle, a first motor connected between said support member and said gimbal for rotating said gimbal about a first axis relative to said support 5 member and a second motor connected between said gimbal and said stabilized platform for rotating said stabilized platform about a second axis relative to said gimbal, wherein the sensing direction of said sensor is stabilized with respect to said first and second axes as the vehicle moves.
0 15. A sensing system as defined in claim 14 wherein said vehicle comprises an aircraft and wherein said control unit stabilizes the sensing direction of said sensor in a vertical orientation with respect to pitch and roll ofthe aircraft.
16. A sensing system as defined in claim 13 wherein said control unit comprises a support 5 member rigidly mounted to the vehicle and a motor connected between said support member and said stabilized platform for rotating said stabilized platform about an axis of rotation relative to said fixed frame, wherein the direction of said sensor is stabilized with respect to said axis of rotation as the vehicle moves.
30 17. A sensing system as defined in claim 16 wherein said vehicle comprises an aircraft and wherein said control unit stabilizes the sensing direction of said sensor in a vertical orientation with respect to roll of the aircraft.
18. An imaging system for use in an aircraft, said imaging system comprising: 5 a stabilized platform; a camera system rigidly mounted to said stabilized platform for providing image data, said camera system having a boresight direction; an inertial measurement unit (IMU) rigidly mounted to said stabilized platform for providing IMU data representative of attitude of said camera system; 0 a control unit responsive to said IMU data for rotating said stabilized platform about at least one axis of rotation with respect to the aircraft to control the boresight direction of said camera system during flight; a global positioning system (GPS) receiver for providing GPS data representative of position of said camera system; 5 a processing unit responsive to said GPS data and said IMU data for determining geographic data referenced to said image data; and a data storage unit for storing said image data and said geographic data for subsequent use.
19. An imaging system as defined in claim 18 wherein said camera system comprises a plurality 0 of CCD cameras, each having a different spectral response.
20. An imaging system as defined in claim 18 wherein said aircraft includes a cargo door and wherein said camera system, said IMU and said stabilized platform are mounted to said cargo door. 5
21. An imaging system as defined in claim 18 further comprising means responsive to a desired aircraft trajectory and said GPS data for indicating deviations of the aircraft from the desired aircraft trajectory.
30 22. An imaging system as defined in claim 18 wherein said GPS receiver comprises a differential GPS receiver.
23. An imaging system as defined in ciaim 18 wherein said GPS receiver comprises a kinematic GPS receiver.
24. An imaging system as defined in claim 18 wherein said control unit includes means for rotating said stabilized platform about pitch and roll axes with respect to the aircraft to maintain the boresight direction of said camera system substantially vertical during flight.
25. An imaging system as defined in claim 18 wherein said processing unit further comprises means responsive to said IMU data and an IMU error model for determining IMU errors and wherein said processing unit is responsive to said GPS data, said IMU data and said IMU errors for determining said geographic data with high accuracy.
PCT/US1997/004668 1996-03-22 1997-03-21 Airborne imaging system WO1997035166A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU23425/97A AU726815B2 (en) 1996-03-22 1997-03-21 Airborne imaging system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US08/621,107 US5894323A (en) 1996-03-22 1996-03-22 Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data
US08/621,107 1996-03-22

Publications (1)

Publication Number Publication Date
WO1997035166A1 true WO1997035166A1 (en) 1997-09-25

Family

ID=24488754

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/004668 WO1997035166A1 (en) 1996-03-22 1997-03-21 Airborne imaging system

Country Status (4)

Country Link
US (1) US5894323A (en)
AU (1) AU726815B2 (en)
CA (1) CA2250063A1 (en)
WO (1) WO1997035166A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0919787A1 (en) * 1997-11-28 1999-06-02 Mitsumi Electric Co., Ltd. Navigation data receiving device
GB2342242A (en) * 1998-09-25 2000-04-05 Environment Agency Environmental data collection system
DE10010366A1 (en) * 2000-03-07 2001-09-27 Ekos Entwicklung Und Konstrukt Digital recording method of aerial photographs, involves comparing flight data with specific data for selectively downloading aerial photograph data from memory to ground station
GB2368219A (en) * 2000-09-13 2002-04-24 Roke Manor Research Camera system with GPS
EP1930689A1 (en) * 2005-08-31 2008-06-11 PASCO Corporation Laser distance measurement device and laser distance measurement method
WO2007041696A3 (en) * 2005-10-04 2009-04-23 Eugene J Alexander System and method for calibrating a set of imaging devices and calculating 3d coordinates of detected features in a laboratory coordinate system
WO2011062525A1 (en) 2009-11-20 2011-05-26 Saab Ab A method estimating absolute orientation of a vehicle
US8223208B2 (en) 2005-11-10 2012-07-17 Motion Analysis Corporation Device and method for calibrating an imaging device for generating three dimensional surface models of moving objects
US8848035B2 (en) 2005-10-04 2014-09-30 Motion Analysis Corporation Device for generating three dimensional surface models of moving objects
WO2016009402A3 (en) * 2014-07-18 2016-04-21 Altec S.P.A. Image and/or radio signals capturing platform
US9430846B2 (en) 2013-04-19 2016-08-30 Ge Aviation Systems Llc Method of tracking objects using hyperspectral imagery
AT520253A3 (en) * 2018-07-16 2019-04-15 Umweltdata G M B H Selective harvesting method
US10358235B2 (en) 2008-04-11 2019-07-23 Nearmap Australia Pty Ltd Method and system for creating a photomap using a dual-resolution camera system
US10358234B2 (en) 2008-04-11 2019-07-23 Nearmap Australia Pty Ltd Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features

Families Citing this family (164)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10361802B1 (en) 1999-02-01 2019-07-23 Blanding Hovenweep, Llc Adaptive pattern recognition based control system and method
US8352400B2 (en) 1991-12-23 2013-01-08 Hoffberg Steven M Adaptive pattern recognition based controller apparatus and method and human-factored interface therefore
DE19714396A1 (en) * 1997-04-08 1998-10-15 Zeiss Carl Fa Photogrammetric camera used in aircraft or satellite
US6597818B2 (en) * 1997-05-09 2003-07-22 Sarnoff Corporation Method and apparatus for performing geo-spatial registration of imagery
WO1999007139A1 (en) * 1997-07-30 1999-02-11 Pinotage, L.L.C. Imaging device
JP3833786B2 (en) * 1997-08-04 2006-10-18 富士重工業株式会社 3D self-position recognition device for moving objects
DE19752559B4 (en) * 1997-11-27 2004-01-22 Honeywell Ag Procedure for guiding aircraft on taxiways
US6281970B1 (en) * 1998-03-12 2001-08-28 Synergistix Llc Airborne IR fire surveillance system providing firespot geopositioning
US6172470B1 (en) * 1998-04-30 2001-01-09 Trw Inc. Large aperture precision gimbal drive module
US6714240B1 (en) * 1998-06-23 2004-03-30 Boeing North American, Inc. Optical sensor employing motion compensated integration-device and process
US6023241A (en) * 1998-11-13 2000-02-08 Intel Corporation Digital multimedia navigation player/recorder
US6205400B1 (en) * 1998-11-27 2001-03-20 Ching-Fang Lin Vehicle positioning and data integrating method and system thereof
US7966078B2 (en) 1999-02-01 2011-06-21 Steven Hoffberg Network media appliance system and method
DE19950033B4 (en) * 1999-10-16 2005-03-03 Bayerische Motoren Werke Ag Camera device for vehicles
US6965397B1 (en) 1999-11-22 2005-11-15 Sportvision, Inc. Measuring camera attitude
US7143130B2 (en) * 1999-12-09 2006-11-28 Ching-Fang Lin Portable multi-tracking method and system
US6298286B1 (en) * 1999-12-17 2001-10-02 Rockwell Collins Method of preventing potentially hazardously misleading attitude data
US6535114B1 (en) * 2000-03-22 2003-03-18 Toyota Jidosha Kabushiki Kaisha Method and apparatus for environment recognition
US6281797B1 (en) 2000-04-04 2001-08-28 Marconi Data Systems Inc. Method and apparatus for detecting a container proximate to a transportation vessel hold
US6734796B2 (en) 2000-04-04 2004-05-11 Ian J. Forster Self-check for a detector detecting the proximity of a transportation vessel
US6373521B1 (en) * 2000-07-19 2002-04-16 Kevin D. Carter Aircraft incident surveillance system
US6421610B1 (en) * 2000-09-15 2002-07-16 Ernest A. Carroll Method of preparing and disseminating digitized geospatial data
US20020184348A1 (en) * 2000-09-20 2002-12-05 Lockheed Martin Corporation Object oriented framework architecture for sensing and/or control environments
EP1319203A2 (en) * 2000-09-20 2003-06-18 Lockheed Martin Corporation Object oriented framework architecture for sensing and/or control environments
US6622090B2 (en) * 2000-09-26 2003-09-16 American Gnc Corporation Enhanced inertial measurement unit/global positioning system mapping and navigation process
JP2002135758A (en) * 2000-10-20 2002-05-10 Yazaki Corp On-vehicle transmitting system, receiving apparatus and transmitting apparatus for video data
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US7899243B2 (en) 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
US20020067424A1 (en) * 2000-12-01 2002-06-06 Brunner Joseph F. Environmentally sealed cameras for mounting externally on aircraft and systems for using the same
US6424804B1 (en) * 2000-12-27 2002-07-23 Cessna Aircraft Company Modular airborne flir support and extension structure
US20040257441A1 (en) * 2001-08-29 2004-12-23 Geovantage, Inc. Digital imaging system for airborne applications
US20030048357A1 (en) * 2001-08-29 2003-03-13 Geovantage, Inc. Digital imaging system for airborne applications
FR2830097B1 (en) * 2001-09-21 2004-02-20 Univ Compiegne Tech PROCESS FOR TAKING MOTION IMAGES
AU2002328690B2 (en) * 2001-10-11 2007-10-25 Cgg Data Services Ag Airborne geophysical measurements
US6759979B2 (en) * 2002-01-22 2004-07-06 E-Businesscontrols Corp. GPS-enhanced system and method for automatically capturing and co-registering virtual models of a site
IL149934A (en) * 2002-05-30 2007-05-15 Rafael Advanced Defense Sys Airborne reconnaissance system
JP4181800B2 (en) * 2002-06-20 2008-11-19 Nec東芝スペースシステム株式会社 Topographic measurement system, storage medium, and program using stereo image
US6831599B2 (en) * 2002-08-26 2004-12-14 Honeywell International Inc. Remote velocity sensor slaved to an integrated GPS/INS
US7725258B2 (en) * 2002-09-20 2010-05-25 M7 Visual Intelligence, L.P. Vehicle based data collection and processing system and imaging sensor system and methods thereof
US7893957B2 (en) * 2002-08-28 2011-02-22 Visual Intelligence, LP Retinal array compound camera system
US8994822B2 (en) 2002-08-28 2015-03-31 Visual Intelligence Lp Infrastructure mapping system and method
US8483960B2 (en) 2002-09-20 2013-07-09 Visual Intelligence, LP Self-calibrated, remote imaging and data processing system
US7212938B2 (en) 2002-09-17 2007-05-01 M7 Visual Intelligence, Lp Method of using a self-locking travel pattern to achieve calibration of remote sensors using conventionally collected data
US6928194B2 (en) * 2002-09-19 2005-08-09 M7 Visual Intelligence, Lp System for mosaicing digital ortho-images
USRE49105E1 (en) 2002-09-20 2022-06-14 Vi Technologies, Llc Self-calibrated, remote imaging and data processing system
US7002551B2 (en) * 2002-09-25 2006-02-21 Hrl Laboratories, Llc Optical see-through augmented reality modified-scale display
US20070035562A1 (en) * 2002-09-25 2007-02-15 Azuma Ronald T Method and apparatus for image enhancement
US20040066391A1 (en) * 2002-10-02 2004-04-08 Mike Daily Method and apparatus for static image enhancement
US20040068758A1 (en) * 2002-10-02 2004-04-08 Mike Daily Dynamic video annotation
US7424133B2 (en) * 2002-11-08 2008-09-09 Pictometry International Corporation Method and apparatus for capturing, geolocating and measuring oblique images
US6975959B2 (en) * 2002-12-03 2005-12-13 Robert Bosch Gmbh Orientation and navigation for a mobile device using inertial sensors
WO2005017550A2 (en) * 2002-12-13 2005-02-24 Utah State University Research Foundation A vehicle mounted system and method for capturing and processing physical data
US7046259B2 (en) * 2003-04-30 2006-05-16 The Boeing Company Method and system for presenting different views to passengers in a moving vehicle
US7088310B2 (en) * 2003-04-30 2006-08-08 The Boeing Company Method and system for presenting an image of an external view in a moving vehicle
US7212921B2 (en) * 2003-05-21 2007-05-01 Honeywell International Inc. System and method for multiplexing and transmitting DC power, IMU data and RF data on a single cable
US7117086B2 (en) * 2003-09-08 2006-10-03 Honeywell International Inc. GPS/IMU clock synchronization particularly for deep integration vector tracking loop
JP4253239B2 (en) * 2003-10-07 2009-04-08 富士重工業株式会社 Navigation system using image recognition
US7308342B2 (en) * 2004-01-23 2007-12-11 Rafael Armament Development Authority Ltd. Airborne reconnaissance system
US7065449B2 (en) * 2004-03-05 2006-06-20 Bell Geospace, Inc. Method and system for evaluating geophysical survey data
EP1759173A4 (en) * 2004-06-02 2012-02-22 Rockwell Collins Control Technologies Inc Image-augmented inertial navigation system (iains) and method
AU2005322595B2 (en) * 2004-06-02 2010-04-22 Rockwell Collins Control Technologies, Inc. Systems and methods for controlling dynamic systems
US7542850B2 (en) * 2004-06-24 2009-06-02 Bell Geospace, Inc. Method and system for synchronizing geophysical survey data
US7458264B2 (en) * 2004-09-10 2008-12-02 Honeywell International Inc. Generalized inertial measurement error reduction through multiple axis rotation during flight
US7274439B2 (en) * 2004-09-10 2007-09-25 Honeywell International Inc. Precise, no-contact, position sensing using imaging
US7289902B2 (en) * 2004-09-10 2007-10-30 Honeywell International Inc. Three dimensional balance assembly
US20060054660A1 (en) * 2004-09-10 2006-03-16 Honeywell International Inc. Articulated gas bearing support pads
US7617070B2 (en) * 2004-09-10 2009-11-10 Honeywell International Inc. Absolute position determination of an object using pattern recognition
US7340344B2 (en) * 2004-09-10 2008-03-04 Honeywell International Inc. Spherical position monitoring system
US7295947B2 (en) * 2004-09-10 2007-11-13 Honeywell International Inc. Absolute position determination of an object using pattern recognition
US7366613B2 (en) * 2004-09-10 2008-04-29 Honeywell International Inc. RF wireless communication for deeply embedded aerospace systems
US7698064B2 (en) * 2004-09-10 2010-04-13 Honeywell International Inc. Gas supported inertial sensor system and method
US7668655B2 (en) * 2004-12-07 2010-02-23 Honeywell International Inc. Navigation component modeling system and method
US7586514B1 (en) * 2004-12-15 2009-09-08 United States Of America As Represented By The Secretary Of The Navy Compact remote tactical imagery relay system
WO2006090368A1 (en) * 2005-02-22 2006-08-31 Israel Aerospace Industries Ltd. A calibration method and system for position measurements
US20060210169A1 (en) * 2005-03-03 2006-09-21 General Dynamics Advanced Information Systems, Inc. Apparatus and method for simulated sensor imagery using fast geometric transformations
US7260389B2 (en) * 2005-07-07 2007-08-21 The Boeing Company Mobile platform distributed data load management system
US8732233B2 (en) * 2005-07-13 2014-05-20 The Boeing Company Integrating portable electronic devices with electronic flight bag systems installed in aircraft
US7827400B2 (en) 2005-07-28 2010-11-02 The Boeing Company Security certificate management
US7788002B2 (en) * 2005-08-08 2010-08-31 The Boeing Company Fault data management
ES2369229T3 (en) * 2005-10-07 2011-11-28 Saab Ab PROCEDURE AND APPLIANCE TO GENERATE A ROUTE.
US9182228B2 (en) * 2006-02-13 2015-11-10 Sony Corporation Multi-lens array system and method
US20100245571A1 (en) * 2006-04-24 2010-09-30 Northrop Grumman Corporation Global hawk image mosaic
JP4938351B2 (en) * 2006-05-16 2012-05-23 トヨタ自動車株式会社 Positioning information update device for vehicles
US7873238B2 (en) 2006-08-30 2011-01-18 Pictometry International Corporation Mosaic oblique images and methods of making and using same
US7647176B2 (en) * 2007-01-11 2010-01-12 Honeywell International Inc. Method and system for wireless power transfers through multiple ports
US8593518B2 (en) * 2007-02-01 2013-11-26 Pictometry International Corp. Computer system for continuous oblique panning
US8520079B2 (en) * 2007-02-15 2013-08-27 Pictometry International Corp. Event multiplexer for managing the capture of images
US7463340B2 (en) * 2007-03-28 2008-12-09 Honeywell International Inc. Ladar-based motion estimation for navigation
US20080255736A1 (en) * 2007-04-10 2008-10-16 Helena Holding Company Geo-referenced agricultural levees
US8385672B2 (en) * 2007-05-01 2013-02-26 Pictometry International Corp. System for detecting image abnormalities
US9262818B2 (en) 2007-05-01 2016-02-16 Pictometry International Corp. System for detecting image abnormalities
US7762133B2 (en) * 2007-07-17 2010-07-27 Honeywell International Inc. Inertial measurement unit with gas plenums
US7425097B1 (en) 2007-07-17 2008-09-16 Honeywell International Inc. Inertial measurement unit with wireless power transfer gap control
US8024119B2 (en) * 2007-08-14 2011-09-20 Honeywell International Inc. Systems and methods for gyrocompass alignment using dynamically calibrated sensor data and an iterated extended kalman filter within a navigation system
US7671607B2 (en) * 2007-09-06 2010-03-02 Honeywell International Inc. System and method for measuring air bearing gap distance
US8965812B2 (en) * 2007-10-09 2015-02-24 Archer Daniels Midland Company Evaluating commodity conditions using aerial image data
US7991226B2 (en) 2007-10-12 2011-08-02 Pictometry International Corporation System and process for color-balancing a series of oblique images
US8531472B2 (en) 2007-12-03 2013-09-10 Pictometry International Corp. Systems and methods for rapid three-dimensional modeling with real façade texture
DE102007058943A1 (en) * 2007-12-07 2009-06-10 Emt Ingenieurgesellschaft Dipl.-Ing. Hartmut Euer Mbh Multi-spectral video device for air-based surveillance and real time aerial photograph monitoring of uncrewed aircraft, has electronic mosaic/fusion device designed such that fusion device receives image signals of video scanner cameras
WO2009105254A2 (en) * 2008-02-20 2009-08-27 Actioncam, Llc Aerial camera system
US20090245581A1 (en) * 2008-03-31 2009-10-01 Sean Dey Airborne terrain acquisition and processing system with fluid detection
US8213706B2 (en) * 2008-04-22 2012-07-03 Honeywell International Inc. Method and system for real-time visual odometry
US9235334B2 (en) * 2008-05-09 2016-01-12 Genesis Industries, Llc Managing landbases and machine operations performed thereon
US8373127B2 (en) * 2008-06-26 2013-02-12 Lynntech, Inc. Method of searching for a thermal target
US8588547B2 (en) 2008-08-05 2013-11-19 Pictometry International Corp. Cut-line steering methods for forming a mosaic image of a geographical area
US8401222B2 (en) 2009-05-22 2013-03-19 Pictometry International Corp. System and process for roof measurement using aerial imagery
US8577518B2 (en) * 2009-05-27 2013-11-05 American Aerospace Advisors, Inc. Airborne right of way autonomous imager
PT104783B (en) 2009-10-13 2014-08-27 Univ Aveiro HIGH PRECISION POSITIONING SYSTEM ADAPTED TO A TERRESTRIAL MOBILE PLATFORM
US9330494B2 (en) 2009-10-26 2016-05-03 Pictometry International Corp. Method for the automatic material classification and texture simulation for 3D models
WO2011089477A1 (en) * 2010-01-25 2011-07-28 Tarik Ozkul Autonomous decision system for selecting target in observation satellites
JP2011155361A (en) * 2010-01-26 2011-08-11 Sony Corp Imaging apparatus, imaging control method, and program
CA2796162A1 (en) * 2010-04-13 2012-10-04 Visual Intelligence, LP Self-calibrated, remote imaging and data processing system
FR2959633B1 (en) * 2010-04-29 2012-08-31 Airbus Operations Sas METHOD FOR UPGRADING AN AIRCRAFT
US8477190B2 (en) 2010-07-07 2013-07-02 Pictometry International Corp. Real-time moving platform management system
US8823732B2 (en) 2010-12-17 2014-09-02 Pictometry International Corp. Systems and methods for processing images with edge detection and snap-to feature
US8552905B2 (en) 2011-02-25 2013-10-08 Raytheon Company Automated layout of beams
EP2719163A4 (en) 2011-06-10 2015-09-09 Pictometry Int Corp System and method for forming a video stream containing gis data in real-time
WO2013020158A1 (en) * 2011-08-10 2013-02-14 John Lucas Inspecting geographically spaced features
US8687062B1 (en) 2011-08-31 2014-04-01 Google Inc. Step-stare oblique aerial camera system
US8430578B1 (en) * 2011-11-18 2013-04-30 Raytheon Company Separation of main and secondary inertial measurements for improved line of sight error of an imaging vehicle's isolated detector assembly
US8552350B2 (en) 2012-01-15 2013-10-08 Raytheon Company Mitigation of drift effects in secondary inertial measurements of an isolated detector assembly
US9183538B2 (en) 2012-03-19 2015-11-10 Pictometry International Corp. Method and system for quick square roof reporting
ES2394540B1 (en) * 2012-07-26 2013-12-11 Geonumerics, S.L. PROCEDURE FOR THE ACQUISITION AND PROCESSING OF GEOGRAPHICAL INFORMATION OF A TRAJECT
IL222221B (en) * 2012-09-27 2019-03-31 Rafael Advanced Defense Systems Ltd Improved inertial navigation system and method
CN103148803B (en) * 2013-02-28 2015-12-02 中国地质大学(北京) Small-sized three-dimensional laser scanning measurement system and method
US9881163B2 (en) 2013-03-12 2018-01-30 Pictometry International Corp. System and method for performing sensitive geo-spatial processing in non-sensitive operator environments
US9244272B2 (en) 2013-03-12 2016-01-26 Pictometry International Corp. Lidar system producing multiple scan paths and method of making and using same
US9753950B2 (en) 2013-03-15 2017-09-05 Pictometry International Corp. Virtual property reporting for automatic structure detection
US9275080B2 (en) 2013-03-15 2016-03-01 Pictometry International Corp. System and method for early access to captured images
US9441974B2 (en) * 2013-03-15 2016-09-13 Novatel Inc. System and method for calculating lever arm values photogrammetrically
US9182236B2 (en) 2013-10-25 2015-11-10 Novatel Inc. System for post processing GNSS/INS measurement data and camera image data
US9528834B2 (en) 2013-11-01 2016-12-27 Intelligent Technologies International, Inc. Mapping techniques using probe vehicles
US9751639B2 (en) * 2013-12-02 2017-09-05 Field Of View Llc System to control camera triggering and visualize aerial imaging missions
WO2015081383A1 (en) * 2013-12-04 2015-06-11 Spatial Information Systems Research Ltd Method and apparatus for developing a flight path
KR101429166B1 (en) * 2013-12-27 2014-08-13 대한민국 Imaging system on aerial vehicle
MX2016008890A (en) 2014-01-10 2017-01-16 Pictometry Int Corp Unmanned aircraft structure evaluation system and method.
US9292913B2 (en) 2014-01-31 2016-03-22 Pictometry International Corp. Augmented three dimensional point collection of vertical structures
CA2938973A1 (en) 2014-02-08 2015-08-13 Pictometry International Corp. Method and system for displaying room interiors on a floor plan
CN103984193B (en) * 2014-03-14 2020-10-16 广州虹天航空科技有限公司 Photographing apparatus stabilizer and control method thereof
US20150358522A1 (en) * 2014-03-31 2015-12-10 Goodrich Corporation Stabilization Of Gyro Drift Compensation For Image Capture Device
CN103994755B (en) * 2014-05-29 2016-03-30 清华大学深圳研究生院 A kind of space non-cooperative object pose measuring method based on model
CN104062687B (en) * 2014-06-12 2018-08-10 中国航空无线电电子研究所 A kind of earth's magnetic field joint observation method and system of vacant lot one
US9440750B2 (en) 2014-06-20 2016-09-13 nearmap australia pty ltd. Wide-area aerial camera systems
US9052571B1 (en) 2014-06-20 2015-06-09 nearmap australia pty ltd. Wide-area aerial camera systems
US9641736B2 (en) 2014-06-20 2017-05-02 nearmap australia pty ltd. Wide-area aerial camera systems
US9046759B1 (en) 2014-06-20 2015-06-02 nearmap australia pty ltd. Compact multi-resolution aerial camera system
US9185290B1 (en) 2014-06-20 2015-11-10 Nearmap Australia Pty Ltd Wide-area aerial camera systems
CN106796276A (en) 2014-10-08 2017-05-31 斯布克费舍创新私人有限公司 Aerocamera system
CN105493496B (en) 2014-12-14 2019-01-18 深圳市大疆创新科技有限公司 A kind of method for processing video frequency, device and picture system
PL3257168T3 (en) * 2015-02-09 2019-04-30 European Space Agency Esa Method for creating a constellation of electronic devices for providing optical or radio-frequency operations on a predetermined geographical area, and a system of such a constellation of electronic devices
US10754922B2 (en) 2015-04-23 2020-08-25 Lockheed Martin Corporation Method and apparatus for sensor fusion
US9945828B1 (en) 2015-10-23 2018-04-17 Sentek Systems Llc Airborne multispectral imaging system with integrated navigation sensors and automatic image stitching
US10132933B2 (en) * 2016-02-02 2018-11-20 Qualcomm Incorporated Alignment of visual inertial odometry and satellite positioning system reference frames
AU2017221222B2 (en) 2016-02-15 2022-04-21 Pictometry International Corp. Automated system and methodology for feature extraction
US10671648B2 (en) 2016-02-22 2020-06-02 Eagle View Technologies, Inc. Integrated centralized property database systems and methods
US10438326B2 (en) 2017-07-21 2019-10-08 The Boeing Company Recursive suppression of clutter in video imagery
US10453187B2 (en) 2017-07-21 2019-10-22 The Boeing Company Suppression of background clutter in video imagery
FR3072475B1 (en) * 2017-10-17 2019-11-01 Thales METHOD OF PROCESSING AN ERROR DURING THE EXECUTION OF A PREDETERMINED AVIONIC PROCEDURE, COMPUTER PROGRAM AND SYSTEM FOR DETECTION AND ALERT
US10267889B1 (en) * 2017-11-15 2019-04-23 Avalex Technologies Corporation Laser source location system
KR20200028648A (en) * 2018-09-07 2020-03-17 삼성전자주식회사 Method for adjusting an alignment model for sensors and an electronic device performing the method
CN114935331B (en) * 2022-05-27 2023-05-26 中国科学院西安光学精密机械研究所 Aviation camera dynamic imaging ground test method

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0383114A1 (en) * 1989-02-13 1990-08-22 Hughes Aircraft Company Measurement and control system for scanning sensors
EP0598454A1 (en) * 1992-11-19 1994-05-25 Gatso Special Products B.V. Method, system and vehicle for making and analyzing multispectral recorded images
WO1995016895A1 (en) * 1993-12-13 1995-06-22 Core Corp. Integrated photographing apparatus mounted on aircraft

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3191170A (en) * 1963-01-07 1965-06-22 Gen Instrument Corp Contour mapping system
US4589610A (en) * 1983-11-08 1986-05-20 Westinghouse Electric Corp. Guided missile subsystem
US4814711A (en) * 1984-04-05 1989-03-21 Deseret Research, Inc. Survey system and method for real time collection and processing of geophysicals data using signals from a global positioning satellite network
US4764781A (en) * 1987-02-26 1988-08-16 Grumman Aerospace Corporation Universal translational and rotational film drive mechanism
US5166789A (en) * 1989-08-25 1992-11-24 Space Island Products & Services, Inc. Geographical surveying using cameras in combination with flight computers to obtain images with overlaid geographical coordinates
JPH04250436A (en) * 1991-01-11 1992-09-07 Pioneer Electron Corp Image pickup device
US5247356A (en) * 1992-02-14 1993-09-21 Ciampa John A Method and apparatus for mapping and measuring land
US5270756A (en) * 1992-02-18 1993-12-14 Hughes Training, Inc. Method and apparatus for generating high resolution vidicon camera images
US5251037A (en) * 1992-02-18 1993-10-05 Hughes Training, Inc. Method and apparatus for generating high resolution CCD camera images
US5477459A (en) * 1992-03-06 1995-12-19 Clegg; Philip M. Real time three-dimensional machine locating system
US5438404A (en) * 1992-12-16 1995-08-01 Aai Corporation Gyroscopic system for boresighting equipment by optically acquiring and transferring parallel and non-parallel lines
US5503350A (en) * 1993-10-28 1996-04-02 Skysat Communications Network Corporation Microwave-powered aircraft
US5467271A (en) * 1993-12-17 1995-11-14 Trw, Inc. Mapping and analysis system for precision farming applications
US5519620A (en) * 1994-02-18 1996-05-21 Trimble Navigation Limited Centimeter accurate global positioning system receiver for on-the-fly real-time kinematic measurement and control
US5490075A (en) * 1994-08-01 1996-02-06 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Global positioning system synchronized active light autonomous docking system
US5557397A (en) * 1994-09-21 1996-09-17 Airborne Remote Mapping, Inc. Aircraft-based topographical data collection and processing system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0383114A1 (en) * 1989-02-13 1990-08-22 Hughes Aircraft Company Measurement and control system for scanning sensors
EP0598454A1 (en) * 1992-11-19 1994-05-25 Gatso Special Products B.V. Method, system and vehicle for making and analyzing multispectral recorded images
WO1995016895A1 (en) * 1993-12-13 1995-06-22 Core Corp. Integrated photographing apparatus mounted on aircraft
EP0737845A1 (en) * 1993-12-13 1996-10-16 Core Corp. Integrated photographing apparatus mounted on aircraft

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
EKIN W H: "A new lightweight camera rig", PHOTOGRAMMETIC RECORD, APRIL 1987, UK, vol. 12, no. 69, ISSN 0031-868X, pages 343 - 348, XP000675710 *
EKIN W H: "The development of an inexpensive retractable vertical camera rig for a light aircraft", PHOTOGRAMMETIC RECORD, APRIL 1984, UK, vol. 11, no. 63, ISSN 0031-868X, pages 311 - 317, XP000675709 *

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0919787A1 (en) * 1997-11-28 1999-06-02 Mitsumi Electric Co., Ltd. Navigation data receiving device
GB2342242A (en) * 1998-09-25 2000-04-05 Environment Agency Environmental data collection system
DE10010366A1 (en) * 2000-03-07 2001-09-27 Ekos Entwicklung Und Konstrukt Digital recording method of aerial photographs, involves comparing flight data with specific data for selectively downloading aerial photograph data from memory to ground station
DE10010366C2 (en) * 2000-03-07 2003-08-07 Ekos Entwicklung Und Konstrukt Process for the digital recording and storage of aerial photographs under flight conditions
GB2368219A (en) * 2000-09-13 2002-04-24 Roke Manor Research Camera system with GPS
EP1930689A1 (en) * 2005-08-31 2008-06-11 PASCO Corporation Laser distance measurement device and laser distance measurement method
EP1930689A4 (en) * 2005-08-31 2010-09-29 Pasco Corp Laser distance measurement device and laser distance measurement method
WO2007041696A3 (en) * 2005-10-04 2009-04-23 Eugene J Alexander System and method for calibrating a set of imaging devices and calculating 3d coordinates of detected features in a laboratory coordinate system
US8848035B2 (en) 2005-10-04 2014-09-30 Motion Analysis Corporation Device for generating three dimensional surface models of moving objects
US8223208B2 (en) 2005-11-10 2012-07-17 Motion Analysis Corporation Device and method for calibrating an imaging device for generating three dimensional surface models of moving objects
US10358235B2 (en) 2008-04-11 2019-07-23 Nearmap Australia Pty Ltd Method and system for creating a photomap using a dual-resolution camera system
US10358234B2 (en) 2008-04-11 2019-07-23 Nearmap Australia Pty Ltd Systems and methods of capturing large area images in detail including cascaded cameras and/or calibration features
WO2011062525A1 (en) 2009-11-20 2011-05-26 Saab Ab A method estimating absolute orientation of a vehicle
US9476987B2 (en) 2009-11-20 2016-10-25 Saab Ab Method estimating absolute orientation of a vehicle
US9430846B2 (en) 2013-04-19 2016-08-30 Ge Aviation Systems Llc Method of tracking objects using hyperspectral imagery
WO2016009402A3 (en) * 2014-07-18 2016-04-21 Altec S.P.A. Image and/or radio signals capturing platform
US10436941B2 (en) 2014-07-18 2019-10-08 Altec S.P.A. Image and/or radio signals capturing platform
AT520253A3 (en) * 2018-07-16 2019-04-15 Umweltdata G M B H Selective harvesting method

Also Published As

Publication number Publication date
US5894323A (en) 1999-04-13
AU2342597A (en) 1997-10-10
AU726815B2 (en) 2000-11-23
CA2250063A1 (en) 1997-09-25

Similar Documents

Publication Publication Date Title
US5894323A (en) Airborne imaging system using global positioning system (GPS) and inertial measurement unit (IMU) data
US10996055B2 (en) Integrated aerial photogrammetry surveys
Rinaudo et al. Archaeological site monitoring: UAV photogrammetry can be an answer
Zhou Near real-time orthorectification and mosaic of small UAV video flow for time-critical event response
Xiang et al. Development of a low-cost agricultural remote sensing system based on an autonomous unmanned aerial vehicle (UAV)
Mian et al. Direct georeferencing on small unmanned aerial platforms for improved reliability and accuracy of mapping without the need for ground control points
Hernandez-Lopez et al. An automatic approach to UAV flight planning and control for photogrammetric applications
Gurtner et al. Investigation of fish-eye lenses for small-UAV aerial photography
US20120114229A1 (en) Orthorectification and mosaic of video flow
CA2796162A1 (en) Self-calibrated, remote imaging and data processing system
Raczynski Accuracy analysis of products obtained from UAV-borne photogrammetry influenced by various flight parameters
Toth Sensor integration in airborne mapping
US20060018642A1 (en) Mobile laser designated infrared multimedia mapping system
CN110296688A (en) A kind of detecting one inclination aerial survey gondola based on passive geographic positioning technology
Zhou Geo-referencing of video flow from small low-cost civilian UAV
Grejner-Brzezinska Direct sensor orientation in airborne and land-based mapping applications
Cramer On the use of direct georeferencing in airborne photogrammetry
JP2007033258A (en) Method and device for observing object to be observed
Eisenbeiss Applications of photogrammetric processing using an autonomous model helicopter
Mostafa et al. A fully digital system for airborne mapping
Ladd et al. Rectification, georeferencing, and mosaicking of images acquired with remotely operated aerial platforms
US20200077028A1 (en) Method and aircraft for capturing aerial images and three-dimensional mapping of a geographical area
Mostafa et al. GPS/INS integrated navigation system in support of digital image georeferencing
Kremer et al. Operation of the UltraCamD together with CCNS4/Aerocontrol–First experiences and results
Miller et al. Precision 3-D modeling for autonomous helicopter flight

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AU BR CA CN JP RU

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
ENP Entry into the national phase

Ref document number: 2250063

Country of ref document: CA

Ref country code: CA

Ref document number: 2250063

Kind code of ref document: A

Format of ref document f/p: F

NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 97533767

Format of ref document f/p: F

122 Ep: pct application non-entry in european phase