US8098893B2 - Moving object image tracking apparatus and method - Google Patents

Moving object image tracking apparatus and method Download PDF

Info

Publication number
US8098893B2
US8098893B2 US12/550,648 US55064809A US8098893B2 US 8098893 B2 US8098893 B2 US 8098893B2 US 55064809 A US55064809 A US 55064809A US 8098893 B2 US8098893 B2 US 8098893B2
Authority
US
United States
Prior art keywords
moving object
rotation unit
angular velocity
zenith
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related, expires
Application number
US12/550,648
Other versions
US20100246886A1 (en
Inventor
Hiroaki Nakamura
Tadashi Kuroiwa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toshiba Corp
Original Assignee
Toshiba Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toshiba Corp filed Critical Toshiba Corp
Assigned to KABUSHIKI KAISHA TOSHIBA reassignment KABUSHIKI KAISHA TOSHIBA ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUROIWA, TADASHI, NAKAMURA, HIROAKI
Publication of US20100246886A1 publication Critical patent/US20100246886A1/en
Application granted granted Critical
Publication of US8098893B2 publication Critical patent/US8098893B2/en
Expired - Fee Related legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30232Surveillance

Definitions

  • the present invention relates to a moving object image tracking apparatus and method for enabling a target recognition sensor, such as a camera, to track a target that can move in every direction.
  • a target recognition sensor such as a camera
  • the gimbal structure In the above moving object image tracking systems, the gimbal structure must have at least two axes to track a target that moves in every direction. In a biaxial gimbal, when the target passes the zenith or near the same, the AZ axis of the gimbal must instantly rotate through 180°. Actually, however, this motion is hard to realize, resulting in a gimbal lock phenomenon in which continuous tracking is impossible. Accordingly, the biaxial gimbal structure cannot be oriented to the zenith, which makes it difficult to realize omnidirectional tracking.
  • the degree of freedom is increased using a triaxial gimbal structure, and one operation is distributed to the AZ axis and the xEL axis to prevent the angular velocity of the gimbal from excessively increasing, whereby the movable range of the gimbal is not exceeded to avoid the gimbal lock phenomenon and enable continuous omnidirectional tracking (see, for example, JP-A 2006-106910 (KOKAI)).
  • the above-described triaxial gimbal structure is more complex than the biaxial gimbal structure, and is hard to reduce in size and cost since it requires a larger number of driving means, such as motors. Further, since a camera, for example, is mounted on the gimbal structure, the load inertia of the xEL axis is large, which may cause axial interference between the AZ axis and the xEL axis. This is a problem peculiar to the triaxial gimbal structure.
  • the present invention has been developed in light of the above, and provides a moving object image tracking apparatus that exhibits improved tracking performance without any additional sensor, and a method employed therein.
  • a moving object image tracking apparatus comprising: a first rotation unit configured to rotate about an azimuth axis vertically oriented and rotatably supported; a second rotation unit configured to rotate about an elevation axis rotatably supported and horizontally oriented, the elevation axis being perpendicular to the azimuth axis, the second rotation unit being horizontally rotatable from a front position at which the second rotation unit faces a front, to a back position at which the second rotation unit faces a back, via an angular position corresponding to a zenith, the second rotation unit having a movable range of at least 180′; a driving unit configured to drive the first rotation unit and the second rotation unit to rotate independent of each other; an acquisition unit supported by the second rotation unit and configured to acquire image data of a moving object by photography; a first detection unit configured to detect, in the image data, a tracking error indicating a deviation of the moving object from a center of a field of view of the acquisition unit
  • FIG. 1 is a block diagram illustrating a moving object image tracking apparatus according to a first embodiment
  • FIG. 2 is a perspective view illustrating the gimbal driving unit shown in FIG. 1 ;
  • FIG. 3 is a block diagram illustrating the correction control system shown in FIG. 1 ;
  • FIG. 4 is a schematic view illustrating the field of view of the camera sensor shown in FIG. 1 , and moving object tracking;
  • FIG. 5 is a view illustrating the trajectory of a moving object, and that of the optical axis of an optical system
  • FIG. 6 is a view illustrating the trajectories, correction range and tracking error, using a gimbal zenith coordinate system
  • FIG. 7 is a view illustrating the trajectories, correction range and tracking error obtained after the zenith is reached, using the gimbal zenith coordinate system;
  • FIG. 8 is a view illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 1 ;
  • FIG. 9 is a view illustrating angular velocity instruction values that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1 ;
  • FIG. 10 is a view illustrating tracking errors that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1 ;
  • FIG. 11 is a view illustrating the trajectory of an optical axis vector in the gimbal zenith coordinate system, obtained when the correction control of the moving object image tracking apparatus shown in FIG. 1 is used;
  • FIG. 12 is a block diagram illustrating a correction control system according to a second embodiment
  • FIG. 13 is a flowchart illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 12 ;
  • FIG. 14 is a block diagram illustrating a correction control system according to a third embodiment.
  • the moving object image tracking apparatuses of the embodiments are obtained by applying a control system for a moving object image tracking mechanism to an image tracking system.
  • FIG. 1 a moving object image tracking apparatus according to a first embodiment will be described.
  • the moving object image tracking apparatus of the first embodiment includes first and second gimbals 111 and 121 , first and second driving units 112 and 122 , first and second angular velocity sensors 113 and 123 , first and second angle sensors 114 and 124 , a camera sensor 140 , a tracking error detecting unit 173 , an angular velocity instruction generating unit 150 , a moving object direction estimating unit 174 , a corrected angular velocity instruction generating unit 171 , an angular velocity instruction selecting unit 172 , and a driving control unit 160 .
  • the driving control unit 160 includes first and second servo controllers 161 and 162 .
  • the first gimbal 111 rotates about a first gimbal axis 110 as a rotatably supported vertical azimuth axis.
  • the second gimbal 121 rotates about a second gimbal axis 120 that is perpendicular to the first gimbal axis 110 and serves as a rotatably supported elevation axis.
  • the first and second driving units 112 and 122 rotate the first and second gimbals 111 and 121 , respectively.
  • the first angular velocity sensor 113 detects the angular velocity of the first gimbal 111 that rotates about the first gimbal axis 110 .
  • the second angular velocity sensor 123 detects the angular velocity of the second gimbal 121 that rotates about the second gimbal axis 120 .
  • the first angle sensor 114 detects the angle of rotation of the first gimbal 111 with respect to a gimbal fixing unit (not shown).
  • the second angle sensor 124 detects the angle of rotation of the second gimbal 121 with respect to the first gimbal 111 .
  • the camera sensor 140 is supported by the second gimbal 121 and used to detect a moving object and produce image data thereof.
  • the tracking error detecting unit 173 performs image processing on image data obtained from the camera sensor 140 to detect a tracking error.
  • the tracking error detecting unit 173 executes binarization to obtain monochrome image data, extracts the characterizing point of the moving object to determine the position thereof in the field of view of the camera, and detects, as the detected tracking error, a two-dimensional displacement ( ⁇ X, ⁇ Y) from the center of the field of view.
  • the time required for the above process including image processing is regarded as a sampling time for detecting a tracking error.
  • the detected tracking error will be described later with reference to FIG. 4 .
  • the angular velocity instruction generating unit 150 generates angular velocity instruction values for driving the gimbals to track a moving object, based on the detected two-dimensional tracking error from the tracking error detecting unit 173 , and the angles ( ⁇ 1 , ⁇ 2 ) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124 . This process will be described later in detail with reference to FIG. 3 .
  • the moving object direction estimating unit 174 receives data on the detected two-dimensional tracking error from the tracking error detecting unit 173 , and receives data on the detected angles from the first and second angle sensors 114 and 124 , thereby acquiring data on the traveling direction of the moving object as an estimated moving object direction value (estimated angle).
  • the corrected angular velocity instruction generating unit 171 receives data on the estimated moving object direction value, the detected two-dimensional tracking error and the detected angles from the moving object direction estimating unit 174 , the tracking error detecting unit 173 and the first and second angle sensors 114 and 124 , respectively, and generates corrected angular velocity instruction values for driving the gimbals to track the moving object with the zenith avoided.
  • the angular velocity instruction selecting unit 172 receives the angular velocity instruction values, the corrected angular velocity instruction values, and the detected angles from the angular velocity instruction generating unit 150 , the corrected angular velocity instruction generating unit 171 and the angle sensors 114 and 124 , respectively, and outputs either the angular velocity instruction values received from the angular velocity instruction generating unit 150 , or the corrected angular velocity instruction values received from the corrected angular velocity instruction generating unit 171 , in accordance with the degree of closeness in the orientation of the optical axis to the zenith.
  • the driving control unit 160 computes control values for making zero the difference between each of the angular velocity instruction values for the first and second angular velocity sensors 113 and 123 , and the corresponding one of the angular velocities detected by the first and second angular velocity sensors 113 and 123 .
  • the first and second servo controllers 161 and 162 correspond to the first and second angular velocity sensors 113 and 123 , respectively, and output control values to the first, and second driving units 112 and 122 , respectively.
  • the first gimbal axis of the gimbal driving unit is an azimuth axis (hereinafter, referred to simply as the “AZ axis”), and the second gimbal is an elevation axis (hereinafter, referred to simply as the “EL axis”).
  • the moving object image tracking apparatus of FIG. 1 is a biaxial whirling apparatus having a biaxial structure in which the AZ axis and the EL axis intersect each other at a point.
  • FIG. 3 is a block diagram collectively illustrating the AZ axis and the EL axis.
  • the angular velocity instruction generating unit 150 generates angular velocity instruction values given by the following expression (A) and used for driving the gimbals to track a moving object, based on the corrected two-dimensional tracking error obtained from the tracking error detecting unit 173 , and the angles ( ⁇ 1 , ⁇ 2 ) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124 , respectively.
  • the angular velocities set for the respective gimbal axes based on the two-dimensional image data ( ⁇ X, ⁇ Y) can be expressed by the following relational expression for computing an angular velocity instruction value from a detected tracking error and detected angles:
  • the moving object direction estimating unit 174 estimates the traveling direction of the moving object, based on angle data indicating the angles of the gimbals, and data indicating the detected tracking error and sent from the camera sensor 140 .
  • the optical axis of the camera sensor determined by the angles of the gimbals is oriented toward the zenith, and the position of the moving object can be estimated from the optical axis and the tracking error.
  • the position component (X i , Y i ) of the moving object associated with the optical axis is expressed using the transform, given by the following equations, of the coordinate system from the spherical coordinate system to the Cartesian coordinate system:
  • X j [n] cos ⁇ 2 [n] ⁇ cos( ⁇ 1 [n] ⁇ / 2)
  • Y j [n] cos ⁇ 2 [n] ⁇ sin( ⁇ 1 [n] ⁇ / 2)
  • ⁇ est arctan( dy/dx ) (6)
  • the corrected angular velocity instruction generating unit 171 generates a corrected angular velocity instruction, using the thus-estimated moving object traveling angle ⁇ est , as well as the gimbal angles ( ⁇ 1 , ⁇ 2 ) and the tracking error ( ⁇ X, ⁇ Y).
  • the correction range is given by the following mathematical expression that is related to a correction range threshold angle ⁇ th and utilizes transformation of a spherical coordinate system based on the gimbal angles into a Cartesian coordinate system:
  • correction driving for driving the optical axis to pass through the zenith is executed within an allowable tracking error range, in order to avoid gimbal lock in which an excessive angular velocity instruction is generated because of a tracking error near the zenith.
  • K is a corrected angular velocity gain
  • the angular velocity of the first gimbal 111 is controlled so that the difference between the azimuth axis angle of the first gimbal 111 and the moving direction of the moving object will approach zero.
  • Correction driving for compensating for the tracking error is executed in the same manner as that based on the aforementioned mathematical expression (1).
  • the sec function has an extremely high value
  • the angular velocity instruction selecting unit 172 uses the thus-determined angular velocity instruction value and corrected angular velocity instruction value, and the angles detected by the angle sensors 114 , to select an angular velocity instruction value for the position near the zenith.
  • the angular velocity instruction selecting unit 172 uses, as the selected angular velocity instruction value, the following angular velocity instruction value (B) based on data indicating the detected tracking error and sent from the tracking error detecting unit 173 .
  • the angular velocity instruction selecting unit 172 uses the following angular velocity instruction value (C) based on the aforementioned estimated moving object direction value.
  • the driving control unit 160 computes controlled instruction values that cause, zero, the differences between the respective angular velocity instruction values, generated by the angular velocity instruction selecting unit 172 for the first and second angular velocity sensors 113 and 123 , and the angular velocities detected by the sensors 113 and 123 . Based on the thus-computed instruction values, the gimbal driving unit is driven to track the moving object.
  • the gimbal driving unit includes the first and second gimbals 111 and 121 and the first and second driving units 112 and 122 .
  • FIG. 4 is a schematic view illustrating the field of view of the camera sensor and moving object tracking.
  • a detected tracking error ( ⁇ X, ⁇ Y) as a deviation from the center of the field of view of the camera is obtained.
  • the detected tracking error must fall within the field of view of the camera. It is desirable that the detected tracking error be low.
  • the moving object can be tracked using the biaxial gimbal structure, as long as the value falls within the field of view of the camera.
  • FIG. 5 is a view illustrating the trajectory of the moving object, and that of the optical axis resulting from driving based on corrected angular velocities. If the optical axis is expressed three dimensionally, it can be oriented by the biaxial gimbal structure in every semispherical direction. Consideration will now be given to a typical example in which the moving object travels from the front of the tracking apparatus to the back of the same along a trajectory on the hemisphere slightly deviated from a trajectory that passes a maximum semi-circle of the hemisphere including the zenith.
  • the range defined by the correction range threshold angle ⁇ th from the zenith is set as the correction range.
  • FIG. 6 is a view in which the correction range is enlarged two-dimensionally.
  • FIGS. 6 and 7 show the trajectory of the moving object, the correction range and a tracking error in the gimbal zenith coordinate system.
  • the moving object moves upwards.
  • the optical axis enters the correction range as a result of upwardly tracking the moving object using an angular velocity instruction based on a detected tracking error
  • the estimated moving object traveling angle ⁇ est becomes 90° in the gimbal zenith coordinate system.
  • the angle of the AZ axis to be set until the zenith is reached is 0°
  • the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (8).
  • the second gimbal 121 is driven by an angular velocity instruction based on the tracking error detected by the tracking error detecting unit 173 .
  • FIG. 7 is a view illustrating the positional relationship between the optical axis and the moving object, assumed when the optical axis reaches the zenith.
  • the traveling direction of the moving object is opposite to the aforementioned orientation in which the front position is assumed, and an X-directional tracking error ⁇ X corresponding to a deviation of the optical axis from the zenith occurs.
  • the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (9).
  • the second gimbal 121 is driven by an angular velocity instruction based on the tracking error ⁇ Y that is detected by the tracking error detecting unit 173 based on the image acquired from the camera sensor 140 .
  • the tracking error is compensated for.
  • the corrected angular velocity instruction is switched to the angular velocity instruction based on the detected tracking error, thereby enabling continuous tracking.
  • step S 801 it is determined whether a moving object enters the zenith correction range, using the gimbal angles detected by the angle sensors 114 and 124 , and the mathematical expression (7) (step S 801 ). If it is determined that the moving object is outside the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (step S 814 ), and is transferred to the driving control unit 160 . In contrast, if it is determined that the moving object has entered the correction range, it is determined whether the moving object has passed through the zenith, based on whether the EL axis has rotated through 90° (step S 802 ).
  • the moving object direction estimating unit 174 computes an estimated moving object direction value, based on the gimbal angles obtained from the angle sensors 114 and 124 , and the tracking error obtained from the camera sensor 140 (step S 806 ). If the angle of the EL axis does not exceed 90°, the corrected angle is computed as corrected angle A, using the mathematical expression (8) (step S 808 ), while if the angle of the EL axis exceeds 90°, the corrected angle is computed as corrected angle B, using the mathematical expression (8) (step S 809 ).
  • the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (steps S 810 and S 814 ). This enables application of correction to be limited even in the zenith correction range when the rotational angle of the first gimbal 111 is too large and therefore a high corrected angular velocity must be instructed. If the corrected angle for the first gimbal 111 is determined to be not higher than the threshold value, a corrected angular velocity instruction for instructing the corrected angular velocity A is generated using the mathematical expression (9) (step S 811 ).
  • a corrected angular velocity instruction for instructing the corrected angular velocity B is generated using the mathematical expression (10) (step S 804 ), while if the angle of the EL axis does not exceed 90°, a corrected angular velocity instruction for instructing a corrected angular velocity C is generated using the mathematical expression (11) (step S 805 ).
  • the angular velocity instruction selecting unit 172 selects one of the computed corrected angular velocity instructions, and transfers the same to the driving control unit 160 . While the gimbal structure is being driven by the corrected angular velocity instruction, it is repeatedly determined whether the moving object has passed through the zenith, computation of the corrected angular velocity instruction is switched. Further, while zenith correction is being executed, if it is determined that the moving object left the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected, and then it is determined whether the moving object has again entered the zenith correction range.
  • FIG. 9 shows the angular velocity instruction values imparted with time through the driving control unit 160 to the first and second gimbals 111 and 121 .
  • the first gimbal 111 corresponds to AZ
  • the second gimbal 121 corresponds to EL.
  • an AZ-axis angular velocity instruction of an extremely high level is imparted near the zenith (near 2.2 [s] in the figure).
  • this instruction cannot be followed because of the limitations of the gimbal driving characteristics.
  • FIG. 9 shows the case (shown in FIG.
  • FIG. 10 shows variations with time in the x- and y-components of a camera tracking error obtained when no correction is performed, and those obtained when correction is performed.
  • FIG. 10(A) shows that an extremely large AZ-axis angular velocity instruction is generated, a large tracking error, which falls outside the tracking error detection range, occurs because of the limitations of the gimbal driving characteristics, whereby tracking becomes impossible.
  • FIG. 10(B) shows that the gimbal structure can reliably track the moving object within the tracking error detection range.
  • the remarkable feature of the case shown in FIG. 10(B) is that the tracking error is maximum at a time, immediately before 2.2 [s], at which the moving object passes through the zenith, but falls within the field of view of the camera. This means that stable tracking can be executed.
  • FIG. 11 shows the vector trajectory of the optical axis in the gimbal zenith coordinate system, obtained when the correction is performed.
  • the circle indicated by the broken line indicates the correction range.
  • the optical axis enters the correction range after upward tracking from below to thereby start zenith correction, then once orients toward the zenith having coordinates (0, 0), and then returns to upward tracking from below with occurrence of a tracking error suppressed.
  • the biaxial gimbal structure with no additional sensor has a movable range in which the second gimbal can be oriented from the front position to the back position, and hence can perform omnidirectional tracking.
  • this gimbal structure at and near the zenith, an estimated moving object angular velocity is computed based on angle data, thereby executing driving in accordance with a corrected angular velocity instruction that corresponds to the estimated angular velocity.
  • an appropriate angular velocity instruction for tracking a moving object which is free from gimbal lock caused near the zenith by an excessive angular velocity instruction, can be generated, thereby improving the tracking performance.
  • the moving state of a moving object is determined using a moving object angular velocity estimated by a moving object angular velocity estimating unit 1201 , and when tracking is started at the position extremely close to the zenith, angular velocity instructions are generated for the azimuth axis and the elevation axis so that the optical axis is oriented toward the zenith.
  • FIG. 12 is a block diagram collectively illustrating the AZ axis and the EL axis.
  • the moving object angular velocity estimating unit 1201 receives data on a two-dimensional tracking error from the tracking error detecting unit 173 , receives data on angles from the first and second angle sensors 114 and 124 , and receives data on angular velocities from the first and second angular velocity sensors 113 and 123 , thereby determining the velocity of the mobbing object.
  • the moving object angular velocity estimating unit 1201 estimates the moving object velocity based on angle data indicating the angles of the gimbals, angular velocity data ( ⁇ 1 , ⁇ 2 ) indicating the angular velocities of the gimbals, and data ( ⁇ X, ⁇ Y) indicating a detected tracking error sent from the camera sensor 140 .
  • the velocity of the moving object cannot be estimated using the mathematical expression (13).
  • a corrected angular velocity generating unit 1202 generates a corrected angular velocity instruction based on the thus-determined estimated moving object the estimated moving object direction value ⁇ est , the gimbal angles ( ⁇ 1 , ⁇ 2 ), and the gimbal angular velocities ( ⁇ 1 , ⁇ 2 ).
  • step S 801 if it is determined whether a moving object is in the zenith correction range (step S 801 ), the traveling state of the moving object is determined based on the estimated moving object angular velocity ⁇ est estimated by the moving object angular velocity estimating unit 1201 (step S 1301 ). If the estimated moving object angular velocity ⁇ est is higher than the moving object angular velocity threshold value ⁇ th , it is determined that the moving object is kept to travel, and the program proceeds to step S 1303 .
  • the gimbal structure is driven using an angular velocity instruction based on a tracking error and set for tracking performed when the moving object halts.
  • sec ⁇ 2 has a very high value, and hence tracking to be performed when the moving object halts is performed using a sign function in place of a sec function (step S 1302 ).
  • sign ⁇ is a signum function for outputting 1 when ⁇ is higher than 0, outputting 0 when ⁇ is equal to 0, and outputting ⁇ 1 when ⁇ is lower than 0.
  • step S 1303 it is determined whether au ultimate zenith correction application condition related to a zenith singular point is satisfied.
  • the ultimate zenith correction application condition is defined by the following mathematical expression, based on whether tracking is started within the zenith correction range and whether the optical axis of the camera is within the range ⁇ c of the field of view.
  • the range ⁇ c of the field of view is smaller than the zenith correction range.
  • step S 802 the program proceeds to step S 802 , where the same zenith correction as in the first embodiment is executed.
  • the AZ axis is driven to reduce the tracking error, and the EL axis is driven to 90° that indicates the zenith, regardless of the tracking error.
  • the EL axis is rotated to pass 90°. Therefore, in the subsequent zenith correction process, the same processing as in the first embodiment is executed. Namely, in the zenith correction range, the angular velocity instruction selecting unit 1203 selects the thus-computed corrected angular velocity instruction, and transfers it to the driving control unit 160 .
  • the traveling state of the moving object is determined using the moving object angular velocity estimated by the moving object angular velocity estimating unit, and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith.
  • AZ azimuth
  • EL elevation
  • the traveling state of a moving object is determined using the difference in moving object position estimated by the moving object direction estimating unit 174 , and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith.
  • the third embodiment differs from the second embodiment in the method of determining at step S 1301 of FIG. 13 whether the moving object is traveling.
  • FIG. 14 is a block diagram collectively illustrating the AZ and EL axes.
  • the condition for determining the traveling state of the moving object is given by the following mathematical expression based on the difference (dx, dy) in moving object position estimated by the moving object direction estimating unit 174 and expressed by the aforementioned equations (5). Namely, whether the moving object is traveling can be determined based on a difference at a certain sample interval which corresponds to the velocity of the moving object. dx 2 +dy 2 ⁇ r 2 (19)
  • the moving object is determined to be in a halt, whereas if the difference is greater than the threshold value, the moving object is determined to be traveling. After that, the same zenith correction as in the second embodiment is executed.
  • the above-described third embodiment can provide the same advantage as that of the second embodiment by determining whether the moving object is traveling, based on the difference (dx, dy) in moving object position.
  • the moving object image tracking apparatuses of the above-described embodiments effectively serves as a tracking camera system of an omnidirectional biaxial gimbal structure installed in a mobile apparatus that is provided with, for example, a TV camera, camera seeker or automatic surveying tool.
  • the biaxial gimbal structure can avoid, near the zenith, gimbal lock due to excessive angular velocity instruction values, and an appropriate angular velocity instruction for tracking a moving object can be generated to improve tracking characteristics, since the second gimbal has a movable range ranging from the front position to the back position, moving object velocity values are estimated based on angle data at and near the zenith, and the gimbal structure is driven by a corrected angular velocity instruction.
  • the present invention is not limited to the above-described embodiments, but may be modified in various ways without departing from the scope.
  • Various inventions can be realized by appropriately combining the structural elements disclosed in the embodiments. For instance, some of the disclosed structural elements may be deleted. Some structural elements of different embodiments may be combined appropriately.

Abstract

An apparatus includes a first-computation unit computing first-angular-velocity-instruction values for driving first-and-second-rotation units to track a moving object, using a detected tracking error and a detected angles, when the moving object exists in a first range separate from a zenith by at least a preset distance, a second-computation unit computing second-angular-velocity-instruction values for driving the first-and-second-rotation units to track the moving object and avoid a zenith-singular point, using the detected angles, the detected tracking error and an estimated traveling direction, and a control unit controlling the first-and-second-rotation units to eliminate differences between the first-angular-velocity-instruction values and the angular velocities when the moving object exists in the first range, and controlling the first-and-second-rotation units to eliminate differences between the second-angular-velocity instruction values and the angular velocities when the moving object exists in a second range within the preset distance from the zenith.

Description

CROSS-REFERENCE TO RELATED APPLICATIONS
This application is based upon and claims the benefit of priority from prior Japanese Patent Application No. 2009-076700, filed Mar. 26, 2009, the entire contents of which are incorporated herein by reference.
BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to a moving object image tracking apparatus and method for enabling a target recognition sensor, such as a camera, to track a target that can move in every direction.
2. Description of the Related Art
In recent years, systems for tracking an object using, for example, an ITV camera, to realize continued monitoring or acquire detailed information have been produced on a commercial basis for protective service equipment employed in major facilities such as airports and manufacturing plants, lifeline facilities such as electric power plants and water networks, and traffic information support systems such as ITSs. These systems include not only ground equipment type systems, but also compact ones installed in vehicles, ships or airplanes and having a vibration-proof structure. In the systems, it has come to be important to enhance their whirling speed to make them quickly point a plurality of targets and sequentially track the targets.
In the above moving object image tracking systems, the gimbal structure must have at least two axes to track a target that moves in every direction. In a biaxial gimbal, when the target passes the zenith or near the same, the AZ axis of the gimbal must instantly rotate through 180°. Actually, however, this motion is hard to realize, resulting in a gimbal lock phenomenon in which continuous tracking is impossible. Accordingly, the biaxial gimbal structure cannot be oriented to the zenith, which makes it difficult to realize omnidirectional tracking.
In the conventional moving object image tracking systems, the degree of freedom is increased using a triaxial gimbal structure, and one operation is distributed to the AZ axis and the xEL axis to prevent the angular velocity of the gimbal from excessively increasing, whereby the movable range of the gimbal is not exceeded to avoid the gimbal lock phenomenon and enable continuous omnidirectional tracking (see, for example, JP-A 2006-106910 (KOKAI)).
The above-described triaxial gimbal structure is more complex than the biaxial gimbal structure, and is hard to reduce in size and cost since it requires a larger number of driving means, such as motors. Further, since a camera, for example, is mounted on the gimbal structure, the load inertia of the xEL axis is large, which may cause axial interference between the AZ axis and the xEL axis. This is a problem peculiar to the triaxial gimbal structure.
Further, to execute tracking near the zenith using the biaxial gimbal structure, a motor performance that allows instant 180° rotation is required, for example. Thus, excessive requests must be satisfied.
The present invention has been developed in light of the above, and provides a moving object image tracking apparatus that exhibits improved tracking performance without any additional sensor, and a method employed therein.
BRIEF SUMMARY OF THE INVENTION
According to an aspect of the invention, there is provided a moving object image tracking apparatus comprising: a first rotation unit configured to rotate about an azimuth axis vertically oriented and rotatably supported; a second rotation unit configured to rotate about an elevation axis rotatably supported and horizontally oriented, the elevation axis being perpendicular to the azimuth axis, the second rotation unit being horizontally rotatable from a front position at which the second rotation unit faces a front, to a back position at which the second rotation unit faces a back, via an angular position corresponding to a zenith, the second rotation unit having a movable range of at least 180′; a driving unit configured to drive the first rotation unit and the second rotation unit to rotate independent of each other; an acquisition unit supported by the second rotation unit and configured to acquire image data of a moving object by photography; a first detection unit configured to detect, in the image data, a tracking error indicating a deviation of the moving object from a center of a field of view of the acquisition unit; a second detection unit configured to detect angles indicating attitudes of the first rotation unit and the second rotation unit; a third detection unit configured to detect angular velocities of the first rotation unit and the second rotation unit; a first computation unit configured to compute first angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object, using the detected tracking error and the detected angles, when the moving object exists in a first range separate from the zenith by at least a preset distance; an estimation unit configured to estimate a traveling direction of the moving object using the detected angles and the detected tracking error, when the moving object exists in a second range within the preset distance from the zenith; a second computation unit configured to compute second angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object and avoid a zenith singular point, using the detected angles, the detected tracking error and the estimated traveling direction; and a control unit configured to control the driving unit to eliminate differences between the first angular velocity instruction values and the angular velocities when the moving object exists in the first range, and to control the driving unit to eliminate differences between the second angular velocity instruction values and the angular velocities when the moving object exists in the second range.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWING
FIG. 1 is a block diagram illustrating a moving object image tracking apparatus according to a first embodiment;
FIG. 2 is a perspective view illustrating the gimbal driving unit shown in FIG. 1;
FIG. 3 is a block diagram illustrating the correction control system shown in FIG. 1;
FIG. 4 is a schematic view illustrating the field of view of the camera sensor shown in FIG. 1, and moving object tracking;
FIG. 5 is a view illustrating the trajectory of a moving object, and that of the optical axis of an optical system;
FIG. 6 is a view illustrating the trajectories, correction range and tracking error, using a gimbal zenith coordinate system;
FIG. 7 is a view illustrating the trajectories, correction range and tracking error obtained after the zenith is reached, using the gimbal zenith coordinate system;
FIG. 8 is a view illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 1;
FIG. 9 is a view illustrating angular velocity instruction values that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1;
FIG. 10 is a view illustrating tracking errors that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1;
FIG. 11 is a view illustrating the trajectory of an optical axis vector in the gimbal zenith coordinate system, obtained when the correction control of the moving object image tracking apparatus shown in FIG. 1 is used;
FIG. 12 is a block diagram illustrating a correction control system according to a second embodiment;
FIG. 13 is a flowchart illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 12; and
FIG. 14 is a block diagram illustrating a correction control system according to a third embodiment.
DETAILED DESCRIPTION OF THE INVENTION
Moving object image tracking apparatuses and methods according to embodiments will be described in detail with reference to the accompanying drawings. In the embodiments, like reference numbers denote like elements, and duplicate of description will be avoided.
The moving object image tracking apparatuses of the embodiments are obtained by applying a control system for a moving object image tracking mechanism to an image tracking system.
First Embodiment
Referring first to FIG. 1, a moving object image tracking apparatus according to a first embodiment will be described.
The moving object image tracking apparatus of the first embodiment includes first and second gimbals 111 and 121, first and second driving units 112 and 122, first and second angular velocity sensors 113 and 123, first and second angle sensors 114 and 124, a camera sensor 140, a tracking error detecting unit 173, an angular velocity instruction generating unit 150, a moving object direction estimating unit 174, a corrected angular velocity instruction generating unit 171, an angular velocity instruction selecting unit 172, and a driving control unit 160. The driving control unit 160 includes first and second servo controllers 161 and 162.
The first gimbal 111 rotates about a first gimbal axis 110 as a rotatably supported vertical azimuth axis. The second gimbal 121 rotates about a second gimbal axis 120 that is perpendicular to the first gimbal axis 110 and serves as a rotatably supported elevation axis. The first and second driving units 112 and 122 rotate the first and second gimbals 111 and 121, respectively.
The first angular velocity sensor 113 detects the angular velocity of the first gimbal 111 that rotates about the first gimbal axis 110. The second angular velocity sensor 123 detects the angular velocity of the second gimbal 121 that rotates about the second gimbal axis 120.
The first angle sensor 114 detects the angle of rotation of the first gimbal 111 with respect to a gimbal fixing unit (not shown). The second angle sensor 124 detects the angle of rotation of the second gimbal 121 with respect to the first gimbal 111. The camera sensor 140 is supported by the second gimbal 121 and used to detect a moving object and produce image data thereof.
The tracking error detecting unit 173 performs image processing on image data obtained from the camera sensor 140 to detect a tracking error. In general, the tracking error detecting unit 173 executes binarization to obtain monochrome image data, extracts the characterizing point of the moving object to determine the position thereof in the field of view of the camera, and detects, as the detected tracking error, a two-dimensional displacement (ΔX, ΔY) from the center of the field of view. The time required for the above process including image processing is regarded as a sampling time for detecting a tracking error. The detected tracking error will be described later with reference to FIG. 4.
The angular velocity instruction generating unit 150 generates angular velocity instruction values for driving the gimbals to track a moving object, based on the detected two-dimensional tracking error from the tracking error detecting unit 173, and the angles (θ1, θ2) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124. This process will be described later in detail with reference to FIG. 3.
The moving object direction estimating unit 174 receives data on the detected two-dimensional tracking error from the tracking error detecting unit 173, and receives data on the detected angles from the first and second angle sensors 114 and 124, thereby acquiring data on the traveling direction of the moving object as an estimated moving object direction value (estimated angle).
The corrected angular velocity instruction generating unit 171 receives data on the estimated moving object direction value, the detected two-dimensional tracking error and the detected angles from the moving object direction estimating unit 174, the tracking error detecting unit 173 and the first and second angle sensors 114 and 124, respectively, and generates corrected angular velocity instruction values for driving the gimbals to track the moving objet with the zenith avoided.
The angular velocity instruction selecting unit 172 receives the angular velocity instruction values, the corrected angular velocity instruction values, and the detected angles from the angular velocity instruction generating unit 150, the corrected angular velocity instruction generating unit 171 and the angle sensors 114 and 124, respectively, and outputs either the angular velocity instruction values received from the angular velocity instruction generating unit 150, or the corrected angular velocity instruction values received from the corrected angular velocity instruction generating unit 171, in accordance with the degree of closeness in the orientation of the optical axis to the zenith.
The driving control unit 160 computes control values for making zero the difference between each of the angular velocity instruction values for the first and second angular velocity sensors 113 and 123, and the corresponding one of the angular velocities detected by the first and second angular velocity sensors 113 and 123. The first and second servo controllers 161 and 162 correspond to the first and second angular velocity sensors 113 and 123, respectively, and output control values to the first, and second driving units 112 and 122, respectively.
Referring then to FIG. 2, the gimbal driving unit used in the first embodiment will be described.
The first gimbal axis of the gimbal driving unit is an azimuth axis (hereinafter, referred to simply as the “AZ axis”), and the second gimbal is an elevation axis (hereinafter, referred to simply as the “EL axis”). The moving object image tracking apparatus of FIG. 1 is a biaxial whirling apparatus having a biaxial structure in which the AZ axis and the EL axis intersect each other at a point.
Referring to FIG. 3, a description will be given of a correction control system incorporated in the moving object image tracking apparatus of the embodiment. FIG. 3 is a block diagram collectively illustrating the AZ axis and the EL axis.
The angular velocity instruction generating unit 150 generates angular velocity instruction values given by the following expression (A) and used for driving the gimbals to track a moving object, based on the corrected two-dimensional tracking error obtained from the tracking error detecting unit 173, and the angles (θ1, θ2) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124, respectively.
{dot over (θ)}r1,{dot over (θ)}r2  (A)
The angular velocities set for the respective gimbal axes based on the two-dimensional image data (ΔX, ΔY) can be expressed by the following relational expression for computing an angular velocity instruction value from a detected tracking error and detected angles:
[ θ . r 1 θ . r 2 ] = K C [ - sec θ 2 0 0 1 ] [ Δ X Δ Y ] ( 1 )
where Kc is a tracking gain, and sec θ is a secant function related to θ (this function reaches an infinite value when θ is 90°). Accordingly, at or near the zenith, an extremely high angular velocity instruction will be output to the first gimbal 111, which involves a gimbal lock problem.
In the correction control performed in the first embodiment, the moving object direction estimating unit 174 estimates the traveling direction of the moving object, based on angle data indicating the angles of the gimbals, and data indicating the detected tracking error and sent from the camera sensor 140. In the zenith correction range to which zenith correction is applied, the optical axis of the camera sensor determined by the angles of the gimbals is oriented toward the zenith, and the position of the moving object can be estimated from the optical axis and the tracking error. In the gimbal zenith coordinate system in which 0° and 90° are set as reference values for the first and second gimbals, respectively, the position component (Xi, Yi) of the moving object associated with the optical axis is expressed using the transform, given by the following equations, of the coordinate system from the spherical coordinate system to the Cartesian coordinate system:
X j [n]=cos θ2 [n]·cos(θ1 [n]−π/2)
Y j [n]=cos θ2 [n]·sin(θ1 [n]−π/2)  (2)
Whenever the zenith correction is applied, a tracking error occurs. Therefore, it is necessary to correct a position component based on the tracking error, using the position of the optical axis. The camera coordinate system secured to the camera is rotated by the first gimbal 111, and hence the position component (Xe, Ye) due to the tracking error is given by the following equation that expresses the inverse rotation transform of the angle of the first gimbal:
[ X e [ n ] Y e [ n ] ] = [ cos ( - θ 1 [ n ] ) - sin ( - θ 1 [ n ] ) sin ( - θ 1 [ n ] ) cos ( - θ 1 [ n ] ) ] [ Δ X Δ Y ] ( 3 )
Accordingly, the position (X, Y) of the moving object in the gimbal zenith coordinate system is given by the following equations:
X[n]=X j [n]−X e [n]
Y[n]=Y j [n]+Y e [n]  (4)
Based on the thus-estimated position (X, Y) of the moving object, the difference (dx, dy) of a k-sampling interval between a sample (n−k) and a subsequent sample (n) is given by
dx=X[n]−X[n−k]
dy=Y[n]−Y[n−k]  (5)
Based on the thus-determined difference in moving object estimated position, the traveling direction (angle) θest of the moving object is given by
θest=arctan(dy/dx)  (6)
The corrected angular velocity instruction generating unit 171 generates a corrected angular velocity instruction, using the thus-estimated moving object traveling angle θest, as well as the gimbal angles (θ1, θ2) and the tracking error (ΔX, ΔY).
Within the correction range in which the corrected angular velocity instruction is applied, the optical axis is oriented toward the zenith. Therefore, the correction range is given by the following mathematical expression that is related to a correction range threshold angle θth and utilizes transformation of a spherical coordinate system based on the gimbal angles into a Cartesian coordinate system:
|(cos θ2·cos θ1)2+(cos θ2·sin θ1)2|<(sin θth)2
Namely, |(cos θ2)2|<(sin θth)2  (7)
When the optical axis is within the correction range, correction driving for driving the optical axis to pass through the zenith is executed within an allowable tracking error range, in order to avoid gimbal lock in which an excessive angular velocity instruction is generated because of a tracking error near the zenith.
Assume that when the EL axis is at 0°, the camera faces the front and the AZ axis is at 0°. This position will hereinafter be referred to as “the front position.” Similarly, assume that when the EL axis is at 180°, the camera faces the back and the AZ axis is at 180°. This position will hereinafter be referred to as “the back position.” When the moving object is at the zenith, the traveling direction of the moving object is opposite to the orientation in which the front position is assumed. Accordingly, a corrected AZ-axis angle θneed is computed based on the angle θ1 of the AZ axis and the estimated moving object traveling angle θest, using the following equation:
θneedest−π/2−θ1  (8)
A corrected angular velocity instruction value provided for the first gimbal 111 from when the moving object enters the correction range, to when it reaches the zenith is determined, based on the corrected angle θneed and using the following equation:
{dot over (θ)}′r1 =K·θ need (from when the correction range is entered, to when the zenith is reached)  (9)
where K is a corrected angular velocity gain.
Namely, until the moving object passes through the zenith, the angular velocity of the first gimbal 111 is controlled so that the difference between the azimuth axis angle of the first gimbal 111 and the moving direction of the moving object will approach zero.
When the AZ axis is driven by the corrected angular velocity instruction, and the zenith, where the EL axis assumes 90°, is reached, a tracking error inevitably occurs in the X component in accordance with a deviation of the moving object from the zenith. Therefore, correction driving for compensating for the tracking error is executed after the moving object passes through the zenith.
Correction driving for compensating for the tracking error is executed in the same manner as that based on the aforementioned mathematical expression (1). However, since near the zenith, the sec function has an extremely high value, a corrected angular velocity instruction based on a gain Kp is generated for compensating for a tracking error in the zenith correction range, using the following equation:
{dot over (θ)}′r1 =K c ·K p ·ΔX (from when the zenith is reached, to when the correction range is left)  (10)
After the EL axis exceeds 90° by the zenith correction and the moving object leaves the correction range, if the moving object again enters the correction range, the mathematical expression (9) is used, and the corrected AZ-axis angle θneed needed in this case is given by
θneed=π+θest−π/2−θ1 (from when the correction range is again entered after the correction range is left, to when the zenith is reached)  (11)
Accordingly, when the moving object again enters the correction range after the EL axis assumes 90° or more and the moving object leaves the correction range, a zenith corrected angular velocity instruction given by the following equation is generated for tracking error compensation:
{dot over (θ)}′r1 =K c ·K p ·ΔX (from when the zenith is again reached after the correction range is left, to when the correction range is left)  (12)
The angular velocity instruction selecting unit 172 uses the thus-determined angular velocity instruction value and corrected angular velocity instruction value, and the angles detected by the angle sensors 114, to select an angular velocity instruction value for the position near the zenith. When the moving object is outside the correction range, the angular velocity instruction selecting unit 172 uses, as the selected angular velocity instruction value, the following angular velocity instruction value (B) based on data indicating the detected tracking error and sent from the tracking error detecting unit 173. In contrast, when the moving object is within the correction range, the angular velocity instruction selecting unit 172 uses the following angular velocity instruction value (C) based on the aforementioned estimated moving object direction value. Regarding the second gimbal 121, the following angular velocity instruction value (D) is always used since no excessive angular velocity instruction is generated.
{dot over (θ)}r1  (B)
{dot over (θ)}′r1  (C)
{dot over (θ)}r2  (D)
Thus, the driving control unit 160 computes controlled instruction values that cause, zero, the differences between the respective angular velocity instruction values, generated by the angular velocity instruction selecting unit 172 for the first and second angular velocity sensors 113 and 123, and the angular velocities detected by the sensors 113 and 123. Based on the thus-computed instruction values, the gimbal driving unit is driven to track the moving object. The gimbal driving unit includes the first and second gimbals 111 and 121 and the first and second driving units 112 and 122.
By driving the moving object image tracking apparatus as described above, an excessive angular velocity instruction can be avoided even near the zenith, and hence appropriate angular velocity instructions for executing an appropriate tracking operation can be generated even near the zenith.
Referring then to FIG. 4, a description will be given of the field of view of the camera sensor 140 and tracking of a moving object.
FIG. 4 is a schematic view illustrating the field of view of the camera sensor and moving object tracking. When the moving object falls within the field of view of the camera, a detected tracking error (ΔX, ΔY) as a deviation from the center of the field of view of the camera is obtained. In consideration of tracking delay, the detected tracking error must fall within the field of view of the camera. It is desirable that the detected tracking error be low. However, even if the detected tracking error is relatively high, the moving object can be tracked using the biaxial gimbal structure, as long as the value falls within the field of view of the camera.
Referring then to FIGS. 5, 6 and 7, a description will be given of a correction process executed when a moving object enters the zenith range, passes therein and leaves it.
FIG. 5 is a view illustrating the trajectory of the moving object, and that of the optical axis resulting from driving based on corrected angular velocities. If the optical axis is expressed three dimensionally, it can be oriented by the biaxial gimbal structure in every semispherical direction. Consideration will now be given to a typical example in which the moving object travels from the front of the tracking apparatus to the back of the same along a trajectory on the hemisphere slightly deviated from a trajectory that passes a maximum semi-circle of the hemisphere including the zenith. In FIG. 5, the range defined by the correction range threshold angle θth from the zenith is set as the correction range.
FIG. 6 is a view in which the correction range is enlarged two-dimensionally. FIGS. 6 and 7 show the trajectory of the moving object, the correction range and a tracking error in the gimbal zenith coordinate system. In this example, the moving object moves upwards. When the optical axis enters the correction range as a result of upwardly tracking the moving object using an angular velocity instruction based on a detected tracking error, the estimated moving object traveling angle θest becomes 90° in the gimbal zenith coordinate system. Accordingly, the angle of the AZ axis to be set until the zenith is reached is 0°, and the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (8). The second gimbal 121 is driven by an angular velocity instruction based on the tracking error detected by the tracking error detecting unit 173.
FIG. 7 is a view illustrating the positional relationship between the optical axis and the moving object, assumed when the optical axis reaches the zenith. At the zenith at which the EL axis assumes 90°, the traveling direction of the moving object is opposite to the aforementioned orientation in which the front position is assumed, and an X-directional tracking error ΔX corresponding to a deviation of the optical axis from the zenith occurs. To compensate for the tracking error, the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (9). The second gimbal 121 is driven by an angular velocity instruction based on the tracking error ΔY that is detected by the tracking error detecting unit 173 based on the image acquired from the camera sensor 140. As a result, the tracking error is compensated for. Further, when the moving object leaves the zenith correction range, the corrected angular velocity instruction is switched to the angular velocity instruction based on the detected tracking error, thereby enabling continuous tracking.
Referring then to FIG. 8, an operation example of the angular velocity instruction selecting unit 172 will be described.
Firstly, it is determined whether a moving object enters the zenith correction range, using the gimbal angles detected by the angle sensors 114 and 124, and the mathematical expression (7) (step S801). If it is determined that the moving object is outside the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (step S814), and is transferred to the driving control unit 160. In contrast, if it is determined that the moving object has entered the correction range, it is determined whether the moving object has passed through the zenith, based on whether the EL axis has rotated through 90° (step S802). If the zenith is not reached, the moving object direction estimating unit 174 computes an estimated moving object direction value, based on the gimbal angles obtained from the angle sensors 114 and 124, and the tracking error obtained from the camera sensor 140 (step S806). If the angle of the EL axis does not exceed 90°, the corrected angle is computed as corrected angle A, using the mathematical expression (8) (step S808), while if the angle of the EL axis exceeds 90°, the corrected angle is computed as corrected angle B, using the mathematical expression (8) (step S809). If the thus-computed angle is greater than a corrected-angle threshold value, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (steps S810 and S814). This enables application of correction to be limited even in the zenith correction range when the rotational angle of the first gimbal 111 is too large and therefore a high corrected angular velocity must be instructed. If the corrected angle for the first gimbal 111 is determined to be not higher than the threshold value, a corrected angular velocity instruction for instructing the corrected angular velocity A is generated using the mathematical expression (9) (step S811).
In contrast, after the moving object passes through the zenith, if the angle of the EL axis exceeds 90°, a corrected angular velocity instruction for instructing the corrected angular velocity B is generated using the mathematical expression (10) (step S804), while if the angle of the EL axis does not exceed 90°, a corrected angular velocity instruction for instructing a corrected angular velocity C is generated using the mathematical expression (11) (step S805).
Within the zenith correction range, the angular velocity instruction selecting unit 172 selects one of the computed corrected angular velocity instructions, and transfers the same to the driving control unit 160. While the gimbal structure is being driven by the corrected angular velocity instruction, it is repeatedly determined whether the moving object has passed through the zenith, computation of the corrected angular velocity instruction is switched. Further, while zenith correction is being executed, if it is determined that the moving object left the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected, and then it is determined whether the moving object has again entered the zenith correction range.
Referring now to FIGS. 9, 10 and 11, a description will be given of variations in tracking error that occur depending upon whether correction control is executed by the moving object image tracking apparatus of the embodiment when the moving object travels from the front of the apparatus to the back along a trajectory slightly deviated from the zenith.
When the moving object slowly starts traveling and passes near the zenith, the closer to the zenith, the stronger the possibility of the gimbal structure horizontally greatly rotating the AZ axis from the front position at which the camera sensor faces the front, to thereby degrade the moving object tracking performance of the gimbal structure because of the limitations of the gimbal driving characteristics.
FIG. 9 shows the angular velocity instruction values imparted with time through the driving control unit 160 to the first and second gimbals 111 and 121. The first gimbal 111 corresponds to AZ, and the second gimbal 121 corresponds to EL. In the case (shown in FIG. 9(A)) where no correction is performed, an AZ-axis angular velocity instruction of an extremely high level is imparted near the zenith (near 2.2 [s] in the figure). However, this instruction cannot be followed because of the limitations of the gimbal driving characteristics. In contrast, in the case (shown in FIG. 9(B)) where correction is performed, angular velocity instructions are imparted which enable the gimbal driving characteristics to sufficiently track the moving object. The remarkable feature of the case shown in FIG. 9(B) where correction is performed near the zenith is that when the moving object has entered the correction range, the AZ axis starts to be rotated to counter the traveling direction of the moving object (for example, to be rotated clockwise in FIG. 2), and is therefore decelerated, with the result that the X-directional error ΔX due to zenith correction can be compensated for after passing through the zenith. Further, in the case of FIG. 9(B), the EL axis is driven at the angular velocity corresponding to the traveling speed of the moving object, and has its angle varied from 0 to 180° when tracking the moving object from the front to the back.
FIG. 10 shows variations with time in the x- and y-components of a camera tracking error obtained when no correction is performed, and those obtained when correction is performed. In the case (shown in FIG. 10(A)) where no correction is performed, an extremely large AZ-axis angular velocity instruction is generated, a large tracking error, which falls outside the tracking error detection range, occurs because of the limitations of the gimbal driving characteristics, whereby tracking becomes impossible. In contrast, in the case (shown in FIG. 10(B)) where correction is performed, the gimbal structure can reliably track the moving object within the tracking error detection range. The remarkable feature of the case shown in FIG. 10(B) is that the tracking error is maximum at a time, immediately before 2.2 [s], at which the moving object passes through the zenith, but falls within the field of view of the camera. This means that stable tracking can be executed.
FIG. 11 shows the vector trajectory of the optical axis in the gimbal zenith coordinate system, obtained when the correction is performed. The circle indicated by the broken line indicates the correction range. As shown in FIG. 11, the optical axis enters the correction range after upward tracking from below to thereby start zenith correction, then once orients toward the zenith having coordinates (0, 0), and then returns to upward tracking from below with occurrence of a tracking error suppressed.
The biaxial gimbal structure with no additional sensor, according to the first embodiment described above, has a movable range in which the second gimbal can be oriented from the front position to the back position, and hence can perform omnidirectional tracking. In this gimbal structure, at and near the zenith, an estimated moving object angular velocity is computed based on angle data, thereby executing driving in accordance with a corrected angular velocity instruction that corresponds to the estimated angular velocity. As a result, an appropriate angular velocity instruction for tracking a moving object, which is free from gimbal lock caused near the zenith by an excessive angular velocity instruction, can be generated, thereby improving the tracking performance.
Second Embodiment
In correction control performed in a moving object image tracking apparatus according to a second embodiment, the moving state of a moving object is determined using a moving object angular velocity estimated by a moving object angular velocity estimating unit 1201, and when tracking is started at the position extremely close to the zenith, angular velocity instructions are generated for the azimuth axis and the elevation axis so that the optical axis is oriented toward the zenith.
Referring to FIG. 12, a description will be given of the correction control system incorporated in the moving object image tracking apparatus of the second embodiment. FIG. 12 is a block diagram collectively illustrating the AZ axis and the EL axis.
The moving object angular velocity estimating unit 1201 receives data on a two-dimensional tracking error from the tracking error detecting unit 173, receives data on angles from the first and second angle sensors 114 and 124, and receives data on angular velocities from the first and second angular velocity sensors 113 and 123, thereby determining the velocity of the mobbing object.
In the correction control of the second embodiment, the moving object angular velocity estimating unit 1201 estimates the moving object velocity based on angle data indicating the angles of the gimbals, angular velocity data (ω1, ω2) indicating the angular velocities of the gimbals, and data (ΔX, ΔY) indicating a detected tracking error sent from the camera sensor 140.
When the zenith correction is not applied, the moving object is always moving to cross the optical axis of the camera if it is tracked. At this time, since the vector velocity of the optical axis is equal to the velocity of the moving object, an estimated moving object angular velocity Ωest is given by
ωest=√{square root over ((ω1 cos θ2)2+(ω2)2)}{square root over ((ω1 cos θ2)2+(ω2)2)}  (13)
In contrast, when the zenith correction is applied, an operation for allowing a tracking error to avoid a high angular velocity instruction for the first gimbal 111 is executed. Therefore the velocity of the moving object cannot be estimated using the mathematical expression (13). In the zenith correction range, however, the X-directional tracking error is not related to the traveling of the moving object, and the velocity of the moving object is equal to the angular velocity instruction for the second gimbal 121. Accordingly, the velocity of the moving object is given by the following equation based on the Y-directional tracking error and a tracking gain Kc:
ωest=√{square root over ((K c ΔY)2)}  (14)
A corrected angular velocity generating unit 1202 generates a corrected angular velocity instruction based on the thus-determined estimated moving object the estimated moving object direction value ωest, the gimbal angles (θ1, θ2), and the gimbal angular velocities (ω1, ω2).
Referring then to FIG. 13, an operation example of an angular velocity instruction selecting unit 1203 will be described.
Firstly, if it is determined whether a moving object is in the zenith correction range (step S801), the traveling state of the moving object is determined based on the estimated moving object angular velocity ωest estimated by the moving object angular velocity estimating unit 1201 (step S1301). If the estimated moving object angular velocity ωest is higher than the moving object angular velocity threshold value ωth, it is determined that the moving object is kept to travel, and the program proceeds to step S1303. In contrast, if the estimated moving object angular velocity ωest is not higher than the moving object angular velocity threshold value ωth, the gimbal structure is driven using an angular velocity instruction based on a tracking error and set for tracking performed when the moving object halts. However, near the zenith, sec θ2 has a very high value, and hence tracking to be performed when the moving object halts is performed using a sign function in place of a sec function (step S1302). A corrected angular velocity instruction generated when tracking is performed while the moving object halts is given by
{dot over (θ)}′r1 =K c·(−sign(θ2−π/2))·ΔX  (15)
where sign θ is a signum function for outputting 1 when θ is higher than 0, outputting 0 when θ is equal to 0, and outputting −1 when θ is lower than 0.
If it is determined that the moving object is travelling, it is determined whether au ultimate zenith correction application condition related to a zenith singular point is satisfied (step S1303). The ultimate zenith correction application condition is defined by the following mathematical expression, based on whether tracking is started within the zenith correction range and whether the optical axis of the camera is within the range θc of the field of view. The range θc of the field of view is smaller than the zenith correction range.
|(cos θ2·cos θ1)2+(cos θ2·sin θ1)2|<(sin θc)2  (16)
Namely, |(cos θ2)2|<(sin θc)2
If the ultimate zenith correction application condition expressed by the mathematical expression (16) is not satisfied, the program proceeds to step S802, where the same zenith correction as in the first embodiment is executed. In contrast, if the ultimate zenith correction application condition, i.e., the mathematical expression (16), is satisfied, a corrected angular velocity instruction D is computed using the following equations (step S1304):
{dot over (θ)}′r1 =K c ·K p1 ·ΔX  (17)
{dot over (θ)}′r2 =K p2·(π/2−θ2)  (18)
If moving object tracking is started in the ultimate zenith range including the zenith singular point in accordance with the corrected angular velocity instruction D, the AZ axis is driven to reduce the tracking error, and the EL axis is driven to 90° that indicates the zenith, regardless of the tracking error. As a result, the EL axis is rotated to pass 90°. Therefore, in the subsequent zenith correction process, the same processing as in the first embodiment is executed. Namely, in the zenith correction range, the angular velocity instruction selecting unit 1203 selects the thus-computed corrected angular velocity instruction, and transfers it to the driving control unit 160. While driving is being executed in accordance with the corrected angular velocity instruction, it is repeatedly determined whether the EL axis has passed 90°, and computation of the corrected angular velocity instruction is switched. Further, in the zenith correction state, if it is determined that the optical axis is oriented outside the zenith correction range, the angular velocity instruction values generated by the angular velocity instruction generating unit 150 are selected (steps S801 and S814), thereby returning the program to the determination as to whether the optical axis is again oriented toward the zenith correction range.
In the above-described second embodiment, the traveling state of the moving object is determined using the moving object angular velocity estimated by the moving object angular velocity estimating unit, and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith. As a result, gimbal lock near the zenith due to excessive angular velocity instructions can be avoided using appropriate angular velocity instructions, thereby improving the tracking performance.
Third Embodiment
In correction control executed by a moving object image tracking apparatus of a third embodiment, the traveling state of a moving object is determined using the difference in moving object position estimated by the moving object direction estimating unit 174, and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith. The third embodiment differs from the second embodiment in the method of determining at step S1301 of FIG. 13 whether the moving object is traveling.
Referring to FIG. 14, a description will be given of a correction control system incorporated in the moving object image tracking apparatus of the third embodiment. FIG. 14 is a block diagram collectively illustrating the AZ and EL axes.
The condition for determining the traveling state of the moving object is given by the following mathematical expression based on the difference (dx, dy) in moving object position estimated by the moving object direction estimating unit 174 and expressed by the aforementioned equations (5). Namely, whether the moving object is traveling can be determined based on a difference at a certain sample interval which corresponds to the velocity of the moving object.
dx 2 +dy 2 <r 2  (19)
If the difference is smaller than a threshold value, the moving object is determined to be in a halt, whereas if the difference is greater than the threshold value, the moving object is determined to be traveling. After that, the same zenith correction as in the second embodiment is executed.
The above-described third embodiment can provide the same advantage as that of the second embodiment by determining whether the moving object is traveling, based on the difference (dx, dy) in moving object position.
The moving object image tracking apparatuses of the above-described embodiments effectively serves as a tracking camera system of an omnidirectional biaxial gimbal structure installed in a mobile apparatus that is provided with, for example, a TV camera, camera seeker or automatic surveying tool.
As described above, even the biaxial gimbal structure can avoid, near the zenith, gimbal lock due to excessive angular velocity instruction values, and an appropriate angular velocity instruction for tracking a moving object can be generated to improve tracking characteristics, since the second gimbal has a movable range ranging from the front position to the back position, moving object velocity values are estimated based on angle data at and near the zenith, and the gimbal structure is driven by a corrected angular velocity instruction.
The present invention is not limited to the above-described embodiments, but may be modified in various ways without departing from the scope. Various inventions can be realized by appropriately combining the structural elements disclosed in the embodiments. For instance, some of the disclosed structural elements may be deleted. Some structural elements of different embodiments may be combined appropriately.

Claims (7)

1. A moving object image tracking apparatus comprising:
a first rotation unit configured to rotate about an azimuth axis vertically oriented and rotatably supported;
a second rotation unit configured to rotate about an elevation axis rotatably supported and horizontally oriented, the elevation axis being perpendicular to the azimuth axis, the second rotation unit being horizontally rotatable from a front position at which the second rotation unit faces a front, to a back position at which the second rotation unit faces a back, via an angular position corresponding to a zenith, the second rotation unit having a movable range of at least 180°;
a driving unit configured to drive the first rotation unit and the second rotation unit to rotate independent of each other;
an acquisition unit supported by the second rotation unit and configured to acquire image data of a moving object by photography;
a first detection unit configured to detect, in the image data, a tracking error indicating a deviation of the moving object from a center of a field of view of the acquisition unit;
a second detection unit configured to detect angles indicating attitudes of the first rotation unit and the second rotation unit;
a third detection unit configured to detect angular velocities of the first rotation unit and the second rotation unit;
a first computation unit configured to compute first angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object, using the detected tracking error and the detected angles, when the moving object exists in a first range separate from the zenith by at least a preset distance;
an estimation unit configured to estimate a traveling direction of the moving object using the detected angles and the detected tracking error, when the moving object exists in a second range within the preset distance from the zenith;
a second computation unit configured to compute second angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object and avoid a zenith singular point, using the detected angles, the detected tracking error and the estimated traveling direction; and
a control unit configured to control the driving unit to eliminate differences between the first angular velocity instruction values and the angular velocities when the moving object exists in the first range, and to control the driving unit to eliminate differences between the second angular velocity instruction values and the angular velocities when the moving object exists in the second range.
2. The apparatus according to claim 1, wherein the estimation unit detects a position of the moving object in a zenith coordinate system, based on a positional component of an optical axis of the acquisition unit determined from the detected angles, and a positional component of the optical axis determined from the detected tracking error, and the estimation unit estimates a traveling direction of the moving object based on a difference between a currently detected position of the moving object and a position of the moving object detected a sampling interval before.
3. The apparatus according to claim 1, wherein until the moving object passes the zenith, the second computation unit controls a first angular velocity of the first rotation unit to cause a difference between an angle of the azimuth axis of the first rotation unit and the traveling direction to approach 0.
4. The apparatus according to claim 1, wherein after the moving object passes the zenith, the second computation unit controls a first angular velocity of the first rotation unit in accordance with the detected tracking error, to compensate for the detected tracking error.
5. The apparatus according to claim 1, further comprising a third computation unit configured to estimate an angular velocity of the moving object in accordance with the detected tracking error when the moving object exists in the second range,
and wherein if the second computation unit starts to track the moving object in the second range, and if the optical axis falls within a certain range of the field of view, the second computation unit controls a first angular velocity of the first rotation unit to compensate for the detected tracking error, and controls a second angular velocity of the second rotation unit to cause the second rotation unit to orient the zenith.
6. The apparatus according to claim 5, wherein when the moving object exists in the first range, the third computation unit estimates the angular velocity of the moving object based on the angles detected by the second detection unit and the angular velocities detected by the third detection unit.
7. A moving object image tracking method comprising:
preparing a first rotation unit configured to rotate about an azimuth axis vertically oriented and rotatably supported;
preparing a second rotation unit configured to rotate about an elevation axis rotatably supported and horizontally oriented, the elevation axis being perpendicular to the azimuth axis, the second rotation unit being horizontally rotatable from a front position at which the second rotation unit faces a front, to a back position at which the second rotation unit faces a back, via an angular position corresponding to a zenith, the second rotation unit having a movable range of at least 180°;
driving the first rotation unit and the second rotation unit to rotate independent of each other;
preparing an acquisition unit supported by the second rotation unit, and acquiring image data of a moving object by photography;
detecting, in the image data, a tracking error indicating a deviation of the moving object from a center of a field of view of the acquisition unit;
detecting angles indicating attitudes of the first rotation unit and the second rotation unit;
detecting angular velocities of the first rotation unit and the second rotation unit;
computing first angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object, using the detected tracking error and the detected angles, when the moving object exists in a first range separate from a zenith by at least a preset distance;
estimating a traveling direction of the moving object using the detected angles and the detected tracking error, when the moving object exists in a second range within the preset distance from the zenith;
computing second angular velocity instruction values for driving the first rotation unit and the second rotation unit to track the moving object and avoid a zenith singular point, using the detected angles, the detected tracking error and the estimated traveling direction; and
controlling the first rotation unit and the second rotation unit to eliminate differences between the first angular velocity instruction values and the angular velocities when the moving object exists in the first, range, and controlling the first rotation unit and the second rotation unit to eliminate differences between the second angular velocity instruction values and the angular velocities when the moving object exists in the second range.
US12/550,648 2009-03-26 2009-08-31 Moving object image tracking apparatus and method Expired - Fee Related US8098893B2 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2009-076700 2009-03-26
JP2009076700A JP4810582B2 (en) 2009-03-26 2009-03-26 Mobile object image tracking apparatus and method

Publications (2)

Publication Number Publication Date
US20100246886A1 US20100246886A1 (en) 2010-09-30
US8098893B2 true US8098893B2 (en) 2012-01-17

Family

ID=42784298

Family Applications (1)

Application Number Title Priority Date Filing Date
US12/550,648 Expired - Fee Related US8098893B2 (en) 2009-03-26 2009-08-31 Moving object image tracking apparatus and method

Country Status (2)

Country Link
US (1) US8098893B2 (en)
JP (1) JP4810582B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8984757B2 (en) 2011-07-12 2015-03-24 Kabushiki Kaisha Toshiba Tracking apparatus
US9160907B2 (en) 2012-03-22 2015-10-13 Kabushiki Kaisha Toshiba Tracking apparatus
US20210390327A1 (en) * 2016-06-06 2021-12-16 Sz Dji Osmo Technology Co., Ltd. Carrier-assisted tracking

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2929416B1 (en) * 2008-03-27 2010-11-05 Univ Paris 13 METHOD FOR DETERMINING A THREE-DIMENSIONAL REPRESENTATION OF AN OBJECT FROM A CUTTING IMAGE SEQUENCE, COMPUTER PROGRAM PRODUCT, CORRESPONDING OBJECT ANALYSIS METHOD, AND IMAGING SYSTEM
JP5197647B2 (en) * 2010-02-16 2013-05-15 株式会社東芝 Mobile image tracking device
US9615064B2 (en) * 2010-12-30 2017-04-04 Pelco, Inc. Tracking moving objects using a camera network
JP5836628B2 (en) * 2011-04-19 2015-12-24 キヤノン株式会社 Control system evaluation apparatus, evaluation method, and program
US9008359B2 (en) 2012-06-28 2015-04-14 International Business Machines Corporation Detection of static object on thoroughfare crossings
MX2015003533A (en) * 2012-09-20 2015-09-08 Nec Corp Antenna orientation adjustment assistance device and antenna device installation method.
EP2966530A4 (en) * 2013-03-07 2016-09-07 Nec Corp Space stabilizing device and space stabilization method
US9954277B2 (en) 2013-03-14 2018-04-24 Nec Corporation Antenna device and antenna device control method
EP3115079B1 (en) 2013-07-11 2019-03-20 Oticon Medical A/S Signal processor for a hearing device
KR101478172B1 (en) * 2014-07-09 2014-12-31 주식회사 다올넷 System for searching food trucks and users using location information
CN107065560A (en) * 2017-05-15 2017-08-18 北京环境特性研究所 A kind of two axle singular path photoelectric tracking control methods
JP7085954B2 (en) * 2018-09-25 2022-06-17 三菱電機株式会社 Antenna drive, antenna drive method, and antenna drive program
KR102195419B1 (en) * 2019-09-18 2020-12-28 (주)인텔리안테크놀로지스 Communication system

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05259722A (en) 1992-03-10 1993-10-08 Tokimec Inc Antenna directive device
US20010055063A1 (en) * 2000-05-26 2001-12-27 Honda Giken Kogyo Kabushiki Kaisha Position detection apparatus, position detection method and position detection program
JP2006106910A (en) 2004-09-30 2006-04-20 Toshiba Corp Gimbal device
US7489806B2 (en) * 2002-11-07 2009-02-10 Olympus Corporation Motion detection apparatus
US20090115850A1 (en) 2007-11-06 2009-05-07 Kabushiki Kaisha Toshiba Mobile object image tracking apparatus and method
JP2009141728A (en) 2007-12-07 2009-06-25 Furuno Electric Co Ltd Control method for reducing pointing error of antenna having two-axis gimbal structure, and control device with control method
US20090262197A1 (en) 2008-02-22 2009-10-22 Kabushiki Kaisha Toshiba Moving object image tracking apparatus and method
US7965868B2 (en) * 2006-07-20 2011-06-21 Lawrence Livermore National Security, Llc System and method for bullet tracking and shooter localization
US8009918B2 (en) * 2007-07-08 2011-08-30 Universite De Liege Visual background extractor

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6022803A (en) * 1983-07-19 1985-02-05 Nec Corp Controller of satellite tracking antenna
JPH04207361A (en) * 1990-11-30 1992-07-29 Fujitsu Ltd Image tracking device
JP2003099127A (en) * 2001-09-21 2003-04-04 Toshiba Corp Light wave tracking device and method
JP3886842B2 (en) * 2002-04-25 2007-02-28 三菱電機株式会社 Robot control device
JP2008200763A (en) * 2007-02-16 2008-09-04 Hitachi Ltd Control device for manipulator for working

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05259722A (en) 1992-03-10 1993-10-08 Tokimec Inc Antenna directive device
US20010055063A1 (en) * 2000-05-26 2001-12-27 Honda Giken Kogyo Kabushiki Kaisha Position detection apparatus, position detection method and position detection program
US7489806B2 (en) * 2002-11-07 2009-02-10 Olympus Corporation Motion detection apparatus
JP2006106910A (en) 2004-09-30 2006-04-20 Toshiba Corp Gimbal device
US7965868B2 (en) * 2006-07-20 2011-06-21 Lawrence Livermore National Security, Llc System and method for bullet tracking and shooter localization
US8009918B2 (en) * 2007-07-08 2011-08-30 Universite De Liege Visual background extractor
US20090115850A1 (en) 2007-11-06 2009-05-07 Kabushiki Kaisha Toshiba Mobile object image tracking apparatus and method
JP2009141728A (en) 2007-12-07 2009-06-25 Furuno Electric Co Ltd Control method for reducing pointing error of antenna having two-axis gimbal structure, and control device with control method
US20090262197A1 (en) 2008-02-22 2009-10-22 Kabushiki Kaisha Toshiba Moving object image tracking apparatus and method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
U.S. Appl. No. 12/549,448.

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8984757B2 (en) 2011-07-12 2015-03-24 Kabushiki Kaisha Toshiba Tracking apparatus
US9160907B2 (en) 2012-03-22 2015-10-13 Kabushiki Kaisha Toshiba Tracking apparatus
US20210390327A1 (en) * 2016-06-06 2021-12-16 Sz Dji Osmo Technology Co., Ltd. Carrier-assisted tracking
US11568626B2 (en) * 2016-06-06 2023-01-31 Sz Dji Osmo Technology Co., Ltd. Carrier-assisted tracking

Also Published As

Publication number Publication date
US20100246886A1 (en) 2010-09-30
JP2010231371A (en) 2010-10-14
JP4810582B2 (en) 2011-11-09

Similar Documents

Publication Publication Date Title
US8098893B2 (en) Moving object image tracking apparatus and method
US8174581B2 (en) Moving object image tracking apparatus and method
US8243142B2 (en) Mobile object image tracking apparatus and method
JP5010634B2 (en) Mobile image tracking device
JP5459678B2 (en) Mobile image tracking device
CN108733066B (en) Target tracking control method based on pod attitude feedback
US9160907B2 (en) Tracking apparatus
CN108919841A (en) A kind of compound heavy metal method and system of photoelectric follow-up
CN108281789B (en) Blind area tracking method and device of directional antenna and mobile tracking system
US20120002016A1 (en) Long-Distance Target Detection Camera System
US10642271B1 (en) Vehicle guidance camera with zoom lens
KR102307079B1 (en) System for detecting and tracking target of unmanned aerial vehicle
FR2982963A1 (en) GUIDE METHOD FOR TRACK CORRECTION OF AN AIRCRAFT
CN111366155A (en) Local scanning method based on airborne photoelectric system
CN108544491B (en) Mobile robot obstacle avoidance method comprehensively considering two factors of distance and direction
JP2009260564A (en) Mobile object image tracking apparatus
CN113156450B (en) Active rotation laser radar system on unmanned aerial vehicle and control method thereof
Kim et al. Targeted driving using visual tracking on Mars: From research to flight
CN109993768B (en) Aerial target spectrum mapping method for improving real-time performance and accuracy of servo tracking
JP5197647B2 (en) Mobile image tracking device
WO2019220740A1 (en) Autonomous movement device and autonomous movement system
KR102544345B1 (en) Tracking apparatus and method using the same
US9234723B2 (en) Method for automatically managing a homing device mounted on a projectile, in particular on a missile
KR102544347B1 (en) Tracking apparatus and tracking method using the same
Battistel et al. Inertially stabilized platforms using only two gyroscopic measures and sensitivity analysis to unmodeled motion

Legal Events

Date Code Title Description
AS Assignment

Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAMURA, HIROAKI;KUROIWA, TADASHI;REEL/FRAME:023592/0434

Effective date: 20090901

ZAAA Notice of allowance and fees due

Free format text: ORIGINAL CODE: NOA

ZAAB Notice of allowance mailed

Free format text: ORIGINAL CODE: MN/=.

STCF Information on status: patent grant

Free format text: PATENTED CASE

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 4

MAFP Maintenance fee payment

Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

LAPS Lapse for failure to pay maintenance fees

Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20240117