US8098893B2 - Moving object image tracking apparatus and method - Google Patents
Moving object image tracking apparatus and method Download PDFInfo
- Publication number
- US8098893B2 US8098893B2 US12/550,648 US55064809A US8098893B2 US 8098893 B2 US8098893 B2 US 8098893B2 US 55064809 A US55064809 A US 55064809A US 8098893 B2 US8098893 B2 US 8098893B2
- Authority
- US
- United States
- Prior art keywords
- moving object
- rotation unit
- angular velocity
- zenith
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related, expires
Links
- 238000000034 method Methods 0.000 title claims description 10
- 230000003287 optical effect Effects 0.000 claims description 30
- 238000001514 detection method Methods 0.000 claims description 10
- 238000005070 sampling Methods 0.000 claims description 3
- 238000013459 approach Methods 0.000 claims description 2
- 238000012937 correction Methods 0.000 description 87
- 238000010586 diagram Methods 0.000 description 7
- 238000012545 processing Methods 0.000 description 3
- 230000008901 benefit Effects 0.000 description 2
- 238000006073 displacement reaction Methods 0.000 description 1
- 239000000284 extract Substances 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001681 protective effect Effects 0.000 description 1
- 230000009466 transformation Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/20—Analysis of motion
- G06T7/246—Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10016—Video; Image sequence
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30232—Surveillance
Definitions
- the present invention relates to a moving object image tracking apparatus and method for enabling a target recognition sensor, such as a camera, to track a target that can move in every direction.
- a target recognition sensor such as a camera
- the gimbal structure In the above moving object image tracking systems, the gimbal structure must have at least two axes to track a target that moves in every direction. In a biaxial gimbal, when the target passes the zenith or near the same, the AZ axis of the gimbal must instantly rotate through 180°. Actually, however, this motion is hard to realize, resulting in a gimbal lock phenomenon in which continuous tracking is impossible. Accordingly, the biaxial gimbal structure cannot be oriented to the zenith, which makes it difficult to realize omnidirectional tracking.
- the degree of freedom is increased using a triaxial gimbal structure, and one operation is distributed to the AZ axis and the xEL axis to prevent the angular velocity of the gimbal from excessively increasing, whereby the movable range of the gimbal is not exceeded to avoid the gimbal lock phenomenon and enable continuous omnidirectional tracking (see, for example, JP-A 2006-106910 (KOKAI)).
- the above-described triaxial gimbal structure is more complex than the biaxial gimbal structure, and is hard to reduce in size and cost since it requires a larger number of driving means, such as motors. Further, since a camera, for example, is mounted on the gimbal structure, the load inertia of the xEL axis is large, which may cause axial interference between the AZ axis and the xEL axis. This is a problem peculiar to the triaxial gimbal structure.
- the present invention has been developed in light of the above, and provides a moving object image tracking apparatus that exhibits improved tracking performance without any additional sensor, and a method employed therein.
- a moving object image tracking apparatus comprising: a first rotation unit configured to rotate about an azimuth axis vertically oriented and rotatably supported; a second rotation unit configured to rotate about an elevation axis rotatably supported and horizontally oriented, the elevation axis being perpendicular to the azimuth axis, the second rotation unit being horizontally rotatable from a front position at which the second rotation unit faces a front, to a back position at which the second rotation unit faces a back, via an angular position corresponding to a zenith, the second rotation unit having a movable range of at least 180′; a driving unit configured to drive the first rotation unit and the second rotation unit to rotate independent of each other; an acquisition unit supported by the second rotation unit and configured to acquire image data of a moving object by photography; a first detection unit configured to detect, in the image data, a tracking error indicating a deviation of the moving object from a center of a field of view of the acquisition unit
- FIG. 1 is a block diagram illustrating a moving object image tracking apparatus according to a first embodiment
- FIG. 2 is a perspective view illustrating the gimbal driving unit shown in FIG. 1 ;
- FIG. 3 is a block diagram illustrating the correction control system shown in FIG. 1 ;
- FIG. 4 is a schematic view illustrating the field of view of the camera sensor shown in FIG. 1 , and moving object tracking;
- FIG. 5 is a view illustrating the trajectory of a moving object, and that of the optical axis of an optical system
- FIG. 6 is a view illustrating the trajectories, correction range and tracking error, using a gimbal zenith coordinate system
- FIG. 7 is a view illustrating the trajectories, correction range and tracking error obtained after the zenith is reached, using the gimbal zenith coordinate system;
- FIG. 8 is a view illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 1 ;
- FIG. 9 is a view illustrating angular velocity instruction values that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1 ;
- FIG. 10 is a view illustrating tracking errors that depend on whether correction control is executed in the moving object image tracking apparatus of FIG. 1 ;
- FIG. 11 is a view illustrating the trajectory of an optical axis vector in the gimbal zenith coordinate system, obtained when the correction control of the moving object image tracking apparatus shown in FIG. 1 is used;
- FIG. 12 is a block diagram illustrating a correction control system according to a second embodiment
- FIG. 13 is a flowchart illustrating an operation example of the angular velocity instruction selection unit shown in FIG. 12 ;
- FIG. 14 is a block diagram illustrating a correction control system according to a third embodiment.
- the moving object image tracking apparatuses of the embodiments are obtained by applying a control system for a moving object image tracking mechanism to an image tracking system.
- FIG. 1 a moving object image tracking apparatus according to a first embodiment will be described.
- the moving object image tracking apparatus of the first embodiment includes first and second gimbals 111 and 121 , first and second driving units 112 and 122 , first and second angular velocity sensors 113 and 123 , first and second angle sensors 114 and 124 , a camera sensor 140 , a tracking error detecting unit 173 , an angular velocity instruction generating unit 150 , a moving object direction estimating unit 174 , a corrected angular velocity instruction generating unit 171 , an angular velocity instruction selecting unit 172 , and a driving control unit 160 .
- the driving control unit 160 includes first and second servo controllers 161 and 162 .
- the first gimbal 111 rotates about a first gimbal axis 110 as a rotatably supported vertical azimuth axis.
- the second gimbal 121 rotates about a second gimbal axis 120 that is perpendicular to the first gimbal axis 110 and serves as a rotatably supported elevation axis.
- the first and second driving units 112 and 122 rotate the first and second gimbals 111 and 121 , respectively.
- the first angular velocity sensor 113 detects the angular velocity of the first gimbal 111 that rotates about the first gimbal axis 110 .
- the second angular velocity sensor 123 detects the angular velocity of the second gimbal 121 that rotates about the second gimbal axis 120 .
- the first angle sensor 114 detects the angle of rotation of the first gimbal 111 with respect to a gimbal fixing unit (not shown).
- the second angle sensor 124 detects the angle of rotation of the second gimbal 121 with respect to the first gimbal 111 .
- the camera sensor 140 is supported by the second gimbal 121 and used to detect a moving object and produce image data thereof.
- the tracking error detecting unit 173 performs image processing on image data obtained from the camera sensor 140 to detect a tracking error.
- the tracking error detecting unit 173 executes binarization to obtain monochrome image data, extracts the characterizing point of the moving object to determine the position thereof in the field of view of the camera, and detects, as the detected tracking error, a two-dimensional displacement ( ⁇ X, ⁇ Y) from the center of the field of view.
- the time required for the above process including image processing is regarded as a sampling time for detecting a tracking error.
- the detected tracking error will be described later with reference to FIG. 4 .
- the angular velocity instruction generating unit 150 generates angular velocity instruction values for driving the gimbals to track a moving object, based on the detected two-dimensional tracking error from the tracking error detecting unit 173 , and the angles ( ⁇ 1 , ⁇ 2 ) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124 . This process will be described later in detail with reference to FIG. 3 .
- the moving object direction estimating unit 174 receives data on the detected two-dimensional tracking error from the tracking error detecting unit 173 , and receives data on the detected angles from the first and second angle sensors 114 and 124 , thereby acquiring data on the traveling direction of the moving object as an estimated moving object direction value (estimated angle).
- the corrected angular velocity instruction generating unit 171 receives data on the estimated moving object direction value, the detected two-dimensional tracking error and the detected angles from the moving object direction estimating unit 174 , the tracking error detecting unit 173 and the first and second angle sensors 114 and 124 , respectively, and generates corrected angular velocity instruction values for driving the gimbals to track the moving object with the zenith avoided.
- the angular velocity instruction selecting unit 172 receives the angular velocity instruction values, the corrected angular velocity instruction values, and the detected angles from the angular velocity instruction generating unit 150 , the corrected angular velocity instruction generating unit 171 and the angle sensors 114 and 124 , respectively, and outputs either the angular velocity instruction values received from the angular velocity instruction generating unit 150 , or the corrected angular velocity instruction values received from the corrected angular velocity instruction generating unit 171 , in accordance with the degree of closeness in the orientation of the optical axis to the zenith.
- the driving control unit 160 computes control values for making zero the difference between each of the angular velocity instruction values for the first and second angular velocity sensors 113 and 123 , and the corresponding one of the angular velocities detected by the first and second angular velocity sensors 113 and 123 .
- the first and second servo controllers 161 and 162 correspond to the first and second angular velocity sensors 113 and 123 , respectively, and output control values to the first, and second driving units 112 and 122 , respectively.
- the first gimbal axis of the gimbal driving unit is an azimuth axis (hereinafter, referred to simply as the “AZ axis”), and the second gimbal is an elevation axis (hereinafter, referred to simply as the “EL axis”).
- the moving object image tracking apparatus of FIG. 1 is a biaxial whirling apparatus having a biaxial structure in which the AZ axis and the EL axis intersect each other at a point.
- FIG. 3 is a block diagram collectively illustrating the AZ axis and the EL axis.
- the angular velocity instruction generating unit 150 generates angular velocity instruction values given by the following expression (A) and used for driving the gimbals to track a moving object, based on the corrected two-dimensional tracking error obtained from the tracking error detecting unit 173 , and the angles ( ⁇ 1 , ⁇ 2 ) of the two axes, which indicate the attitudes of the gimbals and are detected by the first and second angle sensors 114 and 124 , respectively.
- the angular velocities set for the respective gimbal axes based on the two-dimensional image data ( ⁇ X, ⁇ Y) can be expressed by the following relational expression for computing an angular velocity instruction value from a detected tracking error and detected angles:
- the moving object direction estimating unit 174 estimates the traveling direction of the moving object, based on angle data indicating the angles of the gimbals, and data indicating the detected tracking error and sent from the camera sensor 140 .
- the optical axis of the camera sensor determined by the angles of the gimbals is oriented toward the zenith, and the position of the moving object can be estimated from the optical axis and the tracking error.
- the position component (X i , Y i ) of the moving object associated with the optical axis is expressed using the transform, given by the following equations, of the coordinate system from the spherical coordinate system to the Cartesian coordinate system:
- X j [n] cos ⁇ 2 [n] ⁇ cos( ⁇ 1 [n] ⁇ / 2)
- Y j [n] cos ⁇ 2 [n] ⁇ sin( ⁇ 1 [n] ⁇ / 2)
- ⁇ est arctan( dy/dx ) (6)
- the corrected angular velocity instruction generating unit 171 generates a corrected angular velocity instruction, using the thus-estimated moving object traveling angle ⁇ est , as well as the gimbal angles ( ⁇ 1 , ⁇ 2 ) and the tracking error ( ⁇ X, ⁇ Y).
- the correction range is given by the following mathematical expression that is related to a correction range threshold angle ⁇ th and utilizes transformation of a spherical coordinate system based on the gimbal angles into a Cartesian coordinate system:
- correction driving for driving the optical axis to pass through the zenith is executed within an allowable tracking error range, in order to avoid gimbal lock in which an excessive angular velocity instruction is generated because of a tracking error near the zenith.
- K is a corrected angular velocity gain
- the angular velocity of the first gimbal 111 is controlled so that the difference between the azimuth axis angle of the first gimbal 111 and the moving direction of the moving object will approach zero.
- Correction driving for compensating for the tracking error is executed in the same manner as that based on the aforementioned mathematical expression (1).
- the sec function has an extremely high value
- the angular velocity instruction selecting unit 172 uses the thus-determined angular velocity instruction value and corrected angular velocity instruction value, and the angles detected by the angle sensors 114 , to select an angular velocity instruction value for the position near the zenith.
- the angular velocity instruction selecting unit 172 uses, as the selected angular velocity instruction value, the following angular velocity instruction value (B) based on data indicating the detected tracking error and sent from the tracking error detecting unit 173 .
- the angular velocity instruction selecting unit 172 uses the following angular velocity instruction value (C) based on the aforementioned estimated moving object direction value.
- the driving control unit 160 computes controlled instruction values that cause, zero, the differences between the respective angular velocity instruction values, generated by the angular velocity instruction selecting unit 172 for the first and second angular velocity sensors 113 and 123 , and the angular velocities detected by the sensors 113 and 123 . Based on the thus-computed instruction values, the gimbal driving unit is driven to track the moving object.
- the gimbal driving unit includes the first and second gimbals 111 and 121 and the first and second driving units 112 and 122 .
- FIG. 4 is a schematic view illustrating the field of view of the camera sensor and moving object tracking.
- a detected tracking error ( ⁇ X, ⁇ Y) as a deviation from the center of the field of view of the camera is obtained.
- the detected tracking error must fall within the field of view of the camera. It is desirable that the detected tracking error be low.
- the moving object can be tracked using the biaxial gimbal structure, as long as the value falls within the field of view of the camera.
- FIG. 5 is a view illustrating the trajectory of the moving object, and that of the optical axis resulting from driving based on corrected angular velocities. If the optical axis is expressed three dimensionally, it can be oriented by the biaxial gimbal structure in every semispherical direction. Consideration will now be given to a typical example in which the moving object travels from the front of the tracking apparatus to the back of the same along a trajectory on the hemisphere slightly deviated from a trajectory that passes a maximum semi-circle of the hemisphere including the zenith.
- the range defined by the correction range threshold angle ⁇ th from the zenith is set as the correction range.
- FIG. 6 is a view in which the correction range is enlarged two-dimensionally.
- FIGS. 6 and 7 show the trajectory of the moving object, the correction range and a tracking error in the gimbal zenith coordinate system.
- the moving object moves upwards.
- the optical axis enters the correction range as a result of upwardly tracking the moving object using an angular velocity instruction based on a detected tracking error
- the estimated moving object traveling angle ⁇ est becomes 90° in the gimbal zenith coordinate system.
- the angle of the AZ axis to be set until the zenith is reached is 0°
- the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (8).
- the second gimbal 121 is driven by an angular velocity instruction based on the tracking error detected by the tracking error detecting unit 173 .
- FIG. 7 is a view illustrating the positional relationship between the optical axis and the moving object, assumed when the optical axis reaches the zenith.
- the traveling direction of the moving object is opposite to the aforementioned orientation in which the front position is assumed, and an X-directional tracking error ⁇ X corresponding to a deviation of the optical axis from the zenith occurs.
- the first gimbal 111 is driven by a corrected angular velocity instruction based on the mathematical expression (9).
- the second gimbal 121 is driven by an angular velocity instruction based on the tracking error ⁇ Y that is detected by the tracking error detecting unit 173 based on the image acquired from the camera sensor 140 .
- the tracking error is compensated for.
- the corrected angular velocity instruction is switched to the angular velocity instruction based on the detected tracking error, thereby enabling continuous tracking.
- step S 801 it is determined whether a moving object enters the zenith correction range, using the gimbal angles detected by the angle sensors 114 and 124 , and the mathematical expression (7) (step S 801 ). If it is determined that the moving object is outside the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (step S 814 ), and is transferred to the driving control unit 160 . In contrast, if it is determined that the moving object has entered the correction range, it is determined whether the moving object has passed through the zenith, based on whether the EL axis has rotated through 90° (step S 802 ).
- the moving object direction estimating unit 174 computes an estimated moving object direction value, based on the gimbal angles obtained from the angle sensors 114 and 124 , and the tracking error obtained from the camera sensor 140 (step S 806 ). If the angle of the EL axis does not exceed 90°, the corrected angle is computed as corrected angle A, using the mathematical expression (8) (step S 808 ), while if the angle of the EL axis exceeds 90°, the corrected angle is computed as corrected angle B, using the mathematical expression (8) (step S 809 ).
- the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected (steps S 810 and S 814 ). This enables application of correction to be limited even in the zenith correction range when the rotational angle of the first gimbal 111 is too large and therefore a high corrected angular velocity must be instructed. If the corrected angle for the first gimbal 111 is determined to be not higher than the threshold value, a corrected angular velocity instruction for instructing the corrected angular velocity A is generated using the mathematical expression (9) (step S 811 ).
- a corrected angular velocity instruction for instructing the corrected angular velocity B is generated using the mathematical expression (10) (step S 804 ), while if the angle of the EL axis does not exceed 90°, a corrected angular velocity instruction for instructing a corrected angular velocity C is generated using the mathematical expression (11) (step S 805 ).
- the angular velocity instruction selecting unit 172 selects one of the computed corrected angular velocity instructions, and transfers the same to the driving control unit 160 . While the gimbal structure is being driven by the corrected angular velocity instruction, it is repeatedly determined whether the moving object has passed through the zenith, computation of the corrected angular velocity instruction is switched. Further, while zenith correction is being executed, if it is determined that the moving object left the zenith correction range, the angular velocity instruction value generated by the angular velocity instruction generating unit 150 is selected, and then it is determined whether the moving object has again entered the zenith correction range.
- FIG. 9 shows the angular velocity instruction values imparted with time through the driving control unit 160 to the first and second gimbals 111 and 121 .
- the first gimbal 111 corresponds to AZ
- the second gimbal 121 corresponds to EL.
- an AZ-axis angular velocity instruction of an extremely high level is imparted near the zenith (near 2.2 [s] in the figure).
- this instruction cannot be followed because of the limitations of the gimbal driving characteristics.
- FIG. 9 shows the case (shown in FIG.
- FIG. 10 shows variations with time in the x- and y-components of a camera tracking error obtained when no correction is performed, and those obtained when correction is performed.
- FIG. 10(A) shows that an extremely large AZ-axis angular velocity instruction is generated, a large tracking error, which falls outside the tracking error detection range, occurs because of the limitations of the gimbal driving characteristics, whereby tracking becomes impossible.
- FIG. 10(B) shows that the gimbal structure can reliably track the moving object within the tracking error detection range.
- the remarkable feature of the case shown in FIG. 10(B) is that the tracking error is maximum at a time, immediately before 2.2 [s], at which the moving object passes through the zenith, but falls within the field of view of the camera. This means that stable tracking can be executed.
- FIG. 11 shows the vector trajectory of the optical axis in the gimbal zenith coordinate system, obtained when the correction is performed.
- the circle indicated by the broken line indicates the correction range.
- the optical axis enters the correction range after upward tracking from below to thereby start zenith correction, then once orients toward the zenith having coordinates (0, 0), and then returns to upward tracking from below with occurrence of a tracking error suppressed.
- the biaxial gimbal structure with no additional sensor has a movable range in which the second gimbal can be oriented from the front position to the back position, and hence can perform omnidirectional tracking.
- this gimbal structure at and near the zenith, an estimated moving object angular velocity is computed based on angle data, thereby executing driving in accordance with a corrected angular velocity instruction that corresponds to the estimated angular velocity.
- an appropriate angular velocity instruction for tracking a moving object which is free from gimbal lock caused near the zenith by an excessive angular velocity instruction, can be generated, thereby improving the tracking performance.
- the moving state of a moving object is determined using a moving object angular velocity estimated by a moving object angular velocity estimating unit 1201 , and when tracking is started at the position extremely close to the zenith, angular velocity instructions are generated for the azimuth axis and the elevation axis so that the optical axis is oriented toward the zenith.
- FIG. 12 is a block diagram collectively illustrating the AZ axis and the EL axis.
- the moving object angular velocity estimating unit 1201 receives data on a two-dimensional tracking error from the tracking error detecting unit 173 , receives data on angles from the first and second angle sensors 114 and 124 , and receives data on angular velocities from the first and second angular velocity sensors 113 and 123 , thereby determining the velocity of the mobbing object.
- the moving object angular velocity estimating unit 1201 estimates the moving object velocity based on angle data indicating the angles of the gimbals, angular velocity data ( ⁇ 1 , ⁇ 2 ) indicating the angular velocities of the gimbals, and data ( ⁇ X, ⁇ Y) indicating a detected tracking error sent from the camera sensor 140 .
- the velocity of the moving object cannot be estimated using the mathematical expression (13).
- a corrected angular velocity generating unit 1202 generates a corrected angular velocity instruction based on the thus-determined estimated moving object the estimated moving object direction value ⁇ est , the gimbal angles ( ⁇ 1 , ⁇ 2 ), and the gimbal angular velocities ( ⁇ 1 , ⁇ 2 ).
- step S 801 if it is determined whether a moving object is in the zenith correction range (step S 801 ), the traveling state of the moving object is determined based on the estimated moving object angular velocity ⁇ est estimated by the moving object angular velocity estimating unit 1201 (step S 1301 ). If the estimated moving object angular velocity ⁇ est is higher than the moving object angular velocity threshold value ⁇ th , it is determined that the moving object is kept to travel, and the program proceeds to step S 1303 .
- the gimbal structure is driven using an angular velocity instruction based on a tracking error and set for tracking performed when the moving object halts.
- sec ⁇ 2 has a very high value, and hence tracking to be performed when the moving object halts is performed using a sign function in place of a sec function (step S 1302 ).
- sign ⁇ is a signum function for outputting 1 when ⁇ is higher than 0, outputting 0 when ⁇ is equal to 0, and outputting ⁇ 1 when ⁇ is lower than 0.
- step S 1303 it is determined whether au ultimate zenith correction application condition related to a zenith singular point is satisfied.
- the ultimate zenith correction application condition is defined by the following mathematical expression, based on whether tracking is started within the zenith correction range and whether the optical axis of the camera is within the range ⁇ c of the field of view.
- the range ⁇ c of the field of view is smaller than the zenith correction range.
- step S 802 the program proceeds to step S 802 , where the same zenith correction as in the first embodiment is executed.
- the AZ axis is driven to reduce the tracking error, and the EL axis is driven to 90° that indicates the zenith, regardless of the tracking error.
- the EL axis is rotated to pass 90°. Therefore, in the subsequent zenith correction process, the same processing as in the first embodiment is executed. Namely, in the zenith correction range, the angular velocity instruction selecting unit 1203 selects the thus-computed corrected angular velocity instruction, and transfers it to the driving control unit 160 .
- the traveling state of the moving object is determined using the moving object angular velocity estimated by the moving object angular velocity estimating unit, and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith.
- AZ azimuth
- EL elevation
- the traveling state of a moving object is determined using the difference in moving object position estimated by the moving object direction estimating unit 174 , and when tracking is started at a position extremely close to the zenith, respective angular velocity instructions are generated for controlling the azimuth (AZ) axis and the elevation (EL) axis to orient the optical axis toward the zenith.
- the third embodiment differs from the second embodiment in the method of determining at step S 1301 of FIG. 13 whether the moving object is traveling.
- FIG. 14 is a block diagram collectively illustrating the AZ and EL axes.
- the condition for determining the traveling state of the moving object is given by the following mathematical expression based on the difference (dx, dy) in moving object position estimated by the moving object direction estimating unit 174 and expressed by the aforementioned equations (5). Namely, whether the moving object is traveling can be determined based on a difference at a certain sample interval which corresponds to the velocity of the moving object. dx 2 +dy 2 ⁇ r 2 (19)
- the moving object is determined to be in a halt, whereas if the difference is greater than the threshold value, the moving object is determined to be traveling. After that, the same zenith correction as in the second embodiment is executed.
- the above-described third embodiment can provide the same advantage as that of the second embodiment by determining whether the moving object is traveling, based on the difference (dx, dy) in moving object position.
- the moving object image tracking apparatuses of the above-described embodiments effectively serves as a tracking camera system of an omnidirectional biaxial gimbal structure installed in a mobile apparatus that is provided with, for example, a TV camera, camera seeker or automatic surveying tool.
- the biaxial gimbal structure can avoid, near the zenith, gimbal lock due to excessive angular velocity instruction values, and an appropriate angular velocity instruction for tracking a moving object can be generated to improve tracking characteristics, since the second gimbal has a movable range ranging from the front position to the back position, moving object velocity values are estimated based on angle data at and near the zenith, and the gimbal structure is driven by a corrected angular velocity instruction.
- the present invention is not limited to the above-described embodiments, but may be modified in various ways without departing from the scope.
- Various inventions can be realized by appropriately combining the structural elements disclosed in the embodiments. For instance, some of the disclosed structural elements may be deleted. Some structural elements of different embodiments may be combined appropriately.
Abstract
Description
{dot over (θ)}r1,{dot over (θ)}r2 (A)
X j [n]=cos θ2 [n]·cos(θ1 [n]−π/2)
Y j [n]=cos θ2 [n]·sin(θ1 [n]−π/2) (2)
X[n]=X j [n]−X e [n]
Y[n]=Y j [n]+Y e [n] (4)
dx=X[n]−X[n−k]
dy=Y[n]−Y[n−k] (5)
θest=arctan(dy/dx) (6)
|(cos θ2·cos θ1)2+(cos θ2·sin θ1)2|<(sin θth)2
Namely, |(cos θ2)2|<(sin θth)2 (7)
θneed=θest−π/2−θ1 (8)
{dot over (θ)}′r1 =K·θ need (from when the correction range is entered, to when the zenith is reached) (9)
{dot over (θ)}′r1 =K c ·K p ·ΔX (from when the zenith is reached, to when the correction range is left) (10)
θneed=π+θest−π/2−θ1 (from when the correction range is again entered after the correction range is left, to when the zenith is reached) (11)
{dot over (θ)}′r1 =K c ·K p ·ΔX (from when the zenith is again reached after the correction range is left, to when the correction range is left) (12)
{dot over (θ)}r1 (B)
{dot over (θ)}′r1 (C)
{dot over (θ)}r2 (D)
ωest=√{square root over ((ω1 cos θ2)2+(ω2)2)}{square root over ((ω1 cos θ2)2+(ω2)2)} (13)
ωest=√{square root over ((K c ΔY)2)} (14)
{dot over (θ)}′r1 =K c·(−sign(θ2−π/2))·ΔX (15)
|(cos θ2·cos θ1)2+(cos θ2·sin θ1)2|<(sin θc)2 (16)
Namely, |(cos θ2)2|<(sin θc)2
{dot over (θ)}′r1 =K c ·K p1 ·ΔX (17)
{dot over (θ)}′r2 =K p2·(π/2−θ2) (18)
dx 2 +dy 2 <r 2 (19)
Claims (7)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009-076700 | 2009-03-26 | ||
JP2009076700A JP4810582B2 (en) | 2009-03-26 | 2009-03-26 | Mobile object image tracking apparatus and method |
Publications (2)
Publication Number | Publication Date |
---|---|
US20100246886A1 US20100246886A1 (en) | 2010-09-30 |
US8098893B2 true US8098893B2 (en) | 2012-01-17 |
Family
ID=42784298
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/550,648 Expired - Fee Related US8098893B2 (en) | 2009-03-26 | 2009-08-31 | Moving object image tracking apparatus and method |
Country Status (2)
Country | Link |
---|---|
US (1) | US8098893B2 (en) |
JP (1) | JP4810582B2 (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8984757B2 (en) | 2011-07-12 | 2015-03-24 | Kabushiki Kaisha Toshiba | Tracking apparatus |
US9160907B2 (en) | 2012-03-22 | 2015-10-13 | Kabushiki Kaisha Toshiba | Tracking apparatus |
US20210390327A1 (en) * | 2016-06-06 | 2021-12-16 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
Families Citing this family (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2929416B1 (en) * | 2008-03-27 | 2010-11-05 | Univ Paris 13 | METHOD FOR DETERMINING A THREE-DIMENSIONAL REPRESENTATION OF AN OBJECT FROM A CUTTING IMAGE SEQUENCE, COMPUTER PROGRAM PRODUCT, CORRESPONDING OBJECT ANALYSIS METHOD, AND IMAGING SYSTEM |
JP5197647B2 (en) * | 2010-02-16 | 2013-05-15 | 株式会社東芝 | Mobile image tracking device |
US9615064B2 (en) * | 2010-12-30 | 2017-04-04 | Pelco, Inc. | Tracking moving objects using a camera network |
JP5836628B2 (en) * | 2011-04-19 | 2015-12-24 | キヤノン株式会社 | Control system evaluation apparatus, evaluation method, and program |
US9008359B2 (en) | 2012-06-28 | 2015-04-14 | International Business Machines Corporation | Detection of static object on thoroughfare crossings |
MX2015003533A (en) * | 2012-09-20 | 2015-09-08 | Nec Corp | Antenna orientation adjustment assistance device and antenna device installation method. |
EP2966530A4 (en) * | 2013-03-07 | 2016-09-07 | Nec Corp | Space stabilizing device and space stabilization method |
US9954277B2 (en) | 2013-03-14 | 2018-04-24 | Nec Corporation | Antenna device and antenna device control method |
EP3115079B1 (en) | 2013-07-11 | 2019-03-20 | Oticon Medical A/S | Signal processor for a hearing device |
KR101478172B1 (en) * | 2014-07-09 | 2014-12-31 | 주식회사 다올넷 | System for searching food trucks and users using location information |
CN107065560A (en) * | 2017-05-15 | 2017-08-18 | 北京环境特性研究所 | A kind of two axle singular path photoelectric tracking control methods |
JP7085954B2 (en) * | 2018-09-25 | 2022-06-17 | 三菱電機株式会社 | Antenna drive, antenna drive method, and antenna drive program |
KR102195419B1 (en) * | 2019-09-18 | 2020-12-28 | (주)인텔리안테크놀로지스 | Communication system |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05259722A (en) | 1992-03-10 | 1993-10-08 | Tokimec Inc | Antenna directive device |
US20010055063A1 (en) * | 2000-05-26 | 2001-12-27 | Honda Giken Kogyo Kabushiki Kaisha | Position detection apparatus, position detection method and position detection program |
JP2006106910A (en) | 2004-09-30 | 2006-04-20 | Toshiba Corp | Gimbal device |
US7489806B2 (en) * | 2002-11-07 | 2009-02-10 | Olympus Corporation | Motion detection apparatus |
US20090115850A1 (en) | 2007-11-06 | 2009-05-07 | Kabushiki Kaisha Toshiba | Mobile object image tracking apparatus and method |
JP2009141728A (en) | 2007-12-07 | 2009-06-25 | Furuno Electric Co Ltd | Control method for reducing pointing error of antenna having two-axis gimbal structure, and control device with control method |
US20090262197A1 (en) | 2008-02-22 | 2009-10-22 | Kabushiki Kaisha Toshiba | Moving object image tracking apparatus and method |
US7965868B2 (en) * | 2006-07-20 | 2011-06-21 | Lawrence Livermore National Security, Llc | System and method for bullet tracking and shooter localization |
US8009918B2 (en) * | 2007-07-08 | 2011-08-30 | Universite De Liege | Visual background extractor |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6022803A (en) * | 1983-07-19 | 1985-02-05 | Nec Corp | Controller of satellite tracking antenna |
JPH04207361A (en) * | 1990-11-30 | 1992-07-29 | Fujitsu Ltd | Image tracking device |
JP2003099127A (en) * | 2001-09-21 | 2003-04-04 | Toshiba Corp | Light wave tracking device and method |
JP3886842B2 (en) * | 2002-04-25 | 2007-02-28 | 三菱電機株式会社 | Robot control device |
JP2008200763A (en) * | 2007-02-16 | 2008-09-04 | Hitachi Ltd | Control device for manipulator for working |
-
2009
- 2009-03-26 JP JP2009076700A patent/JP4810582B2/en active Active
- 2009-08-31 US US12/550,648 patent/US8098893B2/en not_active Expired - Fee Related
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05259722A (en) | 1992-03-10 | 1993-10-08 | Tokimec Inc | Antenna directive device |
US20010055063A1 (en) * | 2000-05-26 | 2001-12-27 | Honda Giken Kogyo Kabushiki Kaisha | Position detection apparatus, position detection method and position detection program |
US7489806B2 (en) * | 2002-11-07 | 2009-02-10 | Olympus Corporation | Motion detection apparatus |
JP2006106910A (en) | 2004-09-30 | 2006-04-20 | Toshiba Corp | Gimbal device |
US7965868B2 (en) * | 2006-07-20 | 2011-06-21 | Lawrence Livermore National Security, Llc | System and method for bullet tracking and shooter localization |
US8009918B2 (en) * | 2007-07-08 | 2011-08-30 | Universite De Liege | Visual background extractor |
US20090115850A1 (en) | 2007-11-06 | 2009-05-07 | Kabushiki Kaisha Toshiba | Mobile object image tracking apparatus and method |
JP2009141728A (en) | 2007-12-07 | 2009-06-25 | Furuno Electric Co Ltd | Control method for reducing pointing error of antenna having two-axis gimbal structure, and control device with control method |
US20090262197A1 (en) | 2008-02-22 | 2009-10-22 | Kabushiki Kaisha Toshiba | Moving object image tracking apparatus and method |
Non-Patent Citations (1)
Title |
---|
U.S. Appl. No. 12/549,448. |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8984757B2 (en) | 2011-07-12 | 2015-03-24 | Kabushiki Kaisha Toshiba | Tracking apparatus |
US9160907B2 (en) | 2012-03-22 | 2015-10-13 | Kabushiki Kaisha Toshiba | Tracking apparatus |
US20210390327A1 (en) * | 2016-06-06 | 2021-12-16 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
US11568626B2 (en) * | 2016-06-06 | 2023-01-31 | Sz Dji Osmo Technology Co., Ltd. | Carrier-assisted tracking |
Also Published As
Publication number | Publication date |
---|---|
US20100246886A1 (en) | 2010-09-30 |
JP2010231371A (en) | 2010-10-14 |
JP4810582B2 (en) | 2011-11-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8098893B2 (en) | Moving object image tracking apparatus and method | |
US8174581B2 (en) | Moving object image tracking apparatus and method | |
US8243142B2 (en) | Mobile object image tracking apparatus and method | |
JP5010634B2 (en) | Mobile image tracking device | |
JP5459678B2 (en) | Mobile image tracking device | |
CN108733066B (en) | Target tracking control method based on pod attitude feedback | |
US9160907B2 (en) | Tracking apparatus | |
CN108919841A (en) | A kind of compound heavy metal method and system of photoelectric follow-up | |
CN108281789B (en) | Blind area tracking method and device of directional antenna and mobile tracking system | |
US20120002016A1 (en) | Long-Distance Target Detection Camera System | |
US10642271B1 (en) | Vehicle guidance camera with zoom lens | |
KR102307079B1 (en) | System for detecting and tracking target of unmanned aerial vehicle | |
FR2982963A1 (en) | GUIDE METHOD FOR TRACK CORRECTION OF AN AIRCRAFT | |
CN111366155A (en) | Local scanning method based on airborne photoelectric system | |
CN108544491B (en) | Mobile robot obstacle avoidance method comprehensively considering two factors of distance and direction | |
JP2009260564A (en) | Mobile object image tracking apparatus | |
CN113156450B (en) | Active rotation laser radar system on unmanned aerial vehicle and control method thereof | |
Kim et al. | Targeted driving using visual tracking on Mars: From research to flight | |
CN109993768B (en) | Aerial target spectrum mapping method for improving real-time performance and accuracy of servo tracking | |
JP5197647B2 (en) | Mobile image tracking device | |
WO2019220740A1 (en) | Autonomous movement device and autonomous movement system | |
KR102544345B1 (en) | Tracking apparatus and method using the same | |
US9234723B2 (en) | Method for automatically managing a homing device mounted on a projectile, in particular on a missile | |
KR102544347B1 (en) | Tracking apparatus and tracking method using the same | |
Battistel et al. | Inertially stabilized platforms using only two gyroscopic measures and sensitivity analysis to unmodeled motion |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: KABUSHIKI KAISHA TOSHIBA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:NAKAMURA, HIROAKI;KUROIWA, TADASHI;REEL/FRAME:023592/0434 Effective date: 20090901 |
|
ZAAA | Notice of allowance and fees due |
Free format text: ORIGINAL CODE: NOA |
|
ZAAB | Notice of allowance mailed |
Free format text: ORIGINAL CODE: MN/=. |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
MAFP | Maintenance fee payment |
Free format text: PAYMENT OF MAINTENANCE FEE, 8TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: M1552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Year of fee payment: 8 |
|
FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20240117 |