WO2006082967A1 - 撮像装置 - Google Patents
撮像装置 Download PDFInfo
- Publication number
- WO2006082967A1 WO2006082967A1 PCT/JP2006/301998 JP2006301998W WO2006082967A1 WO 2006082967 A1 WO2006082967 A1 WO 2006082967A1 JP 2006301998 W JP2006301998 W JP 2006301998W WO 2006082967 A1 WO2006082967 A1 WO 2006082967A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- area
- unit
- feature point
- information
- subject
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B17/00—Details of cameras or camera bodies; Accessories therefor
- G03B17/02—Bodies
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B13/00—Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
- G03B13/32—Means for focusing
- G03B13/34—Power focusing
- G03B13/36—Autofocus systems
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/63—Control of cameras or camera modules by using electronic viewfinders
- H04N23/633—Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
- H04N23/635—Region indicators; Field of view indicators
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/673—Focus control based on electronic image sensor signals based on contrast or high frequency components of image signals, e.g. hill climbing method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/67—Focus control based on electronic image sensor signals
- H04N23/675—Focus control based on electronic image sensor signals comprising setting of focusing regions
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/68—Control of cameras or camera modules for stable pick-up of the scene, e.g. compensating for camera body vibrations
- H04N23/682—Vibration or motion blur correction
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B2205/00—Adjustment of optical system relative to image or object surface other than for focusing
- G03B2205/0007—Movement of one or more optical elements for control of motion blur
Definitions
- the present invention relates to an imaging apparatus such as a digital still camera and a digital video camera, and more particularly to an imaging apparatus such as a digital still camera and a digital video camera having an autofocus function.
- imaging devices such as digital still cameras and digital video cameras equipped with an imaging device such as a CCD or a CMOS are explosively widespread.
- these imaging devices detect an in-focus state based on an image signal of a subject, and move a focus lens group included in a photographing optical system in the optical axis direction based on a detection result, thereby performing autofocus control.
- Those that do are the mainstream.
- Patent Document 1 describes an autofocus device that is applied to an imaging device and performs force adjustment.
- the autofocus device divides an image signal of a subject into a plurality of focus areas, counts the number of skin color pixels included in each focus area, and specifies a focus area for focus adjustment.
- the conventional autofocus device described in Patent Document 1 assumes a person as a main subject.
- the autofocus device performs force control based on the skin color pixels, so that the focus area follows the person and can always accurately focus on the person.
- Patent Document 1 Japanese Patent Application Laid-Open No. 2004-37733
- the conventional autofocus device described in Patent Document 1 assumes that force tracking is performed following a person, and the area following the focus is displayed with a marker or the like. Or select and display the medium power of multiple focus areas.
- the position of the marker displayed on the monitor screen before shooting or the position of the selected focus area changes little by little due to the shake of the autofocus device itself. There is a problem that it is difficult to see. In particular, this effect becomes significant when imaging at high magnification.
- an object of the present invention is to provide an imaging apparatus that can perform focus adjustment on a moving subject and can prevent unnecessary fluctuation of a focus area to be displayed. .
- An imaging apparatus that outputs an electrical image signal of a subject, and includes an imaging optical system that includes a focus lens group for performing focus adjustment and forms an optical image of the subject, and an optical image of the subject In the first area group including at least one area out of a plurality of areas, an image sensor that captures images and converts them into electrical image signals, an image dividing unit that divides the image signals into a plurality of areas, and a plurality of areas.
- a focus information calculation unit that calculates focus information of the imaging optical system, a lens drive control unit that drives and controls the focus lens group in the optical axis direction based on the focus information, and at least one of a plurality of areas.
- a feature point extraction unit that extracts feature points of a subject within a second area group including one area and outputs the feature point position information indicating the position of the feature point, and a temporal vibration frequency of the feature point position information Extract low-frequency components of Comprising a low-pass filter which outputs the position information, based on the extracted position information, it calculates the position of the second area group to be displayed, and an area selection unit for outputting as display position information.
- the present invention it is possible to provide an imaging apparatus capable of performing focus adjustment on a moving subject and preventing unnecessary fluctuations in the focus area to be displayed.
- FIG. 1 is a block diagram of an imaging apparatus according to Embodiment 1 of the present invention.
- FIG. 2 is a diagram showing display positions of area frames displayed on the display unit 17.
- FIG. 3 is a schematic view of the back surface of the imaging device main body according to Embodiment 1 of the present invention.
- FIG. 4 is a flowchart showing the operation of the imaging apparatus in the reference color information setting process.
- FIG. 5 is an overview of the display unit 17 in which a subject is projected.
- Fig. 6 is an overview of the display unit 17 in which the subject and the area frame are projected in the first embodiment of the present invention.
- FIG. 7 is a calculation formula for hue and saturation information in the first embodiment.
- FIG. 8 is a chart of hue and saturation information in the first embodiment.
- FIG. 9 is a chart of hue and saturation information showing reference color information and reference vicinity region 1 in the first embodiment.
- FIG. 10 is a flowchart showing the operation of the imaging apparatus in the focus follow-up imaging process.
- FIG. 11A is an overview diagram of display unit 17 in which the subject and the AF area frame in Embodiment 1 are projected.
- FIG. 11B is an overview of the display unit 17 in which the subject and the AF area frame in Embodiment 1 are projected.
- FIG. 11C is an overview diagram of display unit 17 in which the subject and the AF area frame in Embodiment 1 are projected.
- FIG. 11D is an overview of the display unit 17 in which the subject and the AF area frame in Embodiment 1 are projected.
- FIG. 12 schematically shows the movement of the unit areas Bla to Bld in FIGS. FIG. 12
- FIG. 13 is a diagram showing the coordinates of feature points calculated by a feature point position calculation unit.
- FIG. 14A is a diagram showing the coordinate relationship between the display area frame displayed on the display unit 17 and the feature points.
- FIG. 14B is a diagram showing the coordinate relationship between the display area frame displayed on the display unit 17 and the feature points.
- FIG. 15 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 2 of the present invention.
- FIG. 16 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 4 of the present invention.
- FIG. 17 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 5 of the present invention.
- FIG. 18 is a flowchart showing an operation in focus follow-up imaging processing of the imaging apparatus according to the fifth embodiment.
- FIG. 19 is a block diagram showing a configuration of an imaging apparatus according to Embodiment 6 of the present invention.
- FIG. 20 is a flowchart showing operations in focus follow-up imaging processing of the imaging apparatus according to Embodiment 6.
- FIG. 21 is a diagram showing the coordinates of feature points calculated by a feature point position calculation unit.
- FIG. 22 is an overview of the display unit 17 on which an AF area frame is displayed in the sixth embodiment.
- FIG. 23A is an overview of the display unit 17 in which the subject and the area frame according to the seventh embodiment are projected.
- FIG. 23B is an overview of the display unit 17 in which the subject and the area frame according to the seventh embodiment are projected.
- FIG. 23C shows a display unit 17 in which a subject and an area frame according to Embodiment 7 are projected.
- FIG. 23C shows a display unit 17 in which a subject and an area frame according to Embodiment 7 are projected.
- FIG. 23D is an overview of the display unit 17 in which the subject and the area frame according to the seventh embodiment are projected.
- FIG. 24 is a block diagram showing details of a low-pass filter 36 according to Embodiment 7.
- FIG. 25A is an input signal waveform of low-pass filter 36 according to Embodiment 7.
- FIG. 25B is an output signal waveform of low-pass filter 36 according to Embodiment 7.
- FIG. 26 is a graph showing the relationship between the cut-off frequency fc and the image blur evaluation point of the low-pass filter 36 according to the seventh embodiment.
- FIG. 1 is a block diagram of the imaging apparatus according to the first embodiment of the present invention.
- the imaging apparatus according to the first embodiment includes a lens barrel 11, a zoom lens system 12 and a focus lens 13 that are imaging optical systems, a CCD 14 that is an imaging element, an image processing unit 15, an image memory 16, and a display.
- a unit 17, an operation unit 19, a lens driving unit 21, and a system controller 30 are included.
- the lens barrel 11 holds a zoom lens system 12 inside.
- the zoom lens system 12 and the focus lens 13 are imaging optical systems that form an optical image of a subject so that the magnification can be changed.
- the imaging optical system includes, in order from the subject side, a zoom lens group 12a and a zoom lens group 12b that move along the optical axis at the time of zooming, and a focus lens that moves along the optical axis to adjust the focus state 13 It consists of.
- the CCD 14 is an image sensor that captures an optical image formed by the zoom lens system 12 at a predetermined timing, converts it into an electrical image signal, and outputs it.
- the image processing unit 15 is a processing unit that performs predetermined image processing such as white balance correction and ⁇ correction on the image signal output from the CCD 14.
- the image memory 16 temporarily stores the image signal output from the image processing unit 15.
- the display unit 17 is typically a liquid crystal display, and will be described later with a system controller 30. Based on the powerful command, the image signal output from the CCD 14 or the image signal stored in the image memory 16 is input via the image processing unit 15, and this image signal is displayed to the user as a visible image.
- the image processing unit 15 can bidirectionally access a memory force 18 that can be attached and detached by a user.
- the memory card 18 stores an image signal output from the CCD 14 or an image signal stored in the image memory 16 via the image processing unit 15 based on a command from the system controller 30 described later, The stored image signal is output to the image memory 16 via the image processing unit 15 and temporarily stored.
- the operation unit 19 is provided on the outside of the imaging apparatus main body, and is a button used by the user to set or operate the imaging apparatus main body. Although the operation unit 19 includes a plurality of operation buttons, details will be described later with reference to FIG.
- the lens driving unit 21 outputs a driving signal for driving the focus lens 13 in the optical axis direction (A direction or B direction) based on a command from a lens position control unit 33 of the system controller 30 described later.
- the lens driving unit 21 also has a function of driving the zoom lens 12 in the optical axis direction by a user operating a zoom lever (not shown).
- the system controller 30 includes an image division unit 31, a focusing information calculation unit 32, a lens position control unit 33, a feature point extraction unit 34, a feature point position calculation unit 35, and a low-pass filter 36. And an AF area selection unit 37 and a unit area selection unit 40.
- the image dividing unit 31 performs a process of dividing the image signal output from the image memory 16 into a plurality of unit areas.
- the focus information calculation unit 32 calculates the defocus amount from the image signal divided into a plurality of unit areas by the image division unit 31 based on the contrast information of each unit area and the position information of the focus lens 13. Calculate.
- the focusing information calculation unit 32 calculates the defocus amount in the first area group including at least one unit area.
- the first area group is composed of a minimum unit area group that performs a process of extracting a feature point of a subject to be described later and a process of calculating a defocus amount.
- the lens position control unit 33 generates a control signal for controlling the position of the focus lens 13 based on the defocus amount output from the focusing information calculation unit 32, and the lens driving unit Output to 21. Further, the lens position control unit 33 outputs position information when the lens driving unit 21 drives the focus lens 13 to the in-focus state calculation unit 32. At this time, the focus state calculation unit 32 can calculate the defocus amount from the position information of the focus lens 13 and the contrast information.
- the feature point extraction unit 34 also extracts the feature points of each unit area from the image signal power divided into a plurality of unit areas by the image division unit 31.
- the feature point extraction unit 34 calculates color information as the feature points of each unit area, and outputs the calculated color information to the feature point position calculation unit 35.
- the feature points are extracted from the second area group including at least one unit area.
- the area from which the feature points are extracted is output from an AF area selection unit 37 described later. In the following description, it is determined by display position information indicating the range of the AF area.
- the feature point extraction unit 34 Based on the display position information indicating the range of the feature point setting area output from the AF area selection unit 37 described later, the feature point extraction unit 34 includes the range of the feature point setting area among the image signals divided into unit areas. Color information is calculated as a feature point of each unit area included in the image and output to the feature point information setting unit 41.
- the feature point information setting unit 41 calculates and stores the color information of the unit area selected by the user from the color information of each unit area output from the feature point extraction unit 34, and stores the feature point information. Perform the setting process.
- the feature point information setting unit 41 includes a non-volatile memory, and can store the reference color information once stored even when the power of the imaging apparatus main body is turned off.
- the feature point information setting unit 41 reads out the stored color information and outputs the stored color information to the feature point position calculation unit 35 when performing the focus following imaging process.
- the feature point position calculation unit 35 compares the feature points of each unit area output from the feature point extraction unit 34 with the feature points output from the feature point information setting unit 41 and finds a position that substantially matches. Calculate. In the present embodiment, the feature point position calculation unit 35 compares the color information of each unit area with the color information output from the feature point information setting unit 41, and calculates a substantially matching position. The feature point position calculation unit 35 outputs the feature point position information obtained by the calculation to the low-pass filter 36 and the unit area selection unit 40.
- the feature point position information is information indicating coordinates, for example.
- the low-pass filter 36 is the feature point position information output from the feature point position calculation unit 35. By removing the high-frequency components, the low-frequency components of the temporal vibration frequency of the feature point position information are extracted and output. For example, a low-pass filter adds feature point position information and outputs an average value thereof, or a moving average that outputs an average value of feature point position information obtained within a certain period of time. Extracts low frequency components of position information. The low-pass filter 36 outputs the extracted low-frequency component value to the AF area selection unit 37 as extraction position information.
- the AF area selection unit 37 generates display position information indicating the position of the AF area to be displayed on the display unit 17 based on the extraction position information output from the low-pass filter 36, and the feature point extraction unit 34 and display unit 17.
- the AF area selection unit 37 reads the default display position information stored in advance in a memory (not shown) and outputs it to the feature point extraction unit 34 and the display unit 17 when the AF area is displayed for the first time in the focus tracking imaging process. To do.
- the AF area selection unit 37 reads out default display position information stored in advance in a memory, not shown. Output to the feature point extraction unit 34 and the display unit 17.
- the unit area selection unit 40 selects a unit area existing at the position indicated by the feature point position information based on the feature point position information output from the feature point position calculation unit 35. For example, when the feature point position information is information indicating coordinates, the unit area selection unit 40 selects a unit area including the coordinates output from the feature point position calculation unit 35. Then, the unit area selection unit 40 causes the display unit 17 to display a unit area frame surrounding the selected unit area.
- FIG. 2 is a diagram showing an area frame displayed on the display unit 17.
- FIG. 2 shows an example in which an image signal to be displayed on the display unit 17 is divided into 18 parts in the horizontal direction (X direction) and 13 parts in the vertical direction (y direction).
- the image signal is divided into 18 ⁇ 13 unit areas, and the display unit 17 displays unit area frames that respectively surround the 18 ⁇ 13 unit areas.
- a unit area BO represents one of unit areas for performing a subject feature point extraction process and a focus information calculation process, which will be described later, and is represented by coordinates (10, 7) in this example.
- the unit area selector 40 does not necessarily display 18 X 13 unit area frames. Only the unit area frame surrounding the selected unit area may be displayed. For example, if you want to display all the area frames, you can make the unit area frame lines thinner or lighter, and use color lines to make the display 17 easier to see.
- an AF area frame surrounding an AF area AO formed of at least one unit area is displayed on the display unit 17.
- the AF area AO is an area where feature points are extracted in the focus tracking imaging process of the subject.
- FIG. 3 is a schematic view of the back surface of the imaging device main body according to Embodiment 1 of the present invention.
- the imaging apparatus according to the first embodiment includes an imaging apparatus main body 10, a display unit 17, an operation unit 19, and a finder 10a.
- the viewfinder 10a is an optical system that optically guides an image of a subject to the user's pupil.
- the display unit 17 is the liquid crystal display described above, and displays the captured image signal as a visible image to the user.
- the operation unit 19 includes a shutter button 19a, a cursor key 19b, an enter button 19c, and a menu button 19d.
- the shutter button 19a starts the focus follow-up imaging process when pressed halfway by the user, and performs an operation of storing an image captured when fully pressed by the user in the memory card.
- the cursor key 19b is operated to select menu force items and contents of various operation modes displayed on the display unit 17.
- the determination button 19c is operated to determine the content selected by the cursor key 19b.
- the menu button 19d is operated to display a menu of various general operation modes of the imaging apparatus main body.
- the items of the various operation mode menus include whether or not to start the feature point information (setting feature point information) of the captured image signal on the display unit 17 described later.
- the menu button 19d When the user operates the menu button 19d to display a menu related to the start of the feature point information setting process on the display unit 17, the cursor key 19b accepts selection of the content by the operation by the user.
- the feature point information setting process is started by the feature point information setting unit 41.
- FIG. 4 is a flowchart showing the operation of the imaging apparatus in the feature point information setting process. The flowchart shown in FIG. 4 shows the operation flow of a program operating on the system controller 30.
- FIG. 5 is an overview of the display unit 17 in which the subject is projected.
- FIG. 5 shows an example in which the subject P2 is projected on the display unit 17.
- FIG. 6 is an overview of the display unit 17 in which the subject and the area frame are shown in the first embodiment of the present invention.
- FIG. 6 shows an example in which 18 ⁇ 13 unit area frames are displayed on the subject P2.
- step S 101 the image signal captured by the CCD 14 is output from the image processing unit 15 and a visible image is displayed on the display unit 17.
- the unit area selection unit 40 displays the unit area frame on the display unit 17. At this time, the image displayed on the display unit 17 is in a display state in which the visible image and the unit area frame are superimposed as shown in FIG.
- An image signal input from the image memory 16 to the image dividing unit 31 of the system controller 30 is divided for each unit area.
- step S102 the process waits for an input as to whether or not the feature point setting area C1 has been selected.
- the feature point setting area C1 is used for setting feature points.
- the AF area selection unit 37 outputs display position information indicating the range of the feature point setting area C 1 to the display unit 17 and causes the display unit 17 to display a feature point setting area frame.
- a specific area (feature point setting area C1) is displayed with a thick frame to indicate that it can be selected.
- the user can move the bold frame area with the cursor keys 19b. For example, when the user moves the thick frame area and presses the enter button 19c, the feature point setting area C1 shown in FIG. 6 is selected, and the process proceeds to step S103.
- step S103 the feature point extraction unit 34 calculates the color information of the divided image displayed in the selected feature point setting area C1, and proceeds to the process of step S104.
- step S104 the feature point information setting unit 41 stores the calculated color information, and ends the feature point information setting process.
- FIG. 7 is a calculation formula for hue and saturation information in the first embodiment. The following steps The calculation principle of the hue and saturation information of the feature point extraction unit 34 described in S103 will be described.
- the image signal is divided into red (hereinafter referred to as “R”), green (hereinafter referred to as “G”), and blue (hereinafter referred to as “B”). .
- the calculation of hue and saturation information is executed by the feature point extraction unit 34.
- the feature point extraction unit 34 calculates the maximum values of R, G, and B for the divided image signal output from the image division unit 31 (hereinafter referred to as the divided image signal).
- the maximum value is V (Equation 1).
- the feature point extraction unit 34 obtains a minimum value for the divided image signal output from the image division unit 31, and subtracts the obtained minimum value from V to obtain d (Equation 2).
- the feature point extraction unit 34 obtains the saturation S from V and d (Equation 3).
- the predetermined processing means that the hue H is obtained from Equation 5 when equal to the maximum value force 3 ⁇ 4 of R, G, B, and the hue H is obtained according to Equation 6 when equal to G, and equal to B. In this case, the hue H is obtained by Equation 7.
- the feature point extraction unit 34 adds 360 to convert it to a positive value (Equation 8). As described above, the feature point extraction unit 34 calculates the hue and saturation information of the divided image signal by calculation.
- FIG. 8 is a chart of hue and saturation information in the first embodiment.
- hue H corresponds to the rotation direction and is represented by a value of 0 to 359 along the rotation direction.
- the feature point extraction unit 34 calculates the saturation and hue of the divided image signal.
- the reference color information including the calculated saturation and hue is output and stored as feature point information to the feature point information setting unit 41. Next, the reference neighborhood area set adjacent to the reference color information is displayed. And explain.
- the reference color information calculated by the feature point extraction unit 34 is stored in the feature point information setting unit 41, and used as a reference to determine color information of a subject to be referred to and imaged at any time.
- the captured color information may slightly change depending on factors such as illumination light and exposure time. Therefore, when comparing the reference color information with the color information of the subject to be imaged, it is desirable to determine the identity by giving the reference color information a certain tolerance. This certain allowable range of reference color information is called the reference neighborhood area.
- FIG. 9 is a chart of hue and saturation information showing the reference color information and the reference neighboring area 1 in the first embodiment.
- points plotted as reference color information correspond to the color information stored in the feature point information setting unit 41.
- the force in which the allowable range of hue is uniformly set is not limited to this.
- the reference range can be accurately defined even when the imaging device is used in a dark place by correcting the color information range that is referenced based on the hue information of the light source. It becomes possible. For example, in the case of a red auxiliary light source using LEDs or the like, correction can be made by shifting HI in the direction of 0.
- FIG. 10 is a flowchart showing the operation of the imaging apparatus in the focus tracking imaging process.
- the flowchart shown in FIG. 10 shows an operation flow of a program operating on the system controller 30.
- FIG. 11 is an overview of the display unit 17 in which the subject and the AF area frame in the first embodiment are projected.
- Figures 11A to 11D show 18 x 13 unit area frames and AF area frame (area A1) on subject P1. ) Is displayed.
- 11A to 11D show how the subject P1 moves on the display unit 17 each time lZ30 [s] elapses.
- the unit area frame in which the feature points of the subject extracted by the feature point extraction unit 34 are moved in the order of area Bla, area Blb, area Blc, and area Bid.
- the shutter button 19a is half-pressed by the user, the focus follow-up imaging process is started.
- step S201 the display unit 17 displays a visible image and an AF area frame. Specifically, the display unit 17 displays an image signal captured by the CCD 14 and subjected to predetermined image processing by the image processing unit 15 as a visible image. The image signal is divided into 18 ⁇ 13 unit areas by the image dividing unit 31, and an AF area A1 is formed with 7 ⁇ 5 unit areas. The AF area frame surrounding the AF area A1 is displayed over the image signal. At this time, as shown in FIG. 11A, the display unit 17 is in a display state in which the visible image and the AF area frame are superimposed. In Fig. 11A, the force unit area frame that also displays the unit area frame should not be displayed.
- step S202 it is determined whether or not the center coordinates of the AF area A1 are outside the predetermined range of the display unit 17. If the center coordinates of the AF area A1 are outside the predetermined range, the AF area frame will be displayed outside the screen.
- the predetermined range is, for example, a range including coordinates near the center of the display unit 17. For example, in FIGS. 11A to 11D, the coordinates (3, 2), (14, 2), (14, 10), ( 3, 10).
- step S203 the center coordinate of the AF area A1 is reset to default in step S203.
- the center coordinates of the AF area A1 are moved to the coordinates (8, 6) which are the center coordinates of the display unit 17 shown in FIG.
- step S201 the display unit 17 displays the visible image and the AF area frame.
- the process proceeds to step S204.
- step S204 the unit area selection unit 40 determines whether or not there is a feature point in the AF area A1. Specifically, based on the reference color information stored in the feature point information setting unit 41, the reference neighborhood region is calculated by the method described in FIG. 9, and each area output from the feature point extraction unit 34 is calculated. Whether the color information is within the reference neighborhood It is. If there is a feature point, that is, if there is an area having color information close to the reference color information that is the feature point information, the process proceeds to step S205. On the other hand, if the feature point is strong, that is, if an area having color information close to the reference color information is strong, the process proceeds to step S208.
- the image dividing unit 31 outputs the image signal divided into 18 ⁇ 13 unit areas to the feature point extracting unit 34 and the focus information calculating unit 32. Based on the display position information information output from the AF area selection unit 37, the feature point extraction unit 34 is used as a feature point of each unit area included in the range of the AF area A1 in the image signal divided into unit areas. Color information is calculated.
- the area Bla extracted as a feature point is represented by coordinates (5, 6).
- the area Bib extracted as a feature point is represented by coordinates (10, 5).
- the area Blc extracted as a feature point is represented by coordinates (8, 4).
- the area Bid extracted as a feature point is represented by coordinates (11, 8).
- FIG. 12 is a diagram schematically showing the movement of the unit areas Bla to Bld in FIGS. 11A to 11D. 11A, FIG. 11B, FIG. 11C, and FIG. 1
- the unit area frame from which the feature points are extracted moves in the order shown in FIG.
- the display unit 17 is in the state illustrated in FIG. 11D will be described as an example.
- a low frequency component is also extracted from the feature point position information force.
- the low-pass filter 36 calculates the average of the coordinates of the current feature point (area Bid) and the previous feature point (area Blc) of the selected unit areas (areas Bla to Bld). And output as extracted position information.
- step S206 the display position of AF area A1 is selected based on the extracted position information.
- the AF area selection unit 37 selects the display position of the AF area A1, outputs the display position information, and causes the display unit 17 to display the AF area frame A1.
- the defocus amount is calculated in the unit area (area Bid) selected by the unit area selection unit 40.
- the focusing information calculation unit 32 calculates the contrast of the image signal power of each unit area selected by the unit area selection unit 40 and calculates the defocus amount to the position where the contrast reaches a peak. More specifically, the defocus amount is calculated from the focus information calculation unit 32 to the lens position control unit 33.
- the lens position control unit 33 causes the lens driving unit 21 to drive the focus lens 13 in the A direction or the B direction, and sends the position information of the focus lens 13 to the focusing information calculation unit 32.
- the defocus amount is calculated from the position of the focus lens and the current position where the contrast information power for which the position information of the focus lens 13 and the image signal power are also calculated has the highest contrast value.
- step S207 focusing is performed in the unit area (for example, area Bid) selected in step S207. Specifically, the defocus amount calculated by the focus information calculation unit 32 is sent to the lens position control unit 33, and the lens position control unit 33 determines the focus lens based on this defocus amount! Drive 13 to focus on the subject. Then, the process proceeds to step S209.
- the unit area for example, area Bid
- step S204 if it is determined in step S204 that there is no feature point in the AF area (area A1), the focus information calculation unit 32, lens position control unit 33, and lens driving unit 21 perform focusing in step S208.
- the focus information calculation unit 32 calculates the focus lens position information and the contrast signal that also generates the image signal force for all areas in the AF area.
- the position of the focus lens 13 where the contrast value is the highest is calculated.
- the defocus amount is calculated from the position of the focus lens 13 having the highest contrast value and the current position.
- step S209 the focus lens 13 is moved by the lens position control unit 33 and the lens drive unit 21 based on the defocus amount obtained by the focus information calculation unit 32 in the process of step S207 or step S208.
- the operation of focusing in the selected area is performed, and the process proceeds to the next step S210.
- step S208 the defocus amount of the closest area is selected from the defocus amounts obtained in all areas in area A1, and the closest area is focused in step S209.
- the defocus amount may be selected with priority given to the area near the center, and the area near the center may be focused in step S209.
- step S210 it is determined whether or not the shutter button 19a has been fully pressed. If the shutter button 19a has been fully pressed, the process proceeds to the next step S211. When the shutter button 19a is released, all the above processes are performed again.
- step S211 the shutter button 19 At the timing when a is fully pressed, based on the command from the system controller 30, the image processing that stores the image signal output from the image memory 16 or the image processing unit 15 in the memory card is performed, and the focus follow-up imaging processing is performed. Terminate.
- FIG. 13 is a diagram showing the coordinates of feature points calculated by the feature point position calculation unit 35.
- the graph shown in the upper part of FIG. 13 is a graph showing the change over time in the X direction of the unit area where the feature points exist.
- the vertical axis represents the X coordinate of the unit area where the feature point exists in the display unit 17, and the horizontal axis represents time t.
- the waveform Wxl is a waveform showing the temporal change of the position in the X direction among the feature point position information output from the feature point position calculation unit 35.
- Waveform Wx2 is a waveform showing the temporal change in position in the X direction in the extracted position information output from low-pass filter 36. In this way, by extracting the low frequency component of Wxl, it is possible to generate a waveform with less variation than Wxl.
- the graph shown in the lower part of FIG. 13 is a graph showing the temporal change in the y direction of the unit area where the feature points exist.
- the vertical axis represents the y coordinate of the unit area where the feature point exists in the display unit 17, and the horizontal axis represents time t.
- the waveform Wyl is a waveform indicating the temporal change in the position in the y direction in the feature point position information output from the feature point position calculation unit 35.
- Waveform Wy2 is a waveform showing the temporal change in position in the y direction in the extracted position information output from low-pass filter 36. In this way, by extracting the low-frequency component of Wy 1, it is possible to generate a waveform with less fluctuation compared to Wyl.
- the feature point position information of the unit area is plotted at intervals of Ts for performing the focus information calculation process or the feature point extraction process.
- FIG. 14 is a diagram showing an overview of the display unit 17 on which the AF area frame and the unit area frame are displayed. is there.
- the coordinates of the subject feature points described with reference to FIGS. 11 and 13 change greatly in the order of area Bla, area Blb, area Blc, and area Bid.
- the image displayed on the display unit is updated. Since the position of the displayed unit area changes every lZ30 [s], the screen becomes very difficult to see.
- the imaging apparatus covers one or more unit areas that do not display only the unit areas (areas Bla to Bld) from which the feature points of these subjects are extracted.
- the size of the area A1 that is determined according to the low-frequency components of the X-coordinate and y-coordinate of the area where the feature points exist output from the low-pass filter 36 here, 7 x 5 size (including unit area) is set and displayed on display unit 17. Therefore, as shown in Fig. 14A, the AF area (area A1) is kept in a stable position even when the position of the feature point of the subject changes in small increments in the order of areas Bla, Blb, Blc, and Bid. It can be displayed.
- the unit area (area B2a to B2d) from which the feature point of the subject is extracted may be lost.
- Area A2 is set and displayed so as to include this area as a range. Therefore, when the image pickup device body is moved in the lower left direction while the AF area is displayed at the center of the display unit 17 (the state shown in FIG. 14A), the AF area displayed on the display unit 17 is slowly moved upward to the right. And move from FIG. 14A to the state of FIG. 14B. This makes it possible to follow the subject and clearly indicate that it is displayed, and to display it as an AF area for easy viewing.
- the position of the displayed AF area does not change in small increments, so that the range in which the subject is captured can be easily displayed on the screen.
- An imaging device with high operability can be provided.
- the control information is calculated within the minimum necessary AF area, the load on the calculation process is reduced. Therefore, the function of the imaging device can be improved. Also special Since the color information of the subject used as the point can be arbitrarily set by the user, the function of the imaging apparatus can be further improved.
- the center coordinate of the AF area when the center coordinate of the AF area is outside the predetermined range, the center coordinate of the AF area is reset to the default, and the AF area frame is moved near the center of the screen.
- the imaging device is a digital still camera or digital video camera, in general, when the subject moves to the edge of the screen, the user positions the imaging device so that the subject is projected near the center of the screen. Change. Therefore, when the center coordinate of the AF area is outside the predetermined range, the AF area display position can be quickly moved to the vicinity of the center of the screen by resetting the center coordinates of the AF area to the default.
- the image dividing unit divides the image signal into 18 ⁇ 13 unit areas and the display unit displays 18 ⁇ 13 unit area frames as an example.
- the setting of the unit area is arbitrary and may be set as appropriate.
- some unit areas may be combined into a unit area. In that case, a plurality of unit areas may overlap each other.
- the example in which the feature point information is calculated and stored in 2 ⁇ 2 area frames has been described, but the setting of the size and position of the area frame is arbitrary.
- the image pickup apparatus main body may make it possible to store some reference color information as feature point information, such as skin color.
- feature point information is stored in advance in a storage device such as a memory provided in the imaging device.
- the feature point extraction unit 34 extracts feature points based on the feature point information stored in advance in the memory when extracting the feature points.
- the image pickup apparatus may or may not include the feature point information setting unit 41.
- the force used for focus tracking imaging processing described with respect to the example in which the frame surrounding the AF area used for focus tracking is displayed as an AF area frame is not necessarily the same as the area to be displayed. There is no need to do it.
- focus information calculation processing and feature point extraction processing may be performed in all areas other than the AF area.
- the area of the AF area frame to be displayed is preferably larger than the area of the area where the focus information is calculated.
- the display position of the AF area that is displayed first at the start of the focus follow-up process and the display position of the AF area when the center coordinate is outside the predetermined range The default display position of the force AF area described in the example near the center of the screen is not limited to this. For example, in a surveillance camera, the subject often appears at the edge of the screen. Therefore, in such a case, use the default display position of the AF area as the edge of the screen.
- the imaging device uses color information when extracting feature points.
- the imaging device according to the present embodiment relates to luminance when extracting feature points. It is characterized by using information.
- FIG. 15 is a block diagram showing a configuration of the imaging apparatus according to Embodiment 2 of the present invention.
- the imaging apparatus according to Embodiment 2 has the same schematic configuration as that of the imaging apparatus according to Embodiment 1, and therefore, the same reference numerals are assigned to components that function in the same manner as in FIG. 1, and detailed description thereof is omitted. To do.
- the system controller 30a shown in FIG. 15 has the feature point extraction unit 34 and the feature point information setting unit 41 omitted as compared with the system controller 30 provided in the imaging apparatus according to Embodiment 1 shown in FIG. Is different. Further, in the system controller 30a shown in FIG. 15, since the operations of the image dividing unit and the feature point position calculating unit are different from those in the first embodiment, the image dividing unit and the feature point position calculating unit in the present embodiment are different from those in the embodiment. In order to distinguish from the image dividing unit 31 and the feature point position calculating unit 35 in FIG. 1, they are referred to as an image dividing unit 31a and a feature point position calculating unit 35a, respectively.
- the image dividing unit 31a outputs the image signal divided for each unit area to the focusing information calculating unit 32 and the feature point position calculating unit 35a.
- the feature point position calculation unit 35a includes feature points based on information on luminance of each unit area (hereinafter referred to as luminance information) from the image signal divided into a plurality of unit areas by the image dividing unit 31a.
- luminance information information on luminance of each unit area
- the position to be calculated is calculated.
- the feature point position calculation unit 35a It is determined whether there is a luminance value indicated by the information that changes with the passage of time.
- the feature point position calculation unit 35a compares the luminance value of the image signal at a predetermined timing with the luminance value of the image signal at a timing when a certain time has elapsed from the predetermined timing cover, and the difference between the luminance values is calculated. If it is greater than the predetermined threshold, it is determined that the luminance value has changed.
- the feature point position calculation unit 35a determines that the position where the luminance value has changed is the position where the feature point exists, and uses the low-pass filter 36 and the unit area selection unit for the feature point position information obtained by the
- the operation of the imaging apparatus in the focus follow-up imaging process is the operation of the imaging apparatus according to Embodiment 1 except that luminance information is used when extracting the feature points. Since FIG. 10 is the same as FIG.
- the position where the luminance value has changed is extracted as the feature point.
- the feature point extraction method using the luminance value is not limited to this, and for example, a certain luminance value or A luminance value greater than a certain value may be set as a feature point in advance. In that case, a specific luminance value or a luminance value equal to or higher than a certain value is stored in the memory in advance.
- the feature point position calculation unit reads the luminance value stored in the memory and performs feature point extraction processing. This is particularly effective when the imaging device is a built-in camera such as a surveillance camera and the background to be projected is substantially fixed.
- the imaging device uses color information when extracting feature points.
- the imaging device according to the present embodiment uses a motion vector when extracting feature points. It is characterized by using.
- FIG. 15 Since the configuration of the imaging apparatus according to the present embodiment is the same as the configuration of the imaging apparatus according to Embodiment 2, FIG. 15 is cited.
- the feature point position calculation unit 35a based on the luminance value indicated by the luminance information of each unit area from the image signal divided into a plurality of unit areas by the image dividing unit 31a, the subject per predetermined time Detect motion vectors of (feature points) in the X and y directions respectively.
- the feature point position calculation unit 35a uses the detected motion vector as feature point information and the low-pass filter 36. Output to the unit area selector 40.
- the operation of the imaging apparatus in the focus follow-up imaging process is the same as the operation of the imaging apparatus according to the first embodiment except that a motion vector is extracted as a feature point. 10 is used and description is omitted.
- focus tracking can be performed using a motion vector.
- the imaging device uses color information when extracting feature points.
- the imaging device according to the present embodiment uses edge information when extracting feature points. It is characterized by using.
- FIG. 16 is a block diagram showing a configuration of the imaging apparatus according to Embodiment 4 of the present invention.
- the imaging apparatus according to the present embodiment has the same schematic configuration as that of the imaging apparatus according to the first embodiment, and therefore, the same reference numerals are assigned to components that function in the same manner as in FIG. Omitted.
- the system controller 30b shown in FIG. 16 has the feature point extraction unit 34 and the feature point information setting unit 41 omitted as compared with the system controller 30 provided in the imaging apparatus according to Embodiment 1 shown in FIG. Is different. Further, in the system controller 30b shown in FIG. 16, since the operation of the feature point position calculation unit and the focus information calculation unit is different from that of the first embodiment, the feature point position calculation unit and the focus information calculation unit in the present embodiment are changed. In order to distinguish from the feature point position calculation unit 35 and the focus information calculation unit 32 in Embodiment 1, they are referred to as the feature point position calculation unit 35b and the focus information calculation unit 32b, respectively.
- the focus information calculation unit 32b outputs the contrast information of each unit area to the feature point position calculation unit 35b.
- the feature point position calculation unit 35b calculates the position where the feature point exists based on the contrast information output from the focus information calculation unit 32b. Specifically, the feature point position calculation unit 35b also generates a contrast difference between the background and the subject based on the contrast information, and generates edge information indicating the contour of the subject.
- a method for generating edge information for example, there are a method of binarizing by comparing luminance values, and a method of detecting edges using a differential filter. Any other method that can generate edge information may be used to generate the edge information.
- the feature point position calculation unit 35b compares the edge information at a predetermined timing with the edge information at a timing after a predetermined time has elapsed from the predetermined timing, and extracts the changed edge position as a feature point. Then, the feature point position information of the feature points obtained by the calculation is output to the low-pass filter 36 and the unit area selection unit 40.
- the operation of the imaging apparatus in the focus tracking imaging process is the same as the operation of the imaging apparatus according to the first embodiment except that edge information is extracted as a feature point. Therefore, FIG. 10 is used and the description is omitted.
- focus tracking is performed using edge information.
- the imaging apparatus starts the focus tracking imaging process when the user presses the shutter button 19a halfway.
- the imaging apparatus according to the present embodiment is characterized in that the focus follow-up imaging process is started when the shutter button is half-pressed and exceeds a predetermined focal length.
- FIG. 17 is a block diagram showing the configuration of the imaging apparatus according to Embodiment 5 of the present invention.
- the imaging apparatus according to Embodiment 2 has the same schematic configuration as that of the imaging apparatus according to Embodiment 1, and therefore, the same reference numerals are assigned to components that function in the same manner as in FIG. 1, and detailed description thereof is omitted. To do.
- the system controller 30c in Fig. 17 is different from the system controller provided in the imaging apparatus according to Embodiment 1 shown in Fig. 1 in that it further includes a focal length calculation unit 42. Further, in the system controller 30c shown in FIG. 17, since the operations of the feature point position calculation unit and the lens position control unit are different from those in the first embodiment, the feature point position calculation unit and the lens position control unit in the present embodiment are implemented. In order to distinguish from the feature point position calculation unit 35 and the lens position control unit 33 in Embodiment 1, they are referred to as a feature point position calculation unit 35c and a lens position control unit 33c, respectively.
- the lens position control unit 33c is based on the defocus amount output from the focus information calculation unit 32. Then, a control signal for controlling the position of the focus lens 13 is generated and output to the lens driving unit 21 and the focal length calculation unit 42.
- the focal length calculation unit 42 calculates the focal length based on the control signal output from the lens position control unit 33c. When the focal length becomes equal to or greater than the predetermined value, the focal length calculation unit 42 instructs the feature point position calculation unit 35c to start the focus follow-up imaging process. When the feature point position calculation unit 35c receives an instruction for starting the focus follow-up imaging process from the focal length calculation unit 42, the feature point position calculation unit 35c starts the focus follow-up imaging process.
- FIG. 18 is a flowchart showing an operation in the focus follow-up imaging process of the imaging apparatus according to the fifth embodiment.
- the flowchart shown in FIG. 18 shows an operation flow of a program operating on the system controller 30c.
- FIG. 18 when the user presses the shirt turn button 19a halfway, the focus follow-up imaging process is started.
- step S301 the focal length calculation unit 42 calculates the focal length based on the control signal output from the lens position control unit 33c.
- step S302 the focal length calculation unit 42 determines whether or not the calculated focal length is greater than or equal to a predetermined value. When the focal length is less than the predetermined value, the focal length calculation unit 42 ends the focus tracking imaging process.
- step S 201 the focal length calculation unit 42 proceeds to the process of step S 201 in FIG. Since the processing after step S201 is the same as that of the first embodiment, FIG. 10 is used and the description thereof is omitted.
- the focus follow-up imaging process can be started when the user presses the shutter button halfway and the focal length becomes a certain value or more. This makes it possible to capture the subject and display the AF area on the screen in an easy-to-see manner even when the subject moves a lot when taking a picture at a high magnification.
- the imaging apparatus according to the present embodiment is characterized by changing the size of the AF area according to the amount of movement of the subject. It is a sign.
- FIG. 19 is a block diagram showing a configuration of the imaging apparatus according to Embodiment 6 of the present invention.
- the imaging apparatus according to Embodiment 6 has the same schematic configuration as that of the imaging apparatus according to Embodiment 1, and therefore, components that function in the same manner as in FIG. 1 are denoted by the same reference numerals, and detailed description thereof is omitted. To do.
- the system controller 30d shown in FIG. 19 is different from the system controller 30 provided in the imaging apparatus according to Embodiment 1 shown in FIG. 1 in that it further includes an area area changing unit 43. Further, in the system controller 30d shown in FIG. 19, since the operations of the feature point position calculation unit 35d and the AF area selection unit 37d are different from those in the first embodiment, the feature point position calculation unit and the AF area selection unit in the present embodiment are In order to distinguish from the feature point position calculation unit 35 and the AF area selection unit 37 in the first embodiment, they are referred to as a feature point position calculation unit 35d and an AF area selection unit 37d, respectively.
- the feature point position calculation unit 35d calculates and outputs the X-direction and y-direction coordinates of the feature points of the subject extracted by the feature point extraction unit 34.
- the feature point position calculation unit 35d outputs the coordinate information to the low-pass filter 36 and the unit area selection unit 40.
- the feature point position calculation unit 35d outputs the coordinate information to the low-pass filter 36, the unit area selection unit, and the area area change unit 43.
- the area area changing unit 43 calculates the area of the AF area based on the feature point position information output from the feature point position calculating unit 35d. Specifically, the area area changing unit 43 calculates the amplitude by performing envelope detection or root mean square on the waveform of the feature point position information. Then, the area area changing unit 43 notifies the AF area selecting unit 37d of the area of the AF area according to the change in amplitude. In the following, an example will be described in which the envelope detection is performed and the amplitude is calculated based on the X coordinate and y coordinate information as the feature point position information.
- the AF area selection unit 37d calculates the display position and area of the AF area based on the area notified to the area area changing unit 43 and the extracted position information output from the regular passage filter 36. . Then, the AF area selection unit 37d outputs the calculated display position and area of the AF area to the display unit 17 as display position information, and displays the AF area frame.
- the operation of the imaging apparatus in the focus follow-up imaging process is different from the process in step S206 in the flowchart shown in FIG.
- FIG. 20 is a flowchart showing an operation in focus follow-up imaging processing of the imaging apparatus according to the sixth embodiment.
- the operation of the imaging apparatus according to the present embodiment will be described with reference to FIG. 10 and FIG.
- step S206 shown in FIG. 10 the AF area selection unit 37d selects the display position of the AF area.
- step S401 shown in FIG. 20 the area area changing unit 43 determines whether or not to change the area of the AF area. Specifically, the area area changing unit 43 performs envelope detection based on the waveform of the feature point position information, and determines whether or not the change in amplitude is a predetermined value or more.
- the area area changing unit 43 When there is a change in the area of the AF area, that is, when the change in amplitude is greater than or equal to a predetermined value, the area area changing unit 43 notifies the AF area selecting unit 37d of the area of the AF area.
- the AF area selection unit 37d calculates the display position and area of the AF area based on the notified area and the extracted position information, and outputs the calculated display position information and display area information to the display unit 17.
- the display unit 17 displays the AF area frame based on the display position information.
- the process proceeds to step S207 in FIG.
- the area area changing unit 43 does not notify the size of the AF area frame and does not notify the size of the AF area frame. Proceed to processing.
- FIG. 21 is a diagram showing the coordinates of feature points calculated by the feature point position calculation unit 35d.
- the graph shown in the upper part of FIG. 21 is a graph showing the change over time in the X direction of the unit area where the feature points exist.
- the vertical axis represents the X coordinate of the unit area where the feature point exists on the display unit 17, and the horizontal axis represents time t.
- the waveform Wx3 is a waveform indicating the temporal change of the position in the X direction in the feature point position information output from the feature point position calculation unit 35d.
- Waveform Wx4 is a waveform showing the temporal change in position in the X direction in the extraction output from low-pass filter.
- the graph shown in the lower part of FIG. 21 is a graph showing the temporal change in the y direction of the unit area where the feature points exist.
- the vertical axis represents the y coordinate of the unit area where the feature point exists in the display unit 17, and the horizontal axis represents time t.
- the waveform Wy3 is Of the feature point position information output from the feature point position calculation unit 35d, this is a waveform showing the temporal change of the position in the y direction.
- Waveform Wy4 is a waveform showing the temporal change in position in the X direction among the extractions output from low-pass filter.
- Wx3, Wx4, Wy3, and Wy4 described here are the same as Wxl, Wx2, Wyl, and Wy 2 shown in FIG. 13 described in the first embodiment, respectively.
- a predetermined threshold is set in advance, and envelope detection is performed for each of a value greater than the predetermined threshold and a value less than the predetermined threshold.
- a waveform Wx5 represents a waveform obtained by performing envelope detection based on coordinates of the waveform Wx3 that are equal to or greater than a predetermined threshold.
- Waveform Wx6 is based on coordinates below a predetermined threshold among the coordinates of waveform Wx3! This shows the waveform that has been subjected to envelope detection. This is the amplitude in the differential direction between waveform Wx5 and waveform Wx6.
- the waveform Wy5 represents a waveform obtained by performing envelope detection based on coordinates of the waveform Wy3 that are equal to or greater than a predetermined threshold.
- Waveform Wy6 represents a waveform obtained by performing envelope detection based on coordinates below a predetermined threshold among the coordinates of waveform Wy3. The difference between this waveform Wy5 and waveform Wy6 is the amplitude in the ⁇ direction.
- the area area changing unit 43 calculates the position change of the feature point of the subject in the X direction and the y direction as the amplitude, and calculates the area (number of unit areas) of the AF area to be displayed.
- FIG. 22 is an overview of the display unit 17 on which the AF area frame in the sixth embodiment is displayed.
- a method for controlling the size of the AF area frame displayed on the display unit 17 will be described with reference to FIG. 21 and FIG.
- AF area A3a is formed from 7 X 5 unit areas !
- AF area A3b, 10 X 8 units formed from 9 X 7 unit areas
- the AF area frame size displayed on the display unit 17 is enlarged in the order of the AF area A3c formed from the unit areas.
- Embodiments 1 to 5 the effects described in Embodiments 1 to 5 are achieved. Because it has a structure that controls the size of the AF area according to the movement of the subject, the amount of camera shake when the image pickup device body shake varies from person to person or when the magnification is increased is increased. At the same time, the AF area size is controlled, and the display area of the AF area frame does not fluctuate in small increments according to the movement of the subject on the screen. In addition, the calculation processing for tracking the focus can be performed with a minimum load.
- envelope detection processing is performed based on the coordinates of feature points that are equal to or greater than a predetermined threshold as shown in FIG.
- a predetermined threshold As shown in FIG.
- the current coordinate is obtained as the output of envelope detection, or the coordinate of the feature point is less than the predetermined threshold
- a so-called peak hold process may be performed in which the current coordinate is obtained as an output of envelope detection.
- the peak hold processing may be reset and the peak hold processing may be performed from the beginning!
- the operating principle of the low-pass filter 36 described in the first embodiment will be described in more detail.
- the feature point position information output from the feature point position calculation unit 35 is extracted, and the low frequency component of the temporal vibration frequency of the feature point position information is extracted and output.
- the extracted low-frequency component values are output to the AF area selection unit 37 as AF area display position information.
- the imaging apparatus according to the present embodiment has the same schematic configuration as that of the imaging apparatus according to the first embodiment.
- the same reference numerals are assigned and detailed description is omitted.
- FIGS. 23A to 23D are overview diagrams of the display unit 17 in which the subject and the area frame are projected.
- FIG. 23 shows an example in which the image signal to be displayed on the display unit 17 is divided into 16 parts in the horizontal direction (X direction) and 12 parts in the y direction.
- the image signal is divided into 16 ⁇ 12 unit areas, and the display unit 17 displays unit area frames that respectively surround the 16 ⁇ 12 unit areas.
- the coordinates of each unit area shown in FIGS. 23A to 23D are expressed in units of 0 to 15 in the X direction and 0 to 11 in the y direction.
- each of the unit areas B4a to d has 20 [pixels]. Defined as an area represented by X 20 [pixel].
- FIGS. 23A to 23D show a state in which the position of the subject P 1 displayed on the display unit 17 changes due to camera shake or movement of the subject in the order of FIGS. 23A, 23B, 23C, and 23D. is doing. Accordingly, the unit area frame in which the feature points of the subject extracted by the feature point extraction unit 34 exist is the area B4a of coordinates (7, 5), the area B4b of coordinates (8, 6), and the coordinates (8, 5). ) Area B4c and coordinates (7, 6) area B4d.
- a dotted line frame indicates an AF area frame that moves by focus tracking of a conventional imaging apparatus.
- feature point position information in the X direction is output in the order of 7, 8, 8, and 7, and feature point position information in the y direction is output in the order of 5, 6, 5, 6.
- the unit area B4 is composed of 20 [pixels] X 20 [pixels], so if the X-direction coordinate changes from 7 to 8 or from 8 to 7, the image blur of the subject will be If 20 [pixels] are generated in the X direction and the y-direction coordinate is changed from 5 to 6 or 6 to 5, the image blur of the object will be generated 20 [pixels] in the y direction.
- the AF area frame is 1Z 30 seconds with conventional imaging devices. Move 20 [pixels] in the X or y direction every time. Therefore, the position of the displayed AF area frame changes little by little, and the display on the screen becomes very difficult to see.
- solid line frames surrounding the areas A4a to d indicate AF area frames that move by the focus tracking of the imaging apparatus according to the present embodiment.
- the imaging apparatus according to the present embodiment is a low-frequency component that extracts low-frequency components from fluctuation components of the temporal vibration frequency of coordinates. With the pass-through filter 36, it is possible to prevent the position of the AF area frame from changing little by little.
- FIG. 24 is a block diagram showing details of the low-pass filter 36 according to the present embodiment.
- FIG. 24 shows a configuration example in the case of an IIR (Infinite Impulse Response) configured with a low-pass filter 36-force digital circuit.
- IIR Infinite Impulse Response
- the low-pass filter 36 includes a position information processing unit 360x and a position information processing unit 360y.
- the position information processing unit 360x includes coefficient blocks 361 and 363, a delay block 362, and an addition block 364, and extracts and outputs a low-frequency component of temporal vibration frequency force of feature point position information in the x direction.
- the position information processing unit 360y includes coefficient blocks 365 and 367, a delay block 366, and an addition block 367, and extracts and outputs a low-frequency component of temporal vibration frequency force of feature point position information in the y direction.
- the position information processing unit 360x and the position information processing unit 360y are different in the feature point position information to be processed, but the basic operation is the same, so the position information processing unit 360x will be described as a representative example.
- the feature point position information in the X direction output from the feature point position calculation unit 35 is input to the addition block 364 of the low-pass filter 36.
- the feature point position calculation unit 35 outputs the coordinates in the X direction of the unit area B4 shown in FIGS. 23A to 23D as the feature point position information in the X direction, and displays the coordinates as the feature point positions in the y direction.
- the coordinates in the y direction of unit area B4 shown in 23A to D are output.
- the feature point position information is updated and output every 1Z30 seconds, for example.
- the addition block 364 adds the value output from the feature point position calculation unit 35 and the value from which the coefficient block 363 force is also output.
- the coefficient block 361 processes the value added by the addition block 364 with a predetermined coefficient K1, and outputs it to the area A selection unit 37.
- Delay block 362 delays the value added by addition block 364 by a predetermined time and outputs the result to coefficient block 363.
- the coefficient block 363 processes the value output from the delay block 362 with a predetermined coefficient K2, and outputs the result to the addition block 364. [0142]
- K1 expressed by the following formula (1) and K2 expressed by the formula (2) are set as coefficients.
- a low-pass filter can be configured.
- ⁇ 1 1 / ⁇ 1 + 1 / (30 ⁇ 2 ⁇ ⁇ X fc) ⁇ ---(1)
- ⁇ 2 1 / ⁇ 1 + (30 ⁇ 2 ⁇ ⁇ X fc) ⁇ ⁇ ⁇ ⁇ (2)
- FIGS. 25A and 25B show input / output signal waveforms of the low-pass filter 36.
- FIG. 25A is a diagram illustrating a waveform of an input signal input to the low-pass filter 36
- FIG. 25B is a diagram illustrating a waveform of an output signal output from the low-pass filter 36.
- the vertical axis represents the number of pixels corresponding to the amount of displacement of the subject
- the horizontal axis represents the frame number.
- FIG. 25A and 25B There are two signals in the X direction and y direction as signals that are input to the low-pass filter 36.
- Figs. 25A and 25B the waveforms and output signals of only one of these signals are shown.
- the waveform of the number is shown.
- FIG. 25A shows an example in which the position of the subject represented by the input signal is changed within a range of ⁇ 20 [pixels].
- the frame update period is 1Z30 seconds
- the time spent on frame numbers 0 to 300 is 10 seconds.
- the cut-off frequency fc of the low-pass filter 36 according to the present embodiment was changed, and the degree of image blur of the subject imaged on the display unit 17 was evaluated. Specifically, 10 subjects each evaluated the image blur of the subject displayed on the 2.5-inch monitor screen of QVGA, and the degree of image blur was “no image blur problem: 1 point” I can't say: 0.5 points "," The image blurring problem: 0 points ”was judged whether it corresponds to the deviation.
- the cutoff frequency fc of the low-pass filter is 5 [Hz], 4 [Hz], 3 [Hz], 2 [Hz], l [Hz], 0.5 [Hz], 0.2 “Hz”, 0.1 [Hz] Then, the image blurring by the subjects was evaluated.
- FIG. 26 is a graph showing the relationship between the cutoff frequency fc of the low-pass filter 36 and the image blur evaluation point by the subject.
- the vertical axis represents the average value of image blur evaluation scores by the subject, and the horizontal axis represents the cutoff frequency fc.
- the frequency at which the input signal exists can be defined as 1Z2 with sampling frequency fs.
- the average power of the input signal existing in the range of fsZ2 is statistically expressed as fc X ⁇ 2. Therefore, the power of the output signal that has passed through the low-pass filter 36 is attenuated to (fc X ⁇ 2) ⁇ ( ⁇ 2).
- Equations (3) and (4) for example, when the cut-off frequency fc is used as a parameter, fluctuations in the display position of the AF area frame can be reduced as follows.
- the low-pass filter 36 is constituted by a digital filter.
- the display condition of the AF area frame can be easily set and changed by changing the cutoff frequency fc of the low-pass filter and the block coefficient. Therefore, it is possible to easily set the AF area frame display and conditions according to the difference in image blur due to the size of the monitor screen, the size of the imaging apparatus main body, and the type.
- the AF area frame is displayed based on the position information output via the low-pass filter that suppresses temporal fluctuation in the position information of the subject.
- the cut-off frequency fc is such that the user does not feel the small vibration of the AF area frame. For example, it may be determined appropriately according to the number of pixels on the screen, the contrast, and the size of the subject.
- the monitor image is displayed as a 2.5 inch Q
- the force monitor screen described as an example that the display on the screen can be made relatively easy to see if the image blurring is suppressed to the extent that 4 [pixels] to 13 [pixels] remain.
- an inch 640 [pixel] X 480 [pixel] VGA if the image blurring is suppressed to 8 [pixel] to 26 [pixel] in proportion to the resolution increase rate in one direction, The display on the screen can be made relatively easy to see.
- the image display will be reduced if the image blur is suppressed to the extent that it remains in inverse proportion to the increase in the number of inches 6.7 [pixel] to 10.8 [pixel]. Can be made relatively easy to see.
- the low-pass filter is not limited to the IIR filter, and may be composed of a FIR (Finite Impulse Response) filter, a second-order digital filter, and other higher-order digital filters.
- the low-pass filter may be an analog filter. In that case, the feature point position information of the subject may be extracted as an analog signal.
- the digital filter may be configured by programming a microcomputer or the like, or it may be configured by hardware.
- the imaging device that works in each embodiment is not limited to a specific correspondence and can be changed as appropriate.
- the imaging apparatus according to each embodiment is a digital still camera in which the shutter button is operated by the user to acquire a still image as an example. It can also be applied as a digital video camera where images continue to be acquired. In this case, focus tracking can be performed even if the subject moves.
- the imaging device that works in each embodiment may be applied to a surveillance camera, an in-vehicle camera, and a web camera. In this case, a force that may be difficult for the user to operate the shutter button.
- the shutter button may be automatically operated at a predetermined timing, or may be remotely operated.
- the imaging apparatus is an example in which a system controller is individually provided. However, this may be applied to an imaging system in which a control CPU of a personal computer or a mobile phone terminal is substituted. Is possible.
- the respective constituent elements can be arbitrarily joined with each other.
- the photographing optical system and the image sensor and other structures are provided. Various combinations such as an example of a physically separated system, an imaging optical system, an image sensor, an image processing unit, and an example of a system in which other configurations are physically separated may be considered.
- the present invention is suitable for an imaging apparatus such as a digital still camera and a digital video camera.
Description
Claims
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2007501666A JP4245185B2 (ja) | 2005-02-07 | 2006-02-06 | 撮像装置 |
US12/832,617 US20100277636A1 (en) | 2005-02-07 | 2010-07-08 | Imaging device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2005-030264 | 2005-02-07 | ||
JP2005030264 | 2005-02-07 | ||
JP2005-114992 | 2005-04-12 | ||
JP2005114992 | 2005-04-12 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/832,617 Division US20100277636A1 (en) | 2005-02-07 | 2010-07-08 | Imaging device |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2006082967A1 true WO2006082967A1 (ja) | 2006-08-10 |
Family
ID=36777344
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2006/301998 WO2006082967A1 (ja) | 2005-02-07 | 2006-02-06 | 撮像装置 |
Country Status (3)
Country | Link |
---|---|
US (2) | US7769285B2 (ja) |
JP (2) | JP4245185B2 (ja) |
WO (1) | WO2006082967A1 (ja) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080002028A1 (en) * | 2006-06-30 | 2008-01-03 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
JP2008288868A (ja) * | 2007-05-17 | 2008-11-27 | Casio Comput Co Ltd | 撮像装置及びプログラム |
JP2009037152A (ja) * | 2007-08-03 | 2009-02-19 | Canon Inc | 合焦制御装置及び方法 |
JP2010021597A (ja) * | 2008-07-08 | 2010-01-28 | Victor Co Of Japan Ltd | 撮像装置および撮像方法 |
EP2232331A2 (en) * | 2007-12-20 | 2010-09-29 | Thomson Licensing | Device for helping the capture of images |
JP2012093775A (ja) * | 2011-12-14 | 2012-05-17 | Nikon Corp | 焦点検出装置および撮像装置 |
US8200081B2 (en) | 2007-08-02 | 2012-06-12 | Canon Kabushiki Kaisha | Image capturing apparatus and method for controlling same |
CN101753834B (zh) * | 2008-11-28 | 2013-06-26 | 日立民用电子株式会社 | 信号处理装置 |
Families Citing this family (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2007010908A (ja) * | 2005-06-29 | 2007-01-18 | Canon Inc | 焦点調節方法及び装置 |
JP5045125B2 (ja) * | 2006-03-15 | 2012-10-10 | 株式会社ニコン | 被写体追尾装置および光学機器 |
JP5016909B2 (ja) * | 2006-12-15 | 2012-09-05 | キヤノン株式会社 | 撮像装置 |
WO2009050841A1 (ja) * | 2007-10-17 | 2009-04-23 | Nikon Corporation | 合焦測定装置、合焦測定方法およびプログラム |
JP5176483B2 (ja) | 2007-10-30 | 2013-04-03 | 株式会社ニコン | 画像認識装置、画像追尾装置および撮像装置 |
US8237807B2 (en) | 2008-07-24 | 2012-08-07 | Apple Inc. | Image capturing device with touch screen for adjusting camera settings |
JP2010097167A (ja) * | 2008-09-22 | 2010-04-30 | Fujinon Corp | オートフォーカス装置 |
JP5206494B2 (ja) * | 2009-02-27 | 2013-06-12 | 株式会社リコー | 撮像装置、画像表示装置と、撮像方法及び画像表示方法並びに合焦領域枠の位置補正方法 |
JP2011029905A (ja) * | 2009-07-24 | 2011-02-10 | Fujifilm Corp | 撮像装置、方法およびプログラム |
JP5810505B2 (ja) * | 2010-11-01 | 2015-11-11 | 株式会社ソシオネクスト | 撮像制御装置、撮像装置、及び撮像制御方法 |
US9077890B2 (en) * | 2011-02-24 | 2015-07-07 | Qualcomm Incorporated | Auto-focus tracking |
CN103733607B (zh) * | 2011-08-10 | 2015-08-26 | 富士胶片株式会社 | 用于检测运动物体的装置和方法 |
US9766701B2 (en) * | 2011-12-28 | 2017-09-19 | Intel Corporation | Display dimming in response to user |
CN103999145B (zh) * | 2011-12-28 | 2017-05-17 | 英特尔公司 | 响应于用户的显示器调光 |
US9870752B2 (en) | 2011-12-28 | 2018-01-16 | Intel Corporation | Display dimming in response to user |
US20130258167A1 (en) * | 2012-03-28 | 2013-10-03 | Qualcomm Incorporated | Method and apparatus for autofocusing an imaging device |
JP5949306B2 (ja) * | 2012-08-13 | 2016-07-06 | 株式会社ニコン | 画像処理装置、撮像装置、および画像処理プログラム |
JP5900273B2 (ja) * | 2012-10-03 | 2016-04-06 | 株式会社ソシオネクスト | 合焦評価値生成装置、合焦評価値生成方法、及び、合焦評価値生成プログラム |
JP6128928B2 (ja) * | 2013-04-17 | 2017-05-17 | キヤノン株式会社 | 撮像装置、その制御方法、および制御プログラム |
JP5842884B2 (ja) * | 2013-09-04 | 2016-01-13 | 株式会社ニコン | 追尾装置およびカメラ |
CN105446056B (zh) * | 2014-12-25 | 2018-08-10 | 北京展讯高科通信技术有限公司 | 自动对焦装置及方法 |
JP6544936B2 (ja) * | 2015-01-30 | 2019-07-17 | キヤノン株式会社 | 撮像装置の制御装置、撮像装置及びその制御方法 |
KR102493746B1 (ko) * | 2016-08-18 | 2023-02-02 | 삼성전자주식회사 | 이미지 신호 처리 방법, 이미지 신호 프로세서, 및 전자 장치 |
JP7249748B2 (ja) * | 2018-08-30 | 2023-03-31 | 株式会社ミツトヨ | 焦点距離可変レンズ装置および焦点距離可変レンズ制御方法 |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60254108A (ja) * | 1984-05-31 | 1985-12-14 | Canon Inc | カメラにおける自動追尾装置 |
JPH01175373A (ja) * | 1987-12-28 | 1989-07-11 | Sony Corp | フォーカス制御回路 |
JPH03149512A (ja) * | 1989-11-07 | 1991-06-26 | Sony Corp | フォーカス制御回路 |
JPH04158322A (ja) * | 1990-10-23 | 1992-06-01 | Ricoh Co Ltd | 自動焦点調整装置 |
Family Cites Families (12)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2535845B2 (ja) * | 1986-10-08 | 1996-09-18 | キヤノン株式会社 | 自動合焦装置 |
JP3298072B2 (ja) * | 1992-07-10 | 2002-07-02 | ソニー株式会社 | ビデオカメラシステム |
US6476868B1 (en) * | 1994-04-11 | 2002-11-05 | Canon Kabushiki Kaisha | Image pickup apparatus provided with enlargement process means for enlarging image signals output from an image pickup device |
JP3450449B2 (ja) | 1994-07-18 | 2003-09-22 | キヤノン株式会社 | 撮像装置およびその撮像方法 |
EP1107166A3 (en) * | 1999-12-01 | 2008-08-06 | Matsushita Electric Industrial Co., Ltd. | Device and method for face image extraction, and recording medium having recorded program for the method |
JP4122865B2 (ja) | 2002-07-02 | 2008-07-23 | コニカミノルタオプト株式会社 | オートフォーカス装置 |
US20040234134A1 (en) * | 2003-05-19 | 2004-11-25 | Kabushiki Kaisha Toshiba | Image processing apparatus and image processing method |
JP4181923B2 (ja) | 2003-05-29 | 2008-11-19 | キヤノン株式会社 | 撮像装置および撮像装置の制御方法 |
JP4525089B2 (ja) * | 2004-01-27 | 2010-08-18 | フジノン株式会社 | オートフォーカスシステム |
KR100542365B1 (ko) * | 2004-05-07 | 2006-01-10 | 삼성전자주식회사 | 영상 화질 개선 장치 및 그 방법 |
EP1601189A2 (en) * | 2004-05-26 | 2005-11-30 | Fujinon Corporation | Autofocus system |
US7561790B2 (en) * | 2004-12-28 | 2009-07-14 | Fujinon Corporation | Auto focus system |
-
2006
- 2006-02-06 US US11/795,561 patent/US7769285B2/en active Active
- 2006-02-06 WO PCT/JP2006/301998 patent/WO2006082967A1/ja not_active Application Discontinuation
- 2006-02-06 JP JP2007501666A patent/JP4245185B2/ja active Active
-
2008
- 2008-07-30 JP JP2008196756A patent/JP2008312242A/ja active Pending
-
2010
- 2010-07-08 US US12/832,617 patent/US20100277636A1/en not_active Abandoned
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS60254108A (ja) * | 1984-05-31 | 1985-12-14 | Canon Inc | カメラにおける自動追尾装置 |
JPH01175373A (ja) * | 1987-12-28 | 1989-07-11 | Sony Corp | フォーカス制御回路 |
JPH03149512A (ja) * | 1989-11-07 | 1991-06-26 | Sony Corp | フォーカス制御回路 |
JPH04158322A (ja) * | 1990-10-23 | 1992-06-01 | Ricoh Co Ltd | 自動焦点調整装置 |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080002028A1 (en) * | 2006-06-30 | 2008-01-03 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
US8284256B2 (en) * | 2006-06-30 | 2012-10-09 | Casio Computer Co., Ltd. | Imaging apparatus and computer readable recording medium |
JP2008288868A (ja) * | 2007-05-17 | 2008-11-27 | Casio Comput Co Ltd | 撮像装置及びプログラム |
US8363122B2 (en) | 2007-05-17 | 2013-01-29 | Casio Computer Co., Ltd. | Image taking apparatus execute shooting control depending on face location |
US8200081B2 (en) | 2007-08-02 | 2012-06-12 | Canon Kabushiki Kaisha | Image capturing apparatus and method for controlling same |
JP2009037152A (ja) * | 2007-08-03 | 2009-02-19 | Canon Inc | 合焦制御装置及び方法 |
EP2232331A2 (en) * | 2007-12-20 | 2010-09-29 | Thomson Licensing | Device for helping the capture of images |
EP2232331B1 (en) * | 2007-12-20 | 2022-02-09 | InterDigital CE Patent Holdings | Device for helping the capture of images |
JP2010021597A (ja) * | 2008-07-08 | 2010-01-28 | Victor Co Of Japan Ltd | 撮像装置および撮像方法 |
CN101753834B (zh) * | 2008-11-28 | 2013-06-26 | 日立民用电子株式会社 | 信号处理装置 |
JP2012093775A (ja) * | 2011-12-14 | 2012-05-17 | Nikon Corp | 焦点検出装置および撮像装置 |
Also Published As
Publication number | Publication date |
---|---|
US20100277636A1 (en) | 2010-11-04 |
JP2008312242A (ja) | 2008-12-25 |
US20080131109A1 (en) | 2008-06-05 |
JP4245185B2 (ja) | 2009-03-25 |
US7769285B2 (en) | 2010-08-03 |
JPWO2006082967A1 (ja) | 2008-06-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP4245185B2 (ja) | 撮像装置 | |
US9830947B2 (en) | Image-capturing device | |
US9992421B2 (en) | Image pickup apparatus having FA zoom function, method for controlling the apparatus, and recording medium | |
JP4929630B2 (ja) | 撮像装置、制御方法、およびプログラム | |
JP5381486B2 (ja) | 撮像装置および方法 | |
JP6494202B2 (ja) | 像振れ補正装置、その制御方法、および撮像装置 | |
JP4916513B2 (ja) | 撮像装置 | |
US10873696B2 (en) | Image capture apparatus and control method for the same | |
JP2005284155A (ja) | マニュアルフォーカス調整装置及び合焦アシスト・プログラム | |
JP5906427B2 (ja) | 撮像装置、画像処理装置 | |
JP6598028B2 (ja) | 撮像装置 | |
JP6932531B2 (ja) | 像ブレ補正装置、撮像装置および撮像装置の制御方法 | |
JP6824710B2 (ja) | ズーム制御装置およびズーム制御方法、撮像装置 | |
JP5903658B2 (ja) | 撮像装置 | |
JP2016050973A (ja) | 撮像装置及びその制御方法 | |
US20110032390A1 (en) | Digital photographing apparatus and moving picture capturing method performed by the same | |
US11743576B2 (en) | Image processing apparatus, image processing method, program, and imaging apparatus | |
JP2014062926A (ja) | オートフォーカスシステム | |
KR101373018B1 (ko) | 디지털 영상 처리기에서 저광량 시의 손떨림 방지 장치 및방법 | |
JP2008178031A (ja) | 撮像装置 | |
JP2008249941A (ja) | 画像振れ補正装置および撮像装置 | |
JP2008064847A (ja) | 像振れ補正制御装置、制御方法、撮像装置、および撮像方法 | |
JP2014215550A (ja) | 撮像装置及びその制御方法、プログラム、記憶媒体 | |
JP6104022B2 (ja) | 撮像装置、その制御方法及びプログラム | |
JP2018056650A (ja) | 光学装置、撮像装置および制御方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWE | Wipo information: entry into national phase |
Ref document number: 11795561 Country of ref document: US Ref document number: 2007501666 Country of ref document: JP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 200680004204.6 Country of ref document: CN |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 06713141 Country of ref document: EP Kind code of ref document: A1 |
|
WWW | Wipo information: withdrawn in national office |
Ref document number: 6713141 Country of ref document: EP |