WO2006085827A1 - Method and apparatus for forming a panoramic image - Google Patents

Method and apparatus for forming a panoramic image Download PDF

Info

Publication number
WO2006085827A1
WO2006085827A1 PCT/SG2006/000024 SG2006000024W WO2006085827A1 WO 2006085827 A1 WO2006085827 A1 WO 2006085827A1 SG 2006000024 W SG2006000024 W SG 2006000024W WO 2006085827 A1 WO2006085827 A1 WO 2006085827A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
camera
display
photograph
blending area
Prior art date
Application number
PCT/SG2006/000024
Other languages
French (fr)
Inventor
Yuen Khim Liow
Siang Thia Goh
Guoran Liu
Original Assignee
Creative Technology Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Creative Technology Ltd filed Critical Creative Technology Ltd
Priority to DE112006000358.5T priority Critical patent/DE112006000358B4/en
Priority to GB0715571A priority patent/GB2438335B/en
Publication of WO2006085827A1 publication Critical patent/WO2006085827A1/en

Links

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • G03B37/04Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe with cameras or projectors providing touching or overlapping fields of view
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/02Bodies
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B17/00Details of cameras or camera bodies; Accessories therefor
    • G03B17/18Signals indicating condition of a camera member or suitability of light
    • G03B17/20Signals indicating condition of a camera member or suitability of light visible in viewfinder
    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B37/00Panoramic or wide-screen photography; Photographing extended surfaces, e.g. for surveying; Photographing internal surfaces, e.g. of pipe
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • H04N23/634Warning indications
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2624Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects for obtaining an image which is composed of whole input images, e.g. splitscreen

Definitions

  • This invention relates to a method and apparatus for forming a panoramic image and refers particularly, though not exclusively, to such a method and apparatus that facilitates the formation of a panoramic image with no portion of the panoramic image missing.
  • the first frame is taken and displayed.
  • the next frame is displayed live on the screen as the user moves the camera.
  • the user moves that camera in such a way that there is a slight overlap in photograph information in the new frame to be taken.
  • video clips can be analyzed and a panoramic photograph created from it. This becomes difficult when the size of the photographs (e.g. 4 to 8 megapixels) and the number of photographs becomes large. Video takes more storage than multiple images, and requires more processing.
  • the third and final known method is to automatically perform the stitching each time a photograph is taken. If there is insufficient or no overlap, the stitching process will fail and the user will be prompted to take another photograph. Again, processing takes power and time.
  • a method for forming a panoramic image using a camera comprises placing the camera into a panoramic image mode and taking a first photograph using the camera. On a display of the camera a blending area is formed. The blending area includes a part of a first image of the first photograph displayed on the display. The blending area is at a side of the display. The camera is moved before being prepared to take a second photograph. In the blending area only, a pixel matching process is performed for determining an alignment of the portion of the first image in the blending area and part of the second image of the second photograph in the blending area.
  • a computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the above method.
  • a camera for taking a panoramic image.
  • the camera comprises a body, a lens, a display, an image capturing device, a processor, a controller, and a memory.
  • the display is for displaying images of photographs.
  • the display comprises a blending area for determining an alignment of a portion of a first image in the blending area with a part of an image of a yet-to-be-taken second photograph.
  • the blending area may be at a side of the display and may comprise a predetermined percentage of the first image, the predetermined percentage being the range of 5% to 25%.
  • the red, green and blue values of all pixels in the blending area may be reduced to a fixed amount, the fixed amount being in the range 40% to 60%. It is preferably 45%.
  • the red, green and blue values of the pixels of the portion of the first image in the blending area and the red, green and blue values of the pixels of the part of the second image in the blending area may be summed for display on the display prior to being reduced. After summing and reduction, pixels in the blending area may be further reduced by a set amount. The set amount may be 10%. In this way the red, green and blue values in the portion are at 90% being 45% from the first image and 45% from the second image.
  • the blending area may extend for the full length of the side of the display, the side being selected from: top, bottom, left side, and right side.
  • the side may be determined by a direction of the movement of the camera.
  • a preset percentage of the first image may be removed from the display to leave the predetermined percentage in the display, the present percentage being in the range 75% to 95%.
  • the preset percentage may be 85% and the predetermined percentage may be 15%.
  • All pixels in the blending area may be given a weight value representing visual importance. Pixels that form the boundary edges of the first and second images area may be given higher weight values.
  • a boundary edge may be a collection or group of pixels that defines a clear separation of visual contrast within an image; and may further be the outlines of objects as opposed to a differently contrasted background.
  • a match status indicator may be displayed on the display, the match status indicator being variable in consequence of the result of the pixel matching process.
  • a determination of the side on which the blending area is to be displayed may be made after the pixel matching process.
  • a subsequent preview image of a yet-to- be-taken subsequent photograph may have the side in common with a previous photograph.
  • the previous photograph may be an immediately proceeding photograph.
  • a penultimate aspect there is provided method of forming a panoramic image using a camera.
  • the method comprises taking a first photograph and displaying at least a portion of an image of the first photograph on a display; moving the camera for a second photograph and displaying at least a part of preview image of the second photograph on the display; conducting a pixel match for the portion and the part; and using a match status indicator to indicate a result of the pixel match.
  • the match status indicator may be variable in consequence of the pixel matching process, and may be of a first colour for a poor pixel match, a second colour for a good pixel match, and a third colour for a perfect pixel match; the first, second and third colours being different.
  • a computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the above method.
  • Figure 1 is a schematic view of a camera according to the preferred embodiment about to take a panoramic photograph
  • Figure 2 is a block diagram of the camera of Figure 1 ;
  • Figure 3 is an image of a first photograph for the panoramic photograph
  • Figure 4 is an image prior to the taking of a second photograph for the panoramic photograph
  • Figure 5 is an image corresponding to Figure 4 with the image closer to a match
  • Figure 6 is an image showing the weights photograph for the image of Figure 5;
  • Figure 7 is an image corresponding to Figures 4 and 5 at the completion of the match
  • Figure 8 is a flow chart for the method of the first embodiment
  • Figure 9 is an example of reference images for a second embodiment
  • Figure 10 is an image of a selected reference image from the reference image of Figure 9;
  • Figure 11 is a weight's photograph of the reference image of Figure 10.
  • Figure 12 is a weight's photograph of the stitching portion of the reference image of Figure 10;
  • Figure 13 is an image of the matching status;
  • Figure 14 is the first stitched image of the second embodiment; and Figure 15 is a flow chart for the method of the second embodiment.
  • the camera 10 may be a digital or film camera, a digital camera built in to a mobile/cellular telephone or a personal digital assistant (PDA), or a web camera.
  • PDA personal digital assistant
  • the camera 10 has an imaging system generally indicated as 12 and comprising a lens 14, view finder 16, shutter 18, built-in flash 20, shutter release 22, and other controls 24.
  • an image capturing device 36 such as, for example, a charge-coupled device; and a processor 26 for processing the image data received in a known manner, memory 28 for storing each image as image data, and a controller 30 for controlling data sent for display on display 32.
  • Processor 26 performs conventional digital photographic image processing such as, for example, compressing and formatting a captured photographic image.
  • the imaging system 12, including the image capturing device 36, is able to take and capture photographic images of every day scenes.
  • the imaging system 12 may have a fixed or variable focus, zoom, and other functions found in digital still cameras.
  • the first shot 30 as shown in Figure 3 may be taken.
  • the settings for the first shot will be fixed for the rest of the shots that make up the final panoramic image.
  • the previously captured image will be moved towards the right for a predetermined percentage of its width.
  • the predetermined percentage may be in the range 75% to 95% preferably 85%.
  • the remaining portion is 5% to 25% of the image and is preferably 15% of the image.
  • the remaining portion of the previously captured image will be blended with the image of the second photograph in a blending area 32.
  • the blending area 32 will normally be at a side of display 32, but it may be adjacent a side, or even remote from a side. Depending on the direction of the panoramic series capture sequence, the blending area 32 could reside at any of the four sides of the LCD display- top, bottom, left side or right side.
  • the Red Green Blue (“RGB") value of the pixels for the first image within the blending area 32 will be reduced by a predetermined amount such as, for example, 40% to 60%, preferably 50%. Before the capturing of the next photograph the viewfinder and/or LCD display will show a preview of the image to be captured.
  • the RGB value of the pixels within the blending area for the image to be captured will also be reduced by the same amount (e.g. 50%).
  • the blending area 32 is thus made up of pixels from the previous and current images using additive blending with their reduced RGB values being added together.
  • the summed RGB values are further reduced by a preset amount such as, for example, 5 to 25%, preferably 10%.
  • the rest of the LCD display will present the current image without alteration.
  • RGB [LCD] P (RGB [image (previous)] P X 0.5 + RGB [preview] P X 0.5) X 0.9
  • a 2" LCD may have 206,000 pixels.
  • the blending area will consist of 30,900 pixels.
  • the blending and matching processes applies only to the pixels within the blending area. Therefore, to process in a stitch assist mode where tone is blending the display process, to calculate the match status only 30,900 pixels are involved.
  • a status indicator 34 is provided on the image 30 and the color of the status indicator 34 provides an indication of how well the previously captured image 30 and the current preview image 36 match within the blending area 32. For example, a red color may represent poor matching, a yellow color may represent partial matching, and a green color may represent perfect matching.
  • each pixel used in the matching process is given a weight value to represent its visual importance. It is preferable not to use equal weighting for all the pixels because certain pixels that make up boundary edges are visually more important. The following is how each pixel's weight value (W) and matching value (M) can be computed:
  • the pixels subset in the blending area used is defined at MP.
  • M means the matching status to be used to evaluate the result of the matching.
  • a small value represents good matching. This value is used to determine the color of the matching status index.
  • P (i, j) for the i row and j column pixel on the blending area.
  • W(i, j) for the weight of pixel P (i, j).
  • W(i, j) W(i, j)+abs(R [image(previous)] p(i+s*u,j+s*v) -R [image(previous)] p(i,j))
  • Wjmax max(W(i,j), (i,j) e MP)
  • W(i, j) W(I, j)/W_max; // P pixel's weight value W
  • M O; //M is the matching value for((ij) ⁇ MP)
  • M M + W(i,j)* (abs(R [image(preview)] p(i,j)
  • R [image(previous)]p(i,j) meant previous image at pixel P(i,j) Red value
  • G [image(previous)]p(i,j) meant previous image at pixel P(i,j) Green value
  • B [image(previous)]p(i,j) meant previous image at pixel P(ij) Blue value
  • R [image(preview)]p(i,j) meant preview image at pixel P(i,j) Red value
  • G [image(preview)]p(ij) meant preview image at pixel P(ij) Green value
  • B [image(preview)]p(i,j) meant preview image at pixel P(i,j) Blue value
  • nPixel meant number of MP
  • Method 1 is faster than method 2 because it doesn't need calculate the preview image pixels' weight. But method 2 is more accurate because it detects the boundary edges' information for matching. Therefore, even under different lighting conditions method 2 may achieve a good result, whereas under different lighting conditions method 1 may not.
  • the status indicator 34 is displayed and is red (824) in colour to indicate that the next photograph should not be taken.
  • the camera can than be moved for the next shot (803).
  • the image displayed is moved by the predetermined amount (e.g. 85%) and the blending area 32 is formed (804).
  • the number of pixels in the blending area 32 is also reduced by 50% (805).
  • step (802) what is displayed is as shown in Figure 4.
  • step (803) what is displayed is shown in Figure 5.
  • in the blended area 32 are two images of the flagpole - one from the portion of the first image that remains in the blending area 32, and one from the yet-to-be-taken second image that forms the majority of the displayed image.
  • the pixel matching process described above is then performed (806) only within the blending area 32.
  • queries 807, 809 and 811 are raised to determine the match status. If NO at 807 and 809 (808 and 810 respectively) and YES (812) at (811 ), the status indicator 34 is displayed as red (813) to indicate a poor or bad match.
  • the camera is then moved (814) until there is a visual matching of features in the blending area 32. The process then reverts back to (806).
  • the status indicator 34 is displayed as yellow ( Figure 5) indicating a good, but not perfect match.
  • the camera may be moved (825) to try to achieve an improved match, and the process reverts to (806). This is shown in Figure 7. At this stage the images of the flagpole are aligned. Therefore, in (807) the answer is YES (817) and the display indicator 34 is changed to green (818) and the next photograph is able to be taken (819).
  • a shutter release lockout so that when camera 10 is in the panoramic mode and the second photograph is to be taken, if the status indicator 34 is red the second photograph cannot be taken. This would also be relevant for subsequent photographs.
  • the order or sequence of the photographs taken may be different, and may be random. Normal stitching systems can't cope with such an arrangement.
  • Figure 15 To enable to stitching to take place the process of Figure 15 is followed.
  • One photograph is taken (1501 ) and its image is used as the first image (1502) for stitching to take place.
  • the blending area 32 is then formed (1503) and placed on the display, and the status indicator 34 displayed as red (1504).
  • the camera 10 is then moved (1505) and the pixel matching process is performed in the blending area (1506) as is described above in relation to the first embodiment. If the first image is 1 in Table 1 , the blending area 32 will be on the right side of the first image and on the left side of the second image.
  • the blending area 32 is moved to be on the correct side of the display (1507) so that blending can take place as described above (1508) in relation to the first embodiment.
  • the second photograph is then taken.
  • a query is raised (150) to determine if more photographs are required. If yes (1512), the process continues to (1514). If no (1511), the process ends. If yes, the camera is then moved (1514) and a pixel match attempted (1515). In doing so a common edge between a previous image and the image to be taken must be found (1516). If there is no common or overlapping edge with a previous photograph (1517), an error message is displayed (1518) and the process reverts to (1514).
  • the previous photograph may be any previous photograph taken as part of the image sequence to from the panoramic image; or may be limited to the immediately previous image, particularly if memory and/or processing power is limited.
  • the order of photographs may be in any sequence. Each subsequent photograph must have a common side with the photograph taken prior to it for stitching to take place. For example, if photograph 6 were the first photograph, only photographs 1 , 5 and 7 could be used for the second photograph. If photograph 5 is the second photograph, any one of photographs 12, 4, 7 or 8 could be the third photograph as each has an edge in common with photograph 6 or photograph 5. However, photographs 3 and 9 could not be the third photograph as they have no common edge with photograph 6 or photograph 5. Therefore, the photographs can be taken in any order provided there is a common edge with a previous photograph.
  • the first photograph may be taken from a library or card of precedent photographs.
  • precedent photographs may have very high resolution and high megapixels so a low megapixel camera may be able to be used to from a high megapixel image.
  • FIG. 9 This embodiment is illustrated in Figures 9 to 14 and is for creating images and stitching them together.
  • the purpose is to create a large, high quality image (not just panoramic images) by using a low mega-pixels digital camera to take a high mega-pixels photograph.
  • a precedent card 90 is provided.
  • One photograph 92 from the precedent card is selected as a precedent photograph.
  • the precedent photograph is adjusted to a desired position on the LCD screen or viewfinder.
  • Part of the precedent photograph 92 in the LCD is the blending area 32 ( Figure 10).
  • the preview image about to be taken is matched to the precedent image in the blending area 32 using the matching methods as in the first embodiment described above and as shown in Figures 10 to 14.
  • Figure 14 shows the first image.
  • the photograph stitch software is used to stitch them together on computer to create a large and high quality photograph.
  • the status indicator 34 may be accompanied by an audible indication such as, for example, a "beep” at a low repetition frequency corresponding to the red colour and a poor match; the "beep” at middle repetition frequency corresponding to the yellow colour and a good match; and the "beep” at a high repetition frequency corresponding to the green colour and perfect match.
  • an audible indication such as, for example, a "beep” at a low repetition frequency corresponding to the red colour and a poor match; the "beep” at middle repetition frequency corresponding to the yellow colour and a good match; and the "beep” at a high repetition frequency corresponding to the green colour and perfect match.

Abstract

A method for forming a panoramic image using a camera. The method comprises taking a first photograph using the camera. On a display of the camera a blending area is formed. The blending area includes a part of a first image of the first photograph displayed on the display. The camera is moved before being prepared to take a second photograph. In the blending area only, a pixel matching process is performed for determining an alignment of the portion of the first image in the blending area and part of a second image of the second photograph in the blending area. A camera is also disclosed.

Description

METHOD AND APPARATUS FOR FORMING A PANORAMIC IMAGE
Field of the Invention
This invention relates to a method and apparatus for forming a panoramic image and refers particularly, though not exclusively, to such a method and apparatus that facilitates the formation of a panoramic image with no portion of the panoramic image missing.
Background of the Invention
There are three known methods for forming a panoramic image. In the first, the first frame is taken and displayed. The next frame is displayed live on the screen as the user moves the camera. The user moves that camera in such a way that there is a slight overlap in photograph information in the new frame to be taken.
This is tedious, and sometimes confusing, as the user is observing the edge and there is no clear indication that the position of the shot has sufficient overlap for subsequent stitching operations.
In the second, video clips can be analyzed and a panoramic photograph created from it. This becomes difficult when the size of the photographs (e.g. 4 to 8 megapixels) and the number of photographs becomes large. Video takes more storage than multiple images, and requires more processing.
The third and final known method is to automatically perform the stitching each time a photograph is taken. If there is insufficient or no overlap, the stitching process will fail and the user will be prompted to take another photograph. Again, processing takes power and time.
Summary of the Invention
In accordance with a first aspect there is provided a method for forming a panoramic image using a camera. The method comprises placing the camera into a panoramic image mode and taking a first photograph using the camera. On a display of the camera a blending area is formed. The blending area includes a part of a first image of the first photograph displayed on the display. The blending area is at a side of the display. The camera is moved before being prepared to take a second photograph. In the blending area only, a pixel matching process is performed for determining an alignment of the portion of the first image in the blending area and part of the second image of the second photograph in the blending area.
According to a second preferred aspect there is provided a computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the above method.
According to a third preferred aspect there is provided a camera for taking a panoramic image. The camera comprises a body, a lens, a display, an image capturing device, a processor, a controller, and a memory. The display is for displaying images of photographs. The display comprises a blending area for determining an alignment of a portion of a first image in the blending area with a part of an image of a yet-to-be-taken second photograph.
For all aspects the blending area may be at a side of the display and may comprise a predetermined percentage of the first image, the predetermined percentage being the range of 5% to 25%. The red, green and blue values of all pixels in the blending area may be reduced to a fixed amount, the fixed amount being in the range 40% to 60%. It is preferably 45%. The red, green and blue values of the pixels of the portion of the first image in the blending area and the red, green and blue values of the pixels of the part of the second image in the blending area may be summed for display on the display prior to being reduced. After summing and reduction, pixels in the blending area may be further reduced by a set amount. The set amount may be 10%. In this way the red, green and blue values in the portion are at 90% being 45% from the first image and 45% from the second image.
The blending area may extend for the full length of the side of the display, the side being selected from: top, bottom, left side, and right side. The side may be determined by a direction of the movement of the camera.
A preset percentage of the first image may be removed from the display to leave the predetermined percentage in the display, the present percentage being in the range 75% to 95%. The preset percentage may be 85% and the predetermined percentage may be 15%.
All pixels in the blending area may be given a weight value representing visual importance. Pixels that form the boundary edges of the first and second images area may be given higher weight values. A boundary edge may be a collection or group of pixels that defines a clear separation of visual contrast within an image; and may further be the outlines of objects as opposed to a differently contrasted background.
A match status indicator may be displayed on the display, the match status indicator being variable in consequence of the result of the pixel matching process.
A determination of the side on which the blending area is to be displayed may be made after the pixel matching process. A subsequent preview image of a yet-to- be-taken subsequent photograph may have the side in common with a previous photograph. The previous photograph may be an immediately proceeding photograph.
According to a penultimate aspect there is provided method of forming a panoramic image using a camera. The method comprises taking a first photograph and displaying at least a portion of an image of the first photograph on a display; moving the camera for a second photograph and displaying at least a part of preview image of the second photograph on the display; conducting a pixel match for the portion and the part; and using a match status indicator to indicate a result of the pixel match.
The match status indicator may be variable in consequence of the pixel matching process, and may be of a first colour for a poor pixel match, a second colour for a good pixel match, and a third colour for a perfect pixel match; the first, second and third colours being different.
According to a final preferred aspect there is provided a computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the above method.
Brief Description of the Drawings
In order that the invention may be fully understood and readily put into practical effect, there shall now be described by way of non-limitative example only a preferred embodiment of the present invention, the description being with reference to the accompanying illustrative drawings in which:
Figure 1 is a schematic view of a camera according to the preferred embodiment about to take a panoramic photograph; Figure 2 is a block diagram of the camera of Figure 1 ;
Figure 3 is an image of a first photograph for the panoramic photograph;
Figure 4 is an image prior to the taking of a second photograph for the panoramic photograph;
Figure 5 is an image corresponding to Figure 4 with the image closer to a match; Figure 6 is an image showing the weights photograph for the image of Figure 5;
Figure 7 is an image corresponding to Figures 4 and 5 at the completion of the match;
Figure 8 is a flow chart for the method of the first embodiment; Figure 9 is an example of reference images for a second embodiment; Figure 10 is an image of a selected reference image from the reference image of Figure 9;
Figure 11 is a weight's photograph of the reference image of Figure 10;
Figure 12 is a weight's photograph of the stitching portion of the reference image of Figure 10; Figure 13 is an image of the matching status;
Figure 14 is the first stitched image of the second embodiment; and Figure 15 is a flow chart for the method of the second embodiment.
Detailed Description of the preferred Embodiments To refer to Figures 1 and 2 there is shown a camera 10 about to take a panoramic photograph of a vista 12. The camera 10 may be a digital or film camera, a digital camera built in to a mobile/cellular telephone or a personal digital assistant (PDA), or a web camera.
Although a simple form of digital still camera is shown, the present invention is further applicable to all forms of digital still cameras including single lens reflex cameras, and digital motion photograph cameras in still camera mode. The term "camera" is to be interpreted accordingly.
The camera 10 has an imaging system generally indicated as 12 and comprising a lens 14, view finder 16, shutter 18, built-in flash 20, shutter release 22, and other controls 24. Within the camera 10 is an image capturing device 36 such as, for example, a charge-coupled device; and a processor 26 for processing the image data received in a known manner, memory 28 for storing each image as image data, and a controller 30 for controlling data sent for display on display 32. Processor 26 performs conventional digital photographic image processing such as, for example, compressing and formatting a captured photographic image. The imaging system 12, including the image capturing device 36, is able to take and capture photographic images of every day scenes. The imaging system 12 may have a fixed or variable focus, zoom, and other functions found in digital still cameras.
When the camera 10 is set to a panoramic image or stitch assist mode to create a panoramic image, the first shot 30 as shown in Figure 3 may be taken. The settings for the first shot will be fixed for the rest of the shots that make up the final panoramic image. Before the taking of each successive shot, the previously captured image will be moved towards the right for a predetermined percentage of its width. The predetermined percentage may be in the range 75% to 95% preferably 85%. The remaining portion is 5% to 25% of the image and is preferably 15% of the image. The remaining portion of the previously captured image will be blended with the image of the second photograph in a blending area 32.
The blending area 32 will normally be at a side of display 32, but it may be adjacent a side, or even remote from a side. Depending on the direction of the panoramic series capture sequence, the blending area 32 could reside at any of the four sides of the LCD display- top, bottom, left side or right side. As shown in Figure 3, the Red Green Blue ("RGB") value of the pixels for the first image within the blending area 32 will be reduced by a predetermined amount such as, for example, 40% to 60%, preferably 50%. Before the capturing of the next photograph the viewfinder and/or LCD display will show a preview of the image to be captured. The RGB value of the pixels within the blending area for the image to be captured will also be reduced by the same amount (e.g. 50%). The blending area 32 is thus made up of pixels from the previous and current images using additive blending with their reduced RGB values being added together. In order to differentiate the blending area from the rest of the LCD display, the summed RGB values are further reduced by a preset amount such as, for example, 5 to 25%, preferably 10%. The rest of the LCD display will present the current image without alteration.
Following is the display algorithm for blending area:
If (pixel P belongs to blending areas) then RGB [LCD] P=(RGB [image (previous)] P X 0.5 + RGB [preview] P X 0.5) X 0.9
//multiple 0.9 makes blending area 10 percent dark else RGB [LCD] P = RGB [preview] P
For example a 2" LCD may have 206,000 pixels. In such a case the blending area will consist of 30,900 pixels. The blending and matching processes applies only to the pixels within the blending area. Therefore, to process in a stitch assist mode where tone is blending the display process, to calculate the match status only 30,900 pixels are involved.
A status indicator 34 is provided on the image 30 and the color of the status indicator 34 provides an indication of how well the previously captured image 30 and the current preview image 36 match within the blending area 32. For example, a red color may represent poor matching, a yellow color may represent partial matching, and a green color may represent perfect matching.
Depending on the accuracy desired, the number of pixels within the blending area used in determining the matching status could be varied. The highest accuracy is obtained when all the pixels within the blending area are considered during the match process. Each pixel used in the matching process is given a weight value to represent its visual importance. It is preferable not to use equal weighting for all the pixels because certain pixels that make up boundary edges are visually more important. The following is how each pixel's weight value (W) and matching value (M) can be computed:
The pixels subset in the blending area used is defined at MP. M means the matching status to be used to evaluate the result of the matching. A small value represents good matching. This value is used to determine the color of the matching status index.
Denote P (i, j) for the i row and j column pixel on the blending area. W(i, j) for the weight of pixel P (i, j).
To compute pixel's weight value: s = 1 ; //the pixel step for calculate weight for((i,j) ε MP)
W(U) = O; for(int u=-1 ; u<2;u++) for(int v=-1 ; v<2;v++)
W(i, j) = W(i, j)+abs(R [image(previous)] p(i+s*u,j+s*v) -R [image(previous)] p(i,j))
+abs(G [image(previous)] p(i+s*u,j+s*v)
-G [image(previous)] p(i,jj) +abs(B [image(previous)] p(i+s*u,j+s*v)
-B [image(previous)] p(i,j));
Wjmax = max(W(i,j), (i,j) e MP)
W(i, j) = W(I, j)/W_max; // P pixel's weight value W
To compute matching value: Method 1:
M = O; //M is the matching value for((ij) ε MP)
M = M + W(i,j)* (abs(R [image(preview)] p(i,j)
-R [image(previous)] p(ij)) +abs(G [image(preview)] p(ij)
- G [image(previous)] p(i,j)) +abs(B [image(preview)] p(ij)
-B [image(previous)] p(i,j))); M = M/n Pixel;
R [image(previous)]p(i,j) meant previous image at pixel P(i,j) Red value; G [image(previous)]p(i,j) meant previous image at pixel P(i,j) Green value; B [image(previous)]p(i,j) meant previous image at pixel P(ij) Blue value; R [image(preview)]p(i,j) meant preview image at pixel P(i,j) Red value; G [image(preview)]p(ij) meant preview image at pixel P(ij) Green value; B [image(preview)]p(i,j) meant preview image at pixel P(i,j) Blue value; nPixel meant number of MP; Method 2:
M=O; for((i,j) e MP)
M = M +abs(W[image(preview)]p(i,j)-W[image(previous)]p(i,j)) M = M/nPixel;
W[image(previous)]p(i,j) meant previous image at pixel P(i,j) weight value; W[image(preview)jp(i,j) meant preview image at pixel P(ij) weight value; nPixel meant number of MP; Method 1 is faster than method 2 because it doesn't need calculate the preview image pixels' weight. But method 2 is more accurate because it detects the boundary edges' information for matching. Therefore, even under different lighting conditions method 2 may achieve a good result, whereas under different lighting conditions method 1 may not.
The use of automated pixel-matching together with an on-screen blending area as a visual guidance minimizes human errors while allowing manual intervention when the automated pixel-matching does not provide a satisfactory result. For example, it may not be possible for the pixel-matching algorithm to provide a good result if there are moving objects across successive shots. A moving car may be present in the previous shot but may not be present in the current shot. In this situation, it would be impossible for the automated pixel-matching to provide a satisfactory result. However, the user will be able to use the visual guidance to provide a rough match between the images and ignore the result from the automated pixel- matching.
On the other hand, it is usually very difficult for user to properly match the images when the objects in the scene are small and/or not distinct, (e.g. forest, clouds). In such a situation, the automated pixel-matching will be able to provide a more accurate guide as to whether a photograph can be taken.
To now refer to Figures 3 to 8, when the camera is set to stitch assist mode (801) the status indicator 34 is displayed and is red (824) in colour to indicate that the next photograph should not be taken. When the first shot 30 is taken (802), the camera can than be moved for the next shot (803). The image displayed is moved by the predetermined amount (e.g. 85%) and the blending area 32 is formed (804). The number of pixels in the blending area 32 is also reduced by 50% (805). At step (802) what is displayed is as shown in Figure 4. As can be seen the flag pole that was on the right in Figure 3 and thus is on the left in Figure 4 and is in the blended area 32. When the camera is moved is step (803) what is displayed is shown in Figure 5. As can be seen, in the blended area 32 are two images of the flagpole - one from the portion of the first image that remains in the blending area 32, and one from the yet-to-be-taken second image that forms the majority of the displayed image.
The pixel matching process described above is then performed (806) only within the blending area 32. For the purposes of the status indicator 34, queries 807, 809 and 811 are raised to determine the match status. If NO at 807 and 809 (808 and 810 respectively) and YES (812) at (811 ), the status indicator 34 is displayed as red (813) to indicate a poor or bad match. The camera is then moved (814) until there is a visual matching of features in the blending area 32. The process then reverts back to (806).
If at (809) the answer is YES, the status indicator 34 is displayed as yellow (Figure 5) indicating a good, but not perfect match. The camera may be moved (825) to try to achieve an improved match, and the process reverts to (806). This is shown in Figure 7. At this stage the images of the flagpole are aligned. Therefore, in (807) the answer is YES (817) and the display indicator 34 is changed to green (818) and the next photograph is able to be taken (819).
If there are no more photographs to be taken to form the panoramic image (820, 812) the process ends (823). If there are more photographs (821 ), the process reverts to (803).
At (816), it may be possible to take the next photograph as there is a sufficiently good match for stitching to take place.
If desired, there may be included a shutter release lockout so that when camera 10 is in the panoramic mode and the second photograph is to be taken, if the status indicator 34 is red the second photograph cannot be taken. This would also be relevant for subsequent photographs. A different situation arises when the panoramic image is not formed from a single line sequence of photographs due to the camera moving in one direction only such as, for example, left to right, right to left, top to bottom, or bottom to top. In these instances the blending area 32 is always in the one location within display 32 - left side, right side, top side, and bottom side respectively. If the camera is to take a sequence of photographs to form a panoramic image and the camera is moved in different directions, stitching will need to take place on different side of images. For example, a panoramic image of a large mountain may require several photographs in a grid:
Figure imgf000012_0001
Table 1
The order or sequence of the photographs taken may be different, and may be random. Normal stitching systems can't cope with such an arrangement.
To enable to stitching to take place the process of Figure 15 is followed. One photograph is taken (1501 ) and its image is used as the first image (1502) for stitching to take place. The blending area 32 is then formed (1503) and placed on the display, and the status indicator 34 displayed as red (1504). The camera 10 is then moved (1505) and the pixel matching process is performed in the blending area (1506) as is described above in relation to the first embodiment. If the first image is 1 in Table 1 , the blending area 32 will be on the right side of the first image and on the left side of the second image.
In consequence of the pixel match, the blending area 32 is moved to be on the correct side of the display (1507) so that blending can take place as described above (1508) in relation to the first embodiment. The second photograph is then taken. A query is raised (150) to determine if more photographs are required. If yes (1512), the process continues to (1514). If no (1511), the process ends. If yes, the camera is then moved (1514) and a pixel match attempted (1515). In doing so a common edge between a previous image and the image to be taken must be found (1516). If there is no common or overlapping edge with a previous photograph (1517), an error message is displayed (1518) and the process reverts to (1514). The previous photograph may be any previous photograph taken as part of the image sequence to from the panoramic image; or may be limited to the immediately previous image, particularly if memory and/or processing power is limited.
Using the above Table 1 as an example, the order of photographs may be in any sequence. Each subsequent photograph must have a common side with the photograph taken prior to it for stitching to take place. For example, if photograph 6 were the first photograph, only photographs 1 , 5 and 7 could be used for the second photograph. If photograph 5 is the second photograph, any one of photographs 12, 4, 7 or 8 could be the third photograph as each has an edge in common with photograph 6 or photograph 5. However, photographs 3 and 9 could not be the third photograph as they have no common edge with photograph 6 or photograph 5. Therefore, the photographs can be taken in any order provided there is a common edge with a previous photograph.
If at (1516) the answer is yes (1519) the blending area 32 is located on the correct side as is described above (1520), the blending process followed as is described in relation to the first embodiment (1521 ); and the photograph taken (1522). A query is raised (1523) to determine if more photographs are to be taken. If yes (1525) the process reverts to (1514), if no (1524) the process ends (1526).
The first photograph may be taken from a library or card of precedent photographs. As such precedent photographs may have very high resolution and high megapixels so a low megapixel camera may be able to be used to from a high megapixel image.
This embodiment is illustrated in Figures 9 to 14 and is for creating images and stitching them together. The purpose is to create a large, high quality image (not just panoramic images) by using a low mega-pixels digital camera to take a high mega-pixels photograph. First and as shown in Figure 9, a precedent card 90 is provided. One photograph 92 from the precedent card is selected as a precedent photograph. The precedent photograph is adjusted to a desired position on the LCD screen or viewfinder. Part of the precedent photograph 92 in the LCD is the blending area 32 (Figure 10). The preview image about to be taken is matched to the precedent image in the blending area 32 using the matching methods as in the first embodiment described above and as shown in Figures 10 to 14. Figure 14 shows the first image.
After taking a certain number of photographs, the photograph stitch software is used to stitch them together on computer to create a large and high quality photograph.
For all embodiments, the status indicator 34 may be accompanied by an audible indication such as, for example, a "beep" at a low repetition frequency corresponding to the red colour and a poor match; the "beep" at middle repetition frequency corresponding to the yellow colour and a good match; and the "beep" at a high repetition frequency corresponding to the green colour and perfect match.
Whilst there has been described in the foregoing description preferred embodiments of the present invention, it will be understood by those skilled in the technology that many variations or modifications in details of design or construction or operation may be made without departing from the present invention.

Claims

The Claims
1. A method for forming a panoramic image using a camera, the method comprising: taking a first photograph using the camera and displaying a first image of the first photograph on a display of the camera; forming a blending area on the display, the blending area comprising a part of the first image of the first photograph as displayed on the display; moving the camera before taking a second photograph and displaying on the display a preview image of a yet-to-be-taken second photograph; and in the blending area only, using a pixel matching process for determining an alignment of the portion of the first image and the part of the preview image.
2. A method as claimed in claim 1 , wherein the blending area comprises a predetermined percentage of the first image, the predetermined percentage being in the range 5% to 25%.
3. A method as claimed in claim 1 , wherein red, green and blue values of all pixels in the blending area are reduced to a fixed amount, the fixed amount being in the range 40% to 60%.
4. A method as claimed in claim 1 , further comprising displaying on the display a match status indicator, the match status indicator being variable in consequence of the result of the pixel matching process.
5. A method as claimed in claim 1 , wherein blending area is at adjacent a side of the display, the side being selected from the group consisting of: top, bottom, left side, and right side; the side being determined by a direction of the movement of the camera.
6.. A method as claimed in claim 3, wherein the red, green and blue values of the pixels of the portion of the first image in the blending area, and the red, green and blue values of the pixels of the part of the preview image in the blending area are summed for display on the display prior to being reduced; and after summing and reduction, the pixels in the blending area are further reduced by a set amount.
7. A method as claimed in claim 8, wherein the set amount is 10% and the fixed amount is 45%.
8. A method as claimed in claim 2, wherein a preset percentage of the first image is removed from the display to leave the predetermined percentage in the display, the preset percentage being in the range 75% to 95%.
9. A method as claimed in claim 8, wherein the preset percentage is 85% and the predetermined percentage is 15%.
10. A method as claimed in claim 5, wherein all pixels in the blending area are given a weight value representing visual importance, and pixels that form the boundary edges of the first and preview images are given higher weight values.
11. A method as claimed in claim 10, wherein a boundary edge is a collection of pixels that define a clear separation of visual contrast within an image; and includes outlines of objects as opposed to a differently contrasted background.
12. A method as claimed in claim 5, wherein a determination of the side is made after the pixel matching process.
13. A method as claimed in claim 11 , wherein a preview image of a yet-to-betaken subsequent photograph has the side in common with an image of a previous photograph.
14. A method as claimed in claim 12, wherein the previous photograph is an immediately proceeding photograph.
15. A method of forming a panoramic image using a camera, the method comprising: taking a first photograph and displaying at least a portion of an image of the first photograph on a display; moving the camera for a second photograph and displaying at least a part of preview image of the second photograph on the display; conducting a pixel match for the portion and the part; and using a match status indicator to indicate a result of the pixel match.
16. A method as claimed in claim 15, wherein the match status indicator is variable in consequence of the pixel matching process and is of a first colour for a poor pixel match, a second colour for a good pixel match, and a third colour for a perfect pixel match; the first, second and third colours being different.
17. A computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the method of claim 1.
18. A computer useable medium comprising a computer program code that is configured to cause a processor in a camera to execute one or more functions to enable the performance of the method of claim 15.
19. A camera for taking a panoramic image, the camera comprising: (a) a body;
(b) a lens;
(C) a display;
(d) an image capturing device;
(e) a processor;
(0 a controller; and
(g) a memory;
wherein the display is for displaying images of photographs and the processor is for forming on the display comprising a blending area for determining an alignment of a portion of a first image in the blending area with a part of a second image of a yet- to-be-taken preview photograph.
20. A camera as claimed in claim 19, wherein the blending area comprises a predetermined percentage of the first image, the predetermined percentage being in the range 5% to 25%.
21. A camera as claimed in claim 19, wherein red, green and blue values of all pixels in the blending area are reduced by a fixed amount, the fixed amount being in the range 40% to 60%.
22. A camera as claimed in claim 19, the display further comprising a match status indicator, the match status indicator being variable in consequence of a result of a pixel matching process performed in the blending area.
23. A camera as claimed in claim 19, wherein the blending area is at adjacent a side of the display, the side being selected from the group consisting of: top, bottom, left side, and right side; the side being determined by a direction of movement of the camera between the first and second photographs.
24. A camera as claimed in claim 21 , wherein the red, green and blue values of the pixels of the portion of the first image, and the red, green and blue values of the pixels of the part of the preview image in the blending area are summed for display on the display prior to being reduced; and after summing and reduction, pixels in the blending area are further reduced by a set amount.
25. A camera as claimed in claim 24, wherein the set amount is 10% and the fixed amount is 50%.
26. A camera as claimed in claim 20, wherein a present percentage of the first image is removed from the display to leave the predetermined percentage in the display, the present percentage being in the range 75% to 95%.
27. A camera as claimed in claim 26, wherein the preset percentage is 85% and the predetermined percentage is 15%.
28. A camera as claimed in claim 19, wherein all pixels in the blending area are given a weight value representing visual importance, pixels that form the sides of the first and preview images being given higher weight values.
PCT/SG2006/000024 2005-02-11 2006-02-10 Method and apparatus for forming a panoramic image WO2006085827A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
DE112006000358.5T DE112006000358B4 (en) 2005-02-11 2006-02-10 Method and device for creating a panoramic image
GB0715571A GB2438335B (en) 2005-02-11 2006-02-10 Method and apparatus for forming a panoramic image

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/056,597 2005-02-11
US11/056,597 US7646400B2 (en) 2005-02-11 2005-02-11 Method and apparatus for forming a panoramic image

Publications (1)

Publication Number Publication Date
WO2006085827A1 true WO2006085827A1 (en) 2006-08-17

Family

ID=36793321

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SG2006/000024 WO2006085827A1 (en) 2005-02-11 2006-02-10 Method and apparatus for forming a panoramic image

Country Status (6)

Country Link
US (1) US7646400B2 (en)
CN (1) CN1854887B (en)
DE (1) DE112006000358B4 (en)
GB (1) GB2438335B (en)
HK (1) HK1097052A1 (en)
WO (1) WO2006085827A1 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1890481A1 (en) 2006-08-16 2008-02-20 Samsung Electronics Co., Ltd. Panorama photography method and apparatus capable of informing optimum photographing position
EP1903786A1 (en) * 2006-09-21 2008-03-26 Samsung Electronics Co., Ltd. Apparatus and method for photographing panoramic image
EP2336975A1 (en) * 2009-11-09 2011-06-22 Samsung Electronics Co., Ltd. Apparatus and method for image registration in portable terminal
WO2015034908A3 (en) * 2013-09-06 2015-04-23 Qualcomm Incorporated Interactive image composition
WO2015082572A3 (en) * 2013-12-03 2015-08-13 Dacuda Ag User feedback for real-time checking and improving quality of scanned image
GB2537221A (en) * 2015-04-09 2016-10-12 Airbus Ds Optronics Gmbh Method for representing a panoramic image and panoramic image representation apparatus
US10225428B2 (en) 2009-05-20 2019-03-05 Ml Netherlands C.V. Image processing for handheld scanner
US10298898B2 (en) 2013-08-31 2019-05-21 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10410321B2 (en) 2014-01-07 2019-09-10 MN Netherlands C.V. Dynamic updating of a composite image
US10484561B2 (en) 2014-05-12 2019-11-19 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
US10708491B2 (en) 2014-01-07 2020-07-07 Ml Netherlands C.V. Adaptive camera control for reducing motion blur during real-time image capture

Families Citing this family (50)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7373017B2 (en) * 2005-10-04 2008-05-13 Sony Corporation System and method for capturing adjacent images by utilizing a panorama mode
TWI299463B (en) * 2005-04-13 2008-08-01 Via Tech Inc Method and device for dynamically displaying image by virtual plane coordinate conversion
US7424218B2 (en) * 2005-07-28 2008-09-09 Microsoft Corporation Real-time preview for panoramic images
CN101697572B (en) * 2005-09-09 2012-02-22 佳能株式会社 Image pickup apparatus
SE532236C2 (en) * 2006-07-19 2009-11-17 Scalado Ab Method in connection with taking digital pictures
KR100866230B1 (en) * 2007-04-12 2008-10-30 삼성전자주식회사 Method for photographing panorama picture
KR100866278B1 (en) * 2007-04-26 2008-10-31 주식회사 코아로직 Apparatus and method for making a panorama image and Computer readable medium stored thereon computer executable instruction for performing the method
US8009178B2 (en) * 2007-06-29 2011-08-30 Microsoft Corporation Augmenting images for panoramic display
US8717412B2 (en) * 2007-07-18 2014-05-06 Samsung Electronics Co., Ltd. Panoramic image production
US8068693B2 (en) * 2007-07-18 2011-11-29 Samsung Electronics Co., Ltd. Method for constructing a composite image
EP2018049B1 (en) 2007-07-18 2013-05-01 Samsung Electronics Co., Ltd. Method of assembling a panoramic image and camera therefor
JP4772009B2 (en) * 2007-08-07 2011-09-14 三洋電機株式会社 Digital camera
US20090122195A1 (en) * 2007-11-09 2009-05-14 Van Baar Jeroen System and Method for Combining Image Sequences
US10503376B2 (en) 2007-12-20 2019-12-10 Samsung Electronics Co., Ltd. Method and apparatus for adjusting an image and control guides displayed on a display
US8922518B2 (en) 2007-12-20 2014-12-30 Samsung Electronics Co., Ltd. Mobile terminal having touch screen and function controlling method of the same
KR20090066368A (en) * 2007-12-20 2009-06-24 삼성전자주식회사 Portable terminal having touch screen and method for performing function thereof
KR101467293B1 (en) * 2008-04-22 2014-12-02 삼성전자주식회사 A method for providing User Interface to display the menu related to the image to be photographed
US8160391B1 (en) * 2008-06-04 2012-04-17 Google Inc. Panoramic image fill
US8072504B2 (en) * 2008-08-27 2011-12-06 Micron Technology, Inc. Method and system for aiding user alignment for capturing partially overlapping digital images
US8509518B2 (en) * 2009-01-08 2013-08-13 Samsung Electronics Co., Ltd. Real-time image collage method and apparatus
CN101799620B (en) * 2009-02-10 2011-12-28 华晶科技股份有限公司 Method for automatically shooting panorama images for digital camera device
CN101964869B (en) * 2009-07-23 2012-08-22 华晶科技股份有限公司 Directed shooting method for panoramic picture
EP2483767B1 (en) 2009-10-01 2019-04-03 Nokia Technologies Oy Method relating to digital images
KR101630287B1 (en) * 2009-11-19 2016-06-14 삼성전자주식회사 Digital photographing apparatus and method for controlling the same
US8611654B2 (en) 2010-01-05 2013-12-17 Adobe Systems Incorporated Color saturation-modulated blending of exposure-bracketed images
US8606042B2 (en) * 2010-02-26 2013-12-10 Adobe Systems Incorporated Blending of exposure-bracketed images using weight distribution functions
JP5561019B2 (en) * 2010-08-23 2014-07-30 ソニー株式会社 Imaging apparatus, program, and imaging method
EP2603834B1 (en) 2010-09-20 2020-12-09 Nokia Technologies Oy Method for forming images
JP2012105145A (en) * 2010-11-11 2012-05-31 Canon Inc Image processing apparatus, image processing method, and program
US8526763B2 (en) * 2011-05-27 2013-09-03 Adobe Systems Incorporated Seamless image composition
US9628749B2 (en) * 2011-12-20 2017-04-18 International Business Machines Corporation Pre-setting the foreground view of a photograph via a camera
US20150124047A1 (en) * 2012-07-20 2015-05-07 Google Inc. Panoramic video acquisition guidance
US20140095349A1 (en) * 2012-09-14 2014-04-03 James L. Mabrey System and Method for Facilitating Social E-Commerce
US20140129370A1 (en) * 2012-09-14 2014-05-08 James L. Mabrey Chroma Key System and Method for Facilitating Social E-Commerce
US10070048B2 (en) * 2013-03-26 2018-09-04 Htc Corporation Panorama photographing method, panorama displaying method, and image capturing method
CN108549119A (en) 2013-07-04 2018-09-18 核心光电有限公司 Small-sized focal length lens external member
US9343043B2 (en) 2013-08-01 2016-05-17 Google Inc. Methods and apparatus for generating composite images
CN109246339B (en) 2013-08-01 2020-10-23 核心光电有限公司 Dual aperture digital camera for imaging an object or scene
US20150215532A1 (en) * 2014-01-24 2015-07-30 Amazon Technologies, Inc. Panoramic image capture
US20150271400A1 (en) * 2014-03-19 2015-09-24 Htc Corporation Handheld electronic device, panoramic image forming method and non-transitory machine readable medium thereof
US9392188B2 (en) 2014-08-10 2016-07-12 Corephotonics Ltd. Zoom dual-aperture camera with folded lens
WO2016203282A1 (en) 2015-06-18 2016-12-22 The Nielsen Company (Us), Llc Methods and apparatus to capture photographs using mobile devices
CN112672022B (en) 2015-08-13 2022-08-02 核心光电有限公司 Dual aperture zoom camera with video support and switching/non-switching dynamic control
US9940695B2 (en) * 2016-08-26 2018-04-10 Multimedia Image Solution Limited Method for ensuring perfect stitching of a subject's images in a real-site image stitching operation
JP6806919B2 (en) 2017-11-23 2021-01-06 コアフォトニクス リミテッド Compact bendable camera structure
US11268829B2 (en) 2018-04-23 2022-03-08 Corephotonics Ltd Optical-path folding-element with an extended two degree of freedom rotation range
US11635596B2 (en) 2018-08-22 2023-04-25 Corephotonics Ltd. Two-state zoom folded camera
CN110753217B (en) * 2019-10-28 2022-03-01 黑芝麻智能科技(上海)有限公司 Color balance method and device, vehicle-mounted equipment and storage medium
KR20230020585A (en) * 2020-05-17 2023-02-10 코어포토닉스 리미티드 Image stitching in the presence of a full field of view reference image
US11637977B2 (en) 2020-07-15 2023-04-25 Corephotonics Ltd. Image sensors and sensing methods to obtain time-of-flight and phase detection information

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6243103B1 (en) * 1996-05-28 2001-06-05 Canon Kabushiki Kaisha Panoramic image generation in digital photography
EP1431921A2 (en) * 2002-12-17 2004-06-23 Seiko Epson Corporation Image layout processing apparatus, method, and program
US6771304B1 (en) * 1999-12-31 2004-08-03 Stmicroelectronics, Inc. Perspective correction device for panoramic digital camera
JP2004312549A (en) * 2003-04-09 2004-11-04 Sharp Corp Panoramic image photographing apparatus and panoramic image photographing method
WO2005041564A1 (en) * 2003-10-28 2005-05-06 Koninklijke Philips Electronics N.V. Digital camera with panorama or mosaic functionality

Family Cites Families (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5138460A (en) * 1987-08-20 1992-08-11 Canon Kabushiki Kaisha Apparatus for forming composite images
JPH07115534A (en) * 1993-10-15 1995-05-02 Minolta Co Ltd Image reader
US6128416A (en) * 1993-09-10 2000-10-03 Olympus Optical Co., Ltd. Image composing technique for optimally composing a single image from a plurality of digital images
US6549681B1 (en) * 1995-09-26 2003-04-15 Canon Kabushiki Kaisha Image synthesization method
US5982951A (en) 1996-05-28 1999-11-09 Canon Kabushiki Kaisha Apparatus and method for combining a plurality of images
WO1998025402A1 (en) 1996-12-06 1998-06-11 Flashpoint Technology, Inc. A method and system for assisting in the manual capture of overlapping images for composite image generation in a digital camera
US5851377A (en) * 1997-03-10 1998-12-22 The Lubrizol Corporation Process of using acylated nitrogen compound petrochemical antifoulants
JP2004297821A (en) 1997-09-03 2004-10-21 Casio Comput Co Ltd Electronic still camera and image reproduction method
US6867801B1 (en) * 1997-09-03 2005-03-15 Casio Computer Co., Ltd. Electronic still camera having photographed image reproducing function
US6552744B2 (en) * 1997-09-26 2003-04-22 Roxio, Inc. Virtual reality camera
US6657667B1 (en) * 1997-11-25 2003-12-02 Flashpoint Technology, Inc. Method and apparatus for capturing a multidimensional array of overlapping images for composite image generation
JP3434756B2 (en) * 1999-12-07 2003-08-11 エヌイーシ−カスタムテクニカ株式会社 Fingerprint authentication method and device
US7064783B2 (en) * 1999-12-31 2006-06-20 Stmicroelectronics, Inc. Still picture format for subsequent picture stitching for forming a panoramic image
US6978051B2 (en) * 2000-03-06 2005-12-20 Sony Corporation System and method for capturing adjacent images by utilizing a panorama mode
US6930703B1 (en) * 2000-04-29 2005-08-16 Hewlett-Packard Development Company, L.P. Method and apparatus for automatically capturing a plurality of images during a pan
US6895126B2 (en) * 2000-10-06 2005-05-17 Enrico Di Bernardo System and method for creating, storing, and utilizing composite images of a geographic location
US7162102B2 (en) 2001-12-19 2007-01-09 Eastman Kodak Company Method and system for compositing images to produce a cropped image
JP3889650B2 (en) * 2002-03-28 2007-03-07 三洋電機株式会社 Image processing method, image processing apparatus, computer program, and recording medium
JP4048907B2 (en) * 2002-10-15 2008-02-20 セイコーエプソン株式会社 Panorama composition of multiple image data
US20040189849A1 (en) * 2003-03-31 2004-09-30 Hofer Gregory V. Panoramic sequence guide
CN2638156Y (en) * 2003-04-18 2004-09-01 清华大学 Fast bidimension 360 degree overall view imaging device based on coloured linear CCD
CN2632725Y (en) * 2003-07-02 2004-08-11 马堃 Portable camera apparatus with panoramic camera function
US7239805B2 (en) * 2005-02-01 2007-07-03 Microsoft Corporation Method and system for combining multiple exposure images having scene and camera motion

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6243103B1 (en) * 1996-05-28 2001-06-05 Canon Kabushiki Kaisha Panoramic image generation in digital photography
US6771304B1 (en) * 1999-12-31 2004-08-03 Stmicroelectronics, Inc. Perspective correction device for panoramic digital camera
EP1431921A2 (en) * 2002-12-17 2004-06-23 Seiko Epson Corporation Image layout processing apparatus, method, and program
JP2004312549A (en) * 2003-04-09 2004-11-04 Sharp Corp Panoramic image photographing apparatus and panoramic image photographing method
WO2005041564A1 (en) * 2003-10-28 2005-05-06 Koninklijke Philips Electronics N.V. Digital camera with panorama or mosaic functionality

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
PATENT ABSTRACTS OF JAPAN *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8928731B2 (en) 2006-08-16 2015-01-06 Samsung Electronics Co., Ltd Panorama photography method and apparatus capable of informing optimum photographing position
EP1890481A1 (en) 2006-08-16 2008-02-20 Samsung Electronics Co., Ltd. Panorama photography method and apparatus capable of informing optimum photographing position
EP1903786A1 (en) * 2006-09-21 2008-03-26 Samsung Electronics Co., Ltd. Apparatus and method for photographing panoramic image
US10225428B2 (en) 2009-05-20 2019-03-05 Ml Netherlands C.V. Image processing for handheld scanner
EP2336975A1 (en) * 2009-11-09 2011-06-22 Samsung Electronics Co., Ltd. Apparatus and method for image registration in portable terminal
US10298898B2 (en) 2013-08-31 2019-05-21 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11563926B2 (en) 2013-08-31 2023-01-24 Magic Leap, Inc. User feedback for real-time checking and improving quality of scanned image
US10841551B2 (en) 2013-08-31 2020-11-17 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
WO2015034908A3 (en) * 2013-09-06 2015-04-23 Qualcomm Incorporated Interactive image composition
US9185284B2 (en) 2013-09-06 2015-11-10 Qualcomm Incorporated Interactive image composition
WO2015082572A3 (en) * 2013-12-03 2015-08-13 Dacuda Ag User feedback for real-time checking and improving quality of scanned image
US10375279B2 (en) 2013-12-03 2019-08-06 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10455128B2 (en) 2013-12-03 2019-10-22 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US10142522B2 (en) 2013-12-03 2018-11-27 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11115565B2 (en) 2013-12-03 2021-09-07 Ml Netherlands C.V. User feedback for real-time checking and improving quality of scanned image
US11798130B2 (en) 2013-12-03 2023-10-24 Magic Leap, Inc. User feedback for real-time checking and improving quality of scanned image
US10410321B2 (en) 2014-01-07 2019-09-10 MN Netherlands C.V. Dynamic updating of a composite image
US10708491B2 (en) 2014-01-07 2020-07-07 Ml Netherlands C.V. Adaptive camera control for reducing motion blur during real-time image capture
US11315217B2 (en) 2014-01-07 2022-04-26 Ml Netherlands C.V. Dynamic updating of a composite image
US11516383B2 (en) 2014-01-07 2022-11-29 Magic Leap, Inc. Adaptive camera control for reducing motion blur during real-time image capture
US10484561B2 (en) 2014-05-12 2019-11-19 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
US11245806B2 (en) 2014-05-12 2022-02-08 Ml Netherlands C.V. Method and apparatus for scanning and printing a 3D object
GB2537221A (en) * 2015-04-09 2016-10-12 Airbus Ds Optronics Gmbh Method for representing a panoramic image and panoramic image representation apparatus

Also Published As

Publication number Publication date
US7646400B2 (en) 2010-01-12
GB2438335A (en) 2007-11-21
US20060181619A1 (en) 2006-08-17
HK1097052A1 (en) 2007-06-15
CN1854887B (en) 2010-09-29
DE112006000358T5 (en) 2007-12-27
CN1854887A (en) 2006-11-01
GB2438335B (en) 2010-04-28
DE112006000358B4 (en) 2019-07-04
GB0715571D0 (en) 2007-09-19

Similar Documents

Publication Publication Date Title
US7646400B2 (en) Method and apparatus for forming a panoramic image
US7460782B2 (en) Picture composition guide
EP3042356B1 (en) Interactive image composition
US7639897B2 (en) Method and apparatus for composing a panoramic photograph
JP4135100B2 (en) Imaging device
US20160028955A1 (en) Camera and camera control method
US8767039B2 (en) Method and apparatus for shooting panorama
US20070223900A1 (en) Digital camera, composition correction device, and composition correction method
JP2003134378A (en) Camera including over-size imager and imaging method of image
US20120113298A1 (en) Image evaluation method, image capturing method and digital camera thereof for evaluating and capturing images according to composition of the images
WO2008075159A2 (en) Image stabilization using multi-exposure pattern
JP2004350130A (en) Digital camera
DE102012006493A1 (en) Camera implementation of selecting and stitching single frames for panorama shooting
JP2007174548A (en) Photographing device and program
CN109196852B (en) Shooting composition guiding method and device
CN110072058B (en) Image shooting device and method and terminal
Vazquez-Corral et al. Color stabilization along time and across shots of the same scene, for one or several cameras of unknown specifications
US20090040292A1 (en) Digital camera
CN103797782A (en) Image processing device and program
CN112991245A (en) Double-shot blurring processing method and device, electronic equipment and readable storage medium
CN110278366B (en) Panoramic image blurring method, terminal and computer readable storage medium
WO2008126577A1 (en) Imaging apparatus
US20080239086A1 (en) Digital camera, digital camera control process, and storage medium storing control program
JP5131367B2 (en) Imaging apparatus and program
EP2200275B1 (en) Method and apparatus of displaying portrait on a display

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
ENP Entry into the national phase

Ref document number: 0715571

Country of ref document: GB

Kind code of ref document: A

Free format text: PCT FILING DATE = 20060210

WWE Wipo information: entry into national phase

Ref document number: 0715571.6

Country of ref document: GB

WWE Wipo information: entry into national phase

Ref document number: 1120060003585

Country of ref document: DE

RET De translation (de og part 6b)

Ref document number: 112006000358

Country of ref document: DE

Date of ref document: 20071227

Kind code of ref document: P

122 Ep: pct application non-entry in european phase

Ref document number: 06717149

Country of ref document: EP

Kind code of ref document: A1