WO2006052701A2 - Digital camera having system for digital image composition and related method - Google Patents

Digital camera having system for digital image composition and related method Download PDF

Info

Publication number
WO2006052701A2
WO2006052701A2 PCT/US2005/039894 US2005039894W WO2006052701A2 WO 2006052701 A2 WO2006052701 A2 WO 2006052701A2 US 2005039894 W US2005039894 W US 2005039894W WO 2006052701 A2 WO2006052701 A2 WO 2006052701A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
overlay
camera
digital
region
Prior art date
Application number
PCT/US2005/039894
Other languages
French (fr)
Other versions
WO2006052701A3 (en
Inventor
Douglas J. Kelly
Original Assignee
Kelly Douglas J
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Kelly Douglas J filed Critical Kelly Douglas J
Priority to EP05821152A priority Critical patent/EP1808013A2/en
Priority to JP2007539349A priority patent/JP2008519505A/en
Publication of WO2006052701A2 publication Critical patent/WO2006052701A2/en
Publication of WO2006052701A3 publication Critical patent/WO2006052701A3/en

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/2621Cameras specially adapted for the electronic generation of special effects during image pickup, e.g. digital cameras, camcorders, video cameras having integrated special effects capability
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/743Bracketing, i.e. taking a series of images with varying exposure conditions
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • H04N23/633Control of cameras or camera modules by using electronic viewfinders for displaying additional information relating to control or operation of the camera
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/71Circuitry for evaluating the brightness variation

Definitions

  • the present invention relates generally to photography and, more particularly, to composition of digital photographs.
  • photography has long been a popular medium of creative expression.
  • factors such as composition and exposure settings, all contribute to creating an esthetic photograph.
  • composition is a particularly important consideration.
  • cameras typically include indicia such as cross-hairs, grid lines, or the like, to help the photographer in alignment.
  • the alignment indicia typically are etched on a screen of a viewfinder assembly.
  • the alignment indicia typically are presented as iconic images on a view-screen, commonly an LCD screen, atop the live image, thereby serving as reference in aligning the subject of the photograph.
  • some digital cameras include several indicia schemes, providing various configurations of indicia to aid the photographer in composing the photograph.
  • the various schemes typically are geared for a particular photographic composition such as a one-person portrait or a two-person portrait.
  • Some digital cameras can be prompted to depict the indicia on the resulting photograph, if desired.
  • Such photographs include a composite of the live image from the camera and the indicia.
  • some cameras provide overlays having the current date and time, serving as a time stamp for the photograph.
  • Some digital cameras have provided factory-installed overlays, simply for comical effect. For example, overlays have been provided that depict a contrived magazine cover having a blank spot for a person's head.
  • the overlay is depicted on the view screen of the camera.
  • the photographer aligns the camera such that the subject's head is positioned within the blank spot of the overlay, and then takes the picture.
  • a photograph is generated depicting the subject on the cover of a magazine.
  • a post processing method called compositing can also be used to refine improperly exposed images.
  • compositing can also be used to refine improperly exposed images.
  • a photographer tries to capture an image with a large disparity between bright and dark regions it is common for the bright areas to overexpose to excessive lightness and/or for the dark areas to underexpose to excessive darkness.
  • the traditional solution to this problem is to put the camera on a tripod and shoot multiple images at a range of different shutter and/or aperture settings. Then in post processing, the images are composited together, which can be tedious and time consuming. Even utilizing this approach, it is possible to make exposure errors which might not be detected until the editing process. However, by that time, arranging to remake the photograph might be difficult or impossible.
  • Post processing can be complicated and prone to failure due to errors made at the time of exposure.
  • the invention provides a system for digital composition usable with a digital camera providing image overlays that enable the photographer to create and combine images in a unique manner.
  • the handheld digital camera includes a plurality of files stored in digital memory. Each file can be used as an overlay that has a user-assignable opacity level.
  • the overlay is depicted on a view-screen of the camera in conjunction with image data corresponding to the field-of-view of the camera such that the opacity of the overlay controls the clarity of the image data as presented on the view-screen.
  • the camera implements a user interface presented on the view-screen to enable creation of overlays and selection of overlays from the plurality of files.
  • a photographer can create and combine images "on location" in a unique manner.
  • the overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen of the camera.
  • the camera can be configured to save two digital files to the digital memory upon taking a photograph, the first file comprising the image data corresponding to the field-of-view of the camera and the second file comprising a composite image of the overlay superimposed on the image data corresponding to the field-of-view of the camera.
  • the camera can present a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality.
  • an overlay can be configured such that its assigned opacity level affects the entirety of a resulting image.
  • the user interface can be configured to modify attributes of the stored files. For example, modification of an overlay can be achieved by removing a first color from the overlay.
  • the user interface can further enable sizing and positioning of the overlay for use relative to the image data from the field-of-view of the camera.
  • the plurality of files includes at least one print overlay having two region of differing opacity, including an open region and a shaded region, the shaded region of the overlay having a reduced opacity relative to the open region, the open region having a prescribed aspect ratio corresponding to a photograph print size.
  • multiple print overlays would be available, wherein the open region of each print overlay has a distinct aspect ratio, (e.g., 8x10 and 4x6).
  • the user interface enables automated creation of a line overlay from an image file via an edge-detect feature in which the line overlay is a line drawing of the image file.
  • the edge-detect feature analyzes the image file by identifying borders between regions of differing color and tone of a prescribed value, and defines, in the line overlay, a line of prescribed opacity and color corresponding to the identified border.
  • the user can set the prescribed value used for identifying the border between regions in the image file.
  • the user can set the opacity value and the color for the line overlay.
  • a user selects a stored image from digital memory.
  • the image is presented on a view-screen of the camera.
  • the user may assign an opacity level to this overlay via a user input device of the digital camera.
  • a user input device is used to designate a region of the stored image to be erased.
  • the resultant image is saved to digital memory for use as an overlay. When used, a portion of the live image corresponding to the erased region is unaffected by the overlay.
  • a method for image composition using a handheld digital camera comprises the steps of:
  • the digital camera having a processor assembly and a digital sensor assembly in communication with the processor, the sensor assembly having a prescribed sensing range beyond which a captured image will have over- or under-exposed regions;
  • the identifying step performed by the processor assembly of the camera;
  • the method further comprises, prior to the capturing multiple images step, analyzing an image taken as a single exposure for over- or under-exposed regions and, if found, prompting user to initiate the capturing multiple images step.
  • the identifying and automated compositing steps further include: (a) selecting a second digital image of the multiple digital images having at least one properly exposed region corresponding in location to an over- or under-exposed region of the first digital image; (b) automated compositing of all regions from the second digital image region corresponding in location to all over- or under-exposed region of the first digital image with the remaining portions of the first digital image; and (c) repeating steps (a) and (b) with the resultant image from the prior step (b) and a third digital image of the multiple digital images.
  • the method further includes storing the multiple digital images that were captured in automated sequence for later use.
  • the method further includes automated deletion of the multiple digital images following completion of the composite image.
  • FIG. 1 is a rear view of a digital camera in accordance with the present invention, depicting a first overlay having a selected opacity level and presented on a view-screen of the camera.
  • FIG. 2 is screen shot of the view-screen of the camera of FIG. 1, depicting an overlay menu of a user interface for selecting and setting an image overlay.
  • FIG. 3 is a simplified block diagram of the digital camera of FIG. 1, depicting the memory having a plurality of image files usable as overlays.
  • FIG. 4 is a rear view of the camera of FIG. 1, depicting an overlay against an alignment background.
  • FIG. 5 is simplified view of the digital camera of FIG. 1 aligned to capture an image of a subject.
  • FIG. 6 is a representative view of the first image overlay superimposed atop the captured image from FIG. 5, forming a composite image incorporating both the first overlay and the captured image.
  • FIG. 7 is rear view of the camera of FIG. 1, depicting the composite image on the view- screen of the first image overlay and the captured image.
  • FIG. 8 is a rear view of the camera of FIG. 1, depicting a second image overlay presented on the view-screen.
  • FIG. 9 is a rear view of the camera of FIG. 1, depicting the second image overlay of FIG. 8 reoriented by the photographer.
  • FIG. 1OA is a screen shot of the view-screen of the camera of FIG. 1, depicting the composite image from FIG. 8 selected by the photographer for modification as an image overlay.
  • FIG. 1OB is a screen shot similar to FIG. 1OA, the composite image having a prescribed area in which pixels were "erased" by the photographer.
  • FIG. 1OC is a screen shot similar to FIG. 1OB, depicting a composite image combining the image overlay from FIG. 1OB with a captured image having the subject's head aligned to reside in the area of "erased" pixels.
  • FIG. 11 is a screen shot of the view-screen of the camera of FIG. 1, depicting a scenic image captured by the camera.
  • FIG. 12 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 4x6.
  • FIG. 13 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 5x7.
  • FIG. 14 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 8x10.
  • FIG. 15A is an exemplary scaled-tone image taken by the camera of FIG. 1.
  • FIG. 15B is an exemplary line overlay derived from the image of FIG. 15 A, using the edge-detect feature of the camera of FIG. 1.
  • FIG. 15C is a refined line overlay derived from the FIG. 15B, having extraneous marks removed using edit feature of the camera of FIG. 1.
  • FIG. 16 is a simplified flow chart of an exemplary method for exposure bracketing implemented by the camera of FIG. 1.
  • FIG. 17A is a desired image, depicting a person standing in front of the corner of a building, having large disparity between bright and dark regions.
  • FIG. 17B is a simplified histogram chart of the image of FIG. 17 A.
  • FIG. 18 A is an image, similar to FIG. 17A, captured by the camera of FIG. 1 at a first exposure setting.
  • FIG. 18B is a simplified histogram chart of the image of FIG. 18 A.
  • FIG. 19A is an image, similar to FIG. 17 A, captured by the camera of FIG. 1 at a second exposure setting.
  • FIG. 19B is a simplified histogram chart of the image of FIG. 19 A.
  • FIG. 2OA is an image, similar to FIG. 17A, captured by the camera of FIG. 1 at a third exposure setting.
  • FIG. 2OB is a simplified histogram chart of the image of FIG. 2OA.
  • a digital camera 20 comprising an image system 21 having a unique combination of features that aid in generating creative, high-quality images.
  • the system includes a set of digital files stored in camera memory 40 (FIG. 3) that can be presented as overlays (e.g., first overlay 24).
  • the system can present a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality, from iconic overlays to full-scale overlays having varied opacity.
  • the overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen 22 of the camera.
  • the system can create a composite photograph in which a live image and the overlay are combined together.
  • the system further includes an auto-compositing feature that aids in creating images free of improperly expbsed regions, which is discussed in detail further below.
  • a photographer can create and combine images "on location" in a unique manner, creating high-quality photographs.
  • a photographer can create or select an overlay having a desired combination of attributes via an overlay menu 34.
  • the menu provides a number of queries to guide the photographer through the process.
  • the photographer interacts with the menu using directional buttons 36 (FIG. 1) located to the left of the view-screen.
  • the directional buttons preferably are used to scroll through the menu, highlighting and selecting items, as desired.
  • Other embodiments are contemplated that allow the photographer to interact with the menu by various other means, e.g., touch-screen, stylus, joystick, and so on.
  • the " photographer can select from any image stored in memory 40 (FIG. 3) for use as an overlay.
  • the images are presented in a scrollable list 42 to the right of the query, allowing the photographer to select a desired image.
  • various other approaches can be taken to enable the photographer to create or select overlays or to access various other features in accordance with the invention, such as, use of file galleries depicted on the view-screen, presentation of a list of overlays, and so on.
  • the photographer can set the opacity level to a desired percentage from 0 to 100 percent, as desired.
  • the opacity level for the overlay controls the clarity through which the subject of the camera is viewed. For example, in a fully opaque image (100 %) none of the underlying image is viewable.
  • the view-screen depicts both the live image received from the camera lens 26 and the overlay, superimposing the overlay atop the live image.
  • the photographer can align the camera (FIG. 5), composing a photograph to combine the overlay and the subject 28 in a desired manner.
  • a third query 46 allows the photographer to further customize the overlay. For example, by selecting the "full screen” option 48, the overlay can be sized to correspond to the entire viewable area, as depicted on the view-screen. By selecting the "size/orient” option 66, the photographer can adjust the size and orientation of the overlay, to achieve a desired look. Also, “compare mode” presents the selected image overlay, whether opaque or translucent, by toggling between the selected image and the live image, i.e., rather than a constant depiction, the overlay is iteratively presented. This image toggling can take place automatically or initiated manually, e.g., via a toggle button.
  • a fourth query 52 is provided. This query allows the photographer to activate different editing features to further modify the overlay, even down to the pixel level. For example, the photographer can erase portions of the overlay to allow corresponding portions of the "live" image to be unaffected by the overlay, an example of which is discussed below with reference to FIGS. 1OA - 1OC (i.e., the fifth example below).
  • the "enter" icon 54 is selected, and the overlay is presented for use.
  • the system enables a number of distinct overlays that provide unique features.
  • the photographer can create creative, high-quality images, without having to be unduly dependent on post processing of the image to obtain a desired look. Instead, much of the , guesswork is eliminated; the photographer has a greater ability to generate a desired image, to include composition and compositing, while in the field.
  • exemplary overlays of the system are discussed below. 1. Small-Scale, Uniform Opacity Overlay
  • a first overlay 24 depicting a "smiley face" is presented.
  • the first overlay is sized to affect just a portion of the overall image and is semi-transparent.
  • the first overlay is presented on the view-screen 22 of the camera superimposed atop the live image received from the camera's lens 26 (e.g., FIGS. 6 and 7).
  • the photographer can align the camera (FIG. 5), composing a photograph to combine the overlay and the subject 28 in a desired manner.
  • the photographer decided to offset the head of the subject with the "smiley face," to create a composition similar to an infinity sign.
  • the camera saves two digital files - the first file comprises an unaltered image 30 (FIG. 6) of the subject itself, and the second file is a composite image 32 (FIG. 7) of the overlay superimposed on the image of the subject. These files are then available for use as an overlay, if desired.
  • the "smiley" file 56 is selected and the opacity level is set at a prescribed percentage, e.g., 60 percent.
  • the "size/orient” option 66 the smiley icon 56 will be depicted against an alignment background 60 (see FIG. 4).
  • the alignment background includes a uniform field of black orthogonal gridlines on a gray background at a resolution corresponding to that of the digital sensor of the camera.
  • the smiley icon can be sized, positioned, and oriented, as desired.
  • FIG. 4 depicts the smiley icon against an alignment background 60.
  • FIGS. 8 and 9 a second overlay 64 in the form of dumbbell is depicted.
  • the overlay is opaque, i.e., the underlying image is not viewable through the overlay.
  • the dumbbell icon 64 is depicted in a default orientation in which it is oriented horizontally.
  • the image identified at the first query is presented on the view-screen 22 against the alignment background 60. In this example, therefore, the dumbbell image is presented. Then, using the directional buttons 36, the dumbbell can be re ⁇ oriented. Once completed, the photographer can use the new overlay having the angled dumbbell.
  • overlays formed of various different images can also be used, such as crosshairs, dots, circles, company logos, and so on.
  • the photographer can also select just a portion of an image for use as an overlay. For example, a flower can be "picked” out of a saved image and stored as an icon. The photographer can position one or more of the icons on the view screen. The user can choose to photograph against these icons "on-the- fly.” The user can utilize the icons to align a composition and then save only live image data. The user may also choose to save any particular iconic arrangement as an overlay image for later use.
  • the positioning of icons can occur against a variety of backgrounds which may selectively include, for example, a live image, a stored image, a neutral background, or an alignment grid.
  • a third example is provided of a full-scale, semi-transparent overlay, (third overlay 80).
  • the third overlay is generated for an image of an island village previously captured with the camera, and the photographer would like to mimic the composition of this image.
  • the photographer selects the desired image and assigns an opacity level of 25% to the entire overlay.
  • the third overlay is presented on the view-screen 22, allowing it to assist in the composition of the live image.
  • the photographer could either take the photograph or exit from the "overlay mode" and continue making other adjustments prior to taking the shot.
  • the camera can save two distinct types of images.
  • the first image type is the live image as recorded by the digital sensor, which would presumably share the compositional characteristics of the overlay.
  • the second image is a composite of the overlay and the live image as had been previewed in the view screen.
  • the second image type is analogous to a multiple exposure of a traditional film camera. This functionality simplifies the production of multiple exposures and frees the photographer to apply attention to the artistic concerns.
  • the photographer can use any image in camera memory or otherwise accessible via the system. Stored images can be randomly and repeatedly accessed as overlay.
  • a photographer could create a photograph of multiple exposures overlaid atop each other. For example, a single image could consist often images combined together, each image having an assigned opacity level.
  • the composite of the stored image and the live image can be previewed in real-time, allowing the photographer to make adjustments prior to depressing the shutter button, thereby minimizing the need for post-processing of the captured image.
  • the photographer could optionally store both the live image data and the composited image data.
  • the system 21 includes overlays that aid in composing images for prescribed aspect ratios to include standard aspect ratios as well as custom aspect ratios.
  • the photographer can compose the image within the parameters of the desired aspect ratio.
  • an overlay 82 is shown in FIG. 12 that is configured to aid in composing images for prints having an aspect ratio of 4x6, e.g., "4x6" prints.
  • the overall size of the overlay corresponds to the aspect ratio of the camera's default image configuration, as dictated by the camera's sensor, i.e., 3x4.
  • the overlay defines an open region 84 with an opacity level of 0 % and an aspect ratio of 4x6, such that the corresponding portion of the live image can be depicted on the view-screen and saved to the resulting image file unhindered.
  • the overlay further includes shaded regions 86 provided for remaining area of overlay having an increased opacity level, e.g., 60 percent.
  • the photographer can, therefore, compose the image with a particular print size in mind.
  • the camera can save several image files, including a first file simply depicting the scenic image without the overlay at the default aspect ratio, a second file depicting the scenic image and the overlay at the default aspect ratio, and a third file depicting the scenic image as defined by the open portion of the open region of the overlay.
  • an overlay 90 depicted in FIG. 13, includes an open area 94 having an aspect ratio of 5x7 usable for composing 5x7 prints.
  • the overlay 92 depicted in FIG. 14, includes an open area 96 having an aspect ratio of 8x10, usable for composing 8x10 prints.
  • additional custom overlays having an open area of any desired aspect ratio can be configured using the overlay menu 34.
  • other visual means of designating areas graphically can be used, e.g., hatching, marching ants, borders, and so on.
  • the user can select if the camera saves the entire image or performs an automatic crop to the previewed aspect ratio. Saving the entire image would result in a somewhat larger file size but would give identical results when printed on a zoom-in default printer. Saving the entire image preserves the potential to recompose the image later by performing an alternate crop utilizing regions that would have been discarded during the automatic crop process.
  • FIGS. 1OA - 1OC an example is provided, detailing features of editing selected portions of an overlay. More particularly, the camera 20 allows the photographer to independently adjust the intensity of selected areas of an overlay, as desired.
  • a photographer can erase pixels from a stored image by selecting the "yes" option 67 at the fourth query. Once this option is selected, the image is presented on the view-screen 22, and the software feature for erasing pixels is activated. For purpose of illustration, this feature is sequentially depicted in FIGS. 1OA - 1OC.
  • FIG. 1OA an image 68 is presented on the view-screen 22. As indicated by a designator 70 in the lower corner, the image is presented with the feature of erasing pixels activated. This image was taken using the dumbbell overlay 62 (FIG. 8) to appear as though person "A" is lifting the dumbbell.
  • the photographer would like to create a photograph in which the head of person "B” replaces the head of person "A.”
  • the photographer can designate a region, or regions, of the image from which to erase pixels, e.g., region 72 (FIG. 1 OB).
  • region 72 FIG. 1 OB
  • the designated region identified the pixels associated with the head of person "A.”
  • the photographer selects the erase designator, erasing the identified pixels and returning to the overlay menu.
  • FIG. 1OC depicts an image 74 created using this overlay, taken of person "B" in the manner discussed with reference to FIG. 2.
  • the camera saves two digital files - the first file consists simply of an image of person "B," and the second file is the composite image 74.
  • a photograph is taken of a baby in a bonnet.
  • the photographer selects the saved image for conversion into an overlay.
  • the photographer can adjust global opacity, regional opacity, or both.
  • the image of the baby in the bonnet is selected and presented on the view screen.
  • the baby's face is erased, and the result is saved as an overlay image.
  • the portion of the image corresponding to the baby's face can be set at a prescribed opacity level, e.g., 50 percent, allowing a corresponding portion of a live image (or another stored image) to be viewable.
  • the overlay is saved to the camera independently of the original image of the baby in the bonnet. Using this overlay, the photographer can now compose other faces into the bonnet.
  • the resulting images can be very humorous or even informative.
  • the camera 20 further includes a feature of overlay generation from an image file that enables a user to delete a predominant color.
  • the camera prompts the photographer to select a saved image that will be edited to create an image overlay with opacity characteristics.
  • the "overlay creation" mode includes an "automatic" setting.
  • a processor 41 determines the predominant single color in the image.
  • this feature can be activated via the overlay menu 34 by identifying the appropriate file at the first query 38 and selecting the "Color Subtract" option 69 for the fourth query 52. Then, the selected image file will be displayed on the view-screen 22.
  • the camera will automatically identify the predominant single color for removal and designates regions containing that color on the view screen with a graphical identification method such as "color flashing.” Next, the photographer will be prompted if the proper color was removed. The user is prompted to "subtract this region? Yes/no/cancel.” If “yes” is selected, the predominant region is subtracted and the result may be stored as an image overlay. If “no” is selected, the region of secondary dominance is designated and the prompt is renewed. In this manner, the user can subtract large regions rapidly.
  • a custom designed grid becomes immediately available as an image overlay.
  • the user would be able to compose live images against this grid.
  • the overlay could also be composited in fine detail to the final image and used immediately for many purposes such as scientific measurement.
  • an image file in the camera's digital memory 40 could be modified in a similar manner.
  • a selected image could be modified to include a predominant single color, e.g., cyan, at desired locations throughout the image using the "modify" option 71 of the third query 46 of the overlay menu. This feature allows the photographer to interact with the image, adding colors, icons, and other indicia.
  • the user could manually designate a color to be removed with a pointer or eyedropper cursor, as are well known in the art.
  • blocks of a single color could be added to the image, covering portions of the image.
  • the designated color blocks can be removed.
  • a user could download a photograph of Mt. Rushmore from the internet.
  • the user could operate a simple paint program to apply a color to one or more of the president's faces, preferably using a color not otherwise present in the image. As described above, the user could rapidly transfer the image to the camera and create an entertaining overlay image.
  • the camera 20 further includes a feature for overlay generation initiated by selecting "edge detect” 73 from the fourth query 52 of the overlay menu 34.
  • This feature is configured to generate a line drawing based off an image file by detecting edges between regions of high contrast.
  • the photographer selects a saved image that will be edited to create an image overlay with opacity characteristics.
  • the "overlay creation" mode includes an "automatic” setting.
  • a processor 41 (FIG. 3) utilizes an "edge-detect" algorithm to determine the location of edges in the image field. Such edges are algorithmically identified by determining where regions are changing color or tone along a path.
  • edge-paths are typically displayed in a selectable or uncommon color.
  • the edges could be displayed against the original image or against a neutral background.
  • the system presents three slider bars that allow the user to vary the tolerance of the edge hardness that would be displayed as well the opacity of the edges and the color of the edge-delineating pixels.
  • a further step would allow the user to manually delete extraneous pixels, and a final step would allow the "edge map" to be saved as an overlay.
  • FIG. 15 A depicts a photograph 75 of women's shoes.
  • consistency among several such photographs would be achieved simply by marking shoe positions on the floor and mounting the camera on a tripod.
  • a floor mark was accidentally removed or a tripod was accidentally struck, re-establishment of the composition becomes problematic.
  • the problem of re-establishing the composition is increased when multiple photo sessions occur at different times or in different places. In any event, the traditional method may properly identify the shoe arrangement but it will often fail to identify the proper leg position.
  • a line overlay 77 is created that can serve as to aid in composition for further images.
  • a processor 41 (FIG. 3) utilizes an "edge-detect" algorithm to determine the location of edges in the image field. Such edges are algorithmically identified by determining where regions are changing color or tone along a path. Such edge-paths can be displayed in a selectable or uncommon color, for example. The edges can be displayed against the original image or against a neutral background, as depicted in image 79 of FIG. 15B.
  • the user could vary the tolerance of the edge hardness that would be displayed as well the opacity of the edges and the color of the edge-delineating pixels.
  • a further step would allow the user to manually delete extraneous pixels as shown in FIG. 15C.
  • the system of the camera 20 includes a "compare mode" that toggles between the selected overlay and the live image without application of the overlay.
  • compare mode can be selected at the third query 46 of menu 34. In this mode, rather than a constant depiction, the overlay is iteratively presented. This iteration can take place automatically or initiated manually, e.g., via a toggle button 55.
  • the overlay is opaque.
  • the photographer can manually toggle between the stored image overlay and the live image in order to compare and make adjustments.
  • automatic mode the camera will toggle at a pre ⁇ determined or user-definable rate between the selected image and the live image.
  • the photographer can override the automatic toggling, e.g., via the toggle button.
  • the live image was sufficiently composed, the photographer could either take the shot or exit from the "compare mode" and continue making other adjustments prior to making the exposure.
  • the camera 20 includes a sensor assembly 25 configured to capture a digital representation of the camera's field-of-view.
  • the captured image of the camera's field-of-view can be non-ideal, particularly when there is a large disparity between bright and dark regions. In these situations, the captured image can include over-exposed and under-exposed regions.
  • the auto-compositing feature aids in creating a composite image free of such improperly exposed regions.
  • FIG. 16 depicts a flowchart for implementing the auto-compositing feature of system 21.
  • the system can composite images taken at various exposure settings to generate a desired image having disparity between bright and dark regions beyond the maximum latitude of the sensor assembly 25 of the camera.
  • FIG. 17A depicts a desired image of a person standing in front of the corner of a building, having large disparity between bright and dark regions. Bright light streams onto the building wall depicted in the right side of the photograph. On the left side of the photograph the building wall is in dark shadows, indicated by the cross-hatched lines. At the center of the photograph the person is in moderate light, perhaps shade, indicated by the single-hatched lines.
  • FIG. 17B is a histogram that classifies the brightness of the image of FIG. 17A by area. Such histograms typically plot darker values on the left and brighter values on the right. By coincidence, the dark values at the left of the photograph correspond primarily to the hump near the left of the histogram. The light values at the right of the photograph correspond primarily to the hump near the right of the histogram. The moderate light on the person corresponds to the center hump. The arrow at the bottom of the histogram illustrates the tremendously wide latitude required to capture this image, beyond the maximum latitude of the sensor assembly 25 of the camera.
  • FIGS. 18A, 19 A, and 2OA depict resulting images captured by camera 20, at various exposure settings.
  • FIG. 18 A represents the result of photographing the image with a non-ideal camera at a first exposure setting. The exposure has been set to capture the dark left side of the building. The remaining portions of the image are extremely overexposed. The camera's latitude at the first exposure setting is indicated by arrow of FIG. 18B. Notably, this latitude is less than is required to capture the desired image of FIG. 17 A.
  • FIG. 19A the exposure has been set to capture the moderate light of the person but the details of the building are underexposed on the left and overexposed on the right.
  • FIG. 2OA the exposure has been set to capture the bright side of the building at the right of the photograph. The left side and the person are underexposed.
  • the auto-compositing feature is initiated by identifying a potentially bad image, step 200.
  • identification comprises an algorithm that identifies significant bad regions as are evident in FIGS. 18 A, 19A, and 2OA.
  • the camera prompts the user to make a new exposure-bracketed attempt at the shot or to exit back to normal camera function. If the user chooses to attempt the improved image, further guidance prompts may be presented as illustrated in step 204. Such guidance may also comprise using a tripod.
  • the user depresses the shutter, step 206.
  • step 208 under the guidance of the processor 41 (FIG. 3), the camera exposes and saves multiple bracketed images, each at a distinct exposure setting.
  • the processor examines all the captured images and selects a first image using an algorithm that typically gives the best results (step 210). In this example, the image of FIG. 18A is selected.
  • the system can randomly select one of the images for evaluation.
  • the system can utilize an algorithm that would evaluate and compare characteristics of the bracketed set in order to establish the most appropriate first image with which to begin the compositing process. Such selection algorithms could evaluate with respect to the compositing algorithms in order to determine the combination that would provide the most pleasing final images.
  • the image is evaluated and may branch to saving the image at step 220.
  • an overexposed region 213 is identified.
  • the overexposed region will be measured against comparable regions in a corresponding location in the other bracketed images.
  • the overexposed region 213 of FIG. 18A would be replaced by the region 215 of FIG. 19A depicting the image of the person.
  • the image is reevaluated by the processor to determine if additional compositing would improve it.
  • the system would yet identify a large portion of overexposed right region and, thus, would loop the processor back to step 214. Then, the system would composite the properly exposed right region 217 from FIG. 19A into the image.
  • no further improvement would be possible without a re- shoot.
  • a final evaluation at step 218 would determine if the final image might be improvable with a re-shoot. If so, the user would get a preview of the current result on the view screen and be prompted with the opportunity to re-shoot at step 222.
  • the camera would make additional adjustments of exposure and number of shots in order to achieve a higher likelihood of success. If the camera determines that a re-shoot would not improve the shot, or if the user elects to not re-shoot, the image is saved at step 220.
  • the compositing process would obviously be benefited by algorithms that can smooth the transition between the composited sections. These algorithms are widely known. It is possible that the quality of such composites may, in many cases, obviate the need for later editing. In any case, the auto- composited image would provide satisfactory assurance to the photographer that the bracketed exposures contained adequate image data for later compositing.
  • the present invention provides a digital camera comprising an image system having a unique combination of features that aid in generating creative, high-quality images.
  • the system presents a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality, and from iconic overlays to full-scale overlays having varied opacity.
  • the overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen of the camera.
  • the camera can create a composite photograph in which a live image and the overlay are combined together.
  • the camera can include an auto-compositing feature that aids in creating images free of improperly exposed regions. Thus, a photographer can create and combine images "on location" in a unique manner.

Abstract

A digital camera is provided, comprising an image system having a unique combination of features that aid in generating creative, high-quality images. The system presents a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality, and from iconic overlays to full-scale overlays having varied opacity. The overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen of the camera. In an exemplary embodiment, the camera can create a composite photograph in which a live image and the overlay are combined together. In an independent aspect, the camera can include an auto-compositing feature that aids in creating images free of improperly exposed regions. Thus, a photographer can create and combine images “on location” in a unique manner.

Description

DIGITAL CAMERA HAVING SYSTEM FOR DIGITAL IMAGE COMPOSITION AND
RELATED METHOD
BACKGROUND OF THE INVENTION
The present invention relates generally to photography and, more particularly, to composition of digital photographs.
Photography has long been a popular medium of creative expression. In traditional photography, factors such as composition and exposure settings, all contribute to creating an esthetic photograph. Of these, composition is a particularly important consideration. To aid in composition of the subject, cameras typically include indicia such as cross-hairs, grid lines, or the like, to help the photographer in alignment. In traditional film cameras the alignment indicia typically are etched on a screen of a viewfinder assembly. In digital cameras, the alignment indicia typically are presented as iconic images on a view-screen, commonly an LCD screen, atop the live image, thereby serving as reference in aligning the subject of the photograph.
More recently, some digital cameras include several indicia schemes, providing various configurations of indicia to aid the photographer in composing the photograph. The various schemes typically are geared for a particular photographic composition such as a one-person portrait or a two-person portrait. Some digital cameras can be prompted to depict the indicia on the resulting photograph, if desired. Thus, such photographs include a composite of the live image from the camera and the indicia. For example, some cameras provide overlays having the current date and time, serving as a time stamp for the photograph. Some digital cameras have provided factory-installed overlays, simply for comical effect. For example, overlays have been provided that depict a contrived magazine cover having a blank spot for a person's head. In use, the overlay is depicted on the view screen of the camera. The photographer aligns the camera such that the subject's head is positioned within the blank spot of the overlay, and then takes the picture. As a result, a photograph is generated depicting the subject on the cover of a magazine.
Similarly, it can be desired to compose multiple images in a single photograph. For example, certain film cameras allow a photographer to expose a single frame of film multiple times. This "multi-exposure" mechanism allows a photographer to open the shutter multiple times without advancing the film between the exposures. Light from each of the exposures is recorded onto a single frame of film. However, the results are often dependent on the photographer precisely controlling many aspects of the composition or at least making an educated guess regarding alignment of the stored image with the live image in the viewfinder. Insofar as the photographer is unable to precisely recall the prior photograph, the quality of the resulting composite is left to chance. If an error is made on either exposure, both exposures are typically rendered useless.
In contrast, in digital photography, multiple-exposure photographs typically are created in post processing. For example, a photographer will capture separate digital images. Then, using a personal computer running software for digital image editing (e.g., Adobe® Photoshop available from Adobe Systems, Inc), the photographer will composite the separate images into a single image. During this process, the photographer typically will need to register the images so that corresponding features within the images are properly aligned. Since each image was taken separately, there is a good bit of guesswork involved in composing each image. Thus, it is common that the contents of the images will not align precisely. During post processing, portions of each image typically must be cropped to conform the images, which can eliminate desired aspects of the image.
A post processing method called compositing can also be used to refine improperly exposed images. When a photographer tries to capture an image with a large disparity between bright and dark regions it is common for the bright areas to overexpose to excessive lightness and/or for the dark areas to underexpose to excessive darkness. The traditional solution to this problem is to put the camera on a tripod and shoot multiple images at a range of different shutter and/or aperture settings. Then in post processing, the images are composited together, which can be tedious and time consuming. Even utilizing this approach, it is possible to make exposure errors which might not be detected until the editing process. However, by that time, arranging to remake the photograph might be difficult or impossible. Thus, despite the aforementioned advancements in photography, a certain amount of good fortune is needed to achieve a desired esthetic in a photograph composed of multiple exposures. Post processing can be complicated and prone to failure due to errors made at the time of exposure.
It should, therefore, be appreciated that there remains a need for a system of digital image composition that allows users to create, modify, or personalize digital images from a digital camera to include image composition and image exposure without undue reliance on post processing. The present invention fulfills this need and others. SUMMARY OF THE INVENTION
The invention provides a system for digital composition usable with a digital camera providing image overlays that enable the photographer to create and combine images in a unique manner. By way of example, the handheld digital camera includes a plurality of files stored in digital memory. Each file can be used as an overlay that has a user-assignable opacity level. The overlay is depicted on a view-screen of the camera in conjunction with image data corresponding to the field-of-view of the camera such that the opacity of the overlay controls the clarity of the image data as presented on the view-screen. The camera implements a user interface presented on the view-screen to enable creation of overlays and selection of overlays from the plurality of files. Thus, a photographer can create and combine images "on location" in a unique manner.
Optionally, the overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen of the camera. Also, the camera can be configured to save two digital files to the digital memory upon taking a photograph, the first file comprising the image data corresponding to the field-of-view of the camera and the second file comprising a composite image of the overlay superimposed on the image data corresponding to the field-of-view of the camera.
In an exemplary embodiment, the camera can present a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality. For example, an overlay can be configured such that its assigned opacity level affects the entirety of a resulting image. The user interface can be configured to modify attributes of the stored files. For example, modification of an overlay can be achieved by removing a first color from the overlay. The user interface can further enable sizing and positioning of the overlay for use relative to the image data from the field-of-view of the camera.
In a detailed aspect of an exemplary embodiment, the plurality of files includes at least one print overlay having two region of differing opacity, including an open region and a shaded region, the shaded region of the overlay having a reduced opacity relative to the open region, the open region having a prescribed aspect ratio corresponding to a photograph print size. Optionally, multiple print overlays would be available, wherein the open region of each print overlay has a distinct aspect ratio, (e.g., 8x10 and 4x6). In another detailed aspect of an exemplary embodiment, the user interface enables automated creation of a line overlay from an image file via an edge-detect feature in which the line overlay is a line drawing of the image file. The edge-detect feature analyzes the image file by identifying borders between regions of differing color and tone of a prescribed value, and defines, in the line overlay, a line of prescribed opacity and color corresponding to the identified border. Optionally, the user can set the prescribed value used for identifying the border between regions in the image file. Also, the user can set the opacity value and the color for the line overlay.
In a preferred method of generating an overlay with a digital camera, a user selects a stored image from digital memory. The image is presented on a view-screen of the camera. The user may assign an opacity level to this overlay via a user input device of the digital camera. A user input device is used to designate a region of the stored image to be erased. The resultant image is saved to digital memory for use as an overlay. When used, a portion of the live image corresponding to the erased region is unaffected by the overlay.
In an independent aspect of the invention, a method for image composition using a handheld digital camera comprises the steps of:
capturing multiple digital images in automated sequence using a digital camera, each image captured at a distinct exposure setting, the digital camera having a processor assembly and a digital sensor assembly in communication with the processor, the sensor assembly having a prescribed sensing range beyond which a captured image will have over- or under-exposed regions;
analyzing a first digital image of the multiple digital images for over- or under-exposed regions;
identifying properly exposed regions in the remaining digital images of the multiple digital images corresponding in location to the over- or under-exposed regions of the first digital image, the identifying step performed by the processor assembly of the camera;
automated compositing of properly exposed regions identified in the first digital image and the remaining digital images of the multiple digital images, resulting in a composite image in which the identified regions replace the corresponding over- or under-exposed regions of the first digital images; and
storing the composite image in digital memory of the camera.
In an exemplary embodiment, the method further comprises, prior to the capturing multiple images step, analyzing an image taken as a single exposure for over- or under-exposed regions and, if found, prompting user to initiate the capturing multiple images step.
In a detailed aspect of an exemplary embodiment, the identifying and automated compositing steps further include: (a) selecting a second digital image of the multiple digital images having at least one properly exposed region corresponding in location to an over- or under-exposed region of the first digital image; (b) automated compositing of all regions from the second digital image region corresponding in location to all over- or under-exposed region of the first digital image with the remaining portions of the first digital image; and (c) repeating steps (a) and (b) with the resultant image from the prior step (b) and a third digital image of the multiple digital images.
In another detailed aspect of an exemplary embodiment, the method further includes storing the multiple digital images that were captured in automated sequence for later use.
In yet another detailed aspect of an exemplary embodiment, the method further includes automated deletion of the multiple digital images following completion of the composite image.
For purposes of summarizing the invention and the advantages achieved over the prior art, certain advantages of the invention are described herein. Of course, it is to be understood that not necessarily all such advantages may be achieved in accordance with any particular embodiment of the invention. Thus, for example, those skilled in the art will recognize that the invention may be embodied or carried out in a manner that achieves or optimizes one advantage or group of advantages as taught herein without necessarily achieving other advantages as may be taught or suggested herein.
All of these embodiments are intended to be within the scope of the invention herein disclosed. These and other embodiments of the present invention will become readily apparent to those skilled in the art from the following detailed description of the exemplary embodiments having reference to the attached figures, the invention not being limited to any particular embodiment disclosed.
BRIEF DESCRIPTION OF THE DRAWINGS
Embodiments of the present invention will now be described, by way of example only, with reference to the following drawings in which:
FIG. 1 is a rear view of a digital camera in accordance with the present invention, depicting a first overlay having a selected opacity level and presented on a view-screen of the camera.
FIG. 2 is screen shot of the view-screen of the camera of FIG. 1, depicting an overlay menu of a user interface for selecting and setting an image overlay.
FIG. 3 is a simplified block diagram of the digital camera of FIG. 1, depicting the memory having a plurality of image files usable as overlays.
FIG. 4 is a rear view of the camera of FIG. 1, depicting an overlay against an alignment background.
FIG. 5 is simplified view of the digital camera of FIG. 1 aligned to capture an image of a subject.
FIG. 6 is a representative view of the first image overlay superimposed atop the captured image from FIG. 5, forming a composite image incorporating both the first overlay and the captured image.
FIG. 7 is rear view of the camera of FIG. 1, depicting the composite image on the view- screen of the first image overlay and the captured image.
FIG. 8 is a rear view of the camera of FIG. 1, depicting a second image overlay presented on the view-screen.
FIG. 9 is a rear view of the camera of FIG. 1, depicting the second image overlay of FIG. 8 reoriented by the photographer.
FIG. 1OA is a screen shot of the view-screen of the camera of FIG. 1, depicting the composite image from FIG. 8 selected by the photographer for modification as an image overlay. FIG. 1OB is a screen shot similar to FIG. 1OA, the composite image having a prescribed area in which pixels were "erased" by the photographer.
FIG. 1OC is a screen shot similar to FIG. 1OB, depicting a composite image combining the image overlay from FIG. 1OB with a captured image having the subject's head aligned to reside in the area of "erased" pixels.
FIG. 11 is a screen shot of the view-screen of the camera of FIG. 1, depicting a scenic image captured by the camera.
FIG. 12 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 4x6.
FIG. 13 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 5x7.
FIG. 14 is a screen shot similar to FIG. 11, depicting an overlay having an open area sized to an aspect ratio of 8x10.
FIG. 15A is an exemplary scaled-tone image taken by the camera of FIG. 1.
FIG. 15B is an exemplary line overlay derived from the image of FIG. 15 A, using the edge-detect feature of the camera of FIG. 1.
FIG. 15C is a refined line overlay derived from the FIG. 15B, having extraneous marks removed using edit feature of the camera of FIG. 1.
FIG. 16 is a simplified flow chart of an exemplary method for exposure bracketing implemented by the camera of FIG. 1.
FIG. 17A is a desired image, depicting a person standing in front of the corner of a building, having large disparity between bright and dark regions.
FIG. 17B is a simplified histogram chart of the image of FIG. 17 A.
FIG. 18 A is an image, similar to FIG. 17A, captured by the camera of FIG. 1 at a first exposure setting.
FIG. 18B is a simplified histogram chart of the image of FIG. 18 A. FIG. 19A, is an image, similar to FIG. 17 A, captured by the camera of FIG. 1 at a second exposure setting.
FIG. 19B is a simplified histogram chart of the image of FIG. 19 A.
FIG. 2OA, is an image, similar to FIG. 17A, captured by the camera of FIG. 1 at a third exposure setting.
FIG. 2OB is a simplified histogram chart of the image of FIG. 2OA.
DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
Referring now to the drawings, and particularly FIGS. 1 - 3, there is shown a digital camera 20 comprising an image system 21 having a unique combination of features that aid in generating creative, high-quality images. The system includes a set of digital files stored in camera memory 40 (FIG. 3) that can be presented as overlays (e.g., first overlay 24). The system can present a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality, from iconic overlays to full-scale overlays having varied opacity. The overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen 22 of the camera. Also, the system can create a composite photograph in which a live image and the overlay are combined together. The system further includes an auto-compositing feature that aids in creating images free of improperly expbsed regions, which is discussed in detail further below. Thus, a photographer can create and combine images "on location" in a unique manner, creating high-quality photographs.
With reference to FIG. 2, a photographer can create or select an overlay having a desired combination of attributes via an overlay menu 34. The menu provides a number of queries to guide the photographer through the process. The photographer interacts with the menu using directional buttons 36 (FIG. 1) located to the left of the view-screen. The directional buttons preferably are used to scroll through the menu, highlighting and selecting items, as desired. Other embodiments are contemplated that allow the photographer to interact with the menu by various other means, e.g., touch-screen, stylus, joystick, and so on. At the first query 38, the " photographer can select from any image stored in memory 40 (FIG. 3) for use as an overlay. The images are presented in a scrollable list 42 to the right of the query, allowing the photographer to select a desired image. In other embodiments, various other approaches can be taken to enable the photographer to create or select overlays or to access various other features in accordance with the invention, such as, use of file galleries depicted on the view-screen, presentation of a list of overlays, and so on.
At a second query 44 of the menu, the photographer can set the opacity level to a desired percentage from 0 to 100 percent, as desired. The opacity level for the overlay controls the clarity through which the subject of the camera is viewed. For example, in a fully opaque image (100 %) none of the underlying image is viewable. In use, the view-screen depicts both the live image received from the camera lens 26 and the overlay, superimposing the overlay atop the live image. The photographer can align the camera (FIG. 5), composing a photograph to combine the overlay and the subject 28 in a desired manner.
A third query 46 allows the photographer to further customize the overlay. For example, by selecting the "full screen" option 48, the overlay can be sized to correspond to the entire viewable area, as depicted on the view-screen. By selecting the "size/orient" option 66, the photographer can adjust the size and orientation of the overlay, to achieve a desired look. Also, "compare mode" presents the selected image overlay, whether opaque or translucent, by toggling between the selected image and the live image, i.e., rather than a constant depiction, the overlay is iteratively presented. This image toggling can take place automatically or initiated manually, e.g., via a toggle button.
A fourth query 52 is provided. This query allows the photographer to activate different editing features to further modify the overlay, even down to the pixel level. For example, the photographer can erase portions of the overlay to allow corresponding portions of the "live" image to be unaffected by the overlay, an example of which is discussed below with reference to FIGS. 1OA - 1OC (i.e., the fifth example below). Once an overlay is configured, the "enter" icon 54 is selected, and the overlay is presented for use.
In use, the system enables a number of distinct overlays that provide unique features. As a result, the photographer can create creative, high-quality images, without having to be unduly dependent on post processing of the image to obtain a desired look. Instead, much of the , guesswork is eliminated; the photographer has a greater ability to generate a desired image, to include composition and compositing, while in the field. To illustrate, exemplary overlays of the system are discussed below. 1. Small-Scale, Uniform Opacity Overlay
In a first example, a first overlay 24 (FIG. 1) depicting a "smiley face" is presented. The first overlay is sized to affect just a portion of the overall image and is semi-transparent. In use, the first overlay is presented on the view-screen 22 of the camera superimposed atop the live image received from the camera's lens 26 (e.g., FIGS. 6 and 7). Thus, the photographer can align the camera (FIG. 5), composing a photograph to combine the overlay and the subject 28 in a desired manner. In this example, the photographer decided to offset the head of the subject with the "smiley face," to create a composition similar to an infinity sign. The camera saves two digital files - the first file comprises an unaltered image 30 (FIG. 6) of the subject itself, and the second file is a composite image 32 (FIG. 7) of the overlay superimposed on the image of the subject. These files are then available for use as an overlay, if desired.
To create the first overlay using the overlay menu 34 (FIG. T), the "smiley" file 56 is selected and the opacity level is set at a prescribed percentage, e.g., 60 percent. By selecting the "size/orient" option 66, the smiley icon 56 will be depicted against an alignment background 60 (see FIG. 4). The alignment background includes a uniform field of black orthogonal gridlines on a gray background at a resolution corresponding to that of the digital sensor of the camera. Using the directional buttons 36, the smiley icon can be sized, positioned, and oriented, as desired. FIG. 4 depicts the smiley icon against an alignment background 60. Arrows are depicted in the figure only to indicate movement of the smiley icon from an initial location in the center of the view-screen to a final location. Once finished, the overlay menu 34 will be displayed again. The fourth query 52 is left at its default response of "no." If satisfied, the photographer selects the "enter" icon 54, and the image overlay is ready for use. Of course, these selections can remain as defaults associated with the "smiley" file or can be saved in any additional overlay file.
2. Small-Scale, Opaque Overlay
With reference now to FIGS. 8 and 9, a second overlay 64 in the form of dumbbell is depicted. The overlay is opaque, i.e., the underlying image is not viewable through the overlay. In FIG. 8, the dumbbell icon 64 is depicted in a default orientation in which it is oriented horizontally. FIG. 9, however, shows the dumbbell icon angled. This can be achieved using the "size/orient" option 66 of the overlay menu 34 (FIG. 2). As previously mentioned, when this option is selected from the overlay menu, the image identified at the first query is presented on the view-screen 22 against the alignment background 60. In this example, therefore, the dumbbell image is presented. Then, using the directional buttons 36, the dumbbell can be re¬ oriented. Once completed, the photographer can use the new overlay having the angled dumbbell.
Using the system 21 (FIG. 3), overlays formed of various different images can also be used, such as crosshairs, dots, circles, company logos, and so on. Moreover, the photographer can also select just a portion of an image for use as an overlay. For example, a flower can be "picked" out of a saved image and stored as an icon. The photographer can position one or more of the icons on the view screen. The user can choose to photograph against these icons "on-the- fly." The user can utilize the icons to align a composition and then save only live image data. The user may also choose to save any particular iconic arrangement as an overlay image for later use. The positioning of icons can occur against a variety of backgrounds which may selectively include, for example, a live image, a stored image, a neutral background, or an alignment grid.
3. Full-Scale, Uniform Opacity Overlay
With reference to FIG. 11, a third example is provided of a full-scale, semi-transparent overlay, (third overlay 80). In this example, the third overlay is generated for an image of an island village previously captured with the camera, and the photographer would like to mimic the composition of this image. To create this overlay using the overlay menu 34 (FIG. 2), the photographer selects the desired image and assigns an opacity level of 25% to the entire overlay. In use, the third overlay is presented on the view-screen 22, allowing it to assist in the composition of the live image. When the live image is sufficiently composed, the photographer could either take the photograph or exit from the "overlay mode" and continue making other adjustments prior to taking the shot.
Optionally, the camera can save two distinct types of images. The first image type is the live image as recorded by the digital sensor, which would presumably share the compositional characteristics of the overlay. The second image is a composite of the overlay and the live image as had been previewed in the view screen. The second image type is analogous to a multiple exposure of a traditional film camera. This functionality simplifies the production of multiple exposures and frees the photographer to apply attention to the artistic concerns. Using this approach, the photographer can use any image in camera memory or otherwise accessible via the system. Stored images can be randomly and repeatedly accessed as overlay. Thus, a photographer could create a photograph of multiple exposures overlaid atop each other. For example, a single image could consist often images combined together, each image having an assigned opacity level.
The composite of the stored image and the live image can be previewed in real-time, allowing the photographer to make adjustments prior to depressing the shutter button, thereby minimizing the need for post-processing of the captured image. The photographer could optionally store both the live image data and the composited image data.
4. Full-Scale, Regional Opacity Overlay
With reference now to FIGS. 12 - 14, the system 21 includes overlays that aid in composing images for prescribed aspect ratios to include standard aspect ratios as well as custom aspect ratios. The photographer can compose the image within the parameters of the desired aspect ratio.
For example, an overlay 82 is shown in FIG. 12 that is configured to aid in composing images for prints having an aspect ratio of 4x6, e.g., "4x6" prints. The overall size of the overlay corresponds to the aspect ratio of the camera's default image configuration, as dictated by the camera's sensor, i.e., 3x4. However, the overlay defines an open region 84 with an opacity level of 0 % and an aspect ratio of 4x6, such that the corresponding portion of the live image can be depicted on the view-screen and saved to the resulting image file unhindered. The overlay further includes shaded regions 86 provided for remaining area of overlay having an increased opacity level, e.g., 60 percent. The photographer can, therefore, compose the image with a particular print size in mind. When a photograph is taken, the camera can save several image files, including a first file simply depicting the scenic image without the overlay at the default aspect ratio, a second file depicting the scenic image and the overlay at the default aspect ratio, and a third file depicting the scenic image as defined by the open portion of the open region of the overlay.
With reference now to FIGS. 13 and 14, an overlay 90, depicted in FIG. 13, includes an open area 94 having an aspect ratio of 5x7 usable for composing 5x7 prints. The overlay 92, depicted in FIG. 14, includes an open area 96 having an aspect ratio of 8x10, usable for composing 8x10 prints. Moreover, additional custom overlays having an open area of any desired aspect ratio can be configured using the overlay menu 34. In other embodiments, other visual means of designating areas graphically can be used, e.g., hatching, marching ants, borders, and so on.
In selected embodiments, the user can select if the camera saves the entire image or performs an automatic crop to the previewed aspect ratio. Saving the entire image would result in a somewhat larger file size but would give identical results when printed on a zoom-in default printer. Saving the entire image preserves the potential to recompose the image later by performing an alternate crop utilizing regions that would have been discarded during the automatic crop process.
5. Editing Opacity of Selected Portions of Overlay
With reference to FIGS. 1OA - 1OC, an example is provided, detailing features of editing selected portions of an overlay. More particularly, the camera 20 allows the photographer to independently adjust the intensity of selected areas of an overlay, as desired.
Using the overlay menu 34, a photographer can erase pixels from a stored image by selecting the "yes" option 67 at the fourth query. Once this option is selected, the image is presented on the view-screen 22, and the software feature for erasing pixels is activated. For purpose of illustration, this feature is sequentially depicted in FIGS. 1OA - 1OC. In FIG. 1OA, an image 68 is presented on the view-screen 22. As indicated by a designator 70 in the lower corner, the image is presented with the feature of erasing pixels activated. This image was taken using the dumbbell overlay 62 (FIG. 8) to appear as though person "A" is lifting the dumbbell. Assume, for example, the photographer would like to create a photograph in which the head of person "B" replaces the head of person "A." With the erasing feature activated and using the directional buttons 36 (FIG. 4), the photographer can designate a region, or regions, of the image from which to erase pixels, e.g., region 72 (FIG. 1 OB). In the present example, as shown in FIG 1OB, the designated region identified the pixels associated with the head of person "A." Once satisfied, the photographer selects the erase designator, erasing the identified pixels and returning to the overlay menu.
Once the remaining overlay queries are completed, the new overlay is presented for use. The original image of person "A" lifting the dumbbell remains in the camera's memory, unaltered. FIG. 1OC depicts an image 74 created using this overlay, taken of person "B" in the manner discussed with reference to FIG. 2. The camera saves two digital files - the first file consists simply of an image of person "B," and the second file is the composite image 74. To further illustrate, at an event, a photograph is taken of a baby in a bonnet. The photographer selects the saved image for conversion into an overlay. Using the system, the photographer can adjust global opacity, regional opacity, or both. In the foregoing example, the image of the baby in the bonnet is selected and presented on the view screen. The baby's face is erased, and the result is saved as an overlay image. Alternatively, the portion of the image corresponding to the baby's face can be set at a prescribed opacity level, e.g., 50 percent, allowing a corresponding portion of a live image (or another stored image) to be viewable. The overlay is saved to the camera independently of the original image of the baby in the bonnet. Using this overlay, the photographer can now compose other faces into the bonnet. The resulting images can be very humorous or even informative.
6. Color Subtract Feature
With reference again to FIG. 2, the camera 20 further includes a feature of overlay generation from an image file that enables a user to delete a predominant color. The camera prompts the photographer to select a saved image that will be edited to create an image overlay with opacity characteristics. In this embodiment, the "overlay creation" mode includes an "automatic" setting. When the graphic file is selected and the automatic setting is chosen, a processor 41 (FIG. 3) determines the predominant single color in the image. In the exemplary embodiment, this feature can be activated via the overlay menu 34 by identifying the appropriate file at the first query 38 and selecting the "Color Subtract" option 69 for the fourth query 52. Then, the selected image file will be displayed on the view-screen 22. The camera will automatically identify the predominant single color for removal and designates regions containing that color on the view screen with a graphical identification method such as "color flashing." Next, the photographer will be prompted if the proper color was removed. The user is prompted to "subtract this region? Yes/no/cancel." If "yes" is selected, the predominant region is subtracted and the result may be stored as an image overlay. If "no" is selected, the region of secondary dominance is designated and the prompt is renewed. In this manner, the user can subtract large regions rapidly.
To further illustrate, if the user subtracted the predominant gray background a custom designed grid becomes immediately available as an image overlay. The user would be able to compose live images against this grid. Because the grid was created at the same resolution as the digital sensor, the overlay could also be composited in fine detail to the final image and used immediately for many purposes such as scientific measurement. Moreover, an image file in the camera's digital memory 40 could be modified in a similar manner. In one approach, a selected image could be modified to include a predominant single color, e.g., cyan, at desired locations throughout the image using the "modify" option 71 of the third query 46 of the overlay menu. This feature allows the photographer to interact with the image, adding colors, icons, and other indicia. The user could manually designate a color to be removed with a pointer or eyedropper cursor, as are well known in the art. Alternatively, blocks of a single color could be added to the image, covering portions of the image. Then, using the "Color Subtract" option 69, the designated color blocks can be removed. In another example, a user could download a photograph of Mt. Rushmore from the internet. On a personal computer, the user could operate a simple paint program to apply a color to one or more of the president's faces, preferably using a color not otherwise present in the image. As described above, the user could rapidly transfer the image to the camera and create an entertaining overlay image.
7. Edge Detect Feature
With reference to FIGS. 1, 2, and 15A - 15 C, the camera 20 further includes a feature for overlay generation initiated by selecting "edge detect" 73 from the fourth query 52 of the overlay menu 34. This feature is configured to generate a line drawing based off an image file by detecting edges between regions of high contrast. To initiate, at the first query 38, the photographer selects a saved image that will be edited to create an image overlay with opacity characteristics. In this embodiment, the "overlay creation" mode includes an "automatic" setting. When the graphic file is selected and the automatic setting is chosen, a processor 41 (FIG. 3) utilizes an "edge-detect" algorithm to determine the location of edges in the image field. Such edges are algorithmically identified by determining where regions are changing color or tone along a path. Such edge-paths are typically displayed in a selectable or uncommon color. The edges could be displayed against the original image or against a neutral background. In the exemplary embodiment, the system presents three slider bars that allow the user to vary the tolerance of the edge hardness that would be displayed as well the opacity of the edges and the color of the edge-delineating pixels. Optionally, a further step would allow the user to manually delete extraneous pixels, and a final step would allow the "edge map" to be saved as an overlay.
The present example relates to photography for a catalog display. In catalogs, it is often desirable to minimize inconsistencies between photographs. Compositional consistency enhances the ability of the consumer to compare the merchandise. FIG. 15 A depicts a photograph 75 of women's shoes. Traditionally, consistency among several such photographs would be achieved simply by marking shoe positions on the floor and mounting the camera on a tripod. However, if a floor mark was accidentally removed or a tripod was accidentally struck, re-establishment of the composition becomes problematic. The problem of re-establishing the composition is increased when multiple photo sessions occur at different times or in different places. In any event, the traditional method may properly identify the shoe arrangement but it will often fail to identify the proper leg position.
Using the "edge-detect" feature 73 (FIG. 2) in conjunction with the photograph 75 of FIG. 15 A, a line overlay 77 is created that can serve as to aid in composition for further images. When the photograph 75 is selected from digital memory and the automatic setting is chosen, a processor 41 (FIG. 3) utilizes an "edge-detect" algorithm to determine the location of edges in the image field. Such edges are algorithmically identified by determining where regions are changing color or tone along a path. Such edge-paths can be displayed in a selectable or uncommon color, for example. The edges can be displayed against the original image or against a neutral background, as depicted in image 79 of FIG. 15B. By means of multiple slider bars (not shown), the user could vary the tolerance of the edge hardness that would be displayed as well the opacity of the edges and the color of the edge-delineating pixels. A further step would allow the user to manually delete extraneous pixels as shown in FIG. 15C.
8. Compare Mode
. With continued reference to FIGS. 1 - 2, the system of the camera 20 includes a "compare mode" that toggles between the selected overlay and the live image without application of the overlay. As mentioned above, compare mode can be selected at the third query 46 of menu 34. In this mode, rather than a constant depiction, the overlay is iteratively presented. This iteration can take place automatically or initiated manually, e.g., via a toggle button 55.
In the example presented, the overlay is opaque. Using the toggle button, the photographer can manually toggle between the stored image overlay and the live image in order to compare and make adjustments. In automatic mode, the camera will toggle at a pre¬ determined or user-definable rate between the selected image and the live image. Optionally, the photographer can override the automatic toggling, e.g., via the toggle button. When the live image was sufficiently composed, the photographer could either take the shot or exit from the "compare mode" and continue making other adjustments prior to making the exposure. 9. Auto-Compositing Feature
With reference now to FIG. 16 through FIG. 2OB, an example is provided illustrating the auto-compositing feature of the system 21. The camera 20 includes a sensor assembly 25 configured to capture a digital representation of the camera's field-of-view. However, due to performance characteristics of the sensor assembly, i.e., exposure latitude, the captured image of the camera's field-of-view can be non-ideal, particularly when there is a large disparity between bright and dark regions. In these situations, the captured image can include over-exposed and under-exposed regions. The auto-compositing feature aids in creating a composite image free of such improperly exposed regions.
FIG. 16 depicts a flowchart for implementing the auto-compositing feature of system 21. In the exemplary embodiment, the system can composite images taken at various exposure settings to generate a desired image having disparity between bright and dark regions beyond the maximum latitude of the sensor assembly 25 of the camera. In this example, FIG. 17A depicts a desired image of a person standing in front of the corner of a building, having large disparity between bright and dark regions. Bright light streams onto the building wall depicted in the right side of the photograph. On the left side of the photograph the building wall is in dark shadows, indicated by the cross-hatched lines. At the center of the photograph the person is in moderate light, perhaps shade, indicated by the single-hatched lines.
FIG. 17B is a histogram that classifies the brightness of the image of FIG. 17A by area. Such histograms typically plot darker values on the left and brighter values on the right. By coincidence, the dark values at the left of the photograph correspond primarily to the hump near the left of the histogram. The light values at the right of the photograph correspond primarily to the hump near the right of the histogram. The moderate light on the person corresponds to the center hump. The arrow at the bottom of the histogram illustrates the tremendously wide latitude required to capture this image, beyond the maximum latitude of the sensor assembly 25 of the camera.
FIGS. 18A, 19 A, and 2OA depict resulting images captured by camera 20, at various exposure settings. FIG. 18 A represents the result of photographing the image with a non-ideal camera at a first exposure setting. The exposure has been set to capture the dark left side of the building. The remaining portions of the image are extremely overexposed. The camera's latitude at the first exposure setting is indicated by arrow of FIG. 18B. Notably, this latitude is less than is required to capture the desired image of FIG. 17 A. In FIG. 19A the exposure has been set to capture the moderate light of the person but the details of the building are underexposed on the left and overexposed on the right. In FIG. 2OA the exposure has been set to capture the bright side of the building at the right of the photograph. The left side and the person are underexposed.
Referring to the flowchart 100 (FIG. 16), the auto-compositing feature is initiated by identifying a potentially bad image, step 200. Such identification comprises an algorithm that identifies significant bad regions as are evident in FIGS. 18 A, 19A, and 2OA. At step 202, the camera prompts the user to make a new exposure-bracketed attempt at the shot or to exit back to normal camera function. If the user chooses to attempt the improved image, further guidance prompts may be presented as illustrated in step 204. Such guidance may also comprise using a tripod. When ready, the user depresses the shutter, step 206.
In step 208, under the guidance of the processor 41 (FIG. 3), the camera exposes and saves multiple bracketed images, each at a distinct exposure setting. The processor examines all the captured images and selects a first image using an algorithm that typically gives the best results (step 210). In this example, the image of FIG. 18A is selected. In other embodiments, the system can randomly select one of the images for evaluation. In yet other embodiments, the system can utilize an algorithm that would evaluate and compare characteristics of the bracketed set in order to establish the most appropriate first image with which to begin the compositing process. Such selection algorithms could evaluate with respect to the compositing algorithms in order to determine the combination that would provide the most pleasing final images.
At step 212, the image is evaluated and may branch to saving the image at step 220. In the image of FIG. 18 A, an overexposed region 213 is identified. In step 214, the overexposed region will be measured against comparable regions in a corresponding location in the other bracketed images. In our example the overexposed region 213 of FIG. 18A would be replaced by the region 215 of FIG. 19A depicting the image of the person.
At step 216, the image is reevaluated by the processor to determine if additional compositing would improve it. In the present example, the system would yet identify a large portion of overexposed right region and, thus, would loop the processor back to step 214. Then, the system would composite the properly exposed right region 217 from FIG. 19A into the image. Returning back to step 216, no further improvement would be possible without a re- shoot. A final evaluation at step 218 would determine if the final image might be improvable with a re-shoot. If so, the user would get a preview of the current result on the view screen and be prompted with the opportunity to re-shoot at step 222. If the user chose to re-shoot, the camera would make additional adjustments of exposure and number of shots in order to achieve a higher likelihood of success. If the camera determines that a re-shoot would not improve the shot, or if the user elects to not re-shoot, the image is saved at step 220. The compositing process would obviously be benefited by algorithms that can smooth the transition between the composited sections. These algorithms are widely known. It is possible that the quality of such composites may, in many cases, obviate the need for later editing. In any case, the auto- composited image would provide satisfactory assurance to the photographer that the bracketed exposures contained adequate image data for later compositing.
It should be appreciated from the foregoing that the present invention provides a digital camera comprising an image system having a unique combination of features that aid in generating creative, high-quality images. The system presents a variety of overlays, each having prescribed attributes, e.g., ranging in size, opacity, and functionality, and from iconic overlays to full-scale overlays having varied opacity. The overlays can be used as an aid in composing a live image for digital capture, either as a constant or intermittent presence on the view-screen of the camera. In an exemplary embodiment, the camera can create a composite photograph in which a live image and the overlay are combined together. In an independent aspect, the camera can include an auto-compositing feature that aids in creating images free of improperly exposed regions. Thus, a photographer can create and combine images "on location" in a unique manner.
Although the invention has been disclosed in detail with reference only to the preferred embodiments, those skilled in the art will appreciate that various other embodiments can be provided without departing from the scope of the invention. Accordingly, the invention is defined only by the claims set forth below.

Claims

I CLAIM:
1. A handheld digital camera, comprising:
a processor;
a digital memory in communication with the processor;
a view-screen in communication with the processor and configured to present images from the field-of-view of the camera;
a plurality of files stored in the digital memory; and
a user interface executable by the processor, the interface configured to be presented on the view-screen to enable creation of overlays and selection of overlays from the plurality of files, the resulting overlay having a user-assignable opacity level and configured to be depicted on the view-screen in conjunction with image data corresponding to the field-of-view of the camera such that the opacity of the overlay affects the presentation of the image data on the view-screen.
2. A camera as defined in claim 1 , wherein the camera can save two digital files to the digital memory upon taking a photograph, the first file comprising the image data corresponding to the field-of-view of the camera and the second file comprising a composite image of the overlay superimposed on the image data corresponding to the field-of-view of the camera.
3. A camera as defined in claim 1, wherein the user interface enables sizing and positioning of the overlay for use relative to the image data captured from the field-of-view of the camera.
4. A camera as defined in claim 1, wherein the plurality of files includes at least one file usable as an overlay sized to correspond to a default image size for the digital camera such that the assigned opacity level of the overlay affects the entirety of corresponding image data.
5. A camera as defined in claim 1, wherein the overlay can be presented on the view-screen as a constant or an intermittent presence.
6. A camera as defined in claim 1 , wherein the user interface enables modification of an overlay selected from the plurality of files by designating the region of the selected overlay with a first color via the user input device, and removing the first color from the overlay.
7. A camera as defined in claim 1 , wherein the user interface enables modification of an overlay selected by removing a first color from the overlay.
8. A camera as defined in claim 1 , wherein the plurality of files includes at least one print overlay having one or more regions of altered appearance, including an unaltered region and an altered region, the altered region of the overlay having an altered appearance relative to the unaltered region, the unaltered region having a prescribed aspect ratio corresponding to a photograph print size.
9. A camera as defined in claim 8, wherein the plurality of files includes multiple print overlays, wherein the open region of each print overlay has a distinct aspect ratio.
10. A camera as defined in claim 8, wherein the prescribed aspect ratio of the open region is 8x10.
11. A camera as defined in claim 1 , wherein the user interface enables automated creation of a line overlay from an image file via an edge-detect feature in which the line overlay is a line drawing of the image of the image file.
12. A camera as defined in claim 11 , wherein the edge-detect feature analyzes the image file by identifying borders between regions of differing color and tone of a prescribed value, and defines, in the line overlay, a line of prescribed opacity and color corresponding to the identified border.
13. A camera as defined in claim 11 , wherein the user can set the prescribed value used for identifying the border between regions in the image file.
14. A camera as defined in claim 13, wherein the user can set the opacity value and the color for the line overlay.
15. A method of generating an overlay for use with a digital camera, comprising:
. selecting a stored image from digital memory of a digital camera for use as an overlay atop a live image corresponding to the field-of-view of the camera;
assigning an opacity level to the overlay via a user input device of the digital camera;
presenting the image selected on a view-screen of the camera; erasing a region of the stored image designated via the user input device such that, when used, a portion of the live image corresponding to the erased region is unaffected by the overlay; and
saving the resultant image to digital memory for use as an overlay.
16. A method as defined in claim 15, wherein the erasing step further comprises
designating the region with a first color via the user input device, and
removing the first color from the image.
17. A method as defined in claim 15, further comprising sequentially displaying (a) the image unaltered and (b) the image having the defined region erased, prior to the saving step.
18. A method as defined in claim 15, wherein the resultant image is sized to correspond to a default image size for the digital camera such that the assigned opacity level of the overlay affects the entirety of the live image.
19. A method as defined in claim 15, wherein the resultant image has two region of differing opacity, including an open region and a shaded region, the shaded region of the overlay having one or more regions of altered appearance, including an unaltered region and an altered region, the altered region of the overlay having a altered appearance relative to the unaltered region, the unaltered region having a prescribed aspect ratio corresponding to a photograph print size.
20. A method of generating an overlay for use with a digital camera, comprising:
selecting an icon file from digital memory of a digital camera for use as an overlay atop a live image corresponding to the field-of-view of the camera;
presenting the selected icon on a view-screen of the camera against an alignment background;
positioning the icon with respect to the alignment background via an user input device of the digital camera such that, when used, a portion of the live image corresponding to the icon as positioned is affected by the overlay; and
saving the icon and associated positioning to digital memory for use an overlay.
21. A method as defined in claim 20, further comprising the step of assigning an opacity level to the overlay via the user input device.
22. A method for image composition using a handheld digital camera, comprising:
capturing multiple digital images in automated sequence using a digital camera, each image captured at a distinct exposure setting, the digital camera having a processor assembly and a digital sensor assembly in communication with the processor, the sensor assembly having a exposure latitude beyond which a captured image will have over- or under-exposed regions;
analyzing a first digital image of the multiple digital images for over- or under-exposed regions;
identifying properly exposed regions in the remaining digital images of the multiple digital images corresponding in location to the over- or under-exposed regions of the first digital image, the identifying step performed by the processor assembly of the camera;
automated compositing of properly exposed regions identified in the first digital image and the remaining digital images of the multiple digital images, resulting in a composite image in which the identified regions replace the corresponding over- or under-exposed regions of the first digital images; and
storing the composite image in digital memory of the camera.
23. A method as defined in claim 22, further comprising, prior to the capturing multiple images step, analyzing an image taken as a single exposure for over- or under-exposed regions and, if found, prompting user to initiate the capturing multiple images step.
24. A method as defined in claim 22, further comprising storing the multiple digital images that were captured in automated sequence for later use.
25. A method as defined in claim 22, further comprising automated deletion of the multiple digital images following completion of the composite image.
26. A handheld digital camera configured to perform the method of claim 22.
27. A method as defined in claim 22, wherein the identifying and automated compositing steps further include (a) selecting a second digital image of the multiple digital images having at least one properly exposed region corresponding in location to an over- or under-exposed region of the first digital image;
(b) automated compositing of all regions from the second digital image region corresponding in location to all over- or under-exposed region of the first digital image with the remaining portions of the first digital image; and
(c) repeating steps (a) and (b) with the resultant image from the prior step (b) and a third digital image of the multiple digital images.
28. A method as defined in claim 27, further comprising, prior to the capturing multiple images step, analyzing an image taken as a single exposure for over- or under-exposed regions and, if found, prompting user to initiate the capturing multiple images step.
29. A handheld digital camera configured to perform the method of claim 27.
PCT/US2005/039894 2004-11-05 2005-11-02 Digital camera having system for digital image composition and related method WO2006052701A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP05821152A EP1808013A2 (en) 2004-11-05 2005-11-02 Digital camera having system for digital image composition and related method
JP2007539349A JP2008519505A (en) 2004-11-05 2005-11-02 Digital camera with digital image composition system and related method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US10/982,645 2004-11-05
US10/982,645 US7782384B2 (en) 2004-11-05 2004-11-05 Digital camera having system for digital image composition and related method

Publications (2)

Publication Number Publication Date
WO2006052701A2 true WO2006052701A2 (en) 2006-05-18
WO2006052701A3 WO2006052701A3 (en) 2006-11-16

Family

ID=35789248

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/039894 WO2006052701A2 (en) 2004-11-05 2005-11-02 Digital camera having system for digital image composition and related method

Country Status (5)

Country Link
US (1) US7782384B2 (en)
EP (1) EP1808013A2 (en)
JP (1) JP2008519505A (en)
CN (1) CN100579181C (en)
WO (1) WO2006052701A2 (en)

Families Citing this family (125)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100580188B1 (en) * 2004-01-28 2006-05-16 삼성전자주식회사 Method and apparatus for processing scanned image
US9124729B2 (en) 2005-01-31 2015-09-01 The Invention Science Fund I, Llc Shared image device synchronization or designation
US20070236505A1 (en) * 2005-01-31 2007-10-11 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Resampling of transformed shared image techniques
US8606383B2 (en) 2005-01-31 2013-12-10 The Invention Science Fund I, Llc Audio sharing
US20060174203A1 (en) 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Viewfinder for shared image device
US9489717B2 (en) 2005-01-31 2016-11-08 Invention Science Fund I, Llc Shared image device
US20060187228A1 (en) * 2005-01-31 2006-08-24 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Sharing including peripheral shared image device
US20060170956A1 (en) 2005-01-31 2006-08-03 Jung Edward K Shared image devices
US20060171603A1 (en) * 2005-01-31 2006-08-03 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Resampling of transformed shared image techniques
US9910341B2 (en) 2005-01-31 2018-03-06 The Invention Science Fund I, Llc Shared image device designation
US7920169B2 (en) 2005-01-31 2011-04-05 Invention Science Fund I, Llc Proximity of shared image devices
US9325781B2 (en) 2005-01-31 2016-04-26 Invention Science Fund I, Llc Audio sharing
US9082456B2 (en) 2005-01-31 2015-07-14 The Invention Science Fund I Llc Shared image device designation
US20060221197A1 (en) * 2005-03-30 2006-10-05 Jung Edward K Image transformation estimator of an imaging device
US7876357B2 (en) 2005-01-31 2011-01-25 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US8902320B2 (en) 2005-01-31 2014-12-02 The Invention Science Fund I, Llc Shared image device synchronization or designation
US8749839B2 (en) * 2005-03-24 2014-06-10 Kofax, Inc. Systems and methods of processing scanned data
US9769354B2 (en) 2005-03-24 2017-09-19 Kofax, Inc. Systems and methods of processing scanned data
US7872675B2 (en) * 2005-06-02 2011-01-18 The Invention Science Fund I, Llc Saved-image management
US20070222865A1 (en) 2006-03-15 2007-09-27 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Enhanced video/still image correlation
US9621749B2 (en) 2005-06-02 2017-04-11 Invention Science Fund I, Llc Capturing selected image objects
US9191611B2 (en) 2005-06-02 2015-11-17 Invention Science Fund I, Llc Conditional alteration of a saved image
US9967424B2 (en) 2005-06-02 2018-05-08 Invention Science Fund I, Llc Data storage usage protocol
US7782365B2 (en) 2005-06-02 2010-08-24 Searete Llc Enhanced video/still image correlation
US9942511B2 (en) 2005-10-31 2018-04-10 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US8072501B2 (en) * 2005-10-31 2011-12-06 The Invention Science Fund I, Llc Preservation and/or degradation of a video/audio data stream
US9819490B2 (en) 2005-05-04 2017-11-14 Invention Science Fund I, Llc Regional proximity for shared image device(s)
US8233042B2 (en) * 2005-10-31 2012-07-31 The Invention Science Fund I, Llc Preservation and/or degradation of a video/audio data stream
US9093121B2 (en) 2006-02-28 2015-07-28 The Invention Science Fund I, Llc Data management of an audio data stream
US10003762B2 (en) 2005-04-26 2018-06-19 Invention Science Fund I, Llc Shared image devices
US9001215B2 (en) 2005-06-02 2015-04-07 The Invention Science Fund I, Llc Estimating shared image device operational capabilities or resources
US8681225B2 (en) 2005-06-02 2014-03-25 Royce A. Levien Storage access technique for captured data
US9451200B2 (en) 2005-06-02 2016-09-20 Invention Science Fund I, Llc Storage access technique for captured data
US8253821B2 (en) * 2005-10-31 2012-08-28 The Invention Science Fund I, Llc Degradation/preservation management of captured data
US9076208B2 (en) * 2006-02-28 2015-07-07 The Invention Science Fund I, Llc Imagery processing
US9167195B2 (en) * 2005-10-31 2015-10-20 Invention Science Fund I, Llc Preservation/degradation of video/audio aspects of a data stream
US8964054B2 (en) 2006-08-18 2015-02-24 The Invention Science Fund I, Llc Capturing selected image objects
US20070097090A1 (en) * 2005-10-31 2007-05-03 Battles Amy E Digital camera user interface
US20070120980A1 (en) * 2005-10-31 2007-05-31 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Preservation/degradation of video/audio aspects of a data stream
US20070203595A1 (en) * 2006-02-28 2007-08-30 Searete Llc, A Limited Liability Corporation Data management of an audio data stream
US20080001614A1 (en) * 2006-06-28 2008-01-03 Thorson Dean E Image Capture Device with Alignment Indicia
US7940955B2 (en) * 2006-07-26 2011-05-10 Delphi Technologies, Inc. Vision-based method of determining cargo status by boundary detection
USD609714S1 (en) 2007-03-22 2010-02-09 Fujifilm Corporation Electronic camera
US7697054B2 (en) * 2007-03-29 2010-04-13 Hewlett-Packard Development Company, L.P. Image Manipulator for a camera
EP2031867B1 (en) 2007-08-28 2015-07-08 SIR S.p.A. System and method of artificial viewing for robotised equipments
US8177441B2 (en) 2007-08-29 2012-05-15 Nintendo Co., Ltd. Imaging apparatus
JP4260215B1 (en) 2007-08-29 2009-04-30 任天堂株式会社 Imaging device
US8917985B2 (en) * 2007-08-29 2014-12-23 Nintendo Co., Ltd. Imaging apparatus
KR20090066368A (en) 2007-12-20 2009-06-24 삼성전자주식회사 Portable terminal having touch screen and method for performing function thereof
US8922518B2 (en) 2007-12-20 2014-12-30 Samsung Electronics Co., Ltd. Mobile terminal having touch screen and function controlling method of the same
US10503376B2 (en) 2007-12-20 2019-12-10 Samsung Electronics Co., Ltd. Method and apparatus for adjusting an image and control guides displayed on a display
US20090244301A1 (en) * 2008-04-01 2009-10-01 Border John N Controlling multiple-image capture
KR101467293B1 (en) * 2008-04-22 2014-12-02 삼성전자주식회사 A method for providing User Interface to display the menu related to the image to be photographed
JP4181211B1 (en) 2008-06-13 2008-11-12 任天堂株式会社 Information processing apparatus and startup program executed therein
US8130275B2 (en) 2008-06-13 2012-03-06 Nintendo Co., Ltd. Information-processing apparatus, and storage medium storing a photographing application launch program executed by information-processing apparatus
WO2010038296A1 (en) 2008-10-01 2010-04-08 任天堂株式会社 Information processing device, information processing system, boot program and storage medium storing same
JP5159588B2 (en) * 2008-12-05 2013-03-06 キヤノン株式会社 Image processing apparatus, image processing method, and computer program
KR20100070043A (en) * 2008-12-17 2010-06-25 삼성전자주식회사 Method for displaying scene recognition of digital image signal processing apparatus, medium for recording the method and digital image signal processing apparatus applying the method
US9767354B2 (en) 2009-02-10 2017-09-19 Kofax, Inc. Global geographic information retrieval, validation, and normalization
US9576272B2 (en) 2009-02-10 2017-02-21 Kofax, Inc. Systems, methods and computer program products for determining document validity
US10613704B2 (en) 2009-06-03 2020-04-07 Savant Systems, Llc Small screen virtual room-based user interface
US10775960B2 (en) * 2009-06-03 2020-09-15 Savant Systems, Inc. User generated virtual room-based user interface
KR101719842B1 (en) 2009-06-03 2017-03-24 사반트 시스템즈 엘엘씨 Virtual room-based light fixture and device control
EP2284800B1 (en) 2009-07-23 2018-09-05 Samsung Electronics Co., Ltd. Method and system for creating an image
JP5026484B2 (en) * 2009-09-17 2012-09-12 シャープ株式会社 Portable terminal device, image output device, captured image processing system, control method for portable terminal device, image output method, program, and recording medium
US9495697B2 (en) * 2009-12-10 2016-11-15 Ebay Inc. Systems and methods for facilitating electronic commerce over a network
JP5558852B2 (en) * 2010-01-28 2014-07-23 キヤノン株式会社 Information processing apparatus, control method thereof, and program
KR101720771B1 (en) * 2010-02-02 2017-03-28 삼성전자주식회사 Digital photographing apparatus, method for controlling the same, and recording medium storing program to execute the method
US8558913B2 (en) * 2010-02-08 2013-10-15 Apple Inc. Capture condition selection from brightness and motion
JP2011211493A (en) * 2010-03-30 2011-10-20 Sony Corp Imaging apparatus, display method, and program
US20120019686A1 (en) * 2010-07-23 2012-01-26 Casio Computer Co., Ltd. Image synthesizing device, image synthesizing method and computer readable medium
US20120038663A1 (en) * 2010-08-12 2012-02-16 Harald Gustafsson Composition of a Digital Image for Display on a Transparent Screen
JP5170226B2 (en) 2010-12-10 2013-03-27 カシオ計算機株式会社 Image processing apparatus, image processing method, and program
US10631712B2 (en) * 2011-02-10 2020-04-28 Karl Storz Imaging, Inc. Surgeon's aid for medical display
US11412998B2 (en) 2011-02-10 2022-08-16 Karl Storz Imaging, Inc. Multi-source medical display
US10674968B2 (en) * 2011-02-10 2020-06-09 Karl Storz Imaging, Inc. Adjustable overlay patterns for medical display
US9088727B2 (en) * 2011-04-06 2015-07-21 Pelco, Inc. Spatially-varying flicker detection
JP5762115B2 (en) * 2011-04-28 2015-08-12 キヤノン株式会社 Imaging apparatus and control method thereof
SE1150505A1 (en) 2011-05-31 2012-12-01 Mobile Imaging In Sweden Ab Method and apparatus for taking pictures
JP2013048318A (en) * 2011-08-29 2013-03-07 Nintendo Co Ltd Information processor, information processing program, information processing method, and information processing system
JP5846075B2 (en) * 2011-08-31 2016-01-20 辰巳電子工業株式会社 Image data providing device, photography game device, image data providing system, and image generation method
JP5895409B2 (en) 2011-09-14 2016-03-30 株式会社リコー Imaging device
US10146795B2 (en) 2012-01-12 2018-12-04 Kofax, Inc. Systems and methods for mobile image capture and processing
US8855375B2 (en) 2012-01-12 2014-10-07 Kofax, Inc. Systems and methods for mobile image capture and processing
US9167009B2 (en) * 2012-05-08 2015-10-20 International Business Machines Corporation Presenting data to electronic meeting participants
US20130317901A1 (en) * 2012-05-23 2013-11-28 Xiao Yong Wang Methods and Apparatuses for Displaying the 3D Image of a Product
CN102769775A (en) * 2012-06-12 2012-11-07 严幸华 System, server and method for providing overlay images
KR20130142310A (en) * 2012-06-19 2013-12-30 삼성전자주식회사 Method and apparatus for image change in electronic device
US9025066B2 (en) 2012-07-23 2015-05-05 Adobe Systems Incorporated Fill with camera ink
US11086196B2 (en) * 2012-08-31 2021-08-10 Audatex North America, Llc Photo guide for vehicle
JP6090679B2 (en) * 2013-02-14 2017-03-15 パナソニックIpマネジメント株式会社 Electronic mirror device
WO2014160426A1 (en) 2013-03-13 2014-10-02 Kofax, Inc. Classifying objects in digital images captured using mobile devices
US9355312B2 (en) 2013-03-13 2016-05-31 Kofax, Inc. Systems and methods for classifying objects in digital images captured using mobile devices
US9208536B2 (en) 2013-09-27 2015-12-08 Kofax, Inc. Systems and methods for three dimensional geometric reconstruction of captured image data
US10068120B2 (en) * 2013-03-15 2018-09-04 Apple Inc. High dynamic range fingerprint sensing
CN103312981A (en) * 2013-03-22 2013-09-18 中科创达软件股份有限公司 Synthetic multi-picture taking method and shooting device
US20140316841A1 (en) 2013-04-23 2014-10-23 Kofax, Inc. Location-based workflows and services
KR102183448B1 (en) * 2013-04-26 2020-11-26 삼성전자주식회사 User terminal device and display method thereof
DE202014011407U1 (en) 2013-05-03 2020-04-20 Kofax, Inc. Systems for recognizing and classifying objects in videos captured by mobile devices
CN103338299B (en) * 2013-06-06 2015-07-29 腾讯科技(深圳)有限公司 A kind of image processing method and device, terminal
CN104346157A (en) * 2013-08-06 2015-02-11 腾讯科技(深圳)有限公司 Picture processing method and device and terminal equipment
JP5761272B2 (en) * 2013-08-06 2015-08-12 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
KR101523843B1 (en) * 2013-09-24 2015-05-29 한국 한의학 연구원 Apparatus and method for evaluating reproducibility of tongue diagnosis device
US9292175B2 (en) 2013-11-08 2016-03-22 Minted, Llc Vendor website GUI for marketing greeting cards
US9478054B1 (en) 2013-11-09 2016-10-25 Google Inc. Image overlay compositing
JP2016538783A (en) 2013-11-15 2016-12-08 コファックス, インコーポレイテッド System and method for generating a composite image of a long document using mobile video data
WO2015134939A2 (en) * 2014-03-07 2015-09-11 North Main Group, Inc. Providing a frame of reference for images
KR102105961B1 (en) * 2014-05-13 2020-05-28 엘지전자 주식회사 Mobile terminal and method for controlling the same
USD757807S1 (en) * 2014-05-30 2016-05-31 Microsoft Corporation Display screen with graphical user interface
JP2016046676A (en) * 2014-08-22 2016-04-04 株式会社リコー Imaging apparatus and imaging method
US9760788B2 (en) 2014-10-30 2017-09-12 Kofax, Inc. Mobile document detection and orientation based on reference object characteristics
JP6582466B2 (en) * 2015-03-17 2019-10-02 フリュー株式会社 Image processing apparatus, image processing method, and image processing program
CN104767939A (en) * 2015-04-03 2015-07-08 广州市久邦数码科技有限公司 Method and system for achieving shape framing photographing
US10242285B2 (en) 2015-07-20 2019-03-26 Kofax, Inc. Iterative recognition-guided thresholding and data extraction
JP2018535568A (en) * 2015-09-02 2018-11-29 サムロール エルエルシー Camera system and method for registering images and presenting a series of aligned images
CN105208288A (en) * 2015-10-21 2015-12-30 维沃移动通信有限公司 Photo taking method and mobile terminal
JP2017108309A (en) * 2015-12-10 2017-06-15 オリンパス株式会社 Imaging apparatus and imaging method
US9779296B1 (en) 2016-04-01 2017-10-03 Kofax, Inc. Content-based detection and three dimensional geometric reconstruction of objects in image and video data
WO2017213685A1 (en) * 2016-06-08 2017-12-14 Google Llc Generating a composite image from a physical item
KR20220156101A (en) * 2016-11-01 2022-11-24 스냅 인코포레이티드 Fast video capture and sensor adjustment
US11317028B2 (en) * 2017-01-06 2022-04-26 Appsure Inc. Capture and display device
US10803350B2 (en) 2017-11-30 2020-10-13 Kofax, Inc. Object detection and image cropping using a multi-detector approach
KR20220062031A (en) 2019-09-11 2022-05-13 사반트 시스템즈, 인크. 3D Virtual Room-Based User Interface for Home Automation Systems
US11722779B2 (en) 2021-06-22 2023-08-08 Snap Inc. Viewfinder ring flash
US11683592B2 (en) * 2021-06-30 2023-06-20 Snap Inc. Adaptive front flash view

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
EP0853426A2 (en) * 1997-01-10 1998-07-15 Casio Computer Co., Ltd. Image pickup apparatus and method for composing images
US5828793A (en) * 1996-05-06 1998-10-27 Massachusetts Institute Of Technology Method and apparatus for producing digital images having extended dynamic ranges
WO1999014941A2 (en) * 1997-09-17 1999-03-25 Flashpoint Technology, Inc. A method and system for translating stamp characteristics
US20020176011A1 (en) * 2001-05-22 2002-11-28 Fuji Photo Film Co., Ltd. On-screen device for subject of interest in portable electronic device, and method of controlling same
US6606117B1 (en) * 1997-09-15 2003-08-12 Canon Kabushiki Kaisha Content information gathering apparatus system and method
US20030169350A1 (en) * 2002-03-07 2003-09-11 Avi Wiezel Camera assisted method and apparatus for improving composition of photography
WO2003083773A2 (en) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Imaging method and system
US20040012702A1 (en) * 2002-07-08 2004-01-22 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing
US20040207734A1 (en) * 1998-12-03 2004-10-21 Kazuhito Horiuchi Image processing apparatus for generating a wide dynamic range image
EP1526727A1 (en) * 2002-06-05 2005-04-27 Seiko Epson Corporation Digital camera and image processing device

Family Cites Families (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4837633A (en) * 1987-06-22 1989-06-06 Parra Jorge M Electronic boundary framing device and method
US5019854A (en) * 1988-09-16 1991-05-28 Minolta Camera Kabushiki Kaisha Display system for displaying information in the viewfinder of a camera
JPH0698232A (en) * 1992-09-10 1994-04-08 Canon Inc Image recognizing device and image pickup device
JP3037140B2 (en) * 1996-06-13 2000-04-24 日本電気オフィスシステム株式会社 Digital camera
US5913088A (en) * 1996-09-06 1999-06-15 Eastman Kodak Company Photographic system capable of creating and utilizing applets on photographic film
US5689742A (en) * 1996-10-11 1997-11-18 Eastman Kodak Company Full frame annotation system for camera
JP3690024B2 (en) * 1996-12-25 2005-08-31 カシオ計算機株式会社 Printing apparatus and captured image printing method using printing apparatus
US5897228A (en) * 1997-02-28 1999-04-27 Eastman Kodak Company Camera with low cost interchangeable pushbutton annotation
JPH1118083A (en) * 1997-06-25 1999-01-22 Sony Corp Digital signal coding method and system, signal recording medium and signal transmission method
US6532039B2 (en) * 1997-09-17 2003-03-11 Flashpoint Technology, Inc. Method and system for digital image stamping
US5873007A (en) * 1997-10-28 1999-02-16 Sony Corporation Picture composition guidance system
US6504575B1 (en) 1998-02-27 2003-01-07 Flashpoint Technology, Inc. Method and system for displaying overlay bars in a digital imaging device
US6486914B1 (en) * 1998-02-27 2002-11-26 Flashpoint Technology, Inc. Method and system for controlling user interaction in a digital imaging device using dynamic overlay bars
JPH11298764A (en) * 1998-04-14 1999-10-29 Fuji Photo Film Co Ltd Digital still camera with composite image display function
JP2001177764A (en) * 1999-12-17 2001-06-29 Canon Inc Image processing unit, image processing method and storage medium
US6456323B1 (en) * 1999-12-31 2002-09-24 Stmicroelectronics, Inc. Color correction estimation for panoramic digital camera
US6677981B1 (en) * 1999-12-31 2004-01-13 Stmicroelectronics, Inc. Motion play-back of still pictures comprising a panoramic view for simulating perspective
JP2002189164A (en) * 2000-12-21 2002-07-05 Minolta Co Ltd Optical system controller, optical system control method, and recording medium
JP2002279420A (en) * 2001-03-16 2002-09-27 Ricoh Co Ltd Method and device for image processing and recording medium stored with program for making computer perform processing in device thereof
US6539177B2 (en) * 2001-07-17 2003-03-25 Eastman Kodak Company Warning message camera and method
US7345774B2 (en) * 2001-10-26 2008-03-18 Hewlett-Packard Development Company, L.P. Apparatus and method for adapting image sensor aspect ratio to print aspect ratio in a digital image capture appliance
JP2003244727A (en) * 2002-02-13 2003-08-29 Pentax Corp Stereoscopic image pickup system
JP2003298905A (en) * 2002-04-02 2003-10-17 Mitsubishi Plastics Ind Ltd Digital camera
JP2004032076A (en) * 2002-06-21 2004-01-29 Canon Inc Imaging apparatus for printing system
JP2004134950A (en) * 2002-10-09 2004-04-30 Sony Corp Image compositing method and image compositing apparatus
US7085413B2 (en) * 2003-04-04 2006-08-01 Good News Enterprises Limited Image background detection and removal
US20050024517A1 (en) * 2003-07-29 2005-02-03 Xerox Corporation. Digital camera image template guide apparatus and method thereof

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5309243A (en) * 1992-06-10 1994-05-03 Eastman Kodak Company Method and apparatus for extending the dynamic range of an electronic imaging system
US5828793A (en) * 1996-05-06 1998-10-27 Massachusetts Institute Of Technology Method and apparatus for producing digital images having extended dynamic ranges
EP0853426A2 (en) * 1997-01-10 1998-07-15 Casio Computer Co., Ltd. Image pickup apparatus and method for composing images
US6606117B1 (en) * 1997-09-15 2003-08-12 Canon Kabushiki Kaisha Content information gathering apparatus system and method
WO1999014941A2 (en) * 1997-09-17 1999-03-25 Flashpoint Technology, Inc. A method and system for translating stamp characteristics
US20040207734A1 (en) * 1998-12-03 2004-10-21 Kazuhito Horiuchi Image processing apparatus for generating a wide dynamic range image
US20020176011A1 (en) * 2001-05-22 2002-11-28 Fuji Photo Film Co., Ltd. On-screen device for subject of interest in portable electronic device, and method of controlling same
US20030169350A1 (en) * 2002-03-07 2003-09-11 Avi Wiezel Camera assisted method and apparatus for improving composition of photography
WO2003083773A2 (en) * 2002-03-27 2003-10-09 The Trustees Of Columbia University In The City Of New York Imaging method and system
EP1526727A1 (en) * 2002-06-05 2005-04-27 Seiko Epson Corporation Digital camera and image processing device
US20040012702A1 (en) * 2002-07-08 2004-01-22 Casio Computer Co., Ltd. Camera apparatus, photographing method and a storage medium that records method of photographing

Also Published As

Publication number Publication date
CN100579181C (en) 2010-01-06
US20060098112A1 (en) 2006-05-11
CN101053248A (en) 2007-10-10
US7782384B2 (en) 2010-08-24
EP1808013A2 (en) 2007-07-18
WO2006052701A3 (en) 2006-11-16
JP2008519505A (en) 2008-06-05

Similar Documents

Publication Publication Date Title
US7782384B2 (en) Digital camera having system for digital image composition and related method
US7034881B1 (en) Camera provided with touchscreen
US6539177B2 (en) Warning message camera and method
US6970199B2 (en) Digital camera using exposure information acquired from a scene
US6577821B2 (en) Camera having oversized imager and method
US6930718B2 (en) Revised recapture camera and method
US6526234B1 (en) Revision suggestion camera and method
US20070291338A1 (en) Photo editing menu systems for digital cameras
US20070296830A1 (en) Cameras, other imaging devices, and methods having non-uniform image remapping using a small data-set of distortion vectors
JP3777922B2 (en) Digital imaging apparatus, image processing system including the same, image processing apparatus, digital imaging method, and recording medium
JP2003116046A (en) Image correction camera and method therefor
US20010022860A1 (en) Image sensing device having image combining function and method for combining images in said image sensing device
JP2003299025A (en) User selectable image preprocessing system for digital camera
US20080088718A1 (en) Template Creator For Digital Cameras
JP3993457B2 (en) Digital camera
JP4406461B2 (en) Imaging device
JP2003333378A (en) Imaging apparatus, method for displaying luminance distribution diagram, and control program
US7889242B2 (en) Blemish repair tool for digital photographs in a camera
US20080123953A1 (en) Digital camera with histogram zoom
TW201108155A (en) Human face image processing method
McCollough Complete guide to high dynamic range digital photography
Gibson Exposure and Understanding the Histogram
JP2003333380A (en) Imaging apparatus, method for confirming photographed image, and program
JP2008103831A (en) Imaging apparatus
US7920168B2 (en) Systems and methods of customizing a color palette on a digital camera

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KM KN KP KR KZ LC LK LR LS LT LU LV LY MA MD MG MK MN MW MX MZ NA NG NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SM SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU LV MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPE1 Request for preliminary examination filed after expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2005821152

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 200580037394.7

Country of ref document: CN

WWE Wipo information: entry into national phase

Ref document number: 2007539349

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

WWP Wipo information: published in national office

Ref document number: 2005821152

Country of ref document: EP