US20150043837A1 - Method for editing images in a mobile terminal - Google Patents

Method for editing images in a mobile terminal Download PDF

Info

Publication number
US20150043837A1
US20150043837A1 US14/524,582 US201414524582A US2015043837A1 US 20150043837 A1 US20150043837 A1 US 20150043837A1 US 201414524582 A US201414524582 A US 201414524582A US 2015043837 A1 US2015043837 A1 US 2015043837A1
Authority
US
United States
Prior art keywords
image
control section
added
effects
icon
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US14/524,582
Inventor
Seung-A Nam
Chung-Kyu Lee
Un-Kyong Lee
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Samsung Electronics Co Ltd
Original Assignee
Samsung Electronics Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from KR1020050040794A external-priority patent/KR100606076B1/en
Application filed by Samsung Electronics Co Ltd filed Critical Samsung Electronics Co Ltd
Priority to US14/524,582 priority Critical patent/US20150043837A1/en
Publication of US20150043837A1 publication Critical patent/US20150043837A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/60Editing figures and text; Combining figures or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10004Still image; Photographic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20092Interactive image processing based on input by user

Definitions

  • the present invention relates to a method for editing images in a mobile terminal. More particularly, the present invention relates to a method for adding various image effects to an existing image, selectively deleting the added image effects and storing an image with image effects added.
  • Future mobile communication terminals will be more capable than current mobile phones of providing high-speed transmission of packet data and image data over voice channels.
  • Mobile terminals having functions to send or receive image data can store an image received from a base station or transmit any acquired image to the base station.
  • Mobile terminals with an embedded camera can take pictures and display them on a display section.
  • the camera may comprise a camera sensor such as a charge coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor.
  • CMOS complementary metal-oxide-semiconductor
  • the display section may comprise a Liquid Crystal Display (LCD). Due to the tendency toward compact-sized mobile terminals, cameras embedded in the mobile terminals are also becoming smaller.
  • the mobile terminals can display the photographed pictures as moving or still pictures or can send the photographed pictures to the base station.
  • image editing functions that allow users to edit an image on a mobile terminal have also been developed. Such functions include zooming-in or zooming-out of a picture stored in the mobile terminal or zooming-in or zooming-out of a composite of two or more pictures.
  • Another image editing function is the creation of various image effects over an existing image, which will enable a wider range of services on mobile terminals.
  • an object of the present invention is to provide a method for adding various image effects to an existing image in a mobile terminal.
  • Another object of the present invention is to provide a method for selectively deleting image effects added to an existing image in a mobile terminal.
  • Still another object of the present invention is to provide a method for storing image information of an image with various image effects added in a mobile terminal.
  • Still another object of the present invention is to provide a method for outputting and displaying an image corresponding to image information stored in a mobile terminal.
  • Still another object of the present invention is to provide a method for combining an existing image with preset image effects and displaying the combined image on a mobile terminal.
  • a method for adding image effects to an existing image in a mobile terminal comprises: displaying an image; selecting a frame and adding the selected frame to the image; selecting at least one icon and moving the selected icon to a desired location on the image; and inputting at least one text message and moving the input text message to a desired location on the image.
  • a method for adding image effects to an existing image in a mobile terminal comprises: displaying an image; when an addition of a frame is selected from an image editing menu, adding a selected frame to the image; when an addition of an icon is selected from the image editing menu and moving the selected icon to a desired location on the image with the frame; and when addition of text is selected from the image editing menu, inputting at least one text message and moving the input text message to a desired location on the image with the frame and the icon.
  • a method for modifying or deleting image effects added to an existing image in a mobile terminal comprises: displaying an image; when a focus function is selected from an image editing menu, displaying all image effects added to the image; randomly selecting one or more of the displayed image effects and deleting each selected image effect; and randomly selecting one or more of the displayed image effects and modifying each selected image effect.
  • a method for deleting image effects added to an existing image in a mobile terminal comprises: displaying an image with image effects added; when deletion of an image effect is selected from an image editing menu, displaying the image effects added to the image; randomly selecting one or more of the displayed image effects to be deleted; when a frame is selected from the displayed image effects, deleting the selected frame and displaying the image without the frame; when an icon is selected from the displayed image effects, deleting the selected icon and displaying the image without the icon; and when text is selected from the displayed image effects, deleting the selected text and displaying the image without the text.
  • a method for modifying image effects added to an existing image in a mobile terminal comprises: displaying an image with image effects added; when modification of an image effect is selected from an image editing menu, displaying the image effects added to the image; randomly selecting one or more of the displayed image effects to be modified; when a frame is selected from the displayed image effects, replacing the selected frame with another frame; when an icon is selected from the displayed image effects, relocating the selected icon on the image; and when text is selected from the displayed image effects, changing the selected text to a different text message, color, size or location.
  • a method for adding image effects to an existing image and selectively modifying or deleting the added image effects in a mobile terminal comprises: displaying an image; selecting a frame and adding the selected frame to the image; selecting at least one icon and adding the selected icon to the image; inputting at least one text message and adding the input text message to the image; repeating the addition of the image effects; and randomly selecting one or more of the image effects added to the image and deleting or modifying each selected image effect.
  • a method for storing an image comprises extracting information about an image effect added to the image when a save image option is selected; and storing image information comprising the extracted image effect information and original image information.
  • a method for displaying an image on a mobile terminal comprises searching for image information of a selected image; extracting an original image and an image effect based on the detected image information; and displaying the extracted original image with the extracted image effect added.
  • FIG. 1 is a view illustrating the structure of a mobile terminal according to an embodiment of the present invention
  • FIG. 2 is a flow chart illustrating a process of adding image effects to an existing image and selectively deleting the added image effects in a mobile terminal according to an embodiment of the present invention
  • FIG. 3 is a flow chart illustrating showing the addition of a frame to an existing image as an image effect shown in FIG. 2 ;
  • FIGS. 4A to 4C are images illustrating the process shown in FIG. 3 ;
  • FIG. 5 is a flow chart illustrating the addition of an icon to an existing image as another image effect shown in FIG. 2 ;
  • FIGS. 6A to 6C are images illustrating the process shown in FIG. 5 ;
  • FIG. 7 is a flow chart illustrating the addition of an icon to an existing image as still another image effect shown in FIG. 2 ;
  • FIGS. 8A to 8C are images illustrating the process shown in FIG. 7 ;
  • FIGS. 9A to 9B are flow charts illustrating the deletion of an image effect in the process shown in FIG. 2 ;
  • FIGS. 10A to 10D are images illustrating the process shown in FIG. 9 ;
  • FIG. 11 is a flow chart illustrating a process of storing the image effects-added image shown in FIG. 2 ;
  • FIGS. 12A and 12B are flow charts illustrating the output and display of an image stored in a mobile terminal according to an embodiment of the present invention.
  • FIG. 1 shows the structure of a mobile terminal equipped with a camera according to an embodiment of the present invention.
  • a radio frequency (RF) section 123 performs a wireless communication function.
  • the RF section 123 comprises a RF transmitter (not shown) for performing upward conversion and amplification of the frequency of a transmitted signal and an RF receiver (not shown) for amplifying a received signal with low noise and performing downward conversion of the frequency of the signal.
  • a modem 120 comprises a transmitter (not shown) for coding and modulating a transmitted signal and a receiver (not shown) for demodulating and decoding a received signal.
  • An audio processor 125 may comprise a codec which comprises a data codec for processing packet data and an audio codec for processing an audio signal such as a speech signal.
  • the audio processor 125 converts a digital audio signal output from the modem 120 into an analog signal through the audio codec and reproduces the analog signal. Also, the audio processor 125 converts an analog audio signal generated from a microphone into a digital audio signal and transmits the digital audio signal to the modem 120 .
  • the codec can be provided as an independent element or included in a control section 110 .
  • a memory 130 may comprise a program memory and a data memory.
  • the program memory comprises programs for controlling general operations of the mobile terminal and those for controlling the addition or selective deletion of image effects, and the storage and display of an image with image effects added according to an embodiment of the present invention.
  • the data memory temporarily stores data generated during implementation of the above programs.
  • the memory 130 may store various types of image effects and images with image effects added according to an embodiment of the present invention.
  • the memory 130 may store only image information of the images with image effects added according to an embodiment of the present invention.
  • the memory 130 may store only image effect information of the image effects added to existing images according to an embodiment of the present invention.
  • the image information comprises original image information and image effect information.
  • the original image information refers to a stored original image number.
  • the image effect information may be frame information, icon information and/or text information according to the types of image effects.
  • the frame information refers to a frame number representing the type of frame added to the original image.
  • the icon information comprises an icon number and an icon position value of the icon added to the original image.
  • the text information comprises a message, color, size and position value of the text added to the original image.
  • a control section 110 controls the overall operations of the mobile terminal.
  • the control section 110 may comprise the modem 120 and the codec. Under the control of the control section 110 , a selected image effect can be added to an existing image according to an embodiment of the present invention.
  • an added image effect can be deleted from the image according to an embodiment of the present invention.
  • only image information of an image with image effects added can be stored according to an embodiment of the present invention.
  • only image effect information of image effects added to an existing image can be stored according to an embodiment of the present invention.
  • control section 110 controls the extraction of the original image and image effects corresponding to the image information and display of the original image with the image effects added.
  • control section 110 under the control of the control section 110 , various existing images can be combined and displayed with preset image effects according to an embodiment of the present invention.
  • a camera module 140 is used to photograph an object.
  • the camera module 140 comprises a camera sensor for converting a photographed optical signal into an electrical signal and a signal processor for converting an analog image signal photographed by the camera sensor into digital data.
  • the signal processor can be a digital signal processor (DSP).
  • the camera sensor and the signal processor can be either integrated into a single element or separated into independent elements.
  • An image processor 150 generates picture data for displaying an image signal output from the camera module 140 .
  • the image processor 150 processes image signals output from the camera module 140 in frames. Also, the image processor 150 adjusts the frame image data to conform to the size and features of a display section 160 and outputs the adjusted frame image data.
  • the image processor 150 comprising an image codec compresses the frame image data displayed on the display section 160 in a preset manner or restores the compressed frame image data to the original frame image data.
  • the image codec is selected from a variety of still or moving picture codecs, such as Joint Photographic Experts Group (JPEG) codec, Moving Picture Experts Group 4 (MPEG4) codec or Wavelet codec. If the image processor 150 has an on screen display (OSD) function, it can output OSD data according to the displayed picture size under the control of the control section 110 .
  • OSD on screen display
  • the display section 160 displays image data output from the image processor 150 or user data output from the control section 110 .
  • the display section 160 may comprise an LCD controller, a memory for storing image data and an LCD device.
  • the LCD is a touch screen, it can serve as an input section.
  • the display section 160 can display an image with an image effect or effects added according to an embodiment of the present invention.
  • a key input section 127 is provided with keys for inputting numbers and characters and function keys for setting up various functions.
  • the key input section 127 may also include function keys for adding or deleting an image effect according to an embodiment of the present invention.
  • the control section 110 will detect the mode and will process the dialed information received through the modem 120 .
  • the control section 110 converts the dialed information into an RF signal through the RF section 123 and outputs the RF signal.
  • a reply signal generated from a recipient is detected by the RF section 123 and the modem 120 .
  • the audio processor 125 then forms a voice communication path so that the user can communicate with the recipient.
  • the control section 110 controls the audio processor 125 to generate a ringing signal.
  • control section 110 When the user replies to the incoming call, the control section 110 detects the reply and controls the audio processor 125 to form a voice communication path so that the user can receive the incoming call. Although voice communications in the incoming or outgoing call mode have been described, the control section 110 can also perform data communications to receive or transmit packet data or image data. In a standby mode or a messaging mode, the control section 110 displays text data processed by the modem 120 on the display section 160 .
  • FIGS. 2 to 10 the image editing function of the mobile terminal according to an embodiment of the present invention will be explained in detail with reference to FIGS. 2 to 10 .
  • frames, icons and text will be described below as examples of image effects, any other types of image effects can be used according to an embodiment of the present invention.
  • FIG. 2 is a flow chart illustrating a process of adding image effects to an existing image and selectively deleting the added image effects in a mobile terminal according to an embodiment of the present invention.
  • FIG. 3 is a flow chart illustrating the addition of a frame to an existing image as an image effect shown in FIG. 2 .
  • FIGS. 4A to 4C are images illustrating the process shown in FIG. 3 .
  • FIG. 5 is a flow chart illustrating the addition of an icon to another existing image as an image effect shown in FIG. 2 .
  • FIGS. 6A to 6C are images illustrating the process shown in FIG. 5 .
  • FIG. 7 is a flow chart illustrating the addition of an icon to an existing image as an image effect shown in FIG. 2 .
  • FIGS. 1 is a flow chart illustrating a process of adding image effects to an existing image and selectively deleting the added image effects in a mobile terminal according to an embodiment of the present invention.
  • FIG. 3 is a flow chart illustrating the addition of a frame to an
  • FIGS. 8A to 8C are images illustrating the process shown in FIG. 7 .
  • FIGS. 9A to 9B are flow charts illustrating the deletion of another image effect in the process of FIG. 2 .
  • FIGS. 10A to 10D are images illustrating the process shown in FIG. 9 .
  • FIG. 11 is a flow chart illustrating a process of storing the image effects-added image shown in FIG. 2 .
  • FIGS. 12A and 12B are flow charts illustrating the output and display of an image stored in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 2 , the mobile terminal displays an image selected by the user from among the stored images on the display section 160 at step 201 .
  • control section 110 detects the key input at step 202 and proceeds to step 203 .
  • control section 110 detects the selection at step 204 and performs a function to add a selected frame to the displayed image at step 300 .
  • FIG. 4A shows the image selected and displayed at step 201 .
  • FIG. 4B shows various types of frames displayed at step 301 .
  • the control section 110 detects the selection at step 302 and proceeds to step 303 to display the original image with the selected frame added.
  • FIG. 4C shows the image with the selected frame added thereto.
  • the control section 110 detects the selection at step 304 and proceeds to step 305 .
  • the control section 110 detects if the user presses a frame change key, for example, a direction key, to apply another frame and displays the image with different frames added in turn.
  • a frame change key for example, a direction key
  • the control section 110 detects the selection at step 306 and displays the image with the newly-selected frame at step 307 .
  • the control section 110 detects the selection at step 308 and proceeds with step 309 to display the original image without the frame.
  • control section 110 detects the selection at step 205 and performs a function to add the selected icon to the displayed original image at step 500 .
  • Step 500 for adding an icon will be explained in further detail with reference to FIGS. 5 and 6 .
  • the control section 110 detects the selection at step 205 and displays types of icons at step 501 .
  • FIG. 6A shows the original image selected and displayed at step 201 .
  • FIG. 6B shows various types of icons displayed at step 501 .
  • the control section 110 detects the selection at step 502 and proceeds to step 503 to display the original image with the selected icon added.
  • FIG. 6C shows the image with the selected icon added thereto.
  • the control section 110 detects the selection at step 504 and proceeds to step 505 .
  • the control section 110 detects each time the user presses an icon moving key, for example, a direction key, to change the location of the icon and displays the image with the icon added to different locations in turn.
  • an icon moving key for example, a direction key
  • the control section 110 detects the selection at step 506 and displays the image with the icon added to the selected location at step 507 .
  • the control section 110 detects the selection at step 508 and returns to step 501 .
  • the control section 110 detects the selection at step 509 and proceeds to step 510 to display the original image without the icon.
  • the icons that can be added to an image include photo image icons that are photo images in an icon format.
  • control section 110 detects the selection at step 206 and performs a function to add text to the displayed image at step 700 .
  • Step 700 for adding text to an image will be described in detail with reference to FIGS. 7 and 8 .
  • the control section 110 detects the selection at step 206 and displays a text input window at step 701 .
  • the user can input text into the text input window at step 702 .
  • FIG. 8A shows the image selected and displayed at step 201 .
  • FIG. 8B shows the text input window with a text message input at step 702 .
  • the control section 110 detects the key input at step 703 and proceeds to step 704 to display the types of text effects.
  • the control section 110 detects the selection at step 705 and proceeds with step 706 to display the original image with the text message inserted.
  • FIG. 8C shows the image with the text message inserted.
  • the control section 110 detects the selection at step 707 and proceeds to step 708 .
  • the user can replace the previous text message with a newly input message.
  • the control section 110 detects the selection at step 709 and proceeds to step 710 to enable the user to select a desired color for the input text.
  • the control section 110 detects the selection at step 711 and proceeds to step 712 to enable the user to adjust or change the size of the input text.
  • the control section 110 detects the selection at step 713 and proceeds to step 714 .
  • the control section 110 displays the text message placed at different locations on the image according to the user's pressing of a location key, for example, a direction key.
  • the control section 110 controls the display section 160 to display a help for adding text to an image.
  • the control section 110 detects the selection at step 715 and terminate the text adding process.
  • control section 110 detects the selection at step 716 and proceeds to step 717 to display the image without the text.
  • control section 110 detects the selection at step 207 and proceeds to step 900 to perform a focus function.
  • Step 900 for performing a focus function to modify or delete an image effect will be explained in further detail with reference to FIGS. 9A , 9 B and 10 .
  • the control section 110 detects the selection at step 207 and generates a focus window at step 901 .
  • the focus window generated at step 901 displays the original image and the image effects added to the original image at step 300 , 500 or 700 .
  • FIG. 10A shows the focus window generated at step 901 to display the image effects added to the original image.
  • the user can select any of the image effects displayed at step 901 to perform the focus function for modifying or deleting the selected image effect.
  • the control section 110 detects the selection at step 902 and prepares to perform the focus function on the selected frame.
  • the control section 110 detects the selection at step 903 and proceeds to step 904 to display the image without the frame.
  • FIG. 10B shows the image with the frame selected at step 904 deleted.
  • the focus function includes modification of an image effect added to the original image.
  • the control section 110 detects the selection at step 905 and proceeds to step 906 to display the image with different frames added in turn by the user.
  • Step 906 is substantially identical to step 305 shown in FIG. 3 .
  • the control section 110 detects the selection at step 907 and locates a cursor on the icon inserted in the image.
  • the control section 110 detects the selection at step 908 and proceeds with step 909 to display the image with the icon deleted.
  • FIG. 10C shows the image with the icon selected at step 909 deleted.
  • the control section 110 detects the selection at step 919 and returns to step 907 .
  • the control section 110 detects the selection at step 910 and proceeds to step 911 to move the location of the selected icon.
  • Step 911 is substantially identical to step 505 shown in FIG. 5 .
  • the control section 110 detects the selection at step 912 and locates a cursor on the text inserted in the image.
  • the control section 110 detects the selection at step 913 and proceeds to step 914 to display the image without the text.
  • FIG. 10D shows the image with the text selected at step 914 deleted.
  • the control section 110 detects the selection at step 915 and proceeds again with step 912 .
  • the control section 110 detects the selection at step 916 and proceeds to step 917 to change the input message, color, size or location of the text.
  • Step 917 is substantially identical to steps 707 to 714 shown in FIG. 5 .
  • the user can selectively delete or modify any of the image effects added to the original image using the focus function in FIG. 9 , regardless of the order of the addition of the image effects.
  • control section 110 detects the selection and performs an undo function to cancel the previous edit and restore the image to the condition that existed before editing was done.
  • control section 110 detects the selection at step 208 and proceeds to step 210 to store the image with the image effects added by the user.
  • Step 1100 for performing a save function will be explained in further detail with reference to FIG. 11 .
  • the control section 110 detects the selection at step 208 and detects if any image effect is added to the image.
  • the control section 110 proceeds to step 1101 to determine the type of the image effect. If the image effect is an icon, the control section 110 will detect this image effect type at step 1102 and proceed to step 1103 to extract and store corresponding icon information.
  • the control section 110 extracts an icon number representing the type of icon and a position value of the icon added to the original image and stores the extracted information. Steps 1103 and 1104 are performed to extract information about at least one icon added to the original image.
  • the control section 110 will detect this image effect type at step 1105 and proceed to step 1106 to extract and store corresponding frame information. Specifically, at step 1106 , the control section 110 extracts and stores a frame number representing the type of frame added to the original image.
  • the control section 110 will detect this image effect type at step 1107 and proceed to step 1108 to extract and store corresponding text information. Specifically, at step 1108 , the control section 110 extracts and stores an input message, color, size and position value of the text added to the original image. Steps 1108 and 1109 are performed to extract information about at least one text added to the original image. Specifically, at step 1109 , a determination is made as to whether additional text is detected.
  • control section 110 Upon extraction of the image effect information through steps 1101 to 1109 , the control section 110 proceeds to step 1110 to store only image information comprising the extracted image effect information and the original image information.
  • the control section 110 can separately store the image effects added to the original image according to the user's selection. At this time, the image effects can be stored as image effect information.
  • control section 110 detects the selection and controls the display section to display a help screen explaining how to add an image effect.
  • the control section 110 detects the selection at step 209 and terminates the image editing process.
  • FIG. 11 The output and display of a stored image in FIG. 11 will be explained in further detail with reference to FIGS. 12A and 12B .
  • control section 110 detects the selection and proceeds to step 1201 to display the types of images stored in the memory 130 .
  • control section 110 detects the selection at step 1202 and searches for stored information about the selected image. If the selected image is stored as image information, the control section 110 will display the image display types at step 1203 .
  • control section 110 detects the selection at step 1204 and proceeds to step 1205 to search for image information of the image selected at step 1202 .
  • control section 110 searches for original image information of the selected image and image effect information such as a frame, icon and/or text added to the original image.
  • control section 110 proceeds to step 1206 to extract the original image corresponding to the original image information from the memory 130 and a frame, at least one icon and at least one text corresponding to the image effect information from the memory 130 .
  • the control section 110 combines the extracted original image with the extracted image effects by adding the frame to the original image, placing the icon at a preset location of the original image and inserting the text in preset color and size in the preset location of the original image, and displays the combined image.
  • the control section 110 detects the selection at step 1207 and proceeds to step 1208 to display the types of images stored in the memory 130 .
  • the control section 110 detects the selection at step 1209 and proceeds to step 1210 to display the newly selected image with the image effects added.
  • the control section 110 displays the two or more selected images in turn at every predetermined time interval with the image effects added.
  • control section 110 detects the selection at step 1211 and proceeds to step 1212 ( FIG. 12B ) to search for image effect information included in the image information of the image selected at step 1202 .
  • control section 110 detects the selection and searches for image effect information corresponding to the selected image effect at step 1212 .
  • the control section 110 searches for image effect information such as a frame, icon and/or text added to the original image. After the search for image effect information, the control section 110 proceeds to step 1213 to extract a frame, at least one icon and at least one text corresponding to the image effect information from the memory 130 and display the extracted image effects.
  • the control section 110 displays the extracted icon at a location corresponding to the preset position value, without displaying the original image.
  • the control section 110 displays the extracted frame.
  • the control section 110 displays the extracted text in the preset color and size at a location corresponding to the preset position value.
  • control section 110 detects the selection at step 1214 and proceeds to step 1215 to display the types of images stored in the memory 130 .
  • the control section 110 detects the selection at step 1216 and proceeds to step 1217 to display the selected image with the image effects added.
  • the control section 110 displays the two or more selected images in turn at every predetermined time interval with the image effects added.
  • various image effects can be added to an existing image stored in a mobile terminal, thereby enabling various changes to the existing image.
  • the user can add an image effect to any desired location of the image. It is also possible to add multiple image effects to a single image. Using the focus function, the user can selectively delete or modify any of the added image effects.
  • the user can display the image edited with various image effects on the mobile terminal and store the image in an external device via a communication interface.
  • the user can also send the edited image to the base station so that the image can be transmitted to other subscribers. It is possible to store only image information of the images with various image effects added to effectively use the memory. It is also possible to apply different images to preset image effects, thereby eliminating the inconvenience in adding the same image effects to each existing image.

Abstract

A method for adding image effects to an existing image, selectively deleting the added image effects or storing an image with various image effects added in a mobile terminal are provided. The method comprises: displaying an image; selecting a frame and adding the selected frame to the image; selecting at least one icon and adding the selected icon to the image; inputting at least one text message and adding the input text message to the image; repeating the addition of the image effects; and randomly selecting one or more of the image effects added to the image and deleting or modifying each selected image effect. There is also provided a method for storing an image, which comprises when saving an image is selected, extracting information about an image effect added to the image; and storing image information comprising the extracted image effect information and original image information.

Description

    PRIORITY
  • This application is a Continuation Application of U.S. patent application Ser. No. 11/171,364, filed on Jul. 1, 2005, and claims the benefit under 35 U.S.C. 119(a) of applications entitled “Method for Editing Images in Mobile Terminal” filed with the Korean Intellectual Property Office on Jul. 2, 2004 and assigned Serial No. 10-2004-51563 and on May 16, 2005 and assigned Serial No. 10-2005-40794, the contents of each of which are incorporated herein by reference.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a method for editing images in a mobile terminal. More particularly, the present invention relates to a method for adding various image effects to an existing image, selectively deleting the added image effects and storing an image with image effects added.
  • 2. Description of the Related Art
  • Future mobile communication terminals will be more capable than current mobile phones of providing high-speed transmission of packet data and image data over voice channels.
  • Current mobile terminals having functions to send or receive image data can store an image received from a base station or transmit any acquired image to the base station. Mobile terminals with an embedded camera can take pictures and display them on a display section. The camera may comprise a camera sensor such as a charge coupled device (CCD) sensor or a complementary metal-oxide-semiconductor (CMOS) sensor. The display section may comprise a Liquid Crystal Display (LCD). Due to the tendency toward compact-sized mobile terminals, cameras embedded in the mobile terminals are also becoming smaller. The mobile terminals can display the photographed pictures as moving or still pictures or can send the photographed pictures to the base station.
  • There is a growing demand for image mail services on a mobile terminal. To meet this demand, image editing functions that allow users to edit an image on a mobile terminal have also been developed. Such functions include zooming-in or zooming-out of a picture stored in the mobile terminal or zooming-in or zooming-out of a composite of two or more pictures. Another image editing function is the creation of various image effects over an existing image, which will enable a wider range of services on mobile terminals.
  • SUMMARY OF THE INVENTION
  • Accordingly, the present invention has been made to solve the problems occurring in the prior art, and an object of the present invention is to provide a method for adding various image effects to an existing image in a mobile terminal.
  • Another object of the present invention is to provide a method for selectively deleting image effects added to an existing image in a mobile terminal.
  • Still another object of the present invention is to provide a method for storing image information of an image with various image effects added in a mobile terminal.
  • Still another object of the present invention is to provide a method for outputting and displaying an image corresponding to image information stored in a mobile terminal.
  • Still another object of the present invention is to provide a method for combining an existing image with preset image effects and displaying the combined image on a mobile terminal.
  • In order to accomplish the above objects of the present invention, there is provided a method for adding image effects to an existing image in a mobile terminal. The method comprises: displaying an image; selecting a frame and adding the selected frame to the image; selecting at least one icon and moving the selected icon to a desired location on the image; and inputting at least one text message and moving the input text message to a desired location on the image.
  • In accordance with another aspect of the present invention, there is provided a method for adding image effects to an existing image in a mobile terminal. The method comprises: displaying an image; when an addition of a frame is selected from an image editing menu, adding a selected frame to the image; when an addition of an icon is selected from the image editing menu and moving the selected icon to a desired location on the image with the frame; and when addition of text is selected from the image editing menu, inputting at least one text message and moving the input text message to a desired location on the image with the frame and the icon.
  • In accordance with still another aspect of the present invention, there is provided a method for modifying or deleting image effects added to an existing image in a mobile terminal. The method comprises: displaying an image; when a focus function is selected from an image editing menu, displaying all image effects added to the image; randomly selecting one or more of the displayed image effects and deleting each selected image effect; and randomly selecting one or more of the displayed image effects and modifying each selected image effect.
  • In accordance with still another aspect of the present invention, there is provided a method for deleting image effects added to an existing image in a mobile terminal. The method comprises: displaying an image with image effects added; when deletion of an image effect is selected from an image editing menu, displaying the image effects added to the image; randomly selecting one or more of the displayed image effects to be deleted; when a frame is selected from the displayed image effects, deleting the selected frame and displaying the image without the frame; when an icon is selected from the displayed image effects, deleting the selected icon and displaying the image without the icon; and when text is selected from the displayed image effects, deleting the selected text and displaying the image without the text.
  • In accordance with still another aspect of the present invention, there is provided a method for modifying image effects added to an existing image in a mobile terminal. The method comprises: displaying an image with image effects added; when modification of an image effect is selected from an image editing menu, displaying the image effects added to the image; randomly selecting one or more of the displayed image effects to be modified; when a frame is selected from the displayed image effects, replacing the selected frame with another frame; when an icon is selected from the displayed image effects, relocating the selected icon on the image; and when text is selected from the displayed image effects, changing the selected text to a different text message, color, size or location.
  • In accordance with still another aspect of the present invention, there is provided a method for adding image effects to an existing image and selectively modifying or deleting the added image effects in a mobile terminal. The method comprises: displaying an image; selecting a frame and adding the selected frame to the image; selecting at least one icon and adding the selected icon to the image; inputting at least one text message and adding the input text message to the image; repeating the addition of the image effects; and randomly selecting one or more of the image effects added to the image and deleting or modifying each selected image effect.
  • In accordance with still another aspect of the present invention, there is provided a method for storing an image. The method comprises extracting information about an image effect added to the image when a save image option is selected; and storing image information comprising the extracted image effect information and original image information.
  • In accordance with still another aspect of the present invention, there is provided a method for displaying an image on a mobile terminal. The method comprises searching for image information of a selected image; extracting an original image and an image effect based on the detected image information; and displaying the extracted original image with the extracted image effect added.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The above and other objects, features and advantages of the present invention will be more apparent from the following detailed description taken in conjunction with the accompanying drawings, in which:
  • FIG. 1 is a view illustrating the structure of a mobile terminal according to an embodiment of the present invention;
  • FIG. 2 is a flow chart illustrating a process of adding image effects to an existing image and selectively deleting the added image effects in a mobile terminal according to an embodiment of the present invention;
  • FIG. 3 is a flow chart illustrating showing the addition of a frame to an existing image as an image effect shown in FIG. 2;
  • FIGS. 4A to 4C are images illustrating the process shown in FIG. 3;
  • FIG. 5 is a flow chart illustrating the addition of an icon to an existing image as another image effect shown in FIG. 2;
  • FIGS. 6A to 6C are images illustrating the process shown in FIG. 5;
  • FIG. 7 is a flow chart illustrating the addition of an icon to an existing image as still another image effect shown in FIG. 2;
  • FIGS. 8A to 8C are images illustrating the process shown in FIG. 7;
  • FIGS. 9A to 9B are flow charts illustrating the deletion of an image effect in the process shown in FIG. 2;
  • FIGS. 10A to 10D are images illustrating the process shown in FIG. 9;
  • FIG. 11 is a flow chart illustrating a process of storing the image effects-added image shown in FIG. 2; and
  • FIGS. 12A and 12B are flow charts illustrating the output and display of an image stored in a mobile terminal according to an embodiment of the present invention.
  • Throughout the drawings, the same or similar elements are denoted by the same reference numerals.
  • DETAILED DESCRIPTION OF EXEMPLARY EMBODIMENTS
  • Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings. Throughout the drawings, the same element is designated by the same reference numeral or character. Also, in the following description of the present invention, a detailed description of known functions and configurations incorporated herein will be omitted for conciseness.
  • FIG. 1 shows the structure of a mobile terminal equipped with a camera according to an embodiment of the present invention.
  • Referring to FIG. 1, a radio frequency (RF) section 123 performs a wireless communication function. The RF section 123 comprises a RF transmitter (not shown) for performing upward conversion and amplification of the frequency of a transmitted signal and an RF receiver (not shown) for amplifying a received signal with low noise and performing downward conversion of the frequency of the signal. A modem 120 comprises a transmitter (not shown) for coding and modulating a transmitted signal and a receiver (not shown) for demodulating and decoding a received signal. An audio processor 125 may comprise a codec which comprises a data codec for processing packet data and an audio codec for processing an audio signal such as a speech signal. The audio processor 125 converts a digital audio signal output from the modem 120 into an analog signal through the audio codec and reproduces the analog signal. Also, the audio processor 125 converts an analog audio signal generated from a microphone into a digital audio signal and transmits the digital audio signal to the modem 120. The codec can be provided as an independent element or included in a control section 110.
  • A memory 130 may comprise a program memory and a data memory. The program memory comprises programs for controlling general operations of the mobile terminal and those for controlling the addition or selective deletion of image effects, and the storage and display of an image with image effects added according to an embodiment of the present invention. The data memory temporarily stores data generated during implementation of the above programs. Also, the memory 130 may store various types of image effects and images with image effects added according to an embodiment of the present invention. The memory 130 may store only image information of the images with image effects added according to an embodiment of the present invention. In addition, the memory 130 may store only image effect information of the image effects added to existing images according to an embodiment of the present invention.
  • The image information comprises original image information and image effect information. The original image information refers to a stored original image number. The image effect information may be frame information, icon information and/or text information according to the types of image effects. The frame information refers to a frame number representing the type of frame added to the original image. The icon information comprises an icon number and an icon position value of the icon added to the original image. The text information comprises a message, color, size and position value of the text added to the original image. A control section 110 controls the overall operations of the mobile terminal. The control section 110 may comprise the modem 120 and the codec. Under the control of the control section 110, a selected image effect can be added to an existing image according to an embodiment of the present invention. Under the control of the control section 110, an added image effect can be deleted from the image according to an embodiment of the present invention. Under the control of the control section 110, only image information of an image with image effects added can be stored according to an embodiment of the present invention. Also, under the control of the control section 110, only image effect information of image effects added to an existing image can be stored according to an embodiment of the present invention.
  • When an image of which image information only is stored is selected, the control section 110 controls the extraction of the original image and image effects corresponding to the image information and display of the original image with the image effects added.
  • In addition, under the control of the control section 110, various existing images can be combined and displayed with preset image effects according to an embodiment of the present invention.
  • A camera module 140 is used to photograph an object. The camera module 140 comprises a camera sensor for converting a photographed optical signal into an electrical signal and a signal processor for converting an analog image signal photographed by the camera sensor into digital data. If the camera sensor is a charge coupled device (CCD) sensor, the signal processor can be a digital signal processor (DSP). The camera sensor and the signal processor can be either integrated into a single element or separated into independent elements.
  • An image processor 150 generates picture data for displaying an image signal output from the camera module 140. The image processor 150 processes image signals output from the camera module 140 in frames. Also, the image processor 150 adjusts the frame image data to conform to the size and features of a display section 160 and outputs the adjusted frame image data. The image processor 150 comprising an image codec compresses the frame image data displayed on the display section 160 in a preset manner or restores the compressed frame image data to the original frame image data. The image codec is selected from a variety of still or moving picture codecs, such as Joint Photographic Experts Group (JPEG) codec, Moving Picture Experts Group 4 (MPEG4) codec or Wavelet codec. If the image processor 150 has an on screen display (OSD) function, it can output OSD data according to the displayed picture size under the control of the control section 110.
  • The display section 160 displays image data output from the image processor 150 or user data output from the control section 110. When using an LCD, the display section 160 may comprise an LCD controller, a memory for storing image data and an LCD device. When the LCD is a touch screen, it can serve as an input section. The display section 160 can display an image with an image effect or effects added according to an embodiment of the present invention. A key input section 127 is provided with keys for inputting numbers and characters and function keys for setting up various functions. The key input section 127 may also include function keys for adding or deleting an image effect according to an embodiment of the present invention.
  • Referring to FIG. 1, if a user (caller) sets an outgoing call mode after dialing by using the keypad 127, the control section 110 will detect the mode and will process the dialed information received through the modem 120. The control section 110 converts the dialed information into an RF signal through the RF section 123 and outputs the RF signal. A reply signal generated from a recipient is detected by the RF section 123 and the modem 120. The audio processor 125 then forms a voice communication path so that the user can communicate with the recipient. When detecting an incoming call, the control section 110 controls the audio processor 125 to generate a ringing signal. When the user replies to the incoming call, the control section 110 detects the reply and controls the audio processor 125 to form a voice communication path so that the user can receive the incoming call. Although voice communications in the incoming or outgoing call mode have been described, the control section 110 can also perform data communications to receive or transmit packet data or image data. In a standby mode or a messaging mode, the control section 110 displays text data processed by the modem 120 on the display section 160.
  • Hereinafter, the image editing function of the mobile terminal according to an embodiment of the present invention will be explained in detail with reference to FIGS. 2 to 10. Although frames, icons and text will be described below as examples of image effects, any other types of image effects can be used according to an embodiment of the present invention.
  • FIG. 2 is a flow chart illustrating a process of adding image effects to an existing image and selectively deleting the added image effects in a mobile terminal according to an embodiment of the present invention. FIG. 3 is a flow chart illustrating the addition of a frame to an existing image as an image effect shown in FIG. 2. FIGS. 4A to 4C are images illustrating the process shown in FIG. 3. FIG. 5 is a flow chart illustrating the addition of an icon to another existing image as an image effect shown in FIG. 2. FIGS. 6A to 6C are images illustrating the process shown in FIG. 5. FIG. 7 is a flow chart illustrating the addition of an icon to an existing image as an image effect shown in FIG. 2. FIGS. 8A to 8C are images illustrating the process shown in FIG. 7. FIGS. 9A to 9B are flow charts illustrating the deletion of another image effect in the process of FIG. 2. FIGS. 10A to 10D are images illustrating the process shown in FIG. 9. FIG. 11 is a flow chart illustrating a process of storing the image effects-added image shown in FIG. 2. FIGS. 12A and 12B are flow charts illustrating the output and display of an image stored in a mobile terminal according to an embodiment of the present invention. Referring to FIG. 2, the mobile terminal displays an image selected by the user from among the stored images on the display section 160 at step 201. When the user presses a select key for selecting a image editing menu, the control section 110 detects the key input at step 202 and proceeds to step 203. When the user selects the addition of a frame from the image editing menu at step 203, the control section 110 detects the selection at step 204 and performs a function to add a selected frame to the displayed image at step 300.
  • The addition of a frame will be explained in further detail with reference to FIGS. 3 and 4. When the user selects the addition of a frame, the control section 110 detects the selection at step 204 and displays types of frames at step 301. FIG. 4A shows the image selected and displayed at step 201. FIG. 4B shows various types of frames displayed at step 301. When the user selects one of the frames displayed at step 301, the control section 110 detects the selection at step 302 and proceeds to step 303 to display the original image with the selected frame added. FIG. 4C shows the image with the selected frame added thereto. When the user selects a change of frame at step 303, the control section 110 detects the selection at step 304 and proceeds to step 305. At step 305, the control section 110 detects if the user presses a frame change key, for example, a direction key, to apply another frame and displays the image with different frames added in turn. When the user selects one of the sequentially displayed images with different frames, the control section 110 detects the selection at step 306 and displays the image with the newly-selected frame at step 307. When the user selects deletion of a frame added to the original image, the control section 110 detects the selection at step 308 and proceeds with step 309 to display the original image without the frame.
  • When the user selects the addition of an icon as another image effect at step 203, the control section 110 detects the selection at step 205 and performs a function to add the selected icon to the displayed original image at step 500.
  • Step 500 for adding an icon will be explained in further detail with reference to FIGS. 5 and 6. When the user selects the addition of an icon, the control section 110 detects the selection at step 205 and displays types of icons at step 501. FIG. 6A shows the original image selected and displayed at step 201. FIG. 6B shows various types of icons displayed at step 501. When the user selects one of the icons displayed at step 501, the control section 110 detects the selection at step 502 and proceeds to step 503 to display the original image with the selected icon added. FIG. 6C shows the image with the selected icon added thereto. When the user selects a change in the location of the icon added at step 503, the control section 110 detects the selection at step 504 and proceeds to step 505. At step 505, the control section 110 detects each time the user presses an icon moving key, for example, a direction key, to change the location of the icon and displays the image with the icon added to different locations in turn. When the user selects a specific location, the control section 110 detects the selection at step 506 and displays the image with the icon added to the selected location at step 507. When the user selects any additional icons, the control section 110 detects the selection at step 508 and returns to step 501. When the user selects deletion of an icon added to the original image, the control section 110 detects the selection at step 509 and proceeds to step 510 to display the original image without the icon. The icons that can be added to an image include photo image icons that are photo images in an icon format.
  • When the user selects the addition of text as another image effect at step 203, the control section 110 detects the selection at step 206 and performs a function to add text to the displayed image at step 700.
  • Step 700 for adding text to an image will be described in detail with reference to FIGS. 7 and 8. When the user selects the addition of text, the control section 110 detects the selection at step 206 and displays a text input window at step 701. The user can input text into the text input window at step 702. FIG. 8A shows the image selected and displayed at step 201. FIG. 8B shows the text input window with a text message input at step 702. When the user presses a select key for creating a text effect on the image, the control section 110 detects the key input at step 703 and proceeds to step 704 to display the types of text effects. When the user selects the insertion of the input text message, the control section 110 detects the selection at step 705 and proceeds with step 706 to display the original image with the text message inserted. FIG. 8C shows the image with the text message inserted. When the user selects a change to the inserted text message, the control section 110 detects the selection at step 707 and proceeds to step 708. At step 708, the user can replace the previous text message with a newly input message. When the user selects a change in the text color, the control section 110 detects the selection at step 709 and proceeds to step 710 to enable the user to select a desired color for the input text. When the user selects a change in the text size, the control section 110 detects the selection at step 711 and proceeds to step 712 to enable the user to adjust or change the size of the input text. When the user selects a change in the text location, the control section 110 detects the selection at step 713 and proceeds to step 714. At step 714, the control section 110 displays the text message placed at different locations on the image according to the user's pressing of a location key, for example, a direction key. When the user selects a help, the control section 110 controls the display section 160 to display a help for adding text to an image. When the user selects termination of text addition, the control section 110 detects the selection at step 715 and terminate the text adding process.
  • When the user selects deletion of the text added to the image, the control section 110 detects the selection at step 716 and proceeds to step 717 to display the image without the text.
  • When the user selects modification or deletion of an image effect from the image editing menu displayed at step 203, the control section 110 detects the selection at step 207 and proceeds to step 900 to perform a focus function.
  • Step 900 for performing a focus function to modify or delete an image effect will be explained in further detail with reference to FIGS. 9A, 9B and 10. When the user selects the focus function, the control section 110 detects the selection at step 207 and generates a focus window at step 901. The focus window generated at step 901 displays the original image and the image effects added to the original image at step 300, 500 or 700. FIG. 10A shows the focus window generated at step 901 to display the image effects added to the original image. The user can select any of the image effects displayed at step 901 to perform the focus function for modifying or deleting the selected image effect.
  • When the user selects the frame from among the image effects displayed in the focus window at step 901, the control section 110 detects the selection at step 902 and prepares to perform the focus function on the selected frame. When the user selects deletion of the frame, the control section 110 detects the selection at step 903 and proceeds to step 904 to display the image without the frame. FIG. 10B shows the image with the frame selected at step 904 deleted. The focus function includes modification of an image effect added to the original image. When the user selects modification of the above frame, the control section 110 detects the selection at step 905 and proceeds to step 906 to display the image with different frames added in turn by the user. Step 906 is substantially identical to step 305 shown in FIG. 3.
  • When the user selects the icon from among the image effects displayed in the focus window at step 901, the control section 110 detects the selection at step 907 and locates a cursor on the icon inserted in the image. When the user selects deletion of the icon, the control section 110 detects the selection at step 908 and proceeds with step 909 to display the image with the icon deleted. FIG. 10C shows the image with the icon selected at step 909 deleted. When the user selects another icon to be deleted from the image, the control section 110 detects the selection at step 919 and returns to step 907. When the user selects relocation of a selected icon, the control section 110 detects the selection at step 910 and proceeds to step 911 to move the location of the selected icon. Step 911 is substantially identical to step 505 shown in FIG. 5.
  • When the user selects the text among the image effects displayed in the focus window at step 901, the control section 110 detects the selection at step 912 and locates a cursor on the text inserted in the image. When the user selects deletion of the text, the control section 110 detects the selection at step 913 and proceeds to step 914 to display the image without the text. FIG. 10D shows the image with the text selected at step 914 deleted. When the user selects another text to be deleted from the image, the control section 110 detects the selection at step 915 and proceeds again with step 912. When the user selects modification of any text, the control section 110 detects the selection at step 916 and proceeds to step 917 to change the input message, color, size or location of the text. Step 917 is substantially identical to steps 707 to 714 shown in FIG. 5. The user can selectively delete or modify any of the image effects added to the original image using the focus function in FIG. 9, regardless of the order of the addition of the image effects.
  • When the user selects undo from the image editing menu displayed at step 203, the control section 110 detects the selection and performs an undo function to cancel the previous edit and restore the image to the condition that existed before editing was done.
  • When the user selects save from the image editing menu displayed at step 203, the control section 110 detects the selection at step 208 and proceeds to step 210 to store the image with the image effects added by the user.
  • Step 1100 for performing a save function will be explained in further detail with reference to FIG. 11. When the user selects the save function for the image, the control section 110 detects the selection at step 208 and detects if any image effect is added to the image. Upon detecting an image effect added to the image, the control section 110 proceeds to step 1101 to determine the type of the image effect. If the image effect is an icon, the control section 110 will detect this image effect type at step 1102 and proceed to step 1103 to extract and store corresponding icon information. At step 1103, the control section 110 extracts an icon number representing the type of icon and a position value of the icon added to the original image and stores the extracted information. Steps 1103 and 1104 are performed to extract information about at least one icon added to the original image.
  • If the image effect is a frame, the control section 110 will detect this image effect type at step 1105 and proceed to step 1106 to extract and store corresponding frame information. Specifically, at step 1106, the control section 110 extracts and stores a frame number representing the type of frame added to the original image.
  • If the image effect is text, the control section 110 will detect this image effect type at step 1107 and proceed to step 1108 to extract and store corresponding text information. Specifically, at step 1108, the control section 110 extracts and stores an input message, color, size and position value of the text added to the original image. Steps 1108 and 1109 are performed to extract information about at least one text added to the original image. Specifically, at step 1109, a determination is made as to whether additional text is detected.
  • Upon extraction of the image effect information through steps 1101 to 1109, the control section 110 proceeds to step 1110 to store only image information comprising the extracted image effect information and the original image information.
  • The control section 110 can separately store the image effects added to the original image according to the user's selection. At this time, the image effects can be stored as image effect information.
  • When the user selects help from the image editing menu displayed at step 203, the control section 110 detects the selection and controls the display section to display a help screen explaining how to add an image effect. When the user selects an end option from the image editing menu displayed at step 203, the control section 110 detects the selection at step 209 and terminates the image editing process.
  • The output and display of a stored image in FIG. 11 will be explained in further detail with reference to FIGS. 12A and 12B.
  • Referring to FIG. 12A, when the user selects viewing images stored in the mobile terminal, the control section 110 detects the selection and proceeds to step 1201 to display the types of images stored in the memory 130.
  • When the user selects one of the types of images displayed at step 1201, the control section 110 detects the selection at step 1202 and searches for stored information about the selected image. If the selected image is stored as image information, the control section 110 will display the image display types at step 1203.
  • When the user selects whole image display among the image display types, the control section 110 detects the selection at step 1204 and proceeds to step 1205 to search for image information of the image selected at step 1202. At step 1205, the control section 110 searches for original image information of the selected image and image effect information such as a frame, icon and/or text added to the original image.
  • After the image information search, the control section 110 proceeds to step 1206 to extract the original image corresponding to the original image information from the memory 130 and a frame, at least one icon and at least one text corresponding to the image effect information from the memory 130. At step 1206, the control section 110 combines the extracted original image with the extracted image effects by adding the frame to the original image, placing the icon at a preset location of the original image and inserting the text in preset color and size in the preset location of the original image, and displays the combined image.
  • When the user selects a change of the original image during the display of the combined image, the control section 110 detects the selection at step 1207 and proceeds to step 1208 to display the types of images stored in the memory 130. When the user selects one of the displayed types of images, the control section 110 detects the selection at step 1209 and proceeds to step 1210 to display the newly selected image with the image effects added. When the user selects two or more images at step 1209, the control section 110 displays the two or more selected images in turn at every predetermined time interval with the image effects added.
  • When the user selects image effect display among the image display types, the control section 110 detects the selection at step 1211 and proceeds to step 1212 (FIG. 12B) to search for image effect information included in the image information of the image selected at step 1202. Alternatively, when the user selects one of the separately stored image effects, the control section 110 detects the selection and searches for image effect information corresponding to the selected image effect at step 1212.
  • Referring to FIG. 12B, at step 1212, the control section 110 searches for image effect information such as a frame, icon and/or text added to the original image. After the search for image effect information, the control section 110 proceeds to step 1213 to extract a frame, at least one icon and at least one text corresponding to the image effect information from the memory 130 and display the extracted image effects. At step 1213, the control section 110 displays the extracted icon at a location corresponding to the preset position value, without displaying the original image. The control section 110 displays the extracted frame. Also, the control section 110 displays the extracted text in the preset color and size at a location corresponding to the preset position value.
  • When the user selects insertion of an image during the display of the image effects only, the control section 110 detects the selection at step 1214 and proceeds to step 1215 to display the types of images stored in the memory 130.
  • When the user selects one of the displayed types of images, the control section 110 detects the selection at step 1216 and proceeds to step 1217 to display the selected image with the image effects added. When the user selects two or more images, the control section 110 displays the two or more selected images in turn at every predetermined time interval with the image effects added. In accordance with an embodiment of the present invention, various image effects can be added to an existing image stored in a mobile terminal, thereby enabling various changes to the existing image. The user can add an image effect to any desired location of the image. It is also possible to add multiple image effects to a single image. Using the focus function, the user can selectively delete or modify any of the added image effects. The user can display the image edited with various image effects on the mobile terminal and store the image in an external device via a communication interface. The user can also send the edited image to the base station so that the image can be transmitted to other subscribers. It is possible to store only image information of the images with various image effects added to effectively use the memory. It is also possible to apply different images to preset image effects, thereby eliminating the inconvenience in adding the same image effects to each existing image.
  • Although embodiments of the present invention have been described for illustrative purposes, those skilled in the art will appreciate that various modifications, additions and substitutions are possible, without departing from the scope and spirit of the invention as disclosed in the accompanying claims, including the full scope of equivalents thereof.

Claims (1)

What is claimed is:
1. A method for adding image effects to an existing image in a mobile terminal, which comprises the steps of:
displaying an image;
selecting a frame and adding the selected frame to the image;
selecting at least one icon and adding the selected icon to the image; and
inputting at least one text message and adding the input text message to the image.
US14/524,582 2004-07-02 2014-10-27 Method for editing images in a mobile terminal Abandoned US20150043837A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US14/524,582 US20150043837A1 (en) 2004-07-02 2014-10-27 Method for editing images in a mobile terminal

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
KR20040051563 2004-07-02
KR10-2004-0051563 2004-07-02
KR10-2005-0040794 2005-05-16
KR1020050040794A KR100606076B1 (en) 2004-07-02 2005-05-16 Method for controlling image in wireless terminal
US11/171,364 US8872843B2 (en) 2004-07-02 2005-07-01 Method for editing images in a mobile terminal
US14/524,582 US20150043837A1 (en) 2004-07-02 2014-10-27 Method for editing images in a mobile terminal

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US11/171,364 Continuation US8872843B2 (en) 2004-07-02 2005-07-01 Method for editing images in a mobile terminal

Publications (1)

Publication Number Publication Date
US20150043837A1 true US20150043837A1 (en) 2015-02-12

Family

ID=34981178

Family Applications (2)

Application Number Title Priority Date Filing Date
US11/171,364 Expired - Fee Related US8872843B2 (en) 2004-07-02 2005-07-01 Method for editing images in a mobile terminal
US14/524,582 Abandoned US20150043837A1 (en) 2004-07-02 2014-10-27 Method for editing images in a mobile terminal

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US11/171,364 Expired - Fee Related US8872843B2 (en) 2004-07-02 2005-07-01 Method for editing images in a mobile terminal

Country Status (2)

Country Link
US (2) US8872843B2 (en)
EP (1) EP1612736B1 (en)

Families Citing this family (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8872843B2 (en) * 2004-07-02 2014-10-28 Samsung Electronics Co., Ltd. Method for editing images in a mobile terminal
KR100686019B1 (en) * 2005-08-26 2007-02-26 엘지전자 주식회사 Mobile communication terminal and method for sending images using the same
EP1771002B1 (en) * 2005-09-30 2017-12-27 LG Electronics Inc. Mobile video communication terminal
KR100736077B1 (en) * 2005-10-24 2007-07-06 삼성전자주식회사 Device and method for controlling text of data broadcasting
US8166418B2 (en) * 2006-05-26 2012-04-24 Zi Corporation Of Canada, Inc. Device and method of conveying meaning
JP4984975B2 (en) * 2007-03-02 2012-07-25 株式会社ニコン Camera and image processing program
KR100924689B1 (en) 2007-12-17 2009-11-03 한국전자통신연구원 Apparatus and method for transforming an image in a mobile device
JP4579316B2 (en) * 2008-06-30 2010-11-10 任天堂株式会社 IMAGING DEVICE, IMAGING SYSTEM, AND GAME DEVICE
US20120198386A1 (en) * 2011-01-31 2012-08-02 Nokia Corporation Causing display of thumbnail images
KR101501028B1 (en) * 2013-04-04 2015-03-12 박정환 Method and Apparatus for Generating and Editing a Detailed Image
US10824313B2 (en) 2013-04-04 2020-11-03 P.J. Factory Co., Ltd. Method and device for creating and editing object-inserted images
US20150172550A1 (en) * 2013-12-16 2015-06-18 Motorola Mobility Llc Display tiling for enhanced view modes
US20150195226A1 (en) * 2014-01-06 2015-07-09 Desiree Gina McDowell-White Interactive Picture Messaging System
US10481211B2 (en) * 2014-01-15 2019-11-19 Lat Enterprises, Inc. State-of-charge indicator
US10044888B2 (en) * 2014-08-06 2018-08-07 Kabushiki Kaisha Toshiba Image forming apparatus and control method thereof
US9652896B1 (en) 2015-10-30 2017-05-16 Snap Inc. Image based tracking in augmented reality systems
US9984499B1 (en) 2015-11-30 2018-05-29 Snap Inc. Image and point cloud based tracking and in augmented reality systems
US20170192651A1 (en) * 2015-12-30 2017-07-06 Facebook, Inc. Editing photos over an online social network
JP6762723B2 (en) * 2016-01-18 2020-09-30 キヤノン株式会社 Recording device, control method and program of recording device
CN107665087B (en) * 2016-07-28 2021-03-16 夏普株式会社 Image display device, image display method, and image display system
US10319149B1 (en) 2017-02-17 2019-06-11 Snap Inc. Augmented reality anamorphosis system
US10074381B1 (en) 2017-02-20 2018-09-11 Snap Inc. Augmented reality speech balloon system
US10387730B1 (en) * 2017-04-20 2019-08-20 Snap Inc. Augmented reality typography personalization system
US10740974B1 (en) 2017-09-15 2020-08-11 Snap Inc. Augmented reality system
US11250050B2 (en) 2018-03-01 2022-02-15 The Software Mackiev Company System for multi-tagging images
US20190272094A1 (en) * 2018-03-01 2019-09-05 Jack M. MINSKY System for multi-tagging images
US10997760B2 (en) 2018-08-31 2021-05-04 Snap Inc. Augmented reality anthropomorphization system

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6525836B2 (en) * 1996-01-31 2003-02-25 Fuji Photo Film Co., Ltd. Apparatus for and method of synthesizing image
US6587596B1 (en) * 2000-04-28 2003-07-01 Shutterfly, Inc. System and method of cropping an image
US20040093432A1 (en) * 2002-11-07 2004-05-13 Eastman Kodak Company Method and system for conducting image processing from a mobile client device
US6822756B1 (en) * 1996-07-29 2004-11-23 Eastman Kodak Company Method of combining two digital images
US7330100B2 (en) * 2003-10-31 2008-02-12 Benq Corporation Mobile device and related method for displaying text message with background images
US7391445B2 (en) * 2004-03-31 2008-06-24 Magix Ag System and method of creating multilayered digital images in real time
US20080247618A1 (en) * 2005-06-20 2008-10-09 Laine Andrew F Interactive diagnostic display system
US20090097723A1 (en) * 2007-10-15 2009-04-16 General Electric Company Method and system for visualizing registered images
US7522206B2 (en) * 2002-03-14 2009-04-21 Kyocera Corporation Photographed image display device and photographed image display method
US8872843B2 (en) * 2004-07-02 2014-10-28 Samsung Electronics Co., Ltd. Method for editing images in a mobile terminal
US20150063778A1 (en) * 2013-09-04 2015-03-05 Samsung Electronics Co., Ltd. Method for processing an image and electronic device thereof

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4785420A (en) * 1986-04-09 1988-11-15 Joyce Communications Systems, Inc. Audio/telephone communication system for verbally handicapped
US5742779A (en) * 1991-11-14 1998-04-21 Tolfa Corporation Method of communication using sized icons, text, and audio
WO1997019429A1 (en) * 1995-11-20 1997-05-29 Motorola Inc. Displaying graphic messages in a radio receiver
CA2193764A1 (en) * 1995-12-25 1997-06-25 Yasuyuki Mochizuki Selective call receiver
US5880740A (en) * 1996-07-12 1999-03-09 Network Sound & Light, Inc. System for manipulating graphical composite image composed of elements selected by user from sequentially displayed members of stored image sets
US6665008B1 (en) * 1997-07-15 2003-12-16 Silverbrook Research Pty Ltd Artcard for the control of the operation of a camera device
JPH11239371A (en) * 1998-02-23 1999-08-31 Nec Corp Communications equipment
US6567983B1 (en) * 1998-04-10 2003-05-20 Fuji Photo Film Co., Ltd. Electronic album producing and viewing system and method
US6621938B1 (en) * 1998-09-18 2003-09-16 Fuji Photo Film Co., Ltd. Image capture apparatus and method
JP3720230B2 (en) * 2000-02-18 2005-11-24 シャープ株式会社 Expression data control system, expression data control apparatus constituting the same, and recording medium on which the program is recorded
JP4553441B2 (en) * 2000-03-10 2010-09-29 富士フイルム株式会社 Mobile communication terminal
JP2001306342A (en) * 2000-04-20 2001-11-02 Victor Co Of Japan Ltd Method, system for processing effect of digital data in network, effect server, user terminal, recording medium in which effect processing program of digital data is recorded and recording medium in which processing program in effect server and user terminal is recorded
US7016869B1 (en) * 2000-04-28 2006-03-21 Shutterfly, Inc. System and method of changing attributes of an image-based product
JP3784289B2 (en) 2000-09-12 2006-06-07 松下電器産業株式会社 Media editing method and apparatus
US20020055992A1 (en) * 2000-11-08 2002-05-09 Lavaflow, Llp Method of providing a screen saver on a cellular telephone
US20040133924A1 (en) * 2000-12-15 2004-07-08 Wilkins David C. Techniques for syncronizing any of a plurality of associated multimedia assets in a distributed system
US6993553B2 (en) * 2000-12-19 2006-01-31 Sony Corporation Data providing system, data providing apparatus and method, data acquisition system and method, and program storage medium
KR100416991B1 (en) * 2001-01-11 2004-02-05 삼성전자주식회사 Video terminal apparatus having displaying virtual background and implementing method thereof
JP4596206B2 (en) 2001-04-18 2010-12-08 ソニー株式会社 Mobile communication system, image editing method, external display device, and mobile terminal device
US6889062B2 (en) * 2001-10-04 2005-05-03 Nokia Corporation System and protocol for providing pictures in wireless communication messages
EP1449172B1 (en) 2001-11-28 2008-12-10 Nokia Corporation Method for generating graphic representation in a mobile terminal
US7372470B2 (en) * 2001-11-29 2008-05-13 Lg Electronics Inc. System and method for transmitting and displaying messages in a mobile terminal
JP2003230117A (en) * 2002-01-31 2003-08-15 Nec Commun Syst Ltd Transmission system, transmission device, transmission scheme and transmission method for dynamic image data
US20030160824A1 (en) * 2002-02-28 2003-08-28 Eastman Kodak Company Organizing and producing a display of images, labels and custom artwork on a receiver
WO2003075146A1 (en) * 2002-03-05 2003-09-12 Sony Ericsson Mobile Communications Japan, Inc. Image processing device, image processing program, and image processing method
KR100566242B1 (en) 2002-07-19 2006-03-29 삼성전자주식회사 Apparatus and method for editing image on display screen of a mobile communication terminal equipment
US7007064B2 (en) * 2002-08-02 2006-02-28 Motorola, Inc. Method and apparatus for obtaining and managing wirelessly communicated content
US6916446B1 (en) * 2002-09-18 2005-07-12 The United States Of America As Represented By The Secretary Of The Navy Bioreactor method, apparatus and product thereby
US20040085360A1 (en) * 2002-10-31 2004-05-06 Hallmark Interactive, Llc Icon-based graphical user interface for text messaging
ATE542366T1 (en) * 2002-12-10 2012-02-15 Sony Ericsson Mobile Comm Ab EFFECTS GENERATION FOR IMAGES
AU2003293771A1 (en) 2002-12-10 2004-06-30 Sony Ericsson Mobile Communications Ab Creating effects for images
US7113809B2 (en) * 2002-12-19 2006-09-26 Nokia Corporation Apparatus and a method for providing information to a user
US7634138B2 (en) 2002-12-20 2009-12-15 Eastman Kodak Company Method for generating an image of a detected subject
US8269793B2 (en) * 2003-02-18 2012-09-18 Serverside Group Limited Apparatus and method for manipulating images
JP4374610B2 (en) * 2003-04-18 2009-12-02 カシオ計算機株式会社 Imaging apparatus, image data storage method, and program
JP2005012764A (en) * 2003-05-22 2005-01-13 Casio Comput Co Ltd Data communication apparatus, image transmitting method, and image transmitting program
US20040250205A1 (en) * 2003-05-23 2004-12-09 Conning James K. On-line photo album with customizable pages
JP2005039785A (en) * 2003-06-30 2005-02-10 Seiko Epson Corp Image processing apparatus, method and program
JP4071726B2 (en) * 2004-02-25 2008-04-02 シャープ株式会社 Portable information device, character display method in portable information device, and program for realizing the method
US7730398B2 (en) * 2005-10-25 2010-06-01 Research In Motion Limited Image stitching for mobile electronic devices

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6525836B2 (en) * 1996-01-31 2003-02-25 Fuji Photo Film Co., Ltd. Apparatus for and method of synthesizing image
US6822756B1 (en) * 1996-07-29 2004-11-23 Eastman Kodak Company Method of combining two digital images
US6587596B1 (en) * 2000-04-28 2003-07-01 Shutterfly, Inc. System and method of cropping an image
US7522206B2 (en) * 2002-03-14 2009-04-21 Kyocera Corporation Photographed image display device and photographed image display method
US20040093432A1 (en) * 2002-11-07 2004-05-13 Eastman Kodak Company Method and system for conducting image processing from a mobile client device
US7330100B2 (en) * 2003-10-31 2008-02-12 Benq Corporation Mobile device and related method for displaying text message with background images
US7391445B2 (en) * 2004-03-31 2008-06-24 Magix Ag System and method of creating multilayered digital images in real time
US8872843B2 (en) * 2004-07-02 2014-10-28 Samsung Electronics Co., Ltd. Method for editing images in a mobile terminal
US20080247618A1 (en) * 2005-06-20 2008-10-09 Laine Andrew F Interactive diagnostic display system
US20090097723A1 (en) * 2007-10-15 2009-04-16 General Electric Company Method and system for visualizing registered images
US20150063778A1 (en) * 2013-09-04 2015-03-05 Samsung Electronics Co., Ltd. Method for processing an image and electronic device thereof

Also Published As

Publication number Publication date
EP1612736A2 (en) 2006-01-04
EP1612736A3 (en) 2011-03-09
US8872843B2 (en) 2014-10-28
US20060001758A1 (en) 2006-01-05
EP1612736B1 (en) 2018-09-05

Similar Documents

Publication Publication Date Title
US8872843B2 (en) Method for editing images in a mobile terminal
US9661265B2 (en) Method of conveying emotion in video telephone mode and wireless terminal implementing the same
US20070070181A1 (en) Method and apparatus for controlling image in wireless terminal
US20070075969A1 (en) Method for controlling display of image according to movement of mobile terminal
US20050153746A1 (en) Mobile terminal capable of editing images and image editing method using same
JP2002354436A (en) Video telephone apparatus
KR100606076B1 (en) Method for controlling image in wireless terminal
US7486821B2 (en) Method for recognizing characters in a portable terminal having an image input unit
US20060262142A1 (en) Method for displaying special effects in image data and a portable terminal implementing the same
US9477688B2 (en) Method for searching for a phone number in a wireless terminal
US7606432B2 (en) Apparatus and method for providing thumbnail image data on a mobile terminal
US20070044021A1 (en) Method for performing presentation in video telephone mode and wireless terminal implementing the same
KR100640501B1 (en) Method for displaying picture stored in mobile communication terminal
KR100438540B1 (en) Image transmitting/receiving method and system for mobile communication terminal equipment
US20050280731A1 (en) Apparatus and method for displaying images in a portable terminal comprising a camera and two display units
KR100617736B1 (en) Method for zooming of picture in wireless terminal
EP1708442A1 (en) Method for transmitting a message with an attachment comprising a photograph in a wireless terminal
KR100547769B1 (en) How to send a phone number during a call on your mobile device
US20060244858A1 (en) Method and system for changing image state in wireless terminal
US20050135780A1 (en) Apparatus and method for displaying moving picture in a portable terminal
KR100630078B1 (en) Method for reorganizing image data in the mobile terminal
KR100630184B1 (en) Method for calling using message in wireless terminal
KR20040029751A (en) Method of compounding background picture and taken picture in wireless telephone having camera
KR20050053887A (en) Method for searching data in wireless terminal
KR101023301B1 (en) Method for sending and editing mp3 file in wireless terminal

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION