US20130159895A1 - Method and system for interactive cosmetic enhancements interface - Google Patents

Method and system for interactive cosmetic enhancements interface Download PDF

Info

Publication number
US20130159895A1
US20130159895A1 US13/709,750 US201213709750A US2013159895A1 US 20130159895 A1 US20130159895 A1 US 20130159895A1 US 201213709750 A US201213709750 A US 201213709750A US 2013159895 A1 US2013159895 A1 US 2013159895A1
Authority
US
United States
Prior art keywords
product
beauty
computing device
user
display screen
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/709,750
Inventor
Parham Aarabi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Modiface Inc
Original Assignee
Modiface Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Modiface Inc filed Critical Modiface Inc
Priority to US13/709,750 priority Critical patent/US20130159895A1/en
Assigned to MODIFACE INC. reassignment MODIFACE INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AARABI, PARHAM
Publication of US20130159895A1 publication Critical patent/US20130159895A1/en
Priority to US16/141,245 priority patent/US10956009B2/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04845Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06QINFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES; SYSTEMS OR METHODS SPECIALLY ADAPTED FOR ADMINISTRATIVE, COMMERCIAL, FINANCIAL, MANAGERIAL OR SUPERVISORY PURPOSES, NOT OTHERWISE PROVIDED FOR
    • G06Q30/00Commerce

Definitions

  • the present disclosure relates generally to a system and method for interactive cosmetics enhancements interface, providing for a user to browse, virtually “try on”, find and purchase beauty products, among other aspects.
  • Digitally-based detection of facial features based on an accessed digital photograph is finding its way into a variety of internet applications as well as modifications of the digital photograph using various facial visual effects.
  • a streamlined process allowing shoppers/customers to save time, and further gain the benefit of expert opinions regarding recommendations of beauty products to personal preferences and other unique situational context may advantageously provide time-saving benefits to the on-line Internet shopper, enabling them to “try on”, in a virtual context via simulation, any beauty article or product using a personal digital photograph, prior to making a purchase.
  • a method executed in a computing device having a display screen and a processor, of providing a cosmetics enhancement interface comprising showing, at the display screen: a digital photograph including a plurality of facial features; an interactive dialog portion reflecting a conversational input received and a subsequent response provided thereto from the computing device; and a product display portion; receiving an inquiry, as reflected in the interactive dialog portion, related to a cosmetic product for application onto at least one facial feature of the plurality of facial features, receiving a selection of the cosmetic product based on a matching to the at least one facial feature according to a predefined rule; displaying, at the product display portion, a product representation associated with the selected cosmetic product; receiving an update request; and updating the digital photograph showing a modification to the at least one facial feature on the display screen by simulating application of the selected cosmetic product thereon.
  • the conversational input comprises an audible voice input.
  • the conversational input comprises a text input.
  • the at least one facial feature consists of one of a nose, at least one lip of a mouth, an eyebrow, an eyelid, a facial cheek, hair and an ear.
  • the cosmetic product consists of one of lipstick, eye shadow, hair color, blush makeup, a cosmetic eye lash, and an article of jewellery.
  • the cosmetics enhancement interface comprises a purchasing tool portion to consummate an e-commerce transaction.
  • the cosmetics enhancement interface further comprises a product advertisement section showing offers for purchase.
  • the computing device consists of one of a desktop computer, a portable computer, a mobile wireless smartphone, and a kiosk.
  • the computing device is communicatively coupled to a beauty products database in another variation.
  • the predefined rule for the matching is at least partly based on an event type specified in the conversational input.
  • the computing device further includes a memory storing a plurality of earlier conversational inputs of a pre-identified user.
  • the earlier conversational inputs may be used to create a beauty preferences profile of the pre-identified user.
  • the product representation includes a plurality of color options for selection of at least one option therefrom.
  • the product representation includes a prioritized list of product options for selection of at least one product option therefrom.
  • the product option selections may be used to create a beauty preferences profile of a pre-identified user.
  • the predefined rule for the matching is at least partly based on the beauty preferences profile of the pre-identified user.
  • the matching may be performed in the processor of the computing device.
  • the cosmetics enhancement interface further may comprise a video display portion.
  • the cosmetics enhancement interface may further comprise a plurality of hyperlinks to access websites associated with respective cosmetic products.
  • FIG. 1 shows an interactive interface according to an embodiment of the invention
  • FIG. 2 shows the interactive interface displaying an embodiment of the interactive conversations with a user
  • FIG. 3 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving an event specified by user
  • FIG. 4 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving a personal article specified by a user for a style-based matching
  • FIG. 5 shows the interactive interface displaying an embodiment of the interactive conversations involving an instructional video related to a product recommendation
  • FIG. 6 shows the interactive interface displaying an embodiment of the interactive conversations involving a purchasing tool for entering into and consummating an ecommerce transaction related to a product recommendation.
  • This description pertains to a system for assisting a beauty consumer (i.e. “user”) to browse, explore, virtually try-on, and/or purchase beauty products, wherein a user starts by asking a question or request related to her beauty interests or needs. Then, based on the user's initial question or request, the system will search the Internet and its own internal database for relevant information about the product and display the results to the user.
  • the results could be in the form of a before and after photo, a representative image, a video, a set of links, a scrolling list of entries, or any other mechanism for displaying visual search results to the user.
  • the user can then virtually try the beauty products on her digital photo, or ask more questions via an interactive conversation with the system. The user may then purchase the selected beauty products through an e-commerce website, mobile site, or application.
  • FIG. 1 shows an interactive interface according to an embodiment of the invention.
  • FIG. 1 shows an example of the user interface for the system.
  • the large menu at the top of FIG. 1 is the Virtual Makeover tool where the effect of applying different makeup products is simulated.
  • the white box at the bottom of FIG. 1 is the conversation tool provided to the user, which is intended to let the user have a conversation to the system about her beauty needs.
  • the virtual makeover tool is a system that allows the user to visualize any products, or effects on her own digital photo. This process is done completely automatically, meaning that all the interactions that user will have with the virtual makeover system, is just to provide it with a photo, and select different beauty products, or effects. As soon as the user selects a product or effect, her photo will automatically be updated accordingly.
  • the conversational tool is an system which allows the user to have conversations to the system and get directions.
  • the conversational tool can work based on either voice or text input from the user. In either case, the input sentences from the user are detected and analyzed. Then the system uses the large data base of the information on the beauty products and their features, and provides the user with responses and guidelines.
  • a user starts by providing a digital photo of herself to the system, as shown in FIG. 1 .
  • This step could be done in different ways depending on the system platform, which includes web, mobile, tablet, or kiosk.
  • This step is done because the system is then going to visualize the beauty products by means of automatically simulating the effect of applying the make up products on the user's face.
  • the user then proceeds by starting a conversation with the system, related to her beauty interests or needs. For example, the user can start with the request:
  • FIG. 1 What happens is that the virtual makeover tool processes the digital photo of the user, and finds the specific characteristics of the user's eyes. Then, based on those features, and the color of the eyes mentioned by the user, the system automatically detects which eye shadows best match to the user's eyes. Finally the system sorts the palette of eye shadows for the user and provides her with the option to select and try any of them. As soon as the user selects one of the suggested colors for the eyeshadow, an eyeshadow with the same color will immediately be applied to her digital photo on the virtual makeover tool, as shown in FIG. 1 . This way, the user can try as many colors as she wants and select the one she is interested in.
  • the system is designed in a way that it is not limited to request and questions only. Instead, the user can also describe a situation to the system, and the system will guide them based on their situation. An example of this is shown in FIG. 2 . In this example, the user has explained her situation to the system as:
  • FIG. 2 further shows the interactive interface displaying an embodiment of the interactive conversations with a user.
  • FIG. 3 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving an event specified by user.
  • the system asks the user for more information about her situation, An example of this is shown if FIG. 3 .
  • the user has described her situation as:
  • the system then asks the following question in order to have more information about the preferences of the user:
  • the user's answer could be something like:
  • the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving a personal article specified by a user for a style-based matching.
  • FIG. 5 shows the interactive interface displaying an embodiment of the interactive conversations involving an instructional video related to a product recommendation.
  • a component in the system which allows browsing the web based on the user's question and presenting the results to the user. Therefore, all the information about the beauty products on the web is used by the system in addition to the system's database for beauty products.
  • An example of this component is shown in FIG. 5 .
  • the user has asked the question:
  • FIG. 6 shows the interactive interface displaying an embodiment of the interactive conversations involving a purchasing tool for entering into and consummating an e-commerce transaction related to a product recommendation.
  • the disclosure herein is not limited to the interactive conversations provided in the examples above, but it includes all kind of questions and conversations about the features, pricing, usage, and all other details of the beauty products.
  • the overall results include the details of the conversations, the details of the products tried and selected by the users.
  • the system then utilizes these overall results, in order to update the large database of beauty products and their features that it uses to communicate to the users.
  • a component is also provided in the updating part of the system that keeps track of all the beauty products in the market. This component then updates the database of beauty products in the system according to the products currently available in the market. This way, the system is always synchronized with the most up to date beauty products.
  • the system collects all the beauty products selected by the user during a conversation and displays them on the interface as shown in FIG. 6 .
  • the conversation will continue until the user is satisfied with the collection of products that are visualized on her photo.
  • there is a component in the system that makes it possible for the user to purchase them through an e-commerce web or mobile site.
  • the disclosure herein is not limited to the makeup products on the face only. Rather, it pertains to other kinds of beauty products, such as but not limited to hair color and dressing products. For all other kinds of beauty products, the user can have conversations to the system about the specific product, and get directions for application or use.

Abstract

Provided is a method and system of providing a cosmetics enhancement interface. The method comprises showing, at the display screen of a computing device having a memory and a processor: a digital photograph including facial features; an interactive dialog portion reflecting a conversational input received and a subsequent response provided thereto from the computing device, and a product display portion; receiving an inquiry, as reflected in the interactive dialog portion, related to a cosmetic product for application onto a selected facial feature; receiving a selection of the cosmetic product based on a matching to the at least one facial feature according to a predefined rule; displaying, at the product display portion, a product representation associated with the selected cosmetic product; receiving an update request, and updating the digital photograph showing a modification to the facial feature on the display screen by simulating application of the selected cosmetic product thereon.

Description

  • This Application claims priority to U.S. Provisional Patent Application No. 61/630,556 filed Dec. 15, 2011, and incorporates by reference the disclosure of said Application No. 61/630,556 in the entirety.
  • FIELD
  • The present disclosure relates generally to a system and method for interactive cosmetics enhancements interface, providing for a user to browse, virtually “try on”, find and purchase beauty products, among other aspects.
  • BACKGROUND
  • Digitally-based detection of facial features based on an accessed digital photograph is finding its way into a variety of internet applications as well as modifications of the digital photograph using various facial visual effects.
  • Trying on and buying beauty products can be a significant chore when it involves travelling to a store or boutique, then typically trying several beauty products or articles before finding a satisfactory fit in accordance with subjective tastes and prevailing trendiness. Personally travelling to, and taking the time to try several products in the store or boutique can result in a significant inconvenience to shoppers. A streamlined process allowing shoppers/customers to save time, and further gain the benefit of expert opinions regarding recommendations of beauty products to personal preferences and other unique situational context may advantageously provide time-saving benefits to the on-line Internet shopper, enabling them to “try on”, in a virtual context via simulation, any beauty article or product using a personal digital photograph, prior to making a purchase.
  • SUMMARY OF THE INVENTION
  • Provided is a method, executed in a computing device having a display screen and a processor, of providing a cosmetics enhancement interface comprising showing, at the display screen: a digital photograph including a plurality of facial features; an interactive dialog portion reflecting a conversational input received and a subsequent response provided thereto from the computing device; and a product display portion; receiving an inquiry, as reflected in the interactive dialog portion, related to a cosmetic product for application onto at least one facial feature of the plurality of facial features, receiving a selection of the cosmetic product based on a matching to the at least one facial feature according to a predefined rule; displaying, at the product display portion, a product representation associated with the selected cosmetic product; receiving an update request; and updating the digital photograph showing a modification to the at least one facial feature on the display screen by simulating application of the selected cosmetic product thereon.
  • In an embodiment the conversational input comprises an audible voice input.
  • In another embodiment the conversational input comprises a text input.
  • In one variation of the method, the at least one facial feature consists of one of a nose, at least one lip of a mouth, an eyebrow, an eyelid, a facial cheek, hair and an ear.
  • In another variation, the cosmetic product consists of one of lipstick, eye shadow, hair color, blush makeup, a cosmetic eye lash, and an article of jewellery.
  • In a further embodiment, the cosmetics enhancement interface comprises a purchasing tool portion to consummate an e-commerce transaction.
  • In yet another embodiment, the cosmetics enhancement interface further comprises a product advertisement section showing offers for purchase.
  • In one variation, the computing device consists of one of a desktop computer, a portable computer, a mobile wireless smartphone, and a kiosk.
  • The computing device is communicatively coupled to a beauty products database in another variation.
  • Further to the method provided, the predefined rule for the matching is at least partly based on an event type specified in the conversational input.
  • In an embodiment, the computing device further includes a memory storing a plurality of earlier conversational inputs of a pre-identified user. The earlier conversational inputs may be used to create a beauty preferences profile of the pre-identified user.
  • In another embodiment, the product representation includes a plurality of color options for selection of at least one option therefrom.
  • In yet another variation, the product representation includes a prioritized list of product options for selection of at least one product option therefrom. The product option selections may be used to create a beauty preferences profile of a pre-identified user.
  • In a further embodiment, the predefined rule for the matching is at least partly based on the beauty preferences profile of the pre-identified user.
  • In yet another embodiment, the matching may be performed in the processor of the computing device.
  • Yet further, the cosmetics enhancement interface further may comprise a video display portion.
  • In yet another embodiment, the cosmetics enhancement interface may further comprise a plurality of hyperlinks to access websites associated with respective cosmetic products.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments will now be described by way of example only, with reference to the following drawings in which:
  • FIG. 1 shows an interactive interface according to an embodiment of the invention;
  • FIG. 2 shows the interactive interface displaying an embodiment of the interactive conversations with a user;
  • FIG. 3 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving an event specified by user;
  • FIG. 4 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving a personal article specified by a user for a style-based matching;
  • FIG. 5 shows the interactive interface displaying an embodiment of the interactive conversations involving an instructional video related to a product recommendation; and
  • FIG. 6 shows the interactive interface displaying an embodiment of the interactive conversations involving a purchasing tool for entering into and consummating an ecommerce transaction related to a product recommendation.
  • DETAILED DESCRIPTION
  • This description pertains to a system for assisting a beauty consumer (i.e. “user”) to browse, explore, virtually try-on, and/or purchase beauty products, wherein a user starts by asking a question or request related to her beauty interests or needs. Then, based on the user's initial question or request, the system will search the Internet and its own internal database for relevant information about the product and display the results to the user. The results could be in the form of a before and after photo, a representative image, a video, a set of links, a scrolling list of entries, or any other mechanism for displaying visual search results to the user. The user can then virtually try the beauty products on her digital photo, or ask more questions via an interactive conversation with the system. The user may then purchase the selected beauty products through an e-commerce website, mobile site, or application.
  • FIG. 1 shows an interactive interface according to an embodiment of the invention.
  • A system is designed for assisting a user to browse, explore, virtually try on, and find the best beauty products. FIG. 1 shows an example of the user interface for the system. The large menu at the top of FIG. 1 is the Virtual Makeover tool where the effect of applying different makeup products is simulated. The white box at the bottom of FIG. 1 is the conversation tool provided to the user, which is intended to let the user have a conversation to the system about her beauty needs.
  • The virtual makeover tool is a system that allows the user to visualize any products, or effects on her own digital photo. This process is done completely automatically, meaning that all the interactions that user will have with the virtual makeover system, is just to provide it with a photo, and select different beauty products, or effects. As soon as the user selects a product or effect, her photo will automatically be updated accordingly.
  • The conversational tool is an system which allows the user to have conversations to the system and get directions. The conversational tool can work based on either voice or text input from the user. In either case, the input sentences from the user are detected and analyzed. Then the system uses the large data base of the information on the beauty products and their features, and provides the user with responses and guidelines.
  • A user starts by providing a digital photo of herself to the system, as shown in FIG. 1. This step could be done in different ways depending on the system platform, which includes web, mobile, tablet, or kiosk. This step is done because the system is then going to visualize the beauty products by means of automatically simulating the effect of applying the make up products on the user's face.
  • The user then proceeds by starting a conversation with the system, related to her beauty interests or needs. For example, the user can start with the request:
  • I have blue eyes and I am looking for a beautiful eye shadow for my eyes.
  • Then, the system will respond back to the user with:
  • Okay. I can help you find an Eyeshadow. I just sorted the palette for you.
  • This is shown in FIG. 1. What happens is that the virtual makeover tool processes the digital photo of the user, and finds the specific characteristics of the user's eyes. Then, based on those features, and the color of the eyes mentioned by the user, the system automatically detects which eye shadows best match to the user's eyes. Finally the system sorts the palette of eye shadows for the user and provides her with the option to select and try any of them. As soon as the user selects one of the suggested colors for the eyeshadow, an eyeshadow with the same color will immediately be applied to her digital photo on the virtual makeover tool, as shown in FIG. 1. This way, the user can try as many colors as she wants and select the one she is interested in.
  • The system is designed in a way that it is not limited to request and questions only. Instead, the user can also describe a situation to the system, and the system will guide them based on their situation. An example of this is shown in FIG. 2. In this example, the user has explained her situation to the system as:
  • I am going to a party tonight, and I was wondering what color is the best for my lipstick.
  • The system has then reacted with the response:
  • For a party, I would recommend darker and more vibrant colors. I have sorted the lipstick products for you.
  • The lipstick products are then sorted accordingly, as shown in FIG. 2. In this case, the system does not only act as a simple automated chatting tool, but it also has the ability to understand and analyze the exact beauty needs of the user and provide her with a professional recommendation via the interactive conversation tool. FIG. 2 further shows the interactive interface displaying an embodiment of the interactive conversations with a user.
  • FIG. 3 shows the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving an event specified by user.
  • There are some situations that the system might be able to provide a much better response to the user if it knows some more details about the user's situation. In this case, the system asks the user for more information about her situation, An example of this is shown if FIG. 3. The user has described her situation as:
  • I am going to a wedding. Could you help me choose a good blush?
  • The system then asks the following question in order to have more information about the preferences of the user:
  • Sure, what is the color of your dress for the wedding?
  • The user's answer could be something like:
  • It is going to be purple.
  • Now the system knows the exact preferences of the user and can narrow down the list of the beauty products for this specific user. So the system responds with:
  • Perfect, I have sorted the blush products that best match to your dress color for you.
  • And it sorts the colors of the blush products that best match the color of the user's dress, as in FIG. 4, the interactive interface displaying an embodiment of the interactive conversations involving a situational context involving a personal article specified by a user for a style-based matching.
  • FIG. 5 shows the interactive interface displaying an embodiment of the interactive conversations involving an instructional video related to a product recommendation. There is a component in the system which allows browsing the web based on the user's question and presenting the results to the user. Therefore, all the information about the beauty products on the web is used by the system in addition to the system's database for beauty products. An example of this component is shown in FIG. 5. In FIG. 5, the user has asked the question:
  • How do I apply a smoky eye effect? Could you help me with that?
  • The system has then provided the response:
  • Sure, here is a video that might be helpful.
  • And it provides video of the specific brand for the user to watch.
  • FIG. 6 shows the interactive interface displaying an embodiment of the interactive conversations involving a purchasing tool for entering into and consummating an e-commerce transaction related to a product recommendation.
  • The disclosure herein is not limited to the interactive conversations provided in the examples above, but it includes all kind of questions and conversations about the features, pricing, usage, and all other details of the beauty products.
  • There is a component in the interactive conversational system, which keeps track of the whole conversation, as well as the products already selected by the user. So the details of the conversation and the make up products the user has tried and selected are all stored. This information is considered when providing responses to the user for the next steps. Therefore, when the system is about to provide the response to the user's question, it not only uses its own large database of beauty products and their features, but it also uses the history of the conversation it has had to the current user. So for example, if the user has selected a green blush for her face, and now she is looking for a lipstick, then the system can take this into account for recommending a lipstick color to the user.
  • There is a component of the system that collects the overall results of the conversations with the users, The overall results include the details of the conversations, the details of the products tried and selected by the users. The system then utilizes these overall results, in order to update the large database of beauty products and their features that it uses to communicate to the users.
  • A component is also provided in the updating part of the system that keeps track of all the beauty products in the market. This component then updates the database of beauty products in the system according to the products currently available in the market. This way, the system is always synchronized with the most up to date beauty products.
  • The system collects all the beauty products selected by the user during a conversation and displays them on the interface as shown in FIG. 6. The conversation will continue until the user is satisfied with the collection of products that are visualized on her photo. At this point, there is a component in the system that makes it possible for the user to purchase them through an e-commerce web or mobile site.
  • The disclosure herein is not limited to the makeup products on the face only. Rather, it pertains to other kinds of beauty products, such as but not limited to hair color and dressing products. For all other kinds of beauty products, the user can have conversations to the system about the specific product, and get directions for application or use.
  • Although the invention has been described with reference to specific exemplary embodiments in the disclosure herein, varying modifications thereof will be apparent to those skilled in the art without departing from the scope of the invention as defined by the appended claims.

Claims (20)

What is claimed is:
1. A method, executed in a computing device having a display screen and a processor, of providing a cosmetics enhancement interface comprising:
showing, at the display screen:
a digital photograph including a plurality of facial features;
an interactive dialog portion reflecting a conversational input received and a subsequent response provided thereto from the computing device; and
a product display portion;
receiving an inquiry, as reflected in the interactive dialog portion, related to a cosmetic product for application onto at least one facial feature of the plurality of facial features;
receiving a selection of the cosmetic product based on a matching to the at least one facial feature according to a predefined rule;
displaying, at the product display portion, a product representation associated with the selected cosmetic product;
receiving an update request; and
updating the digital photograph showing a modification to the at least one facial feature on the display screen by simulating application of the selected cosmetic product thereon.
2. The method of claim 1 wherein the conversational input comprises an audible voice input.
3. The method of claim 1 wherein the conversational input comprises a text input.
4. The method of claim 1 wherein the at least one facial feature consists of one of a nose, at least a lip color, an eyebrow, an eyelid, a facial cheek, a skin color, a skin texture, a hair color and an ear.
5. The method of claim 1 wherein the cosmetic product consists of one of lipstick, eye shadow, hair color, blush makeup, a cosmetic eye lash, and an article of jewellery.
6. The method of claim 1 wherein the cosmetics enhancement interface further comprises a purchasing tool portion to enter into and consummate an e-commerce transaction.
7. The method of claim 1 wherein the cosmetics enhancement interface further comprises a product advertisement section showing offers for purchase.
8. The method of claim 1 wherein the computing device consists of one of a desktop computer, a portable computer, a mobile wireless smartphone, and a kiosk.
9. The method of claim 1 wherein the computing device is communicatively coupled to a beauty products database.
10. The method of claim 1 wherein the predefined rule for the matching is at least partly based on an event type specified in the conversational input.
11. The method of claim 1 wherein the computing device further includes a memory storing a plurality of earlier conversational inputs of a pre-identified user.
12. The method of claim 11 wherein the plurality of earlier conversational inputs is used to create a beauty preferences profile of the pre-identified user.
13. The method of claim 1 wherein the product representation includes a plurality of color options for selection of at least one option therefrom.
14. The method of claim 1 wherein the product representation includes a prioritized list of product options for selection of at least one product option therefrom.
15. The method of claim 14 wherein a plurality of the product option selections is used to create a beauty preferences profile of a pre-identified user.
16. The method of claim 15 wherein the predefined rule for the matching is at least partly based on the beauty preferences profile or the pre-identified user.
17. The method of claim 15 wherein the matching is performed in the processor of the computing device.
18. The method of claim 1 wherein the cosmetics enhancement interface further comprises a video display portion.
19. The method of claim 1 wherein the cosmetics enhancement interface further comprises a plurality of hyperlinks to access websites associated with respective cosmetic products.
20. A method, executed in a computing device having a display screen and a processor, of providing a virtual beauty enhancement comprising:
showing, at the display screen:
a digital photograph including a plurality of facial features;
an interactive dialog portion reflecting a conversational input received and a subsequent response provided thereto from the computing device; and
a beauty effect display portion;
receiving an inquiry related to a selected beauty effect for application onto at least one facial feature of the plurality of facial features;
displaying, at the beauty effect portion, a product representation associated with the selected beauty effect; and
updating the digital photograph showing a modification to the at least one facial feature on the display screen by simulating application thereon of the selected beauty effect.
US13/709,750 2011-12-15 2012-12-10 Method and system for interactive cosmetic enhancements interface Abandoned US20130159895A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US13/709,750 US20130159895A1 (en) 2011-12-15 2012-12-10 Method and system for interactive cosmetic enhancements interface
US16/141,245 US10956009B2 (en) 2011-12-15 2018-09-25 Method and system for interactive cosmetic enhancements interface

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201161630556P 2011-12-15 2011-12-15
US13/709,750 US20130159895A1 (en) 2011-12-15 2012-12-10 Method and system for interactive cosmetic enhancements interface

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US16/141,245 Continuation US10956009B2 (en) 2011-12-15 2018-09-25 Method and system for interactive cosmetic enhancements interface

Publications (1)

Publication Number Publication Date
US20130159895A1 true US20130159895A1 (en) 2013-06-20

Family

ID=48611557

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/709,750 Abandoned US20130159895A1 (en) 2011-12-15 2012-12-10 Method and system for interactive cosmetic enhancements interface
US16/141,245 Active 2033-02-27 US10956009B2 (en) 2011-12-15 2018-09-25 Method and system for interactive cosmetic enhancements interface

Family Applications After (1)

Application Number Title Priority Date Filing Date
US16/141,245 Active 2033-02-27 US10956009B2 (en) 2011-12-15 2018-09-25 Method and system for interactive cosmetic enhancements interface

Country Status (1)

Country Link
US (2) US20130159895A1 (en)

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104408415A (en) * 2014-11-24 2015-03-11 苏州福丰科技有限公司 Detection method based on human ear complexion
CN104461298A (en) * 2015-01-19 2015-03-25 中国人民解放军第三〇七医院 Screen capturing method and screen capturing tool
US9460462B1 (en) * 2012-05-22 2016-10-04 Image Metrics Limited Monetization using video-based simulation of cosmetic products
CN106649465A (en) * 2016-09-26 2017-05-10 珠海格力电器股份有限公司 Recommendation and acquisition method and device of cosmetic information
CN107015745A (en) * 2017-05-19 2017-08-04 广东小天才科技有限公司 Screen operating method, device, terminal device and computer-readable recording medium
DE102016204983A1 (en) 2016-03-24 2017-09-28 Bayerische Motoren Werke Aktiengesellschaft Arrangement, means of transport and method for the cosmetic assistance of a user
CN107463936A (en) * 2016-06-02 2017-12-12 宗经投资股份有限公司 Automatic face makeup method
CN107463373A (en) * 2017-07-10 2017-12-12 北京小米移动软件有限公司 The management method and device of picture U.S. face method, good friend's face value
CN109407912A (en) * 2017-08-16 2019-03-01 丽宝大数据股份有限公司 Electronic device and its offer examination adornment information approach
WO2019220208A1 (en) * 2018-05-16 2019-11-21 Matthewman Richard John Systems and methods for providing a style recommendation
US10939742B2 (en) 2017-07-13 2021-03-09 Shiseido Company, Limited Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and a method for recommending makeup
US10956009B2 (en) 2011-12-15 2021-03-23 L'oreal Method and system for interactive cosmetic enhancements interface
US11257142B2 (en) 2018-09-19 2022-02-22 Perfect Mobile Corp. Systems and methods for virtual application of cosmetic products based on facial identification and corresponding makeup information
US11587358B2 (en) 2020-03-26 2023-02-21 Panasonic Avionics Corporation Managing content on in-flight entertainment platforms

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI708183B (en) * 2019-03-29 2020-10-21 麗寶大數據股份有限公司 Personalized makeup information recommendation method
US10866716B2 (en) * 2019-04-04 2020-12-15 Wheesearch, Inc. System and method for providing highly personalized information regarding products and services
EP3965611A1 (en) * 2019-05-06 2022-03-16 CareOS Smart mirror system and methods of use thereof
FR3110737B1 (en) 2020-05-20 2023-06-02 Sagad Intelligent system allowing a skin test then a formulation and a manufacturing of custom-made cosmetics.
US20210409614A1 (en) * 2020-06-29 2021-12-30 Snap Inc. Generating augmented reality content based on user-selected product data

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6293284B1 (en) * 1999-07-07 2001-09-25 Division Of Conopco, Inc. Virtual makeover
US20030063794A1 (en) * 2001-10-01 2003-04-03 Gilles Rubinstenn Analysis using a three-dimensional facial image
US20050203724A1 (en) * 2000-06-27 2005-09-15 Rami Orpaz Make-up and fashion accessory display and marketing system and method
US20130300761A1 (en) * 2010-11-12 2013-11-14 Colormodules Inc. Method and system for color matching and color recommendation

Family Cites Families (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3912834B2 (en) * 1997-03-06 2007-05-09 有限会社開発顧問室 Face image correction method, makeup simulation method, makeup method, makeup support apparatus, and foundation transfer film
US5924426A (en) 1997-04-17 1999-07-20 Galazin; Norma Cosmetic personal color analysis method and kit using value scale, colors and charts
US6144938A (en) 1998-05-01 2000-11-07 Sun Microsystems, Inc. Voice user interface with personality
US6338349B1 (en) 1998-12-18 2002-01-15 L'oreal Method and system for providing customized color cosmetics
JP4396873B2 (en) 1999-10-01 2010-01-13 株式会社資生堂 How to choose lipstick or eye shadow
JP3444486B2 (en) 2000-01-26 2003-09-08 インターナショナル・ビジネス・マシーンズ・コーポレーション Automatic voice response system and method using voice recognition means
US7079158B2 (en) * 2000-08-31 2006-07-18 Beautyriot.Com, Inc. Virtual makeover system and method
GB2372864B (en) 2001-02-28 2005-09-07 Vox Generation Ltd Spoken language interface
US20030065552A1 (en) 2001-10-01 2003-04-03 Gilles Rubinstenn Interactive beauty analysis
US20030065526A1 (en) 2001-10-01 2003-04-03 Daniela Giacchetti Historical beauty record
US7437344B2 (en) * 2001-10-01 2008-10-14 L'oreal S.A. Use of artificial intelligence in providing beauty advice
US20040162724A1 (en) 2003-02-11 2004-08-19 Jeffrey Hill Management of conversations
WO2005122145A1 (en) 2004-06-08 2005-12-22 Metaphor Solutions, Inc. Speech recognition dialog management
US20060110417A1 (en) * 2004-11-24 2006-05-25 Lori Hamlin Beauty products and methods
US20060122834A1 (en) 2004-12-03 2006-06-08 Bennett Ian M Emotion detection device & method for use in distributed systems
US8185399B2 (en) 2005-01-05 2012-05-22 At&T Intellectual Property Ii, L.P. System and method of providing an automated data-collection in spoken dialog systems
US7930182B2 (en) 2005-03-15 2011-04-19 Nuance Communications, Inc. Computer-implemented tool for creation of speech application code and associated functional specification
US20060215824A1 (en) 2005-03-28 2006-09-28 David Mitby System and method for handling a voice prompted conversation
US20060217978A1 (en) 2005-03-28 2006-09-28 David Mitby System and method for handling information in a voice recognition automated conversation
US8204751B1 (en) 2006-03-03 2012-06-19 At&T Intellectual Property Ii, L.P. Relevance recognition for a human machine dialog system contextual question answering based on a normalization of the length of the user input
US8660319B2 (en) * 2006-05-05 2014-02-25 Parham Aarabi Method, system and computer program product for automatic and semi-automatic modification of digital images of faces
WO2007140609A1 (en) * 2006-06-06 2007-12-13 Moreideas Inc. Method and system for image and video analysis, enhancement and display for communication
TWI512693B (en) 2008-03-06 2015-12-11 Univ Nat Kaohsiung 1St Univ Sc Interactive conversation- learning system
US20090231356A1 (en) * 2008-03-17 2009-09-17 Photometria, Inc. Graphical user interface for selection of options from option groups and methods relating to same
GB2458388A (en) * 2008-03-21 2009-09-23 Dressbot Inc A collaborative online shopping environment, virtual mall, store, etc. in which payments may be shared, products recommended and users modelled.
US7962578B2 (en) 2008-05-21 2011-06-14 The Delfin Project, Inc. Management system for a conversational system
US20090327182A1 (en) * 2008-06-27 2009-12-31 International Business Machines Corporation Calendar based personalized recommendations
EP2538841A2 (en) 2010-02-26 2013-01-02 Myskin, Inc. Analytic methods of tissue evaluation
US20130042261A1 (en) 2011-08-10 2013-02-14 Bank Of America Electronic video media e-wallet application
US20130159895A1 (en) 2011-12-15 2013-06-20 Parham Aarabi Method and system for interactive cosmetic enhancements interface

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6293284B1 (en) * 1999-07-07 2001-09-25 Division Of Conopco, Inc. Virtual makeover
US20050203724A1 (en) * 2000-06-27 2005-09-15 Rami Orpaz Make-up and fashion accessory display and marketing system and method
US20030063794A1 (en) * 2001-10-01 2003-04-03 Gilles Rubinstenn Analysis using a three-dimensional facial image
US20130300761A1 (en) * 2010-11-12 2013-11-14 Colormodules Inc. Method and system for color matching and color recommendation

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10956009B2 (en) 2011-12-15 2021-03-23 L'oreal Method and system for interactive cosmetic enhancements interface
US9460462B1 (en) * 2012-05-22 2016-10-04 Image Metrics Limited Monetization using video-based simulation of cosmetic products
CN104408415A (en) * 2014-11-24 2015-03-11 苏州福丰科技有限公司 Detection method based on human ear complexion
CN104461298A (en) * 2015-01-19 2015-03-25 中国人民解放军第三〇七医院 Screen capturing method and screen capturing tool
DE102016204983A1 (en) 2016-03-24 2017-09-28 Bayerische Motoren Werke Aktiengesellschaft Arrangement, means of transport and method for the cosmetic assistance of a user
CN107463936A (en) * 2016-06-02 2017-12-12 宗经投资股份有限公司 Automatic face makeup method
CN106649465A (en) * 2016-09-26 2017-05-10 珠海格力电器股份有限公司 Recommendation and acquisition method and device of cosmetic information
CN107015745A (en) * 2017-05-19 2017-08-04 广东小天才科技有限公司 Screen operating method, device, terminal device and computer-readable recording medium
CN107463373A (en) * 2017-07-10 2017-12-12 北京小米移动软件有限公司 The management method and device of picture U.S. face method, good friend's face value
US11000107B2 (en) 2017-07-13 2021-05-11 Shiseido Company, Limited Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and method for recommending makeup
US10939742B2 (en) 2017-07-13 2021-03-09 Shiseido Company, Limited Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and a method for recommending makeup
US11039675B2 (en) 2017-07-13 2021-06-22 Shiseido Company, Limited Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and method for recommending makeup
US11344102B2 (en) 2017-07-13 2022-05-31 Shiseido Company, Limited Systems and methods for virtual facial makeup removal and simulation, fast facial detection and landmark tracking, reduction in input video lag and shaking, and a method for recommending makeup
CN109407912A (en) * 2017-08-16 2019-03-01 丽宝大数据股份有限公司 Electronic device and its offer examination adornment information approach
WO2019220208A1 (en) * 2018-05-16 2019-11-21 Matthewman Richard John Systems and methods for providing a style recommendation
US11257142B2 (en) 2018-09-19 2022-02-22 Perfect Mobile Corp. Systems and methods for virtual application of cosmetic products based on facial identification and corresponding makeup information
US11682067B2 (en) 2018-09-19 2023-06-20 Perfect Mobile Corp. Systems and methods for virtual application of cosmetic products based on facial identification and corresponding makeup information
US11587358B2 (en) 2020-03-26 2023-02-21 Panasonic Avionics Corporation Managing content on in-flight entertainment platforms

Also Published As

Publication number Publication date
US10956009B2 (en) 2021-03-23
US20190026013A1 (en) 2019-01-24

Similar Documents

Publication Publication Date Title
US10956009B2 (en) Method and system for interactive cosmetic enhancements interface
US11281366B2 (en) System and method for providing highly personalized information regarding products and services
US10559102B2 (en) Makeup simulation assistance apparatus, makeup simulation assistance method, and non-transitory computer-readable recording medium storing makeup simulation assistance program
Koetz Managing the customer experience: a beauty retailer deploys all tactics
US20080235047A1 (en) Method and system for ordering customized cosmetic contact lenses
EP3485762A1 (en) Makeup assistance device and makeup assistance method
US20130111337A1 (en) One-click makeover
US11861672B2 (en) Method, system, and non-transitory computer-readable medium for a digital personal care platform
US11776187B2 (en) Digital makeup artist
CN108376352A (en) The cosmetics of color with suitable user provide device and method
US11961169B2 (en) Digital makeup artist
US9449025B1 (en) Determining similarity using human generated data
JP2020160641A (en) Virtual person selection device, virtual person selection system and program
JP2005044195A (en) Makeup support system, method, server and program to be executed by computer
KR102367220B1 (en) Product order service providing method and system
JP2002221896A (en) Cosmetic simulation system
Lee et al. Effects of low-price cosmetics experience factors on satisfaction with test products
JP2023121081A (en) Information providing device, information providing method, and information providing program
KR20180092159A (en) Method and apparatus for searching make-up product
Gs et al. An Application of Online Branding Design with Customisation, Culture and Communities Strategy: A case studies on six online store providers
Murni The Impact of TikTok Technology Transformation in the Digitalization Era Using SmartPLS
CN116823402A (en) Virtual fitting processing method, system, equipment and storage medium
KR20230075953A (en) Method and device for providing nail art design
KR20230119940A (en) On-demand platform service method that provides the needs of consumers to suppliers
CN113888274A (en) Online shopping recommendation method and online shopping recommendation device

Legal Events

Date Code Title Description
AS Assignment

Owner name: MODIFACE INC., CANADA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:AARABI, PARHAM;REEL/FRAME:029438/0417

Effective date: 20121210

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION