US20010014173A1 - Color Region Based Recognition of Unidentified Objects - Google Patents

Color Region Based Recognition of Unidentified Objects Download PDF

Info

Publication number
US20010014173A1
US20010014173A1 US09/059,641 US5964198A US2001014173A1 US 20010014173 A1 US20010014173 A1 US 20010014173A1 US 5964198 A US5964198 A US 5964198A US 2001014173 A1 US2001014173 A1 US 2001014173A1
Authority
US
United States
Prior art keywords
color
instructions
color region
color regions
storage medium
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/059,641
Other versions
US6393147B2 (en
Inventor
Gunner D. Danneels
Kean R. Sampat
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intel Corp
Original Assignee
Intel Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intel Corp filed Critical Intel Corp
Priority to US09/059,641 priority Critical patent/US6393147B2/en
Assigned to INTEL CORPORATION reassignment INTEL CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: DANNEELS, GUNNER D., SAMPAT, KETAN R.
Publication of US20010014173A1 publication Critical patent/US20010014173A1/en
Application granted granted Critical
Publication of US6393147B2 publication Critical patent/US6393147B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Definitions

  • the present invention relates to the field of computer systems.
  • the present invention relates to object recognition by computer systems.
  • PC multi-media personal computers
  • a machine implemented method includes characterizing an object by color regions, and then identifying the object in accordance with at least the color region based characterization of the object.
  • the method further includes generating output responses, such as audio responses, in accordance with the identification result.
  • FIG. 1 illustrates an overview of the present invention including the color region based object recognition tool of the present invention
  • FIG. 2 is a flow chart illustrating one embodiment of the operational flow of the color based characterization portion of the object recognition tool
  • FIG. 3 is a flow chart illustrating in further detail one embodiment of the step of subsetting an image of an object into color regions
  • FIG. 4 is a flow chart illustrating one embodiment of the operational flow of the inference portion of the object recognition tool
  • FIG. 5 illustrates an exemplary application of the present invention
  • FIG. 6 is a block diagram illustrating a hardware view of one embodiment of a computer suitable for use to practice the present invention.
  • object recognition tool 100 of the present invention includes color based characterization portion 102 and inference portion 104 .
  • color based characterization portion 102 characterizes an object, such as object 106 , based on color regions of the object, and inference portion 104 in turn identifies the object in accordance with at least the color region based characterizations.
  • the identification result is provided to application 108 , which in turn responds to the identification results.
  • Object 106 is intended to represent all physical items that are visually observable. It includes but not limited to physical items such as a sculpture, a painting, a desk, a table, a fork, a knife, a vase, a stuffed animal, a doll, a book, a page in a book, a flash card, and so forth.
  • Application 108 is intended to represent a broad range of business, education and entertainment applications.
  • the response to the identification result may be externalized, e.g. an audio response, or internal only, e.g. changing certain state data, or both.
  • FIG. 2 illustrates one embodiment of the operational flow of color based characterization portion 102 .
  • color based characterization 102 first generates digitized image data of object 106 , e.g. in the form of a frame of video signals, step 202 .
  • the digitized image data are generated as RGB pixel data.
  • the digitized image data are generated as YUV pixel data instead.
  • color based characterization 102 transforms the pixel data from the RGB/YUV space to the HSI space, step 204 .
  • like variant colors are also transformed into their primary colors to reduce the number of colors, e.g. like variants of the red colors (within a predetermined range of degrees in the H index) are all transformed into the red color.
  • color based characterization 102 subsets the image into regions in accordance with the pixels' transformed colors, step 206 .
  • Step 202 may be performed using any one of a number of techniques known in the art, e.g. through a video camera and a capture card. Furthermore, the present invention may be performed without performing step 204 . That is, step 206 being performed with the pixel data in RGB or YUV space, without transforming the pixel data into the HSI space, nor collapsing like variant colors into the primary colors. However, experience has shown that the HSI space appears to provide the most consistent result across a wide range of ambient conditions, and collapsing like variant colors into the primary colors reduces the amount of processing without significantly sacrificing the ability to properly recognize an object.
  • FIG. 3 illustrates one embodiment of step 206 of FIG. 2.
  • color based characterization portion 102 first selects a pixel at the lower left corner of a frame, step 302 .
  • color based characterization portion 102 assigns the pixel to a new color region, step 304 .
  • color based characterization portion 102 attributes the color of the pixel as the color of the new color region, as well as attributing the coordinates of the pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel, also step 304 .
  • color based characterization portion 102 determines if there is at least another pixel to the right of the current pixel, step 306 . If there is at least another pixel to the right, color based characterization portion 102 selects the pixel immediately to the right, step 308 . Upon selecting the pixel immediately to the right, color based characterization portion 102 determines if the selected pixel has the same color as the previous pixel, i.e. the pixel immediately to the left of the now selected pixel, step 310 .
  • color based characterization portion 102 assigns the selected pixel to the same color region of the pixel immediately to its left, and increments the size of the color region by one pixel, step 312 ; then continues the process at step 306 .
  • color based characterization portion 102 determines if the selected pixel has the same color as the pixel immediately below it, step 314 . If the determination is affirmative, color based characterization portion 102 assigns the pixel to the same color region of the pixel immediately below it, attributes the coordinates of the selected pixel as the reference coordinates of the color region instead, and increments the size of the color region by one pixel, step 316 ; then continues the process at step 306 .
  • color based characterization portion 102 continues the process at step 304 , i.e. assigning the selected pixel to a new color region, attributing the coordinates of the selected pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel.
  • color based characterization portion 102 determines there are no more pixels to the right. Color based characterization portion 102 then continues the process at step 318 wherein it determines if there are pixels above the last processed pixel. If the determination is affirmative, color based characterization portion 102 selects the left most pixel from the row of pixels immediately above, step 320 . Upon doing so, color based characterization portion 102 continues the process at step 314 as described earlier.
  • color based characterization portion 102 determines there are no more pixels above either, i.e. all pixels of the entire frame have been processed. At such time, the process terminates.
  • the pixels may be processed in orders other than the left to right and bottom to top manner described earlier.
  • other coordinates beside the coordinates of the top left pixel may be used as the reference coordinates of a color region instead, as well as other metrics may be employed to denote the size of a color region.
  • FIG. 4 illustrates one embodiment of the operational flow of inference portion 104 .
  • inference portion 104 infers the identity of the characterized object by examining a number of color region characterized reference objects of known identities, one at a time.
  • a color region characterized reference object of known identity is selected.
  • the color region characterized reference object is analyzed to determine if it contains at least the same number of color regions for each of the different colors as the characterized object whose identity is being determined, step 404 . If the color region characterized reference object does not contain at least the same number of color regions for each of the different colors, the color region characterized reference object is rejected, step 412 , and the process continues at step 414 .
  • a color region characterized reference object must contain at least two red color regions and at least one blue color region, otherwise the color region characterized reference object is rejected.
  • the color region characterized reference object is analyzed to determine if the color regions of each of the color groups having at least the same number of color regions have sizes that are at least as large as the color regions of the characterized object whose identity is being determined, step 406 . If the color regions of each of the color groups having at least the same number of color regions do not have color regions with the requisite sizes, the color region characterized reference object is rejected, step 412 , and the process continues at step 414 .
  • a color region characterized reference object must contain at least two red color regions with sizes that are at least s 1 and s 2 , and at least one blue color region with a size that is at least s 3 , otherwise the color region characterized reference object is rejected.
  • the color region characterized reference object is analyzed to determine if the color regions of interest have the same relative orientation to each other as the color regions of the characterized object whose identity is being determined, step 408 . If the color region characterized reference object does not contain color regions with the same relative orientation to each other, the color region characterized reference object is rejected, step 412 , and the process continues at step 414 . In one embodiment, the color regions' relative orientation to each other is determined using the reference coordinates of the color regions.
  • a color region characterized reference object must contain at least two red color regions with sizes at least that of s 1 and s 2 , and at least one blue color region with a size that is at least s 3 , occupying also a similar equilateral triangular orientation (i.e. within certain predetermined tolerance margins), otherwise the color region characterized reference object is rejected.
  • the predetermined tolerance margins are configurable at set up. In an alternate embodiment, the predetermined tolerance margins are user configurable during operation.
  • the determination process terminates, and the identity of the reference object is considered to be the identity of the characterized object, step 410 .
  • the number of reference objects to be employed is application dependent. Obviously, if more reference objects are employed, the less likely that inference portion 104 is unable to identify an object. In one embodiment where the number of referenced objects employed is relatively small, all referenced objects are analyzed before a final inference is drawn. While experience has shown merely employing the above described criteria, inference portion 104 is able to effectively recognize objects with reasonable level of accuracy for a large number of casual applications, those skilled in the art will appreciate that in alternate embodiments, additional criteria may be employed to reduce the likelihood of incorrect inference by inference portion 104 .
  • FIG. 5 illustrates an exemplary application of the present invention.
  • the application includes the use of multi-media computer 500 to read book 502 for a user.
  • Multi-media computer 500 includes multi-media resources such as video camera 504 , a video capture card (not visible), speakers 506 , and an audio player (not shown). More importantly, multi-media computer 500 is equipped with the color region based objection recognition tool of the present invention described earlier, and color region characterization as well as audio data for each page of book 502 .
  • the color region based objection recognition tool of the present invention identifies the current page book 502 is open to, based at least in part on the color region characterization of the page, using the video image generated by video camera 504 and the associated capture card, and the pre-stored reference color region characterization for each page of the book.
  • the audio player plays the audio data for the page, thereby reading the page for the user.
  • multi-media computer 500 is provided with color region characterization and audio data for a number of books, and multi-media computer 500 is further provided with a user interface for the user to inform multi-media computer 500 of the identity of the book, thereby allowing object recognition tool of the present invention to employ the appropriate reference color region characterizations.
  • the reference color region characterizations are organized by the books, and include color region characterizations for the covers of the books.
  • the object recognition tool of the present invention is further extended to include the selective employment of a subset of the pre-stored reference color region characterizations based on the identification of the book, through color region characterization of the cover and comparison with the pre-stored reference color region characterizations for the covers.
  • FIG. 6 illustrates a hardware view of one embodiment of a computer system suitable for practicing the present invention, including the above described application.
  • computer system 600 includes processor 602 , processor bus 606 , high performance I/O bus 610 and standard I/O bus 620 .
  • Processor bus 606 and high performance I/O bus 610 are bridged by host bridge 608 , whereas I/O buses 610 and 612 are bridged by I/O bus bridge 612 .
  • Coupled to processor bus 606 is cache 604 .
  • Coupled to high performance I/O bus 610 are camera, 611 , system memory 614 and video memory 616 , against which video display 618 is coupled.
  • Coupled to standard I/O bus 620 are disk drive 622 , keyboard and pointing device 624 , communication interface 626 , and speakers 628 .
  • disk drive 622 and system memory 614 are used to store permanent and working copies of color region based object recognition tool of the present invention, as well as color region characterization of reference objects and applications that use the color region based object recognition tool.
  • the permanent copies may be pre-loaded into disk drive 622 in factory, loaded from distribution medium 632 , or down loaded from a remote distribution source (not shown).
  • the constitutions of these elements are known. Any one of a number of implementations of these elements known in the art may be used to form computer system 600 .

Abstract

A machine implemented method is disclosed. The method includes characterizing an object by color regions, and then identifying the object in accordance with at least the color region based characterization of the object. In one embodiment, the method further includes generating output response, such as audio response, in accordance with the identification result.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Invention [0001]
  • The present invention relates to the field of computer systems. In particular, the present invention relates to object recognition by computer systems. [0002]
  • 2. Background Information [0003]
  • As advances in microprocessor and other related technologies continue to improve the price/performance of various electronic components in recent years, powerful multi-media personal computers (PC) that once was within the exclusive realm of mainframe computers have now become increasingly affordable to the average consumers. More and more homes and classrooms are now equipped with PC for business, education, and/or entertainment purposes. [0004]
  • Numerous advances have also been made in the field of computer vision, i.e. the ability to recognize people, object etc. by computers. However, perhaps due to the fact that much of the original interest was motivated by security applications, the techniques known today are generally too computational intensive (or unnecessary) for use by classroom/home PCs for more casual applications such as education and/or entertainment. Thus, a less computational intensive and yet sufficiently effective object recognition technique for causal applications is desired. [0005]
  • SUMMARY OF THE INVENTION
  • A machine implemented method is disclosed. The method includes characterizing an object by color regions, and then identifying the object in accordance with at least the color region based characterization of the object. [0006]
  • In one embodiment, the method further includes generating output responses, such as audio responses, in accordance with the identification result. [0007]
  • BRIEF DESCRIPTION OF DRAWINGS
  • The present invention will be described by way of exemplary embodiments, but not limitations, illustrated in the accompanying drawings in which like references denote similar elements, and in which: [0008]
  • FIG. 1 illustrates an overview of the present invention including the color region based object recognition tool of the present invention; [0009]
  • FIG. 2 is a flow chart illustrating one embodiment of the operational flow of the color based characterization portion of the object recognition tool; [0010]
  • FIG. 3 is a flow chart illustrating in further detail one embodiment of the step of subsetting an image of an object into color regions; [0011]
  • FIG. 4 is a flow chart illustrating one embodiment of the operational flow of the inference portion of the object recognition tool; [0012]
  • FIG. 5 illustrates an exemplary application of the present invention; and [0013]
  • FIG. 6 is a block diagram illustrating a hardware view of one embodiment of a computer suitable for use to practice the present invention. [0014]
  • DETAILED DESCRIPTION OF THE INVENTION
  • In the following description, various aspects of the present invention will be described. Those skilled in the art will also appreciate that the present invention may be practiced with only some or all aspects of the present invention. For purposes of explanation, specific numbers, materials and configurations are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. In other instances, well known features are omitted or simplified in order not to obscure the present invention. [0015]
  • Parts of the description will be presented in terms of operations performed by a computer system, using terms such as data, flags, bits, values, characters, strings, numbers and the like, consistent with the manner commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. As well understood by those skilled in the art, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, and otherwise manipulated through mechanical and electrical components of the computer system; and the term computer system include general purpose as well as special purpose data processing machines, systems, and the like, that are standalone, adjunct or embedded. [0016]
  • Various operations will be described as multiple discrete steps in turn in a manner that is most helpful in understanding the present invention, however, the order of description should not be construed as to imply that these operations are necessarily order dependent, in particular, the order of their presentations. [0017]
  • Referring now to FIG. 1, wherein a block diagram illustrating one embodiment of the present invention is shown. As illustrated, [0018] object recognition tool 100 of the present invention includes color based characterization portion 102 and inference portion 104. As will be described in more details below, color based characterization portion 102 characterizes an object, such as object 106, based on color regions of the object, and inference portion 104 in turn identifies the object in accordance with at least the color region based characterizations. In one application of the present invention, the identification result is provided to application 108, which in turn responds to the identification results.
  • Object [0019] 106 is intended to represent all physical items that are visually observable. It includes but not limited to physical items such as a sculpture, a painting, a desk, a table, a fork, a knife, a vase, a stuffed animal, a doll, a book, a page in a book, a flash card, and so forth. Application 108 is intended to represent a broad range of business, education and entertainment applications. The response to the identification result may be externalized, e.g. an audio response, or internal only, e.g. changing certain state data, or both.
  • FIG. 2 illustrates one embodiment of the operational flow of color based [0020] characterization portion 102. As shown, for the illustrated embodiment, color based characterization 102 first generates digitized image data of object 106, e.g. in the form of a frame of video signals, step 202. In one embodiment, the digitized image data are generated as RGB pixel data. In an alternate embodiment, the digitized image data are generated as YUV pixel data instead. Next, color based characterization 102 transforms the pixel data from the RGB/YUV space to the HSI space, step 204. In one embodiment, like variant colors are also transformed into their primary colors to reduce the number of colors, e.g. like variants of the red colors (within a predetermined range of degrees in the H index) are all transformed into the red color. Then, color based characterization 102 subsets the image into regions in accordance with the pixels' transformed colors, step 206.
  • [0021] Step 202 may be performed using any one of a number of techniques known in the art, e.g. through a video camera and a capture card. Furthermore, the present invention may be performed without performing step 204. That is, step 206 being performed with the pixel data in RGB or YUV space, without transforming the pixel data into the HSI space, nor collapsing like variant colors into the primary colors. However, experience has shown that the HSI space appears to provide the most consistent result across a wide range of ambient conditions, and collapsing like variant colors into the primary colors reduces the amount of processing without significantly sacrificing the ability to properly recognize an object.
  • FIG. 3 illustrates one embodiment of [0022] step 206 of FIG. 2. As shown, color based characterization portion 102 first selects a pixel at the lower left corner of a frame, step 302. Next, color based characterization portion 102 assigns the pixel to a new color region, step 304. Additionally, color based characterization portion 102 attributes the color of the pixel as the color of the new color region, as well as attributing the coordinates of the pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel, also step 304.
  • Then, color based [0023] characterization portion 102 determines if there is at least another pixel to the right of the current pixel, step 306. If there is at least another pixel to the right, color based characterization portion 102 selects the pixel immediately to the right, step 308. Upon selecting the pixel immediately to the right, color based characterization portion 102 determines if the selected pixel has the same color as the previous pixel, i.e. the pixel immediately to the left of the now selected pixel, step 310. If the determination is affirmative, color based characterization portion 102 assigns the selected pixel to the same color region of the pixel immediately to its left, and increments the size of the color region by one pixel, step 312; then continues the process at step 306. On the other hand, if the determination is negative, color based characterization portion 102 determines if the selected pixel has the same color as the pixel immediately below it, step 314. If the determination is affirmative, color based characterization portion 102 assigns the pixel to the same color region of the pixel immediately below it, attributes the coordinates of the selected pixel as the reference coordinates of the color region instead, and increments the size of the color region by one pixel, step 316; then continues the process at step 306. On the other hand, if the determination is negative, color based characterization portion 102 continues the process at step 304, i.e. assigning the selected pixel to a new color region, attributing the coordinates of the selected pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel.
  • Eventually, at [0024] step 306, color based characterization portion 102 determines there are no more pixels to the right. Color based characterization portion 102 then continues the process at step 318 wherein it determines if there are pixels above the last processed pixel. If the determination is affirmative, color based characterization portion 102 selects the left most pixel from the row of pixels immediately above, step 320. Upon doing so, color based characterization portion 102 continues the process at step 314 as described earlier.
  • Eventually, at [0025] step 310, color based characterization portion 102 determines there are no more pixels above either, i.e. all pixels of the entire frame have been processed. At such time, the process terminates.
  • In alternate embodiments, the pixels may be processed in orders other than the left to right and bottom to top manner described earlier. Furthermore, other coordinates beside the coordinates of the top left pixel may be used as the reference coordinates of a color region instead, as well as other metrics may be employed to denote the size of a color region. [0026]
  • FIG. 4 illustrates one embodiment of the operational flow of [0027] inference portion 104. As shown, for the illustrated embodiment, inference portion 104 infers the identity of the characterized object by examining a number of color region characterized reference objects of known identities, one at a time. At step 402, a color region characterized reference object of known identity is selected. The color region characterized reference object is analyzed to determine if it contains at least the same number of color regions for each of the different colors as the characterized object whose identity is being determined, step 404. If the color region characterized reference object does not contain at least the same number of color regions for each of the different colors, the color region characterized reference object is rejected, step 412, and the process continues at step 414. For example, if the object whose identity is being determined is characterized as having two red color regions and one blue color regions, then a color region characterized reference object must contain at least two red color regions and at least one blue color region, otherwise the color region characterized reference object is rejected.
  • Next, the color region characterized reference object is analyzed to determine if the color regions of each of the color groups having at least the same number of color regions have sizes that are at least as large as the color regions of the characterized object whose identity is being determined, step [0028] 406. If the color regions of each of the color groups having at least the same number of color regions do not have color regions with the requisite sizes, the color region characterized reference object is rejected, step 412, and the process continues at step 414. For example, if the object whose identity is being determined is characterized as having two red color regions of sizes s1 and s2 and one blue color regions of size s3, then a color region characterized reference object must contain at least two red color regions with sizes that are at least s1 and s2, and at least one blue color region with a size that is at least s3, otherwise the color region characterized reference object is rejected.
  • Then, the color region characterized reference object is analyzed to determine if the color regions of interest have the same relative orientation to each other as the color regions of the characterized object whose identity is being determined, [0029] step 408. If the color region characterized reference object does not contain color regions with the same relative orientation to each other, the color region characterized reference object is rejected, step 412, and the process continues at step 414. In one embodiment, the color regions' relative orientation to each other is determined using the reference coordinates of the color regions. For example, if the object whose identity is being determined is characterized as having two red color regions of sizes s1 and s2 and one blue color regions of size s3, occupying a substantially equilateral triangular orientation (in accordance with their reference coordinates), then a color region characterized reference object must contain at least two red color regions with sizes at least that of s1 and s2, and at least one blue color region with a size that is at least s3, occupying also a similar equilateral triangular orientation (i.e. within certain predetermined tolerance margins), otherwise the color region characterized reference object is rejected. In one embodiment, the predetermined tolerance margins are configurable at set up. In an alternate embodiment, the predetermined tolerance margins are user configurable during operation.
  • For the illustrated embodiment, if the color region characterized reference object is not rejected at [0030] step 408, the determination process terminates, and the identity of the reference object is considered to be the identity of the characterized object, step 410. At step 414, it is determined that whether there are additional reference objects. If there are still additional reference objects available for analysis, the process continues at step 402, otherwise inference portion 104 reports that it is unable to determine the identity of the characterized object, step 416, and the process terminates.
  • The number of reference objects to be employed is application dependent. Obviously, if more reference objects are employed, the less likely that [0031] inference portion 104 is unable to identify an object. In one embodiment where the number of referenced objects employed is relatively small, all referenced objects are analyzed before a final inference is drawn. While experience has shown merely employing the above described criteria, inference portion 104 is able to effectively recognize objects with reasonable level of accuracy for a large number of casual applications, those skilled in the art will appreciate that in alternate embodiments, additional criteria may be employed to reduce the likelihood of incorrect inference by inference portion 104.
  • FIG. 5 illustrates an exemplary application of the present invention. As illustrated, the application includes the use of multi-media computer [0032] 500 to read book 502 for a user. Multi-media computer 500 includes multi-media resources such as video camera 504, a video capture card (not visible), speakers 506, and an audio player (not shown). More importantly, multi-media computer 500 is equipped with the color region based objection recognition tool of the present invention described earlier, and color region characterization as well as audio data for each page of book 502. During operation, the color region based objection recognition tool of the present invention identifies the current page book 502 is open to, based at least in part on the color region characterization of the page, using the video image generated by video camera 504 and the associated capture card, and the pre-stored reference color region characterization for each page of the book. In response, the audio player plays the audio data for the page, thereby reading the page for the user.
  • In one embodiment, multi-media computer [0033] 500 is provided with color region characterization and audio data for a number of books, and multi-media computer 500 is further provided with a user interface for the user to inform multi-media computer 500 of the identity of the book, thereby allowing object recognition tool of the present invention to employ the appropriate reference color region characterizations. In an alternate embodiment, the reference color region characterizations are organized by the books, and include color region characterizations for the covers of the books. The object recognition tool of the present invention is further extended to include the selective employment of a subset of the pre-stored reference color region characterizations based on the identification of the book, through color region characterization of the cover and comparison with the pre-stored reference color region characterizations for the covers.
  • FIG. 6 illustrates a hardware view of one embodiment of a computer system suitable for practicing the present invention, including the above described application. As shown, for the illustrated embodiment, [0034] computer system 600 includes processor 602, processor bus 606, high performance I/O bus 610 and standard I/O bus 620. Processor bus 606 and high performance I/O bus 610 are bridged by host bridge 608, whereas I/O buses 610 and 612 are bridged by I/O bus bridge 612. Coupled to processor bus 606 is cache 604. Coupled to high performance I/O bus 610 are camera, 611, system memory 614 and video memory 616, against which video display 618 is coupled. Coupled to standard I/O bus 620 are disk drive 622, keyboard and pointing device 624, communication interface 626, and speakers 628.
  • These elements perform their conventional functions known in the art. In particular, [0035] disk drive 622 and system memory 614 are used to store permanent and working copies of color region based object recognition tool of the present invention, as well as color region characterization of reference objects and applications that use the color region based object recognition tool. The permanent copies may be pre-loaded into disk drive 622 in factory, loaded from distribution medium 632, or down loaded from a remote distribution source (not shown). The constitutions of these elements are known. Any one of a number of implementations of these elements known in the art may be used to form computer system 600.
  • In general, those skilled in the art will recognize that the present invention is not limited by the details described, in particular, the present invention is not limited to the exemplary application, instead, the present invention can be practiced with modifications and alterations within the spirit and scope of the appended claims. The description is thus to be regarded as illustrative instead of restrictive on the present invention. [0036]
  • Thus, a method and apparatus for color region based object recognition and application has been described. [0037]

Claims (27)

What is claimed is:
1. A machine implemented method comprising:
(a) characterizing an object by color regions; and
(b) identifying the object in accordance with at least said color region based characterization of the object.
2. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their colors.
3. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their sizes.
4. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their relative positions to each other.
5. The method of
claim 1
wherein (a) comprises analyzing and associating pixels of a frame of video signals of the object to form the color regions.
6. The method of
claim 1
wherein (b) comprises assigning an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
7. The method of
claim 6
wherein the object and the selected reference object have consistent color region based characterizations.
8. The method of
claim 1
wherein said method further comprises
(c) generating output response in accordance with the result of (b).
9. The method of
claim 8
wherein (c) comprises rendering audio response.
10. A storage medium having stored therein a plurality of machine executable instructions, wherein when executed, the instructions characterize an object by color regions, and identify the object in accordance with at least said color region based characterization of the object.
11. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their colors.
12. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their sizes.
13. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their relative positions to each other.
14. The storage medium of
claim 10
wherein the instructions analyze and associate pixels of a frame of video signals of the object to form the color regions.
15. The storage medium of
claim 10
wherein the instructions assign an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
16. The storage medium of
claim 15
wherein the object and the selected reference object have consistent color region based characterizations.
17. The storage medium of
claim 10
wherein the instructions further generate output response in accordance with the identification result.
18. The storage medium of
claim 17
wherein the instructions render audio response.
19. An apparatus comprising:
(a) a storage medium having stored therein a plurality of machine executable instructions, when executed, the instructions characterize an object by color regions, and identify the object in accordance with at least said color region based characterization of the object; and
(b) an execution unit coupled to the storage medium to execute the instructions.
20. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their colors.
21. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their sizes.
22. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their relative positions to each other.
23. The apparatus of
claim 19
wherein the instructions analyze and associate pixels of a frame of video signals of the object to form the color regions.
24. The apparatus of
claim 19
wherein the instructions assign an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
25. The apparatus of
claim 24
wherein the object and the selected reference object have consistent color region based characterizations.
26. The apparatus of
claim 19
wherein the instructions further generate output response in accordance with the identification result.
27. The apparatus of
claim 26
wherein the instructions render audio response.
US09/059,641 1998-04-13 1998-04-13 Color region based recognition of unidentified objects Expired - Fee Related US6393147B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US09/059,641 US6393147B2 (en) 1998-04-13 1998-04-13 Color region based recognition of unidentified objects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/059,641 US6393147B2 (en) 1998-04-13 1998-04-13 Color region based recognition of unidentified objects

Publications (2)

Publication Number Publication Date
US20010014173A1 true US20010014173A1 (en) 2001-08-16
US6393147B2 US6393147B2 (en) 2002-05-21

Family

ID=22024285

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/059,641 Expired - Fee Related US6393147B2 (en) 1998-04-13 1998-04-13 Color region based recognition of unidentified objects

Country Status (1)

Country Link
US (1) US6393147B2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1220182A2 (en) * 2000-12-25 2002-07-03 Matsushita Electric Industrial Co., Ltd. Image detection apparatus, program, and recording medium
US20100119331A1 (en) * 2008-11-11 2010-05-13 Xerox Corporation Automatic spine creation from book covers without spines

Families Citing this family (44)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7016532B2 (en) * 2000-11-06 2006-03-21 Evryx Technologies Image capture and identification system and process
US9310892B2 (en) 2000-11-06 2016-04-12 Nant Holdings Ip, Llc Object information derived from object images
US7680324B2 (en) 2000-11-06 2010-03-16 Evryx Technologies, Inc. Use of image-derived information as search criteria for internet and other search engines
US8224078B2 (en) 2000-11-06 2012-07-17 Nant Holdings Ip, Llc Image capture and identification system and process
US8218873B2 (en) * 2000-11-06 2012-07-10 Nant Holdings Ip, Llc Object information derived from object images
US7565008B2 (en) 2000-11-06 2009-07-21 Evryx Technologies, Inc. Data capture and identification system and process
US7899243B2 (en) * 2000-11-06 2011-03-01 Evryx Technologies, Inc. Image capture and identification system and process
US7392287B2 (en) * 2001-03-27 2008-06-24 Hemisphere Ii Investment Lp Method and apparatus for sharing information using a handheld device
US20070159522A1 (en) * 2004-02-20 2007-07-12 Harmut Neven Image-based contextual advertisement method and branded barcodes
US7751805B2 (en) * 2004-02-20 2010-07-06 Google Inc. Mobile image-based information retrieval system
US8421872B2 (en) * 2004-02-20 2013-04-16 Google Inc. Image base inquiry system for search engines for mobile telephones with integrated camera
WO2006085151A2 (en) * 2004-12-06 2006-08-17 Dspv, Ltd System and method of generic symbol recognition and user authentication using a communication device with imaging capabilities
US9137417B2 (en) 2005-03-24 2015-09-15 Kofax, Inc. Systems and methods for processing video data
US9769354B2 (en) 2005-03-24 2017-09-19 Kofax, Inc. Systems and methods of processing scanned data
US7917286B2 (en) 2005-12-16 2011-03-29 Google Inc. Database assisted OCR for street scenes and other images
US20080002855A1 (en) * 2006-07-03 2008-01-03 Barinder Singh Rai Recognizing An Unidentified Object Using Average Frame Color
US8244031B2 (en) 2007-04-13 2012-08-14 Kofax, Inc. System and method for identifying and classifying color regions from a digital image
US8156001B1 (en) 2007-12-28 2012-04-10 Google Inc. Facilitating bidding on images
US8315423B1 (en) 2007-12-28 2012-11-20 Google Inc. Providing information in an image-based information retrieval system
US9043828B1 (en) 2007-12-28 2015-05-26 Google Inc. Placing sponsored-content based on images in video content
CN101582941A (en) * 2008-05-13 2009-11-18 鸿富锦精密工业(深圳)有限公司 Communication terminal and number inquiry method
US8958605B2 (en) 2009-02-10 2015-02-17 Kofax, Inc. Systems, methods and computer program products for determining document validity
US9576272B2 (en) 2009-02-10 2017-02-21 Kofax, Inc. Systems, methods and computer program products for determining document validity
US8774516B2 (en) 2009-02-10 2014-07-08 Kofax, Inc. Systems, methods and computer program products for determining document validity
US9349046B2 (en) 2009-02-10 2016-05-24 Kofax, Inc. Smart optical input/output (I/O) extension for context-dependent workflows
US9767354B2 (en) 2009-02-10 2017-09-19 Kofax, Inc. Global geographic information retrieval, validation, and normalization
JP5984327B2 (en) * 2010-07-24 2016-09-06 キヤノン株式会社 Information processing method and apparatus, and program
US10146795B2 (en) 2012-01-12 2018-12-04 Kofax, Inc. Systems and methods for mobile image capture and processing
US9058580B1 (en) 2012-01-12 2015-06-16 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US8855375B2 (en) 2012-01-12 2014-10-07 Kofax, Inc. Systems and methods for mobile image capture and processing
US9483794B2 (en) 2012-01-12 2016-11-01 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US9058515B1 (en) 2012-01-12 2015-06-16 Kofax, Inc. Systems and methods for identification document processing and business workflow integration
US9355312B2 (en) 2013-03-13 2016-05-31 Kofax, Inc. Systems and methods for classifying objects in digital images captured using mobile devices
US9208536B2 (en) 2013-09-27 2015-12-08 Kofax, Inc. Systems and methods for three dimensional geometric reconstruction of captured image data
WO2014160426A1 (en) 2013-03-13 2014-10-02 Kofax, Inc. Classifying objects in digital images captured using mobile devices
US20140316841A1 (en) 2013-04-23 2014-10-23 Kofax, Inc. Location-based workflows and services
DE202014011407U1 (en) 2013-05-03 2020-04-20 Kofax, Inc. Systems for recognizing and classifying objects in videos captured by mobile devices
JP2016538783A (en) 2013-11-15 2016-12-08 コファックス, インコーポレイテッド System and method for generating a composite image of a long document using mobile video data
US9928532B2 (en) 2014-03-04 2018-03-27 Daniel Torres Image based search engine
WO2016004330A1 (en) 2014-07-03 2016-01-07 Oim Squared Inc. Interactive content generation
US9760788B2 (en) 2014-10-30 2017-09-12 Kofax, Inc. Mobile document detection and orientation based on reference object characteristics
US10242285B2 (en) 2015-07-20 2019-03-26 Kofax, Inc. Iterative recognition-guided thresholding and data extraction
US9779296B1 (en) 2016-04-01 2017-10-03 Kofax, Inc. Content-based detection and three dimensional geometric reconstruction of objects in image and video data
US10803350B2 (en) 2017-11-30 2020-10-13 Kofax, Inc. Object detection and image cropping using a multi-detector approach

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5062714A (en) 1990-02-12 1991-11-05 X-Rite, Incorporated Apparatus and method for pattern recognition
DE69127591T2 (en) * 1990-06-22 1998-01-22 Canon Kk Device and method for processing images
JP2873338B2 (en) 1991-09-17 1999-03-24 富士通株式会社 Moving object recognition device
US5579471A (en) * 1992-11-09 1996-11-26 International Business Machines Corporation Image query system and method
JP3234064B2 (en) * 1993-09-02 2001-12-04 キヤノン株式会社 Image retrieval method and apparatus
JP3591861B2 (en) * 1994-01-31 2004-11-24 キヤノン株式会社 Image processing method and apparatus
DE69428293T2 (en) * 1994-07-21 2002-06-27 Toshiba Kawasaki Kk IMAGE IDENTIFICATION DEVICE
US5835667A (en) * 1994-10-14 1998-11-10 Carnegie Mellon University Method and apparatus for creating a searchable digital video library and a system and method of using such a library
US5619708A (en) * 1994-10-25 1997-04-08 Korteam International, Inc. System and method for generating database input forms
JP2776295B2 (en) * 1994-10-27 1998-07-16 日本電気株式会社 Image index generation method and image index generation device
US6069696A (en) 1995-06-08 2000-05-30 Psc Scanning, Inc. Object recognition system and method
JP3545506B2 (en) 1995-08-21 2004-07-21 株式会社東芝 Specific color area extraction method and specific color area removal method
JPH09274660A (en) 1996-04-05 1997-10-21 Omron Corp Method, device for recognizing image, copy machine mounting the same and scanner
US5864630A (en) * 1996-11-20 1999-01-26 At&T Corp Multi-modal method for locating objects in images
JPH10222663A (en) 1997-01-31 1998-08-21 Yamaha Motor Co Ltd Picture recognition system and device therefor
US6016487A (en) * 1997-03-26 2000-01-18 National Research Council Of Canada Method of searching three-dimensional images
US6031529A (en) * 1997-04-11 2000-02-29 Avid Technology Inc. Graphics design software user interface
US6026411A (en) * 1997-11-06 2000-02-15 International Business Machines Corporation Method, apparatus, and computer program product for generating an image index and for internet searching and querying by image colors

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1220182A2 (en) * 2000-12-25 2002-07-03 Matsushita Electric Industrial Co., Ltd. Image detection apparatus, program, and recording medium
EP1220182A3 (en) * 2000-12-25 2005-08-17 Matsushita Electric Industrial Co., Ltd. Image detection apparatus, program, and recording medium
US20100119331A1 (en) * 2008-11-11 2010-05-13 Xerox Corporation Automatic spine creation from book covers without spines
US8131009B2 (en) * 2008-11-11 2012-03-06 Xerox Corporation Automatic spine creation from book covers without spines
US20120155701A1 (en) * 2008-11-11 2012-06-21 Xerox Corporation Automatic spine creation for book cover images that do not have spines
US8428302B2 (en) * 2008-11-11 2013-04-23 Xerox Corporation Automatic spine creation for book cover images that do not have spines

Also Published As

Publication number Publication date
US6393147B2 (en) 2002-05-21

Similar Documents

Publication Publication Date Title
US6393147B2 (en) Color region based recognition of unidentified objects
AU2008264197B2 (en) Image selection method
US20020172419A1 (en) Image enhancement using face detection
US7668371B2 (en) System and method for adaptively separating foreground from arbitrary background in presentations
US9348799B2 (en) Forming a master page for an electronic document
CN110221747B (en) Presentation method of e-book reading page, computing device and computer storage medium
US7424164B2 (en) Processing a detected eye of an image to provide visual enhancement
US20110013847A1 (en) Identifying picture areas based on gradient image analysis
CN107948730B (en) Method, device and equipment for generating video based on picture and storage medium
JP2002537604A (en) Document similarity search
Yamada et al. Comic image decomposition for reading comics on cellular phones
CN108769803A (en) Recognition methods, method of cutting out, system, equipment with frame video and medium
CN111385665A (en) Bullet screen information processing method, device, equipment and storage medium
CN109388725A (en) The method and device scanned for by video content
JP2002027190A (en) Method for linking scan document to video, information system and its product
CN107423407B (en) Teaching information recording method, device, terminal and computer readable storage medium
US6535652B2 (en) Image retrieval apparatus and method, and computer-readable memory therefor
CN113168674A (en) Automatic real-time high dynamic range content viewing system
US20040208388A1 (en) Processing a facial region of an image differently than the remaining portion of the image
CN112149570B (en) Multi-person living body detection method, device, electronic equipment and storage medium
CN111008295A (en) Page retrieval method and device, electronic equipment and storage medium
US6690826B2 (en) System and method for detecting text in mixed graphics data
CN114758054A (en) Light spot adding method, device, equipment and storage medium
JP3075851B2 (en) Map setting method
CN111079375A (en) Information sorting method and device, computer storage medium and terminal

Legal Events

Date Code Title Description
AS Assignment

Owner name: INTEL CORPORATION, CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DANNEELS, GUNNER D.;SAMPAT, KETAN R.;REEL/FRAME:009096/0683

Effective date: 19980413

FPAY Fee payment

Year of fee payment: 4

REMI Maintenance fee reminder mailed
LAPS Lapse for failure to pay maintenance fees
STCH Information on status: patent discontinuation

Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362

FP Lapsed due to failure to pay maintenance fee

Effective date: 20100521