Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20010014173 A1
Publication typeApplication
Application numberUS 09/059,641
Publication dateAug 16, 2001
Filing dateApr 13, 1998
Priority dateApr 13, 1998
Also published asUS6393147
Publication number059641, 09059641, US 2001/0014173 A1, US 2001/014173 A1, US 20010014173 A1, US 20010014173A1, US 2001014173 A1, US 2001014173A1, US-A1-20010014173, US-A1-2001014173, US2001/0014173A1, US2001/014173A1, US20010014173 A1, US20010014173A1, US2001014173 A1, US2001014173A1
InventorsGunner D. Danneels, Kean R. Sampat
Original AssigneeGunner D. Danneels, Kean R. Sampat
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Color Region Based Recognition of Unidentified Objects
US 20010014173 A1
Abstract
A machine implemented method is disclosed. The method includes characterizing an object by color regions, and then identifying the object in accordance with at least the color region based characterization of the object. In one embodiment, the method further includes generating output response, such as audio response, in accordance with the identification result.
Images(6)
Previous page
Next page
Claims(27)
What is claimed is:
1. A machine implemented method comprising:
(a) characterizing an object by color regions; and
(b) identifying the object in accordance with at least said color region based characterization of the object.
2. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their colors.
3. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their sizes.
4. The method of
claim 1
wherein (a) comprises characterizing a plurality of color regions by their relative positions to each other.
5. The method of
claim 1
wherein (a) comprises analyzing and associating pixels of a frame of video signals of the object to form the color regions.
6. The method of
claim 1
wherein (b) comprises assigning an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
7. The method of
claim 6
wherein the object and the selected reference object have consistent color region based characterizations.
8. The method of
claim 1
wherein said method further comprises
(c) generating output response in accordance with the result of (b).
9. The method of
claim 8
wherein (c) comprises rendering audio response.
10. A storage medium having stored therein a plurality of machine executable instructions, wherein when executed, the instructions characterize an object by color regions, and identify the object in accordance with at least said color region based characterization of the object.
11. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their colors.
12. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their sizes.
13. The storage medium of
claim 10
wherein the instructions characterize a plurality of color regions by their relative positions to each other.
14. The storage medium of
claim 10
wherein the instructions analyze and associate pixels of a frame of video signals of the object to form the color regions.
15. The storage medium of
claim 10
wherein the instructions assign an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
16. The storage medium of
claim 15
wherein the object and the selected reference object have consistent color region based characterizations.
17. The storage medium of
claim 10
wherein the instructions further generate output response in accordance with the identification result.
18. The storage medium of
claim 17
wherein the instructions render audio response.
19. An apparatus comprising:
(a) a storage medium having stored therein a plurality of machine executable instructions, when executed, the instructions characterize an object by color regions, and identify the object in accordance with at least said color region based characterization of the object; and
(b) an execution unit coupled to the storage medium to execute the instructions.
20. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their colors.
21. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their sizes.
22. The apparatus of
claim 19
wherein the instructions characterize a plurality of color regions by their relative positions to each other.
23. The apparatus of
claim 19
wherein the instructions analyze and associate pixels of a frame of video signals of the object to form the color regions.
24. The apparatus of
claim 19
wherein the instructions assign an identity of a selected one of a plurality of color characterized reference objects to the object, in accordance with at least said color region based characterizations.
25. The apparatus of
claim 24
wherein the object and the selected reference object have consistent color region based characterizations.
26. The apparatus of
claim 19
wherein the instructions further generate output response in accordance with the identification result.
27. The apparatus of
claim 26
wherein the instructions render audio response.
Description
BACKGROUND OF THE INVENTION

[0001] 1. Field of the Invention

[0002] The present invention relates to the field of computer systems. In particular, the present invention relates to object recognition by computer systems.

[0003] 2. Background Information

[0004] As advances in microprocessor and other related technologies continue to improve the price/performance of various electronic components in recent years, powerful multi-media personal computers (PC) that once was within the exclusive realm of mainframe computers have now become increasingly affordable to the average consumers. More and more homes and classrooms are now equipped with PC for business, education, and/or entertainment purposes.

[0005] Numerous advances have also been made in the field of computer vision, i.e. the ability to recognize people, object etc. by computers. However, perhaps due to the fact that much of the original interest was motivated by security applications, the techniques known today are generally too computational intensive (or unnecessary) for use by classroom/home PCs for more casual applications such as education and/or entertainment. Thus, a less computational intensive and yet sufficiently effective object recognition technique for causal applications is desired.

SUMMARY OF THE INVENTION

[0006] A machine implemented method is disclosed. The method includes characterizing an object by color regions, and then identifying the object in accordance with at least the color region based characterization of the object.

[0007] In one embodiment, the method further includes generating output responses, such as audio responses, in accordance with the identification result.

BRIEF DESCRIPTION OF DRAWINGS

[0008] The present invention will be described by way of exemplary embodiments, but not limitations, illustrated in the accompanying drawings in which like references denote similar elements, and in which:

[0009]FIG. 1 illustrates an overview of the present invention including the color region based object recognition tool of the present invention;

[0010]FIG. 2 is a flow chart illustrating one embodiment of the operational flow of the color based characterization portion of the object recognition tool;

[0011]FIG. 3 is a flow chart illustrating in further detail one embodiment of the step of subsetting an image of an object into color regions;

[0012]FIG. 4 is a flow chart illustrating one embodiment of the operational flow of the inference portion of the object recognition tool;

[0013]FIG. 5 illustrates an exemplary application of the present invention; and

[0014]FIG. 6 is a block diagram illustrating a hardware view of one embodiment of a computer suitable for use to practice the present invention.

DETAILED DESCRIPTION OF THE INVENTION

[0015] In the following description, various aspects of the present invention will be described. Those skilled in the art will also appreciate that the present invention may be practiced with only some or all aspects of the present invention. For purposes of explanation, specific numbers, materials and configurations are set forth in order to provide a thorough understanding of the present invention. However, it will also be apparent to one skilled in the art that the present invention may be practiced without the specific details. In other instances, well known features are omitted or simplified in order not to obscure the present invention.

[0016] Parts of the description will be presented in terms of operations performed by a computer system, using terms such as data, flags, bits, values, characters, strings, numbers and the like, consistent with the manner commonly employed by those skilled in the art to convey the substance of their work to others skilled in the art. As well understood by those skilled in the art, these quantities take the form of electrical, magnetic, or optical signals capable of being stored, transferred, combined, and otherwise manipulated through mechanical and electrical components of the computer system; and the term computer system include general purpose as well as special purpose data processing machines, systems, and the like, that are standalone, adjunct or embedded.

[0017] Various operations will be described as multiple discrete steps in turn in a manner that is most helpful in understanding the present invention, however, the order of description should not be construed as to imply that these operations are necessarily order dependent, in particular, the order of their presentations.

[0018] Referring now to FIG. 1, wherein a block diagram illustrating one embodiment of the present invention is shown. As illustrated, object recognition tool 100 of the present invention includes color based characterization portion 102 and inference portion 104. As will be described in more details below, color based characterization portion 102 characterizes an object, such as object 106, based on color regions of the object, and inference portion 104 in turn identifies the object in accordance with at least the color region based characterizations. In one application of the present invention, the identification result is provided to application 108, which in turn responds to the identification results.

[0019] Object 106 is intended to represent all physical items that are visually observable. It includes but not limited to physical items such as a sculpture, a painting, a desk, a table, a fork, a knife, a vase, a stuffed animal, a doll, a book, a page in a book, a flash card, and so forth. Application 108 is intended to represent a broad range of business, education and entertainment applications. The response to the identification result may be externalized, e.g. an audio response, or internal only, e.g. changing certain state data, or both.

[0020]FIG. 2 illustrates one embodiment of the operational flow of color based characterization portion 102. As shown, for the illustrated embodiment, color based characterization 102 first generates digitized image data of object 106, e.g. in the form of a frame of video signals, step 202. In one embodiment, the digitized image data are generated as RGB pixel data. In an alternate embodiment, the digitized image data are generated as YUV pixel data instead. Next, color based characterization 102 transforms the pixel data from the RGB/YUV space to the HSI space, step 204. In one embodiment, like variant colors are also transformed into their primary colors to reduce the number of colors, e.g. like variants of the red colors (within a predetermined range of degrees in the H index) are all transformed into the red color. Then, color based characterization 102 subsets the image into regions in accordance with the pixels' transformed colors, step 206.

[0021] Step 202 may be performed using any one of a number of techniques known in the art, e.g. through a video camera and a capture card. Furthermore, the present invention may be performed without performing step 204. That is, step 206 being performed with the pixel data in RGB or YUV space, without transforming the pixel data into the HSI space, nor collapsing like variant colors into the primary colors. However, experience has shown that the HSI space appears to provide the most consistent result across a wide range of ambient conditions, and collapsing like variant colors into the primary colors reduces the amount of processing without significantly sacrificing the ability to properly recognize an object.

[0022]FIG. 3 illustrates one embodiment of step 206 of FIG. 2. As shown, color based characterization portion 102 first selects a pixel at the lower left corner of a frame, step 302. Next, color based characterization portion 102 assigns the pixel to a new color region, step 304. Additionally, color based characterization portion 102 attributes the color of the pixel as the color of the new color region, as well as attributing the coordinates of the pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel, also step 304.

[0023] Then, color based characterization portion 102 determines if there is at least another pixel to the right of the current pixel, step 306. If there is at least another pixel to the right, color based characterization portion 102 selects the pixel immediately to the right, step 308. Upon selecting the pixel immediately to the right, color based characterization portion 102 determines if the selected pixel has the same color as the previous pixel, i.e. the pixel immediately to the left of the now selected pixel, step 310. If the determination is affirmative, color based characterization portion 102 assigns the selected pixel to the same color region of the pixel immediately to its left, and increments the size of the color region by one pixel, step 312; then continues the process at step 306. On the other hand, if the determination is negative, color based characterization portion 102 determines if the selected pixel has the same color as the pixel immediately below it, step 314. If the determination is affirmative, color based characterization portion 102 assigns the pixel to the same color region of the pixel immediately below it, attributes the coordinates of the selected pixel as the reference coordinates of the color region instead, and increments the size of the color region by one pixel, step 316; then continues the process at step 306. On the other hand, if the determination is negative, color based characterization portion 102 continues the process at step 304, i.e. assigning the selected pixel to a new color region, attributing the coordinates of the selected pixel as the reference coordinates of the new color region, and initializing the size of the new color region to one pixel.

[0024] Eventually, at step 306, color based characterization portion 102 determines there are no more pixels to the right. Color based characterization portion 102 then continues the process at step 318 wherein it determines if there are pixels above the last processed pixel. If the determination is affirmative, color based characterization portion 102 selects the left most pixel from the row of pixels immediately above, step 320. Upon doing so, color based characterization portion 102 continues the process at step 314 as described earlier.

[0025] Eventually, at step 310, color based characterization portion 102 determines there are no more pixels above either, i.e. all pixels of the entire frame have been processed. At such time, the process terminates.

[0026] In alternate embodiments, the pixels may be processed in orders other than the left to right and bottom to top manner described earlier. Furthermore, other coordinates beside the coordinates of the top left pixel may be used as the reference coordinates of a color region instead, as well as other metrics may be employed to denote the size of a color region.

[0027]FIG. 4 illustrates one embodiment of the operational flow of inference portion 104. As shown, for the illustrated embodiment, inference portion 104 infers the identity of the characterized object by examining a number of color region characterized reference objects of known identities, one at a time. At step 402, a color region characterized reference object of known identity is selected. The color region characterized reference object is analyzed to determine if it contains at least the same number of color regions for each of the different colors as the characterized object whose identity is being determined, step 404. If the color region characterized reference object does not contain at least the same number of color regions for each of the different colors, the color region characterized reference object is rejected, step 412, and the process continues at step 414. For example, if the object whose identity is being determined is characterized as having two red color regions and one blue color regions, then a color region characterized reference object must contain at least two red color regions and at least one blue color region, otherwise the color region characterized reference object is rejected.

[0028] Next, the color region characterized reference object is analyzed to determine if the color regions of each of the color groups having at least the same number of color regions have sizes that are at least as large as the color regions of the characterized object whose identity is being determined, step 406. If the color regions of each of the color groups having at least the same number of color regions do not have color regions with the requisite sizes, the color region characterized reference object is rejected, step 412, and the process continues at step 414. For example, if the object whose identity is being determined is characterized as having two red color regions of sizes s1 and s2 and one blue color regions of size s3, then a color region characterized reference object must contain at least two red color regions with sizes that are at least s1 and s2, and at least one blue color region with a size that is at least s3, otherwise the color region characterized reference object is rejected.

[0029] Then, the color region characterized reference object is analyzed to determine if the color regions of interest have the same relative orientation to each other as the color regions of the characterized object whose identity is being determined, step 408. If the color region characterized reference object does not contain color regions with the same relative orientation to each other, the color region characterized reference object is rejected, step 412, and the process continues at step 414. In one embodiment, the color regions' relative orientation to each other is determined using the reference coordinates of the color regions. For example, if the object whose identity is being determined is characterized as having two red color regions of sizes s1 and s2 and one blue color regions of size s3, occupying a substantially equilateral triangular orientation (in accordance with their reference coordinates), then a color region characterized reference object must contain at least two red color regions with sizes at least that of s1 and s2, and at least one blue color region with a size that is at least s3, occupying also a similar equilateral triangular orientation (i.e. within certain predetermined tolerance margins), otherwise the color region characterized reference object is rejected. In one embodiment, the predetermined tolerance margins are configurable at set up. In an alternate embodiment, the predetermined tolerance margins are user configurable during operation.

[0030] For the illustrated embodiment, if the color region characterized reference object is not rejected at step 408, the determination process terminates, and the identity of the reference object is considered to be the identity of the characterized object, step 410. At step 414, it is determined that whether there are additional reference objects. If there are still additional reference objects available for analysis, the process continues at step 402, otherwise inference portion 104 reports that it is unable to determine the identity of the characterized object, step 416, and the process terminates.

[0031] The number of reference objects to be employed is application dependent. Obviously, if more reference objects are employed, the less likely that inference portion 104 is unable to identify an object. In one embodiment where the number of referenced objects employed is relatively small, all referenced objects are analyzed before a final inference is drawn. While experience has shown merely employing the above described criteria, inference portion 104 is able to effectively recognize objects with reasonable level of accuracy for a large number of casual applications, those skilled in the art will appreciate that in alternate embodiments, additional criteria may be employed to reduce the likelihood of incorrect inference by inference portion 104.

[0032]FIG. 5 illustrates an exemplary application of the present invention. As illustrated, the application includes the use of multi-media computer 500 to read book 502 for a user. Multi-media computer 500 includes multi-media resources such as video camera 504, a video capture card (not visible), speakers 506, and an audio player (not shown). More importantly, multi-media computer 500 is equipped with the color region based objection recognition tool of the present invention described earlier, and color region characterization as well as audio data for each page of book 502. During operation, the color region based objection recognition tool of the present invention identifies the current page book 502 is open to, based at least in part on the color region characterization of the page, using the video image generated by video camera 504 and the associated capture card, and the pre-stored reference color region characterization for each page of the book. In response, the audio player plays the audio data for the page, thereby reading the page for the user.

[0033] In one embodiment, multi-media computer 500 is provided with color region characterization and audio data for a number of books, and multi-media computer 500 is further provided with a user interface for the user to inform multi-media computer 500 of the identity of the book, thereby allowing object recognition tool of the present invention to employ the appropriate reference color region characterizations. In an alternate embodiment, the reference color region characterizations are organized by the books, and include color region characterizations for the covers of the books. The object recognition tool of the present invention is further extended to include the selective employment of a subset of the pre-stored reference color region characterizations based on the identification of the book, through color region characterization of the cover and comparison with the pre-stored reference color region characterizations for the covers.

[0034]FIG. 6 illustrates a hardware view of one embodiment of a computer system suitable for practicing the present invention, including the above described application. As shown, for the illustrated embodiment, computer system 600 includes processor 602, processor bus 606, high performance I/O bus 610 and standard I/O bus 620. Processor bus 606 and high performance I/O bus 610 are bridged by host bridge 608, whereas I/O buses 610 and 612 are bridged by I/O bus bridge 612. Coupled to processor bus 606 is cache 604. Coupled to high performance I/O bus 610 are camera, 611, system memory 614 and video memory 616, against which video display 618 is coupled. Coupled to standard I/O bus 620 are disk drive 622, keyboard and pointing device 624, communication interface 626, and speakers 628.

[0035] These elements perform their conventional functions known in the art. In particular, disk drive 622 and system memory 614 are used to store permanent and working copies of color region based object recognition tool of the present invention, as well as color region characterization of reference objects and applications that use the color region based object recognition tool. The permanent copies may be pre-loaded into disk drive 622 in factory, loaded from distribution medium 632, or down loaded from a remote distribution source (not shown). The constitutions of these elements are known. Any one of a number of implementations of these elements known in the art may be used to form computer system 600.

[0036] In general, those skilled in the art will recognize that the present invention is not limited by the details described, in particular, the present invention is not limited to the exemplary application, instead, the present invention can be practiced with modifications and alterations within the spirit and scope of the appended claims. The description is thus to be regarded as illustrative instead of restrictive on the present invention.

[0037] Thus, a method and apparatus for color region based object recognition and application has been described.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US8131009 *Nov 11, 2008Mar 6, 2012Xerox CorporationAutomatic spine creation from book covers without spines
US8428302 *Feb 23, 2012Apr 23, 2013Xerox CorporationAutomatic spine creation for book cover images that do not have spines
US20100119331 *Nov 11, 2008May 13, 2010Xerox CorporationAutomatic spine creation from book covers without spines
US20120155701 *Feb 23, 2012Jun 21, 2012Xerox CorporationAutomatic spine creation for book cover images that do not have spines
EP1220182A2 *Nov 22, 2001Jul 3, 2002Matsushita Electric Industrial Co., Ltd.Image detection apparatus, program, and recording medium
Classifications
U.S. Classification382/165
International ClassificationG06K9/46
Cooperative ClassificationG06K9/4652
European ClassificationG06K9/46C
Legal Events
DateCodeEventDescription
Jul 13, 2010FPExpired due to failure to pay maintenance fee
Effective date: 20100521
May 21, 2010LAPSLapse for failure to pay maintenance fees
Dec 28, 2009REMIMaintenance fee reminder mailed
Nov 18, 2005FPAYFee payment
Year of fee payment: 4
Apr 13, 1998ASAssignment
Owner name: INTEL CORPORATION, CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:DANNEELS, GUNNER D.;SAMPAT, KETAN R.;REEL/FRAME:009096/0683
Effective date: 19980413