Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20070270074 A1
Publication typeApplication
Application numberUS 11/775,133
Publication dateNov 22, 2007
Filing dateJul 9, 2007
Priority dateJan 18, 2005
Also published asWO2006077868A1
Publication number11775133, 775133, US 2007/0270074 A1, US 2007/270074 A1, US 20070270074 A1, US 20070270074A1, US 2007270074 A1, US 2007270074A1, US-A1-20070270074, US-A1-2007270074, US2007/0270074A1, US2007/270074A1, US20070270074 A1, US20070270074A1, US2007270074 A1, US2007270074A1
InventorsYuichi AOCHI, Wakana Yamana, Eiji Takuma, Yoshiyuki Endo, Fujio Nobata
Original AssigneeAochi Yuichi, Wakana Yamana, Eiji Takuma, Yoshiyuki Endo, Fujio Nobata
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Robot Toy
US 20070270074 A1
Abstract
A robot toy comprises a first light source emitting light of at least two colors simultaneously or individually, five or more second light sources arranged around the first light source, a memory for storing at least two types of emotion parameters constituting emotion, and a control means for increasing/decreasing the value of emotion parameters based on an external operation input. The control means controls the first and second light sources to emit light in an emission pattern corresponding to a combination of the values of emotion parameters, thus realizing an inexpensive robot toy capable of effectively representing emotion.
Images(17)
Previous page
Next page
Claims(20)
1. A robot toy that artificially expresses an emotion by the flashing of a light source, said mechanism comprising:
a head part of the robot toy having a frontal surface;
a group of light sources disposed at the frontal surface of the head part of the robot toy and comprising of one first light source, at least five second light sources disposed at virtually equal intervals around the first light source as the center;
a cover that covers the light sources and is formed of a semi-transparent material with which it is possible to recognize from the outside the light emitted from a light source when a light source has been turned on; and
an information processor for controlling the flashing of the light sources, wherein the information processor executes control in response to input from the outside and multiple light-emitting patterns are realized comprising a light emission pattern whereby the five or more second light sources turn on and off in succession in a clockwise or counterclockwise fashion.
2. A robot toy as recited claim 1, comprising a memory for storing at least two types of emotion parameters that define emotions, wherein the information processor increases or decreases the values of the emotion parameters based on operation input from the outside and control the emission of light from the first and/or second light sources by a light emission pattern corresponding to the increase or decrease in the emotion parameters.
3. A robot toy as recited in claim 2, wherein the information processor causes the first and/or second light sources to flash in response to the degree of the emotion as determined by the value of the emotion parameter at a cycle that becomes faster with any increase in this degree.
4. A robot toy as recited in claim 2, wherein the information processor causes the first and/or second light source to flash in a color corresponding to the type of emotion as determined by the combination of the values of the emotion parameters by light of two colors being simultaneously or individually emitted.
5. A robot toy as recited in claim 1, wherein the first light source is capable of emitting more colors of light than can be emitted by the second light sources individually.
6. A robot toy as recited in claim 2, wherein the information processor causes the first light source to emit light in a specific color that cannot be emitted by the second light sources when the type of emotion as determined by the combination of the values of the emotion parameters corresponds to a pre-established specific type.
7. A robot toy as recited in claim 2, comprising two or more operation input switches, and the information processor increases the values of the emotion parameters in accordance with the operation input of each of the operation input switches, and decreases the values of the emotion parameters when there has been no operation input for a specific time.
8. A robot toy comprising:
a torso part;
a head part disposed at the front of the torso part such that the head part can move in relation to the torso part;
a pair of front leg parts disposed at the front of the torso part;
a frontal face part comprising,
a virtually flat, semi-transparent cover designed such that light can be emitted through the cover from beneath while the cover hides the area beneath the cover,
multiple light source parts disposed on the inside of said frontal surface part comprising one first light source part and at least five second light source parts disposed annularly at virtually equal intervals surrounding the first light source part, and disposed in such a way that the flashing lights do not mix with one another;
a switch that functions in response to operation by a user; and
an information processor for controlling the turning on and off of the multiple light source parts in response to signals from the switch, the information processor is designed such that the turning on and off of the light source parts is controlled by light emission patterns comprising a light-emission pattern for the successive turning off and on of the multiple second light source parts in a clockwise or counterclockwise fashion.
9. The robot toy recited in claim 8, wherein the first light source part comprises at least two light sources having different emission colors, and the light emission patterns comprise a light-emission pattern wherein light is emitted by one of the first light source parts and a light-emission pattern wherein at least two light sources simultaneously emit light.
10. The robot toy recited in claim 8, wherein the frontal surface part of the head part is formed in an oblong right-angled parallelepiped shape having rounded sides.
11. The robot toy recited in claim 8, wherein the switches are disposed below the position of the multiple light source parts on the top face part and the frontal surface part of the head part.
12. The robot toy recited in claim 8, comprising a tail switch is further disposed at the back of the torso part.
13. The robot toy recited in claim 8, wherein the first light source part comprises three light sources having different emission colors, and the multiple first light source parts comprise two light sources having different emission colors.
14. A musical toy in the shape of an animal, comprising:
a torso part;
a head part disposed at the front part of the torso part such that the head can move in relation to the torso part;
a pair of front legs disposed at the front of the torso part; and
a pair of hind legs disposed at the back of the torso part, and having an appearance and shape that imitate a sitting animal, wherein the head part comprises a frontal surface part, which is an oblong right-angle parallelepiped shape having rounded sides and is flat and wider than the torso part, and further houses a sound amplifier part, and a speaker; and a display part is disposed at the frontal surface part and this display part displays in accordance with the music output from the speakers.
15. The robot toy as recited claim 14, comprising a memory for storing at least two types of emotion parameters that define emotions, wherein the information processor increases or decreases the values of the emotion parameters based on operation input from the outside and control the emission of light from the first and/or second light sources by a light emission pattern corresponding to the increase or decrease in the emotion parameters.
16. The musical toy as recited in claim 14, comprising an information processor and a first and/or second light source.
17. The robot toy as recited in claim 16, wherein the information processor causes the first and/or second light sources to flash in response to the degree of the emotion as determined by the value of the emotion parameter at a cycle that becomes faster with any increase in this degree.
18. The robot toy as recited in claim 16, wherein the information processor causes the first and/or second light source to flash in a color corresponding to the type of emotion as determined by the combination of the values of the emotion parameters by light of two colors being simultaneously or individually emitted.
19. The robot toy as recited in claim 16, wherein the first light source is capable of emitting more colors of light than can be emitted by the second light sources individually.
20. The robot toy as recited in claim 16, wherein the information processor causes the first light source to emit light in a specific color that cannot be emitted by the second light sources when the type of emotion as determined by the combination of the values of the emotion parameters corresponds to a pre-established specific type.
Description
CROSS-REFERENCE TO RELATED APPLICATION

This application is a continuation of International Application PCT/JP2006/300617, filed on Jan. 18, 2006, pending at the time of filing of this continuation application and claims priority from Japanese Patent Application JP 2005-010771 filed on Jan. 18, 2005, the contents of which are herein wholly incorporated by reference.

FIELD OF THE INVENTION

The present invention relates to a robot toy, and is particularly useful as an inexpensive robot toy.

BACKGROUND OF THE INVENTION

Robot toys that generate emotions automatically based on an appeal from a user, such as the clapping of hands or patting, or the surrounding environment, and express these generated emotions through a similar emission pattern on light-emitting elements and with sounds such as music have recently been proposed and marketed.

For instance, JP 3277500 discloses LEDs (light-emitting diodes) having a shape and emission color corresponding to an emotion disposed in the head part of a robot toy, wherein the diodes are covered by a semi-transparent cover in such a way that they are visible from the outside only when they are emitting light and these LEDs are flashed in response to the emotion of the robot toy in order to express the emotions of “joy” and “anger.” JP 3277500 also discloses a robot toy that expresses emotion, wherein, in place of these above-described LEDs, multiple light-emitting elements are disposed in matrix form and these elements have a shape and an emission color corresponding to the emotions of “joy” and “anger,” and these light-emitting elements are flashed selectively.

However, when the emotions of the robot toy are expressed by the light emission pattern of light-emitting elements, it is possible to express emotions by a more diverse emission pattern when multiple light-emitting elements are disposed in matrix form as described above and are selectively flashed in response to these emotions than when LEDs having a shape and emission color corresponding to a variety of emotions including “joy” and “anger” are used.

Nevertheless, it is difficult to inexpensively produce a robot toy wherein light-emitting elements are disposed in matrix form, and skill is required to effectively express the emotions of a robot toy by the light emission pattern of light-emitting elements at a limited cost. Moreover, it appears that if the emotions of a robot toy can be effectively expressed by a similar emission pattern of light-emitting elements, user interest in the robot toy would be improved and the commercial value as a “toy” would also be improved.

SUMMARY OF THE INVENTION

The present invention is created in light of such problems, and an object thereof is to provide a robot toy that, although it is inexpensive in construction, is capable of effectively expressing emotions.

In order to solve the above-mentioned problems, a robot toy of the present invention comprises a first light source capable of emitting, simultaneously or separately, light of at least two colors; five or more second light sources disposed around the first light source; a memory for storing at least two emotion parameters that define emotions; and control means for increasing and decreasing the value of the emotion parameters based on operation input from the outside, wherein the control means cause the first and second light sources to emit light based on an emission pattern corresponding to the combination of the values of the emotion parameters.

By means of the present invention it is possible to express diverse emotions with few light sources and to effectively express emotions with an inexpensive construction.

BRIEF DESCRIPTION OF THE DRAWINGS

For the purpose of facilitating an understanding of the inventions, the accompanying drawings and description illustrate a preferred embodiment thereof, from which the inventions, structure, construction and operation, and many related advantages may be readily understood and appreciated.

FIG. 1 is an oblique view showing the appearance of a robot toy of an embodiment of the present invention.

FIG. 2 is a block diagram showing the internal structure of a robot toy of an embodiment of the present invention.

FIG. 3 is a partial oblique view showing the specific structure of the head part.

FIG. 4 is an oblique view showing the structure of the drive mechanism part.

FIG. 5 is an oblique view following the description of the pendulum gear.

FIG. 6 is a simplified plan view for describing the cam of the first cam gear.

FIG. 7 is an oblique view showing the structure of the rising and falling member.

FIG. 8 is an oblique view showing the structure of the ear drive member.

FIG. 9 is a simplified drawing for describing the movement of the head.

FIG. 10 is an oblique view showing the similar emission status of the LEDs as seen from the outside.

FIG. 11 is a plan view showing the positional relationship between each LED in the LED display part.

FIG. 12 is a schematic drawing for describing the emotions of the robot toy.

FIG. 13 is a chart for describing the emotions of the robot toy.

FIG. 14 is a schematic drawing for describing the basic light emission patterns.

FIG. 15 is a flow chart showing the order of processing for generating and expressing emotions.

FIG. 16 is a flow chart showing the order of processing for modification of the first emotion parameter.

FIG. 17 is a flow chart showing the order of processing for modification of the second emotion parameter.

FIG. 18 is a flow chart showing the order of processing for modification of the third emotion parameter.

FIG. 19 is a flow chart showing the order of processing for expressing the first emotion.

FIG. 20 is a flow chart showing the order of processing for expressing the second emotion.

FIG. 21 is a flow chart showing the order of processing for expressing the third emotion.

FIG. 22 is a simplified drawing for describing another embodiment of the present invention.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS

In the drawings, 1 is the robot toy, 3 is the head part, 7 is the tail switch, 10 is the control part, 21 is the LED display part, 21A through 21G, 70A through 70D, 71A through 71E, 72A through 72F, and 73A through 73G are LEDs; 23 is the head switch, and 24 is the nose switch.

Embodiments of the present invention will now be described while referring to each of the drawings.

(1) Structure of the Robot Toy

FIG. 1 shows a robot toy 1 relating to the present embodiment. Robot toy 1 has an overall appearance and shape imitating a sitting animal, such as a dog. In essence, the head part is linked, in such a way that it can freely turn, to a shaft extending from the frontal end of a torso part 2 and is attached, in such a way that it can freely move, to the left and right top ends of a head part 3 at ear parts 4A and 4B. Moreover, paddle-shaped front leg parts 5A and 5B having rounded sides are linked, in such a way that they can be freely moved manually, to the frontal end parts of the left and right side surfaces of torso part 2, and hind leg parts 6A and a 6B, which protrude shorter then the front leg parts, are formed as a single unit with torso part 2 at the rear end parts of the left and right side surfaces of torso part 2. Furthermore, a joystick-shaped tail switch 7 is attached, in such a way that it can freely move, near the rear end part of the top surface of torso part 2.

Overall, torso part 2 is a right-angled parallelepiped shape having rounded sides, and, as shown in FIG. 2, houses on the inside a control part 10 responsible for motion control of the entire robot toy 1, a motor drive part 11, a substrate 13 formed by a sound amplifier part 12, a microphone 14, a speaker 15, and other devices. Moreover, an earphone jack is disposed on one hind leg part 6B of torso part 2.

Head part 3 is a right-angled parallelepiped shape having rounded sides that is flatter than torso part 2, and houses on the inside a light sensor 20, a display part 21 that has multiple LEDs, and other devices, and a drive mechanism 34 (FIG. 3) described later for driving head part 3 and each ear part 4A and 4B with a motor 22 as the power source, etc. A head switch 23, which is a press switch, is disposed at the top end part of head part 3, and a nose switch 24 which is a press switch, is disposed at the position of the nose on head part 3. Overall, robot toy 1 has front leg parts 5A and 5B that are longer than hind leg parts 6A and 6B; therefore, one surface of head part 3 is inclined with respect to the mounting surface and faces up and the display surface of LED display part 21 is disposed on this one surface so that the display can be easily seen by the user.

Moreover, microphone 14 of torso part 2 is a directional microphone for ambient sounds and transmits obtained sound signals S1 to control part 10. Moreover, light sensor 20 of head part 3 detects ambient brightness and transmits the detection results to a control part 10 as brightness detection signals S2. Furthermore, tail switch 7, head switch 23, and nose switch 24 detect physical appeals from the user, such as by “hitting down” or “pressing” and transmit the detection results to control part 10 as operation detection signals S3A through S3C. It should be noted that sound signals S4 from an outside musical toy are also applied to control part 10 via earphone jack 16.

Control part 10 is a microcomputer providing an information processor comprising a CPU (central processing unit) 10A, a memory 10B, an analog-to-digital conversion circuit 10C, and the like, and recognizes ambient conditions and appeals from the user based on each operation detection signal S3A through S3C of tail switch 7, head switch 23, and nose switch 24.

Control part 10 causes robot toy 1 to move in such a way that head part 3 is tilted and ear parts 4A and 4B are opened and closed by transmitting motor drive signals S4 to motor drive part 11 based on these recognition results and the program pre-stored in memory 10B to actuate motor 22. Moreover, control part 10 outputs sound and music from speaker 15 based on sound signals S5 by applying specific sound signals S5 to speaker 15 by a sound amplifier part 12 as necessary, and flashes the LEDs of LED display part 21 by a specific light emission pattern by applying specific drive signals S6 to LED display part 21.

It should be noted that the specific structure of head part 3 is shown in FIG. 3. As is clear from FIG. 3, head part 3 of robot toy 1 is formed by a drive mechanism part 34, a cover 35 of this drive mechanism part 34, LED display part 21, a member 36 for preventing light from escaping, and a filter cover 37, layered in succession moving from the back to the front in the direction shown by arrow a, inside a case comprising a first half case 30 that forms the external shape of the back face of head part 3, a second half case 31 forming the external shape of the front face of head part 3, and first and second U-shaped case side members 32 that form the left and right side faces of head 3.

As is clear from FIG. 4, a worm gear that is not illustrated is attached to the output axle of motor 22 of drive mechanism part 34, and this worm gear engages with a gear 42 via a gear 40 and a gear 41 formed coaxially as one unit with this gear 40. Moreover, as shown in FIG. 5, a movable member 45 wherein weights 44 are attached to one end, is attached, in such a way that the member can freely turn, to an axle 43 to which gear 42 is also attached; and a pendulum gear 46 is attached, such that it can freely turn, to the other end of this movable member 45 and engages with a gear 47 formed coaxially and as one unit with gear 42. Moreover, first and second linking gears 48 and 49 are disposed at the left and right of pendulum gear 46 such that when movable member 45 turns around axle 43, it can engage with pendulum gear 46.

Thus, when motor 22 of drive mechanism part 34 is driven by normal rotation, this rotational force is transmitted to gear 47 via the row of gears from the worm gear attached to the output axle of motor 22 up to gear 42; and based on this rotational force, gear 47 turns movable member 45 as one unit with pendulum gear 46 in a clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with first linking gear 48, while when motor 22 is driven by reverse rotation, based on this rotational force, gear 47 turns movable member 45 in the counter-clockwise direction in FIG. 4 and thereby enables pendulum gear 46 to engage with second linking gear 49.

In this case, first linking gear 48 engages with a first cam gear 50 and a cam 50A of a specific shape is formed, as shown in FIG. 6, in the bottom surface of this first cam gear 50. Moreover, this cam 50A will fit into an engagement hole 51A of rising and falling member 51, as shown in FIG. 7, disposed so that it can freely move up and down, as shown by arrow b in FIG. 4, in such a way that rising and falling member 51 will rise and fall with cam 50A when first cam gear 50 rotates.

As is clear from FIG. 7, first and second shaft bodies 53A and 53B are set at the top end of rising and falling member 51 such that they are positioned symmetrically to the right and left of a shaft body 52 (FIG. 4) set in first case half 30, and as shown in FIG. 4, an ear drive member 54 is attached to these first and second shaft bodies 53A and 53B.

Ear drive member 54 is formed by linkage, via a barrel part 61 and as one unit in such a way that it can freely bend, of the base part of each tweezer-shaped first and second spring parts 60A and 60B formed of an elastic material Moreover, ear drive member 54 is attached to rising and falling member 51 in such a way that the corresponding first and second shaft bodies 53A and 53B of rising and falling member 51 fit into holes 62AX and 62BX of first and second engagement parts 62A and 62B disposed near the base part of these first and second spring parts 60A and 60B, and barrel part 61 is attached to rising and falling member 51 in such a way that it engages with shaft body 52.

As a result, drive mechanism part 34 operates in such a way that when rising and falling member 51 rises and falls, first and second shaft bodies 53A and 53B of rising and falling member 51 move up and down relative to shaft body 52 and first and second spring parts 60A and 60 B open and close as one unit with first and second engagement parts 62A and 62B of ear drive member 54.

Furthermore, the bottom end parts of the corresponding ear parts 4A and 4B fit into first and second spring parts 60A and 60B of ear drive member 54, and ear part 4A and ear part 4B are supported, near the bottom end, to the left and right, respectively, of the top end of first half case 30, and ear parts 4A and 4B open and close with the opening and closing motion of first and second spring parts 60A and 60B of ear drive member 54.

When motor 22 of drive mechanism 34 is driven by forward motion, pendulum gear 46 (FIG. 5) engages with first linking gear 48; the rotational force of motor 22 is transmitted to first cam gear 50; cam 50A of first cam gear 50 causes rising and falling member 51 to rise and fall based on this rotational force; and as a result, ear parts 4A and 4B open and close with the opening and closing of first and second spring parts 60A and 60B of ear drive member 54.

In contrast to this, second linking gear 49 engages with a second cam gear 63 and a cam 63A of a specific shape is formed as shown in FIG. 9(A) in the bottom surface of this second cam gear 63. Moreover, two parallel arm parts 65A and 65B of a two-pronged member 65 anchored to a shaft body 64, which is in turn anchored to torso part 2, extend to the bottom of second cam gear 63. Moreover, cam 63A of second cam gear 63 fits in between these two arm parts 65A and 65B of this two-pronged member 65.

When motor 22 of drive mechanism part 34 is driven by reverse motion, pendulum gear 46 (FIG. 5) engages with second linking gear 49; the rotational force of motor 22 is transmitted to second cam gear 63; cam 63A of second cam gear 63 restrains arm parts 65A and 65B of two-pronged member 65, as shown in FIGS. 9(B) and (C) based on this rotational force; and in response, head part 3 swings to the left and right of shaft body 64. It should be noted as long as the direction of the rotation is in opposite directions, the forward motion and reverse motion of motor 22 are not limited to just clockwise and counterclockwise motion.

On the other hand, as shown in FIG. 3, LED display part 21 is formed by disposing seven LEDs 21A through 21G on a substrate 66 at a specific positional relationship. The positional relationship of the seven LEDs 21A through 21G and the emission colors thereof are described later.

Member 36 for preventing light from escaping is formed from, for instance, a black resin or rubber material that will not transmit light, and is attached tightly to substrate 66 of LED display part 21. A pressing part 24A and a contact part 24B of nose switch 24 are anchored to the bottom end of this member 36 for preventing light from escaping.

Moreover, a total of seven holes 36A through 36G corresponding to each LED 21A through 21G of LED display part 21 are made in this member 36 for preventing light from escaping, and when this member 36 for preventing light from escaping is attached to substrate 66 of LED display part 21, the corresponding LED 21A through 21G of LED display part 21 can be exposed to the filter cover 37 via the respective holes 36A through 36G.

Moreover, the thickness of member 36 for preventing light from escaping is the same as the size of the space between substrate 66 of LED display part 21 and filter cover 37, and the light that is reflected from each LED 21A through 21G of LED display part 21 can therefore be shined in the direction of filter cover 37 without mixing with the light emitted from each of the other LEDs 21A through 21G.

Furthermore, filter cover 37 is formed using a semitransparent resin material, and second half case 31 is formed from a transparent resin material. As a result, when LEDs 21A through 21G of LED display part 21 are off, these LEDs 21A through 21G cannot be recognized (seen) from the outside, and when LEDs 21A through 21G are on, these LEDs 21A through 21G can be recognized (the light emitted from LEDs 21A through 21G can be seen) from the outside. Moreover, by using filter cover 37, the light can be diffused such that the entire shape formed by the corresponding holes 36A through 36G in member 36 for preventing light from escaping seems brighter rather than pin-point as the brightness of only the individual LEDs 21A through 21G that have been turned on.

In this case, filter cover 37 is white. The light emitted from LEDs 21A through 21G of LEDs that can be seen from the outside through this filter cover 37 does not mix and faintly glows inside the white cover.

(2) Expression of Emotions by Robot Toy (2-1) Expression of Emotions by Robot Toy

Next, the expression of emotions by robot toy 1 will be described. A function for expressing feeling is loaded in control part 10 (FIG. 2) of this robot toy 1. By means of this function, the emotions of robot toy 1 are generated based on the type of appeal from the user, and each of LEDs 21A through 21G of LED display 21 flashes in an emission pattern that corresponds to the type and degree of this emotion in such a way that robot toy 1 expresses an emotion.

FIG. 11 shows the positional layout of each LED 21A through 21G of LED display part 21 used for expression of emotion by robot toy 1. As is clear from FIG. 11, one LED 21A is disposed at a specific position on the center line in the axial direction of head 3 in LED display part 21, and the remaining six LEDs 21B through 21G are disposed, at equal distances from LED 21 and at equal intervals from one another, in a concentric circle around LED 21A at the center. In short, the peripheral six LEDs 21B through 21G are disposed in a positional relationship such that each is at an apex of a regular hexagon with the central LED 21A at the center.

An LED capable of simultaneously or separately emitting the three colors of green, red, and blue light is used as central LED 21A. Moreover, an LED capable of simultaneously or separately emitting the two colors of green and red light is used as the other peripheral LEDs 21B through 12G. Consequently, peripheral LEDs 21B through 21G can emit orange light by simultaneously flashing the two colors of green and red light.

On the other hand, as shown in FIG. 12, the emotions of robot toy 1 are defined by two parameters (hereafter referred to as emotion parameters) that represent a “stimulation level” and an “affection level”. These two parameters are stored in memory 10B and are values within a range of “−8” to “8.”

Moreover, the value of each emotion parameter of the “stimulation level” and the “affection level” corresponds to emotions of “joy” and “great fondness” when 0 or within the positive range; and when the value of the emotion parameter of the “affection level” is 0 or positive, but the value of the “stimulation level” is within a negative range, the emotion corresponds to “normal” or “calm”.

Values of the emotion parameters for the “stimulation level” and “affection level” that are both within the negative range correspond to the emotions of “loneliness” and “depression,” while values of the emotion parameter of the “stimulation level” that are 0 or positive and values of the emotion parameter of the “affection level” that are negative correspond to the emotions of “anger” and “dislike.”

Furthermore, the degree of an emotion (intensity of an emotion) such as “joy” is expressed by the magnitude of each emotion parameter of the “stimulation level” and “affection level,” and if the absolute value of the emotion parameter is high, the degree of the emotion increases with this increase in the value.

Consequently, for instance, when the values of an emotion parameter of the “stimulation level” and the “affection level” at a certain time are both “8,” the values of these emotion parameters are both highly positive, and the emotions of robot toy 1 are “joy” and “great fondness.” Both of these emotion parameters are maximum values; therefore, the degree of the emotions of “joy” and “great fondness” is at a maximum.

In order to change the emotion, control part 10 increases or decreases the value of the emotion parameter of the “stimulation level” and the “affection level” within a range of “−8” to “8” when the user appeals to the toy by “hitting down” or pressing” tail switch 7, head switch 23, or nose switch 24, or when there has been no appeal for a specific time based on operation detection signals S3A through S3C applied from tail switch 7, head switch 23, and nose switch 24.

In this case, it is predetermined whether the value of a parameter of the “stimulation level” and the “affection level” will be increased or decreased in accordance with the appeal from the user. For example, if the user presses nose switch 24, the value of each emotion parameter of the “stimulation level” and “affection level” will be increased by one, and when the user presses the head switch, the value of the emotion parameter of the “stimulation level” will be reduced by one, while the value of the emotion parameter of the “affection level” will be increased by one.

Consequently, the user can change the emotions of robot toy 1 to emotions of “joy” and “great fondness”, and increase the degree of these emotions of “joy” and “great fondness,” by pressing nose switch 24 of robot toy 1, and the user can change the emotions of robot toy 1 to “normal” or “calm,” and increase the degree of this emotion of “normal” or “calm” by pressing head switch 23.

Moreover, when control part 10 makes tail switch 7 swing back and forth, the value of the emotion parameter of the “stimulation level” is increased by one, while the value of the emotion parameter of the “affection level” is increased or decreased by one within a range of “−8” to “8.” Consequently, by swinging tail switch 7 of robot toy 1, the user can change emotions of “anger” and “dislike” to emotions of “depression” or “loneliness,” and increase the degree of these emotions.

Furthermore, control part 10 reduces the value of the emotion parameters of the “stimulation level” and the “affection level” by one when the user appeals by “hitting down” or “pressing” tail switch 7, head switch 23, or nose switch 24, or when there has been no appeal for a specific time (for instance, 30 seconds). At this latter time, robot toy 1 changes from auto-emotion to emotions of “depression” and “loneliness” and the degree of these emotions increases.

When the value of the emotion parameters of the “stimulation level” and “affection level” change in this way, control part 10 responds to the emotion of robot toy 1 as well as the degree thereof as determined from the value of the two emotion parameters after the change by causing each LED 21A through 21G of LED display part 21 to flash with an emission pattern corresponding to the emotion and the degree thereof.

Actually, when the values of the emotion parameters of the “stimulation level” and “affection level” have been changed as described above, control part 10 reads the value of each emotion parameter of the “stimulation level” and “affection level” at this time from memory 10B and differentiates the values of the two read emotion parameters to determine whether they are emotions of “joy” and “great affection” (the values of the emotion parameters of the “stimulation level” and “affection level” are both 0 or within the positive range); emotions of “normal” and “calm” (the value of the emotion parameter of the “affection level” is 0 or positive, and the value of the emotion parameter of the “stimulation level” is negative); emotions of “depression” and “loneliness” (the values of the emotion parameters of the “stimulation level” and “affection level” are both negative); or emotions of “anger” and “dislike” (the value of the emotion parameter of the “stimulation level” is 0 or positive and the value of the emotion parameter of the “affection level” is negative).

Control part 10 further differentiates between the values of the emotion parameters of the “stimulation level” and the “affection level” to determine the degree of the emotion of robot toy 1 at that time and controls LED display based on these determination results in such a way that LEDs 21A through 21G of LED display part 21 flash with an emission pattern corresponding to the degree of the emotions of robot toy 1 at that time.

An example of the above-mentioned means is shown in FIG. 13. Each LED 21A through 21G of LED display part 21 will respond to a specific combination of emotion parameters of the “stimulation level” and the “affection level” by a pre-established emission pattern, and a program that defines, for each of the emission patterns, the timing by which LEDs 21A through LED 21G of LED display part 21 will flash is prestored in memory 10B of LEDs 21A through 21G (this program is referred to as the LED drive program hereafter).

Control part 10 determines the emotion of robot toy 1 and the degree thereof as described above, and then causes LEDs 21A through 21G of LED display part 21 to flash with the emission pattern corresponding to the emotion of robot toy 1 and the degree thereof at that time in accordance with this LED drive program.

As is clear from FIG. 13, the emission patterns of LEDs 21A through 21G of this LED display part 21 are based on an emission pattern wherein each LED 21B through 21G is repeatedly turned on and off in succession relative to the respective adjacent LED 21B through LED 21G such that only peripheral LEDs 21B through 21G flash, one at a time, clockwise around central LED 21A. This emission pattern is referred to as the basic emission pattern hereafter.

This basic emission pattern is used as the emission pattern for expressing emotions other than “depression” and “loneliness.” When the emotions are “joy” and “great fondness” (the values of the emotion parameters of the “stimulation level” and “affection level” are both 0 or positive), LEDs 21B through 21G respond by emitting orange light; when the emotions are “normal” and “calm” (the value of the emotion parameter of the “stimulation level” is negative, and the value of the emotion parameter of the “affection level” is 0 or positive), LEDs 21B through 21G respond by emitting green light; and when the emotions are “anger” and “dislike” (the value of the emotion parameter of the “stimulation level” is 0 or positive, and the value of the emotion parameter of the “affection level” is negative), LEDs 21B through 21G respond by emitting green light. When the emotions of the robot toy are “depression” and “loneliness” (the values of the emotion parameters of the “stimulation level” and the “affection level” are both negative), the peripheral LEDs 21B through 21G do not emit light and only central LED 21A flashes blue.

Furthermore, the emission patterns of LEDs 21A through 21G of LED display part 21 are set in such a way that when the emotions are “joy” and “great fondness,” “normal” and “calm,” or “anger” and “dislike,” the lights rotate faster as the degree of the emotion increases, in essence, the flashing cycle of the peripheral LEDs 21B through 21G becomes faster, when the degree of the emotion is high, and when the emotions are “depression” and “loneliness,” the emission patterns are set in such a way that the flashing cycle of central LED 21A becomes faster as the degree of the emotion increases when the degree of the emotion is high.

The emotion of robot toy 1 can be visually recognized from the outside based on the emission colors of LEDs 21A through 21G of LED display part 21 at that time, and the degree of this emotion can be visually recognized from the outside based on the flashing speed of LED 21A through 21D at that time.

(2-2) Procedure for Generating and Processing Emotion

Control part 10 executes the process for generating emotions of robot toy 1 and processing to express the generated emotions (processing for controlling LED display part 21) based on the LED drive program in accordance with procedure RT1 for the generation and expression of emotions shown in FIG. 15.

In essence, control part 10 starts procedure RT1 for generating and expressing of emotions when the power source of this robot toy 1 is turned on, and in step SP1, the values of the emotion parameters of both the “affection level” and “stimulation level” representing the emotions of robot toy 1 are set at an initial value of “0.”

Control part 10 then proceeds to step SP3 and determines whether or not head switch 23 (FIG. 2), nose switch 24 (FIG. 2), or tail switch 7 (FIG. 2) have been pressed or moved. When control part 10 obtains results to the contrary in step SP3, it proceeds to step SP4 and reads the count value of an internal timer, which is not illustrated. In step SP5, the control part sets the values of each emotion parameter of the “affection level” and “stimulation level” of step SP2 at the initial values based on the count as read at step SP4, or it determines whether or not a specific time (for instance, 30 seconds) has lapsed since either head switch 23, nose switch 24, or tail switch 7 was pressed or moved.

When control part 10 obtains results to the contrary in step SP5, it returns to step SP3 and then repeats steps SP3-SP4-SP5-SP3 until an affirmative result is obtained at step SP3 or step SP5.

Moreover, once affirmation is obtained at step SP3 by the user pressing head switch 23 of robot toy 1, control part 10 proceeds to step SP6 and changes the value of each emotion parameter of the “affection level” and “stimulation level” in accordance with first procedure RT2 for changing the emotion parameters shown in FIG. 16.

In essence, when control part 10 proceeds to step SP6, first procedure RT2 for changing the emotion parameters starts at step SP20 and then in step SP21, the control part determines whether or not the values of the emotion parameters on the “affection level” and the “stimulation level” have reached the maximum value (“8” in the present embodiment).

When control part 10 obtains results to the contrary in step SP21, control part 10 proceeds to step SP22 and increases by one the values of each emotion parameter of the “affection level” and the “stimulation level.” This control part 10 finishes second procedure RT2 for changing the values of the emotion parameters and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).

When control part 10 determines that the value of the emotion parameter of the “affection level” is at a maximum in step SP21, it proceeds to step SP23 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP25, completes first procedure RT2 for changing the emotion parameter, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

Moreover, when control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP21 is at a maximum, it proceeds to step SP24 and increases by one only the value of the “affection level”. Control part 10 then proceeds to step SP25, completes first procedure RT2 for changing the emotion parameter, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

On the other hand, when control part 10 obtains affirmation at step SP3 of procedure RT1 for generating and expressing emotions as a result of the user pressing nose switch 24 of robot toy 1, the control part proceeds to step SP7 and changes the value of each emotion parameter of the “affection level” and “stimulation level” of the emotion in accordance with the second procedure RT3 for changing the emotion parameters shown in FIG. 17.

In essence, when control part 10 proceeds to step SP7, it starts second procedure RT3 for changing the emotion parameter at step S30 and in step SP31 determines whether or not the value of the emotion parameter of the “affection level” is at a maximum and whether or not the value of the emotion parameter of the “stimulation level” is at a minimum (“−8” in the present embodiment).

When control part 10 obtains results to the contrary in step SP31, it proceeds to step SP32, and increases by one the value of the emotion parameter of the “affection level” and decreases by one the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP35, completes second procedure RT3 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).

On the other hand, when control part 10 determines in step SP31 that the value of the emotion parameter of the “affection level” is at a maximum, it proceeds to step SP33 and decreases by one only the value of the emotion parameter of the “stimulation level.” Control part 10 then proceeds to step SP35, completes second processing RT3 for changing the emotion parameters, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

When control part 10 determines in step SP31 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP34 and increases by one only the value of the emotion parameter of the “affection level.” Control part 10 then proceeds to step S35, completes second procedure RT3 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

On the other hand, when control part 10 obtains affirmation in step SP3 for procedure RT1 for generating and expressing emotions as a result of the user moving tail switch 7 of robot toy 1, it proceeds to step SP8 and changes the level of each emotion parameter of the “affection level” and “stimulation level” in accordance with the third procedure RT4 for changing the emotion parameters shown in FIG. 18.

In essence, when control part 10 proceeds to step SP8, it begins third procedure RT4 for changing the emotion parameters in step SP40, and in step SP41 it determines whether the value of the emotion parameter of the “affection level” is at a minimum (“−8” in the present embodiment) and whether the value of the emotion parameter of the “stimulation level” is at a maximum.

When control part 10 obtains results to the contrary in step SP41, it proceeds to step SP42 and decreases by one the emotion parameter of the “affection level” and increases by one the value of the emotion parameter of the “stimulation level.” Moreover, control part 10 then proceeds to step 45, completes the third procedure for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions (FIG. 15).

When control part 10 determines that the value of the emotion parameter of the “affection level” is at a minimum in step SP41, it proceeds to step SP43 and increases by one only the value of the emotion parameter of the “stimulation level”. Moreover, control part 10 then proceeds to SP45, completes third procedure RT4 for changing the emotion parameters, and proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

Moreover, when control part 10 determines that the value of the emotion parameter of the “stimulation level” in step SP41 is at a maximum, it proceeds to step SP44 and decreases by one only the value of the “affection level”. Control part 10 then proceeds to step 45, completes the third procedure RT4 for changing the emotion parameters, and then proceeds to step SP13 of procedure RT1 for generating and expressing emotions.

On the other hand, when control part 10 obtains affirmation in step SP5 of procedure RT1 for generating and expressing emotions as a result of each of the emotion parameters of the “affection level” and “stimulation level” being set at the initial values, or a specific amount of time has passed since head switch 23, nose switch 24, or tail switch 7 has been pressed or moved, it proceeds to step SP9 and determines whether the value of each of the emotion parameters of the “affection level” and “stimulation level” are at a minimum.

When control part 10 obtains results to the contrary in step SP9, it proceeds to step SP10 and decreases by one the values of each emotion parameter of the “affection level” and “stimulation level,” and then proceeds to step SP13.

In contrast to this, when control part 10 determines in step SP9 that the value of the emotion parameter of the “affection level” is at a minimum, it proceeds to step SP11, decreases by one only the value of the emotion parameter of the “stimulation level”, and then proceeds to step SP13.

When control part 10 determines in step SP9 that the value of the emotion parameter of the “stimulation level” is at a minimum, it proceeds to step SP12, decreases by one only the value of the emotion parameter of the “affection level,” and then proceeds to step SP13.

In steps SP13 through SP19, control part 10 expresses the emotions and degree thereof of robot toy 1 by the emission pattern of LEDs 21A through 21G of LED display part 21 based on the values of each emotion parameter of the “affection level” and “stimulation level” that were renewed in steps SP3 through SP12 in accordance with the presence of an appeal from the user and the details thereof.

In essence, when control part 10 proceeds to step SP13, it reads the values of the changed emotion parameters of the “affection level” and “stimulation level” from memory 10B (FIG. 2), and determines the value of these emotion parameters. Control part 10 proceeds to step SP14 when the emotion parameter of the “stimulation level” and “affection level” are both 0 or positive, and expresses the emotion (“joy” and “great affection”) of robot toy 1, and the degree thereof by the emission pattern of LEDs 21A through 21G of LED display part 21 in accordance with the first procedure RT5 for emotion expression in FIG. 19.

Actually, when control part 10 proceeds to step SP14, the first procedure RT5 for expressing emotions is started in step SP50, and then in step SP51, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of 0 to 4, control part 10 proceeds to step SP52 and causes each peripheral LED 21B through 21G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP53 and causes each peripheral LED 21B through 21G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.

Moreover, when control part 10 completes the processing in step SP52 or step SP53, it proceeds to step SP54, completes the first procedure RT5 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.

Control part 10 proceeds to step SP15 when the emotion parameter of the “stimulation level” is 0 or positive and the emotion parameter of the “affection level” is negative as determined in step SP13 for the procedure RT1 for generating and expressing emotions, and expresses the emotions (“anger” and “dislike”) of robot toy 1, and the degree thereof by the emission pattern of LEDs 21A through 21G of LED display part 21 in accordance with the second procedure RT6 for emotion expression in FIG. 20.

In essence, when control part 10 proceeds to step SP15, the second procedure RT6 for expressing emotions is started in step SP60, and then in step SP61, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of 0 to 4, control part 10 proceeds to step SP62 and causes each peripheral LED 21B through 21G of LED display part 21 to flash orange in succession such that one cycle of orange light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” is outside this range, the control part proceeds to step SP63 and causes each peripheral LED 21B through 21G on LED display part 21 to flash orange in succession such that one cycle of orange light is 0.5 second.

Moreover, when control part 10 completes the processing in step SP62 or step SP63, it proceeds to step SP64, completes the second procedure RT6 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.

Control part 10 proceeds to step SP16 when the emotion parameter of the “stimulation level” is negative and the emotion parameter of the “affection level” is 0 or positive as determined in step SP13 for procedure RT1 for generating and expressing emotions, and expresses the emotions (“normal” and “calm”) of robot toy 1, and the degree thereof by the emission pattern of LED 21A through 21G of LED display part 21 in accordance with the third procedure RT7 for emotion expression in FIG. 21.

In essence, when control part 10 proceeds to step SP16, third procedure RT7 for expressing emotions is started in step SP70, and then in step SP71, the control part determines the value of each emotion parameter of the “stimulation level” and “affection level.” When the value of the emotion parameter of the “affection level” is within a range of −1 to −4 and the value of the emotion parameter of the “stimulation level” is within a range of 1 to 4, control part 10 proceeds to step SP72 and causes each peripheral LED 21B through 21G of LED display part 21 to flash green in succession such that one cycle of green light is one second, while if the value of each emotion parameter of the “stimulation level” and “affection level” are outside these ranges, the control part proceeds to step SP73 and causes each peripheral LED 21B through 21G of LED display part 21 to flash green in succession such that one cycle of green light is 0.5 second.

Moreover, when control part 10 completes the processing in step SP72 or step SP73, it proceeds to step SP74, completes the third procedure RT7 for expressing emotions, and then returns to step SP3 of procedure RT1 for generating and expressing emotions.

On the other hand, when the value of each emotion parameter of the “stimulation level” and “affection level” is negative as determined in step SP13 of procedure RT1 for generating emotions, control part 10 proceeds to step SP17 and determines each emotion parameter of the “stimulation level” and the “affection level.”

When the value of each emotion parameter of the “stimulation level” and “affection level” is within a range of −1 through −4, control part 10 proceeds to step SP18 and causes only central LED 21A of LED display part 21 to flash at a cycle of once/second, while if the value of either the emotion parameter of the “stimulation level” or “affection level” is outside these ranges, the control part proceeds to step SP19 and causes only central LED 21A of LED display part 21 to flash once/0.5 second. Moreover, when control part 10 completes step SP18 or step SP19, it returns to step SP3.

Thus, control part 10 causes LEDs 21A through 21G of LED display part 21 to flash with an emission pattern that corresponds to the emotions of robot toy 1, and the degree thereof.

(3) Effects

As previously described, by means of robot toy 1 of the present embodiment, of LEDs 21A through 21G of LED display part 21, only LEDs 21B through 21G flash, one at a time, in the clockwise direction in order to express the emotions of “joy” and “great fondness,” the emotions of “normal” and “calm,” and the emotions of “anger” and “dislike,” while only central LED 21A flashes blue in order to express the emotions of “depression” and “loneliness”.

By means of this expression method, for instance, it is possible to express the emotions of robot toy 1 based on diverse emission patterns when compared to the simple flashing of an LED of a pre-determined shape, while at the same time, emotions can be expressed effectively using robot toy 1 that is inexpensive in construction because LED display part 21 can be formed from fewer LEDs than when multiple LEDs are arranged in matrix form.

(4) Other Embodiments

By means of the above-mentioned embodiment, an LED capable of simultaneously or separately emitting three colors of light, green, red, and blue, was used as central LED 21A (first light source) of LED display part 21, and LEDs capable of simultaneously or separately emitting two colors of light, green and red, were used for the other peripheral LEDs 21B through 21G (second light sources). However, central LED 21A can emit other types and numbers of emission colors, and the other peripheral LEDs 21B through 21G can emit other types and numbers of emission colors.

Moreover, by means of the above-mentioned embodiment, the six LEDs 21B through 21G are disposed such that they are at an equal distance from central LED 21A and they are at equal intervals from one another, but LEDs 21A through 21G can be disposed in another way. For instance, three LEDs 70B through 70D are disposed around central LED 70 in FIG. 22 (A-1) and (A-2), four LEDs 71B through 71E are disposed around central LED 71A in FIG. 22 (B-1) and (B-2), and five LEDs 72B through 72F are disposed around central LED 72A in FIG. 22 (C-1) and (C-2). Moreover, it is possible to dispose six LEDs 73B through 73G around central electrode LED 72A as in FIG. 22(D). Eight electrodes LED 73B through 731 can be disposed around central LED 73A as in FIG. 22(E). It is also possible to use more than eight LEDs. The same effect as in the above-mentioned embodiment can be realized.

Furthermore, by means of the above-described embodiment, there were two types of emotion parameters, a “stimulation level” and an “affection level,” but it is also possible to use other numbers and types of emotion parameters. For instance, LEDs 21A through 21G can be caused to flash with an emission pattern that matches emotions with the emission colors of LEDs 21A through 21G using as many emotions as there are emission colors of LEDs 21A through 21G.

By means of the above-described embodiment, only central LED 21A of LED display 21 flashes, or each of LEDs 21A through 21G flashes with an emission pattern wherein only the peripheral LEDs 21B through 21G flashed clockwise around central LED 21A. However, another emission pattern can be used. For instance, it is possible to control the lights in such a way that peripheral LEDs 21B through 21G are turned on and off in succession with respect to adjacent LEDs 21B through 21G in such a way that only peripheral LEDs 21B through 21G flash, one at a time, counterclockwise around central LED 21A, or in such a way that only peripheral LEDs 21B through 21G flash, several at a time, clockwise or counterclockwise around central LED 21A.

Furthermore, by means of the above-described embodiment, the values of the emotion parameters of the “stimulation level” and the “affection level” were changed within a range of −8 to 8 to simplify the discussion, but the values are not limited to this range and various ranges can be selected. In addition, the details of procedure RT1 for generating and expressing emotions can be changed as needed. For instance, by means of actual computer processing, the numbers within a range of −8 to 8 become the numbers within the range of 0 to 15 (0-F by the 16-ary method); therefore, rather than determining whether the above-described emotion parameters are positive or negative at step SP13 of procedure RT1 for generating and expressing emotion, the control part can determine whether or not the emotion parameters fall within any of 0 to 3, 4 to 7, 8 through B or C through F using step SP13 in combination with step SP14 (or steps SP15 through SP17) such that LEDs 21A through 21G are caused to emit light in colors and emission patterns that are matched to these ranges. In short, a variety of means can be used as long as LEDs 21A through 21G are caused to emit light with emission patterns that correspond to combinations of the values of the emotion parameters.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7988522 *Jun 9, 2008Aug 2, 2011Hon Hai Precision Industry Co., Ltd.Electronic dinosaur toy
US8515092Dec 18, 2009Aug 20, 2013Mattel, Inc.Interactive toy for audio output
US20100305448 *May 26, 2009Dec 2, 2010Anne Cecile DagonneauApparatus and method for indicating ultrasound probe orientation and activation status
US20120320077 *Jun 17, 2011Dec 20, 2012Microsoft CorporationCommunicating status and expression
US20130073087 *Sep 20, 2011Mar 21, 2013Disney Enterprises, Inc.System for controlling robotic characters to enhance photographic results
Classifications
U.S. Classification446/175
International ClassificationA63H30/00
Cooperative ClassificationA63H11/00, A63H33/22, G06N3/008, A63H2200/00
European ClassificationA63H11/00, A63H33/22, G06N3/00L3
Legal Events
DateCodeEventDescription
Jul 20, 2007ASAssignment
Owner name: SEGA TOYS CO., LTD., JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AOCHI, YUICHI;YAMANA, WAKANA;TAKUMA, EIJI;AND OTHERS;REEL/FRAME:019583/0627;SIGNING DATES FROM 20070711 TO 20070717