Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20110230266 A1
Publication typeApplication
Application numberUS 13/043,800
Publication dateSep 22, 2011
Filing dateMar 9, 2011
Priority dateMar 16, 2010
Publication number043800, 13043800, US 2011/0230266 A1, US 2011/230266 A1, US 20110230266 A1, US 20110230266A1, US 2011230266 A1, US 2011230266A1, US-A1-20110230266, US-A1-2011230266, US2011/0230266A1, US2011/230266A1, US20110230266 A1, US20110230266A1, US2011230266 A1, US2011230266A1
InventorsTakeshi Yamaguchi
Original AssigneeKonami Digital Entertainment Co., Ltd.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Game device, control method for a game device, and non-transitory information storage medium
US 20110230266 A1
Abstract
A position acquiring unit acquires, from a position information generating unit, three-dimensional position information relating to a position of a player, the position information generating unit generating the three-dimensional position information based on a photographed image acquired from a photographing unit for photographing the player and depth information relating to a distance between a measurement reference position of a depth measuring unit and the player. A determination unit determines whether or not the position of the player is contained in a determination subject space. A game processing execution unit executes game processing based on a result of the determination made by the determination unit. A determination subject space changing unit changes, in a case where it is determined that the position of the player is not contained in the determination subject space, a position of the determination subject space based on the position of the player.
Images(17)
Previous page
Next page
Claims(6)
1. A game device, comprising:
position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
game processing execution means for executing game processing based on a result of the determination made by the determination means; and
determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
2. The game device according to claim 1, wherein the determination subject space changing means comprises:
means for determining whether or not a state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for a reference period; and
means for changing the position of the determination subject space in a case where the state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for the reference period.
3. The game device according to claim 1, further comprising display control means for causing display means to display a game screen containing a game character and a focused area having lightness thereof set higher than lightness of another area,
wherein the display control means comprises means for controlling a positional relation between a display position of the game character and a display position of the focused area based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
4. The game device according to claim 1, further comprising display control means for causing display means to display a game screen containing a first game character and a second game character,
wherein the display control means comprises means for controlling a positional relation between a display position of the first game character and a display position of the second game character based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.
5. A control method for a game device, comprising:
a position acquiring step of acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
a determination step of determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
a game processing execution step of executing game processing based on a result of the determination made in the determination step; and
a determination subject space changing step of changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
6. A non-transitory computer-readable information storage medium having a program recorded thereon, the program causing a computer to function as a game device comprising:
position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player;
determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space;
game processing execution means for executing game processing based on a result of the determination made by the determination means; and
determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.
Description
CROSS-REFERENCE TO RELATED APPLICATION

The present application claims priority from Japanese application JP2010-059465 filed on Mar. 16, 2010, the content of which is hereby incorporated by reference into this application.

BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention relates to a game device, a control method for a game device, and a non-transitory information storage medium.

2. Description of the Related Art

There is known a game in which an image obtained by photographing a player with a camera is used. For example, JP 2005-287830 A describes the following technology. That is, an image obtained by photographing the player and a reference game image stored in advance are synthesized, and the synthesized image is displayed on a monitor, to thereby enable the player to understand a movement that the player should make in the game.

SUMMARY OF THE INVENTION

In recent years, studies have been made on a game in which, in addition to the image obtained by photographing the player, distance information acquired by using an infrared sensor (for example, distance between the player and the infrared sensor) is used. For example, based on the image obtained by photographing the player and the distance information, a determination can be made as to a position and a movement of the player.

In such a game, the player moves their (his/her) body to play the game, and therefore the standing position of the player is sometimes displaced. As a result, there is a risk of the player hitting an obstacle in their surroundings. To address this, it is conceivable to narrow the photographing range of a camera so that the player does not go out of a predetermined range. However, in this case, the player is more liable to go out of the photographing range, and hence there is a risk that some problem will occur to the gameplay of the player.

The present invention has been made in view of the above-mentioned problems, and therefore has an object to provide a game device, a control method for a game device, and a non-transitory information storage medium, which are capable of dealing with displacement in position of a player during gameplay.

In order to solve the above-mentioned problems, a game device according to the present invention includes: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.

Further, a control method for a game device according to the present invention includes: a position acquiring step of acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; a determination step of determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; a game processing execution step of executing game processing based on a result of the determination made in the determination step; and a determination subject space changing step of changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.

Further, a program according to the present invention causes a computer to function as a game device including: position acquiring means for acquiring, from position information generating means, three-dimensional position information relating to a position of a player in a three-dimensional space, the position information generating means generating the three-dimensional position information based on a photographed image acquired from photographing means for photographing the player and depth information relating to a distance between a measurement reference position of depth measuring means and the player; determination means for determining whether or not the position of the player in the three-dimensional space is contained in a determination subject space; game processing execution means for executing game processing based on a result of the determination made by the determination means; and determination subject space changing means for changing, in a case where it is determined that the position of the player in the three-dimensional space is not contained in the determination subject space, a position of the determination subject space based on the position of the player in the three-dimensional space.

Further, a non-transitory computer-readable information storage medium according to the present invention is a non-transitory computer-readable information storage medium having the above-mentioned program recorded thereon.

According to the present invention, it is possible to deal with the displacement in position of the player during gameplay.

Further, according to one aspect of the present invention, the determination subject space changing means includes: means for determining whether or not a state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for a reference period; and means for changing the position of the determination subject space in a case where the state in which the position of the player in the three-dimensional space is not contained in the determination subject space has continued for the reference period.

Further, according to one aspect of the present invention, the game device further includes display control means for causing display means to display a game screen containing a game character and a focused area having lightness thereof set higher than lightness of another area, in which the display control means includes means for controlling positional relation between a display position of the game character and a display position of the focused area based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.

Further, according to one aspect of the present invention, the game device further includes display control means for causing display means to display a game screen containing a first game character and a second game character, in which the display control means includes means for controlling a positional relation between a display position of the first game character and a display position of the second game character based on a positional relation between the position of the player in the three-dimensional space and the determination subject space.

BRIEF DESCRIPTION OF THE DRAWINGS

In the accompanying drawings:

FIG. 1 is a diagram illustrating a positional relation among a position detecting device, a game device, and a player;

FIG. 2 is a diagram illustrating an example of a photographed image generated by a CCD camera;

FIG. 3 is a diagram for describing a method of measuring a depth of the player, which is performed by an infrared sensor;

FIG. 4 is a diagram illustrating an example of a depth image acquired by the infrared sensor;

FIG. 5 is a diagram illustrating an example of three-dimensional position information generated by the position detecting device;

FIG. 6 is a diagram illustrating a position of the player, which is identified by the three-dimensional position information;

FIG. 7 is a diagram illustrating a space to be photographed by the position detecting device;

FIG. 8 is a diagram illustrating an example of a game screen displayed by the game device;

FIG. 9 is a diagram illustrating, as an example, the game screen displayed by the game device in a case where the player has stepped out of a determination subject space;

FIG. 10 is a diagram illustrating the position detecting device and the player viewed from an Xw-Zw plane;

FIG. 11 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space;

FIG. 12 is a diagram illustrating the position detecting device and the player viewed from an Xw-Yw plane;

FIG. 13 is an example of the game screen displayed in the case where the player has stepped out of the determination subject space;

FIG. 14 is a diagram illustrating a hardware configuration of the position detecting device;

FIG. 15 is a diagram illustrating a hardware configuration of the game device;

FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on the game device;

FIG. 17 is a diagram illustrating an example of reference action information;

FIG. 18 is a diagram illustrating an example of action determination criterion information;

FIG. 19 is a flow chart illustrating an example of processing to be executed on the game device;

FIG. 20 is a diagram illustrating the determination subject space after change;

FIG. 21 is a diagram illustrating a case where a display position of an image contained in the game screen has been changed;

FIG. 22 is a diagram illustrating another example of the game screen;

FIG. 23 is a diagram illustrating an example of the game screen;

FIG. 24 is a diagram illustrating an example of the game screen; and

FIG. 25 is a diagram illustrating a case where the display position of an image contained in the game screen has been changed.

DETAILED DESCRIPTION OF THE INVENTION First Embodiment

Hereinafter, detailed description is given of an example of an embodiment of the present invention with reference to the drawings.

A game device according to the embodiment of the present invention is implemented by, for example, a home-use game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. In this specification, description is given of a case where the game device according to the embodiment of the present invention is implemented by a home-use game machine.

1-1. General Outline

FIG. 1 is a diagram illustrating a positional relation among a position detecting device 1, a game device 20, and a player 100. As illustrated in FIG. 1, the player 100 is positioned, for example, in front of the position detecting device 1. The position detecting device 1 and the game device 20 are connected to each other so as to be able to communicate data therebetween. Further, the player 100 plays a game in, for example, a living room where items of furniture F are placed.

The position detecting device 1 generates information relating to a position of the player 100 based on an image acquired by photographing the player 100 and information relating to a distance between the position detecting device 1 and the player 100. For example, the position detecting device 1 detects sets of three-dimensional coordinates corresponding to a plurality of parts (for example, head, shoulder, etc.) constituting the body of the player 100.

The game device 20 acquires the information relating to the position of the player 100 from the position detecting device 1. For example, the game device 20 acquires a three-dimensional coordinate that indicate a standing position of the player 100 in a three-dimensional space from the position detecting device 1. The game device 20 controls the game based on changes in the three-dimensional coordinate.

A change in the three-dimensional coordinate associated with the player 100 corresponds to an action of the player 100. For example, in a case where the player 100 has performed an action of raising their right hand, sets of the three-dimensional coordinates corresponding to the right elbow and the right hand of the player 100 mainly change.

1-2. Operation of Position Detecting Device

Next, description is given of processing in which the position detecting device 1 generates the information relating to the position of the player 100 (three-dimensional position information). As illustrated in FIG. 1, the position detecting device 1 includes, for example, a CCD camera 2, an infrared sensor 3, and a microphone 4 including a plurality of microphones. In this embodiment, the three-dimensional position information of the player 100 is generated based on information acquired from the CCD camera 2 and the infrared sensor 3.

The CCD camera 2 is a publicly-known camera comprising a CCD image sensor. The CCD camera 2 photographs the player 100. For example, the CCD camera 2 generates a still image (for example, RGB digital image) by photographing the player 100 at predetermined time intervals (for example, every 1/60th of a second). Hereinafter, the still image generated by the CCD camera 2 is referred to as a photographed image. The photographed image contains an object located within a field of view of the CCD camera 2.

FIG. 2 is a diagram illustrating an example of the photographed image generated by the CCD camera 2. As illustrated in FIG. 2, the photographed image contains, for example, the player 100. It should be noted that in a case where the items of furniture F, the floor and the wall of the living room, and the like are contained within the field of view of the CCD camera 2, the photographed image contains those objects, which are omitted in FIG. 2 for simplicity of description.

In the photographed image, there are set an Xs-axis and a Ys-axis, which are orthogonal to each other. For example, the upper left corner of the photographed image is set as an origin point Os (0,0). Further, for example, the lower right corner of the photographed image is set as a coordinate Pmax (Xmax,Ymax). The position of each pixel in the photographed image is identified by a two-dimensional coordinate (Xs-Ys coordinate) that is assigned to each pixel.

The infrared sensor 3 is formed of, for example, an infrared emitting device and an infrared receiving device (for example, infrared diodes). The infrared sensor 3 detects reflected light obtained by emitting infrared light. The infrared sensor 3 measures the depth of a subject (for example, player 100) based on a detection result of the reflected light.

The depth of a subject is a distance between a measurement reference position (for example, position of the infrared receiving device of the infrared sensor 3) and the position of the subject. The measurement reference position is a position that serves as a reference in measuring the depth of the position of the player 100. The measurement reference position may be a predetermined position associated with the position of the position detecting device 1. The infrared sensor 3 measures the depth of the player 100 based, for example, on a time of flight (TOF), which is a time required for the infrared sensor 3 to receive reflected light after emitting infrared light.

FIG. 3 is a diagram for describing a method of measuring the depth of the player 100, which is performed by the infrared sensor 3. As illustrated in FIG. 3, the infrared sensor 3 emits pulsed infrared light at predetermined intervals. The infrared light emitted from the infrared sensor 3 spreads spherically with an emission position of the infrared sensor 3 at the center.

The infrared light emitted from the infrared sensor 3 strikes surfaces of, for example, the body of the player 100 and other objects (for example, furniture F, walls, etc.) located in the living room. The infrared light that has struck those surfaces is reflected. The reflected infrared light is detected by the infrared receiving device of the infrared sensor 3. Specifically, the infrared sensor 3 detects reflected light having a phase shifted by 180 from that of the emitted infrared light.

For example, as illustrated in FIG. 3, in a case where the player 100 is holding out both hands, those held-out hands are closer to the infrared sensor 3 than the torso of the player 100. Specifically, the TOF of the infrared light reflected by both hands of the player 100 is shorter than the TOF of the infrared light reflected by the torso of the player 100.

The value determined as follows corresponds to the distance between the measurement reference position and the player 100 (that is, depth). Specifically, the value is determined by multiplying a time required for the infrared sensor 3 to detect the reflected light after emitting the infrared light (that is, TOF) by the speed of the infrared light and then dividing the resultant value by two. In this manner, the infrared sensor 3 can measure the depth of the player 100.

Further, the infrared sensor 3 can also detect an outline of a subject (player 100) by detecting depth differences acquired from the reflected infrared light.

Specifically, the fact that the infrared sensor 3 receives the reflected infrared light as described above means that an object is located at that place. If there is no other object located behind the object, the depth difference between the object and the surroundings of the object is large. Specifically, for example, the depth difference is large between a depth acquired by the infrared light reflected from the player 100 and a depth acquired by the infrared light reflected from the wall behind the player 100, and hence it is possible to detect the outline of the object by joining portions having the depth differences larger than a predetermined value.

It should be noted that the method of detecting the outline of an object is not limited to the above-mentioned example. Alternatively, for example, the outline may be detected based on the brightness of each pixel of the photographed image acquired by the CCD camera 2. In this case, it is equally possible to detect the outline of the object by, for example, joining portions having large brightness differences among the pixels.

It should be noted that the light that has returned to the infrared sensor 3 may be subjected to predetermined filtering processing. Specifically, noise may be reduced by employing such a configuration that only reflected light corresponding to the infrared light emitted by the infrared sensor 3 is detected by a light detection sensor.

Information relating to the depth of the player 100 (depth information), which is detected as described above, is expressed as, for example, a depth image. In this embodiment, description is given by taking, as an example, a case where the depth information is expressed as a gray-scale depth image (for example, 256-bit gray-scale image data).

FIG. 4 is a diagram illustrating an example of the depth image acquired by the infrared sensor 3. As illustrated in FIG. 4, for example, an object located close to the infrared sensor 3 is expressed as bright (brightness is high), and an object located far from the infrared sensor 3 is expressed as dark (brightness is low). For example, in a case where the depth image is expressed as the 256-bit gray-scale image data, the depth of the player 100 corresponds to the brightness (pixel value) of the depth image. Specifically, for example, for every 2-cm change in depth of the player 100, the depth image is changed by one bit. This case means that the infrared sensor 3 is capable of detecting the depth of the subject in units of 2 cm.

As illustrated in FIG. 3, in the case where the player 100 is holding out both hands, those held-out hands are closer to the infrared sensor 3 than the torso of the player 100. In other words, the depth of both hands of the player 100 is smaller than that of the torso. Accordingly, as illustrated in FIG. 4, pixels corresponding to both hands of the player 100 are expressed as brighter (brightness is higher) than pixels corresponding to the torso.

In this embodiment, similarly to the CCD camera 2, the infrared sensor 3 generates the depth image at predetermined time intervals (for example, every 1/60th of a second). Based on the photographed image acquired by the CCD camera 2 and the depth image acquired by the infrared sensor 3, the three-dimensional position information is generated relating to the position of the player 100.

For example, there is generated such a composite image (RGBD data) that is obtained by adding the depth information (D: depth) indicated by the depth image to the photographed image (RGB data) acquired by the CCD camera 2. In other words, the composite image contains, for each pixel, color information (lightness of each of R, G, and B) and the depth information.

It should be noted that in generating the composite image, the position of at least one of the photographed image and the depth image is corrected based on a positional distance between the CCD camera 2 and the infrared sensor 3. For example, in a case where the CCD camera 2 and the infrared sensor 3 are spaced apart from each other by 2 cm in the horizontal direction, the coordinates of each pixel of the depth image are shifted by the number of pixels that corresponds to 2 cm, to thereby correct the position.

The three-dimensional position information is generated based on the composite image. In this embodiment, description is given by taking, as an example, a case where the three-dimensional position information represents the three-dimensional coordinate corresponding to each of the parts (for example, head, shoulder, etc.) of the body of the player 100.

Specifically, for example, the three-dimensional position information is generated in the following manner.

First, as described above, based on the depth image, pixels corresponding to the outline of the player 100 are identified. Pixels enclosed within the outline of the player 100 are the pixels corresponding to the body of the player 100.

Next, in the photographed image, the color information (lightnesses of R, G, and B) of the above-mentioned pixels enclosed within the outline is referred to. Based on the color information of the photographed image, pixels corresponding to each part of the body of the player 100 are identified. For this identification method, for example, a publicly-known method is applicable, such as a pattern matching method in which the object (that is, each part of the body of the player 100) is extracted from the image through a comparison with a comparison image (training image).

Alternatively, for example, pixels corresponding to the positions of the head, both elbows, etc. of the player 100 may be identified by calculating a velocity vector of each part of the body based on a change in color information of each pixel of the photographed image and then detecting a motion vector of each pixel based on an optical flow representing the movement of the object (for example, gradient method or filtering method).

Based on the pixel values (RGBD values) of the pixels identified as described above, the three-dimensional coordinates of the head, both elbows, etc. of the player 100 are calculated. For example, the three-dimensional coordinates are generated by carrying out predetermined matrix transformation processing on those pixel values. The matrix transformation processing is executed through, for example, a matrix operation similar to transformation processing performed in 3D graphics between two coordinate systems of a world coordinate system and a screen coordinate system. Specifically, the RGB value indicating the color information of the pixel and the D value indicating the depth are substituted into a predetermined determinant, to thereby calculate the three-dimensional coordinate of the pixel. That is, the three-dimensional coordinates of each part of the player 100 are calculated.

It should be noted that for the method of calculating the three-dimensional coordinate that correspond to a pixel based on the pixel value (RGBD value), a publicly-known method may be applied, and the calculation method is not limited to the above-mentioned example. Alternatively, for example, the coordinate transformation may be performed using a lookup table.

FIG. 5 is a diagram illustrating an example of the three-dimensional position information generated by the position detecting device 1. As illustrated in FIG. 5, as the three-dimensional position information, for example, each part of the player 100 and the three-dimensional coordinates are stored in association with each other.

FIG. 6 is a diagram illustrating the position of the player 100, which is identified by the three-dimensional position information. In this embodiment, for example, a predetermined position corresponding to the position detecting device 1 (for example, the measurement reference position) is set as an origin point Ow. For example, the origin point Ow represents the three-dimensional coordinate corresponding to the measurement reference position of the infrared sensor 3. It should be noted that the position of the origin point Ow may be set anywhere in the three-dimensional space in which the player 100 exists. For example, the three-dimensional coordinate corresponding to the origin point Os of the photographed image may be set as the origin point Ow.

As illustrated in FIG. 6, in this embodiment, description is given by taking, as an example, a case where sets of three-dimensional coordinates corresponding to, for example, the head P1, neck P2, right shoulder P3, left shoulder P4, right elbow P5, left elbow P6, right hand P7, left hand P8, chest P9, waist P10, right knee P11, left knee P12, right heel P13, left heel P14, a right toe P15, and a left toe P16 of the player 100 are acquired as the three-dimensional position information.

It should be noted that the part of the body of the player 100, which is indicated by the three-dimensional position information, may be a part that is determined in advance from the player's skeletal frame. For example, any part of the body may be used as long as the part is identifiable by the above-mentioned pattern matching method.

In this embodiment, as described above, based on the photographed image and the depth image which are generated at the predetermined time intervals, the three-dimensional position information is generated at predetermined time intervals (for example, every 1/60th of a second). The generated three-dimensional position information is transmitted from the position detecting device 1 to the game device 20 at predetermined time intervals.

The game device 20 receives the three-dimensional position information transmitted from the position detecting device 1, and recognizes the position of the body of the player 100 based on the three-dimensional position information. Specifically, if the player 100 has performed an action of dancing or kicking a ball, the three-dimensional position information changes in response to this action, and hence the game device 20 recognizes the movement of the player based on the change in three-dimensional position information. The game device 20 executes the game while recognizing the movement of the body of the player based on the three-dimensional position information, details of which are described later.

Next, description is given of a space in which the position detecting device 1 can detect the player 100 (hereinafter, referred to as detectable space 60).

FIG. 7 is a diagram illustrating a space to be photographed by the position detecting device 1. As illustrated in FIG. 7, the detectable space 60 (space enclosed with broken lines of FIG. 7) is, for example, a predetermined space within the field of view of the CCD camera 2. The field of view of the CCD camera 2 is determined based, for example, on the line-of-sight and the angle of view of the CCD camera 2.

Of the space photographed by the position detecting device 1 (that is, space within the field of view), the detectable space 60 is such a space as to allow accurate capturing of the movement of the player 100.

For example, in a case where the position detecting device 1 and the player 100 are located too close to each other (for example, 1 meter or shorter), the position detecting device 1 is unable to photograph the entire body of the player 100. In such a case, for example, if the head, the foot, or the like of the player 100 is not contained in the photographed image (FIG. 2), the position detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space relatively close to the position detecting device 1 is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2.

Further, for example, in a case where the position detecting device 1 and the player 100 are located too far from each other (for example, 5 meters or longer), the infrared light is attenuated, which results in the position detecting device 1 being unable to detect the reflected light. In such a case, the position detecting device 1 is unable to acquire accurate depth information. Therefore, a space relatively far from the position detecting device 1 is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2.

Further, for example, in a case where the standing position of the player 100 is displaced in the horizontal direction (for example, Yw-axis direction), the position detecting device 1 is unable to photograph the entire body of the player 100. In such a case, the right side of or the left side of the body of the player 100 is omitted from the photographed image (FIG. 2), and hence the position detecting device 1 is unable to acquire accurate three-dimensional position information. Therefore, a space close to both end portions of the horizontal direction is excluded from the detectable space 60 even if the space is within the field of view of the CCD camera 2.

As illustrated in FIG. 7, the detectable space 60 is, for example, a space obtained by excluding the respective spaces described above from the space within the field of view of the CCD camera 2. In other words, the detectable space 60 is a space in which the position detecting device 1 can generate accurate three-dimensional position information when the player 100 is standing inside the space. The size (volume), shape, and position of the detectable space 60 may be determined in advance by, for example, a game creator, or may be changed according to a state of a room where the position detecting device 1 is installed.

In this embodiment, a determination subject space 70 is set inside the detectable space 60. As illustrated in FIG. 7, for example, the determination subject space 70 is set at a predetermined position inside the detectable space 60 associated with the position detecting device 1. Further, the determination subject space 70 contains a representative point 71 for specifying a position at which the determination subject space 70 is to be placed.

The determination subject space 70 is used to define a space in which the player 100 needs to be located. The size (volume) and shape of the determination subject space 70 may be determined in advance by, for example, a game creator. On the other hand, the position of the determination subject space 70 is changed, for example, according to the position of the player 100, details of which are described later.

Here, description is given of the significance of the determination subject space 70 illustrated in FIG. 7. As described above, in the case where the player 100 is in the detectable space 60, in principle, the game device 20 can detect the action of the player 100.

However, there is a case where the player 100 is unable to move freely in the detectable space 60. One example is a case where the player 100 plays the game in a living room or the like of their house as illustrated in FIG. 1.

As illustrated in FIG. 1, the items of furniture F, such as a desk and a chair, and the walls and the like of the living room are placed in the living room. Further, there is a case where there is another player 100 or a person who is watching the game in the living room. Thus, in a case where the player 100 actually plays the game, various obstacles are often placed inside the detectable space 60, and hence, in such a case, the player is unable to move freely in the detectable space 60.

Further, in a case where the player 100 gets absorbed in the gameplay, there is a risk that the player 100 will not notice the existence of obstacles in their surroundings. Specifically, there is a risk that the player 100 will move their body despite the existence of an obstacle and hit their body against the obstacle. Further, in a case where a plurality of players 100 play the game simultaneously, there is a risk that those players 100 will hit each other because each of the players moves their body to play the game.

In view of this, in this embodiment, the determination subject space 70 is set inside the detectable space 60, and the player is prompted to play the game in the determination subject space 70. Specifically, as long as the player 100 plays the game in the determination subject space 70, it is possible to reduce the risk of the player 100 hitting another player 100 or an obstacle. Therefore, the determination subject space 70 serves to show the player 100 a safe space in which the player 100 has a low risk of hitting an obstacle.

Normally, the player 100 clears away surrounding items of furniture F from their standing position to ensure safety in their surroundings, and then starts to play the game. Thus, the position of the determination subject space 70 is determined based, for example, on the position of the player 100 at the time of game start (alternatively, immediately before or after the game start). For example, the position of the determination subject space 70 is set so as to contain a place where the player 100 is standing at the time of the game start (alternatively, immediately before or after the game start).

If the player 100 remains inside the determination subject space 70 set as described above, there is a high possibility that there will be no such obstacle as a desk or a chair in their surroundings, and hence the player 100 can play the game more safely.

It should be noted that the setting method for the position of the determination subject space 70 is not limited to the above-mentioned example. For example, an extension of the line-of-sight of the CCD camera 2 may be set as an initial position of the determination subject space 70. In this case, it is possible to prompt the player 100 to stand at a position facing the position detecting device 1 (for example, position in the vicinity of the front of a TV set) to play the game.

It should be noted that in the example described above, one player 100 plays the game, but a plurality of players 100 may play the game. In the case where there are a plurality of players 100, through the same processing as described above, the three-dimensional position information of each player 100 is generated. Specifically, based on the number of outlines of the players 100, the position detecting device 1 can recognize the number of the players 100. The same processing as described above is executed with respect to pixels corresponding to each of the plurality of players 100, and hence it is possible to generate the three-dimensional position information of the plurality of players 100.

Further, when the player 100 is identified from the photographed image acquired by the position detecting device 1, an object having a predetermined height (for example, one meter) or shorter may be excluded. Specifically, in a case such as where the player 100 is sitting on the floor and thus their sitting height is equal to or shorter than the predetermined height, there is a risk that the player 100 will not be detected accurately. Therefore, in this case, the player 100 may be prevented from being detected.

As described above, the game is executed based on the three-dimensional position information of the player 100 in the determination subject space 70. Hereinafter, an example of the game is described.

1-3. Game to be Executed on Game Device

In this embodiment, description is given by taking, as an example, a case where the game device 20 recognizes the standing position and the action of the player based on the three-dimensional position information to execute a dance game.

For example, the game device 20 executes a game configured such that the player 100 dances to movements of a game character on a game screen 50. In this game, for example, the player 100 is required to play the game without moving away from a predetermined standing position. In view of this, according to the game screen 50 displayed in this embodiment, in a case where the player 100 has stepped out of the determination subject space 70, it is possible to prompt the player 100 to return to the determination subject space 70.

FIG. 8 is a diagram illustrating an example of the game screen 50 displayed by the game device 20. As illustrated in FIG. 8, the game screen 50 includes, for example, a game character 51, a spotlight 52, a spotlight area 53 being an area illuminated by the spotlight 52, and a message 54. In the game according to this embodiment, in principle, the game character 51 stands within the spotlight area 53 (focused area) and dances.

The lightness (brightness) of the spotlight area 53 is higher (brighter) than the lightness of the other area. On the other hand, the lightness of an area outside the spotlight area 53 is lower (darker) than the lightness of the spotlight area 53. Specifically, in a case where the game character 51 moves out of the spotlight area 53, the game character 51 becomes less visible.

The game character 51 moves the respective parts of its body, thereby serving to show a dance action to be performed by the player 100. According to movements of the body of the game character 51, the player 100 dances in front of the position detecting device 1.

For example, if the game character 51 has stepped its right foot forward, the player 100 steps their right foot forward as well. Further, for example, if the game character 51 has performed an action of raising its left hand, the player 100 performs an action of raising their left hand as well. In a case where the player 100 has succeeded in moving their body according to the action of the game character 51, for example, the message 54 that reads “GOOD” is displayed on the game screen 50.

Further, in a case where the player 100 has stepped out of the determination subject space 70, the message 54 to that effect is displayed on the game screen 50.

FIG. 9 is a diagram illustrating, as an example, the game screen 50 displayed by the game device 20 in the case where the player 100 has stepped out of the determination subject space 70. As illustrated in FIG. 9, for example, the message 54 that reads “CAUTION” is displayed on the game screen 50. Specifically, because the player 100 is outside the relatively safe determination subject space 70, the message 54 that issues a warning is displayed.

Further, in this case, the display position of the spotlight area 53 may be configured to correspond to the determination subject space 70 of the position detecting device 1. Hereinafter, this example is described. Specifically, in the case where the player 100 is inside the determination subject space 70, the game character 51 is located at a position with high lightness. In other words, in this case, the game character 51 is located in the spotlight area 53.

On the other hand, in the case where the player 100 has stepped out of the determination subject space 70, the game character 51 is located at a position with low lightness. In other words, in this case, the game character 51 is located in the area outside the spotlight area 53.

FIG. 10 is a diagram illustrating the position detecting device 1 and the player 100 viewed from an Xw-Zw plane (that is, from the side). FIG. 10 illustrates a case where the player 100 has moved backward while dancing. As illustrated in FIG. 10, the player 100 has moved backward and therefore is out of the determination subject space 70.

If the player 100 is unaware of this fact and continues the gameplay, there is a risk that the player 100 will hit the furniture located therebehind, such as a sofa. Thus, there is displayed a game screen 50 that prompts the player 100 to move forward to return to the inside of the determination subject space 70.

FIG. 11 is an example of the game screen 50 displayed in the case where the player 100 has stepped out of the determination subject space 70. The game screen 50 of FIG. 11 is displayed in the case where the player 100 is standing at the position of FIG. 10. As illustrated in FIG. 11, the game character 51 is displayed in the area outside the spotlight area 53 (for example, at a predetermined position in the rear). As described above, the area outside the spotlight area 53 is set low in lightness.

Specifically, because the game character 51 is displayed in a relatively dark area, the player 100 finds the game character 51 less visible. In such a case, it is conceivable that the player 100 will move forward so as to cause the game character 51 to move to the spotlight area 53, which is bright and thus makes the game character 51 more visible. Specifically, in a case where the player 100 has stepped backward out of the determination subject space 70 when viewed from the position detecting device 1, by causing the position of the game character 51 to move out of the spotlight area 53, it is possible to prompt the player 100 to move toward the determination subject space 70.

FIG. 12 is a diagram illustrating the position detecting device 1 and the player 100 viewed from an Xw-Yw plane (that is, from the above). FIG. 12 illustrates a case where the player 100 has moved in the horizontal direction (for example, Yw-axis direction) while dancing. As illustrated in FIG. 12, for example, the player 100 is displaced in the horizontal direction and is therefore out of the determination subject space 70.

FIG. 13 is an example of the game screen 50 displayed in the case where the player 100 has stepped out of the determination subject space 70. The game screen 50 of FIG. 13 is displayed in the case where the player 100 is standing at the position of FIG. 12.

Specifically, because the game character 51 is displayed in a relatively dark area, the player 100 finds the game character 51 less visible. The player 100 moves leftward so as to cause the game character 51 to move to the spotlight area 53, which is bright and thus makes the game character 51 more visible. Specifically, in a case where the player 100 has stepped in the horizontal direction out of the determination subject space 70 when viewed from the position detecting device 1, it is possible to prompt the player 100 to move toward the determination subject space 70.

As described above, for example, in a dance game configured such that the player 100 dances to the movements of the game character 51, there is a case where the standing position of the player 100 changes gradually during the gameplay. Specifically, even though the player 100 has ensured safety in their surroundings at the time of starting the game, if the player 100 gets absorbed in the game, there is a risk that the player 100 will move closer to an obstacle. To address this, the game device 20 sets the determination subject space 70 for showing the standing position of the player 100, and hence it is possible to show the standing position to the player 100.

By the way, in the case where the player 100 has stepped out of the determination subject space 70, if the position of the game character 51 is moved out of the spotlight area 53 as illustrated in FIG. 11 or FIG. 13, the game character 51 becomes less visible. It is possible to guide the player 100 to a safe position, but there is a case where no obstacle is actually placed in the area outside the determination subject space 70. Specifically, there is a fear of such inconvenience that the player 100 feels difficulty playing the game because the determination subject space 70 is fixed to the initial position. Hereinafter, description is given of detailed processing relating to technology that solves this inconvenience.

First, detailed description is given of configurations of the position detecting device 1 and the game device 20.

1-4. Configuration of Position Detecting Device

FIG. 14 is a diagram illustrating a hardware configuration of the position detecting device 1. As illustrated in FIG. 14, the position detecting device 1 includes a microprocessor 10, a storage unit 11, a photographing unit 12, a depth measuring unit 13, an audio processing unit 14, and a communication interface unit 15. The respective components of the position detecting device 1 are connected to one another by a bus 16 so as to be able to exchange data thereamong.

The microprocessor 10 controls the respective units of the position detecting device 1 according to an operating system and various kinds of programs which are stored in the storage unit 11.

The storage unit 11 stores programs and various kinds of parameters which are used for operating the operating system, the photographing unit 12, and the depth measuring unit 13. Further, the storage unit 11 stores a program for generating the three-dimensional position information based on the photographed image and the depth image.

The photographing unit 12 includes the CCD camera 2 and the like. The photographing unit 12 generates, for example, the photographed image of the player 100.

The depth measuring unit 13 includes the infrared sensor 3 and the like. The depth measuring unit 13 generates the depth image based, for example, on the TOF acquired using the infrared sensor 3.

As described above, the microprocessor 10 generates the three-dimensional position information based on the photographed image generated by the photographing unit 12 and the depth image generated by the depth measuring unit 13. The microprocessor 10 identifies the positions of pixels corresponding to the respective parts (for example, head P1 to left toe P16) of the player 100 based on the photographed image.

Next, the microprocessor 10 executes coordinate transformation processing and calculates the three-dimensional coordinate based on the RGBD values of the identified pixels. The coordinate transformation processing is performed based on the matrix operation as described above. Through a series of those processing steps, the three-dimensional position information (FIG. 5) is generated at the predetermined time intervals (for example, every 1/60th of a second).

The audio processing unit 14 includes the microphone 4 and the like. For example, the audio processing unit 14 can identify a position at which the player 100 has made a sound based on time lags among sounds detected using a plurality of (for example, three) microphones. Further, as the microphone 4 of the audio processing unit 14, a unidirectional microphone that detects sounds originating from a sound source located along the line-of-sight of the CCD camera 2 may be applied.

The communication interface unit 15 is an interface for transmitting various kinds of data, such as the three-dimensional position information, to the game device 20.

1-5. Configuration of Game Device

FIG. 15 is a diagram illustrating a hardware configuration of the game device 20. As illustrated in FIG. 15, the game device 20 according to this embodiment includes a home-use game machine 21, a display unit 40, an audio output unit 41, an optical disk 42, and a memory card 43. The display unit 40 and the audio output unit 41 are connected to the home-use game machine 21. For example, a home-use television set is used as the display unit 40. Further, for example, a speaker integrated into the home-use television set is used as the audio output unit 41.

The optical disk 42 and the memory card 43 are information storage media, and are inserted into the home-use game machine 21.

The home-use game machine 21 is a publicly-known computer game system, and, as illustrated in FIG. 15, includes a bus 22, a microprocessor 23, a main memory 24, an image processing unit 25, an audio processing unit 26, an optical disk reproducing unit 27, a memory card slot 28, a communication interface (I/F) 29, a controller interface (I/F) 30, and a controller 31. Components other than the controller 31 are accommodated in an enclosure of the home-use game machine 21.

The bus 22 is used for exchanging addresses and data among the units constituting the home-use game machine 21. Specifically, the microprocessor 23, the main memory 24, the image processing unit 25, the audio processing unit 26, the optical disk reproducing unit 27, the memory card slot 28, the communication interface 29, and the controller interface 30 are connected to one another by the bus 22 so as to be able to communicate data thereamong.

The microprocessor 23 executes various kinds of information processing based on an operating system stored in a ROM (not shown), or programs read from the optical disk 42 or the memory card 43.

The main memory 24 includes, for example, a RAM. The program and data read from the optical disk 42 or the memory card 43 are written into the main memory 24 as necessary. The main memory 24 is also used as a working memory for the microprocessor 23.

Further, the main memory 24 stores the three-dimensional position information received from the position detecting device 1 at the predetermined time intervals. The microprocessor 23 controls the game based on the three-dimensional position information stored in the main memory 24.

The image processing unit 25 includes a VRAM, and renders, based on image data transmitted from the microprocessor 23, the game screen 50 in the VRAM. The image processing unit 25 converts the game screen 50 into video signals, and outputs the video signals to the display unit 40 at a predetermined timing.

The audio processing unit 26 includes a sound buffer. The audio processing unit 26 outputs, from the audio output unit 41, various kinds of audio data (game music, game sound effects, messages, etc.) that have been read from the optical disk 42 into the sound buffer.

The optical disk reproducing unit 27 reads a program and data recorded on the optical disk 42. In this embodiment, description is given by taking, as an example, a case where the optical disk 42 is used for supplying the program and the data to the home-use game machine 21. Alternatively, for example, another information storage medium (for example, memory card 43 or the like) may be used. Further, the program and the data may be supplied to the home-use game machine 21 via a data communication network such as the Internet.

The memory card slot 28 is an interface for the memory card 43 to be inserted into. The memory card 43 includes a nonvolatile memory (for example, EEPROM etc.). The memory card 43 stores various kinds of game data, such as saved data.

The communication interface 29 is an interface for establishing communication connection to a communication network such as the Internet.

The controller interface 30 is an interface for establishing wireless connection or wired connection to the controller 31. As the controller interface 30, an interface compliant with, for example, the Bluetooth (registered trademark) interface standard may be used. It should be noted that the controller interface 30 may be an interface for establishing wired connection to the controller 31.

1-6. Functions to be Implemented on Game Device

FIG. 16 is a functional block diagram illustrating a group of functions to be implemented on the game device 20. As illustrated in FIG. 16, on the game device 20, there are implemented a game data storage unit 80, a position acquiring unit 82, a determination unit 84, a game processing execution unit 86, a determination subject space changing unit 88, and a display control unit 90. Those functions are implemented by the microprocessor 23 operating according to programs read from the optical disk 42.

1-6-1. Game Data Storage Unit

The game data storage unit 80 is mainly implemented by the main memory 24 and the memory card 43. The game data storage unit 80 stores information necessary for executing the game. For example, the game data storage unit 80 stores animation information indicating how the game character 51 moves its body.

Further, for example, the game data storage unit 80 stores reference action information for identifying an action to be performed by the player 100.

FIG. 17 is a diagram illustrating an example of the reference action information. As illustrated in FIG. 17, as the reference action information, time information indicating a timing at which an action is to be performed and information for identifying an action to be performed by the player 100 are stored. The time information indicates, for example, an elapsed time after the game is started. In a data storage example illustrated in FIG. 17, for example, a time t1 indicates that the player 100 should perform an action of putting their right foot forward.

As described above, the game character 51 plays a role of showing an action to be performed by the player 100, and thus, when the time t1 arrives, the game character 51 performs an action that looks like putting its right foot forward. The animation information is created in such a manner as to correspond to the reference action information illustrated in FIG. 17. Specifically, every time a time indicated by the time information stored in the reference action information arrives, the game character 51 performs a predetermined animation action based on the animation information.

Further, for example, the game data storage unit 80 stores action determination criterion information, which serves as a condition for making a determination as to the action of the player based on the three-dimensional position information.

FIG. 18 is a diagram illustrating an example of the action determination criterion information. As illustrated in FIG. 18, as the action determination criterion information, for example, information for identifying the movement of the body of the player 100 and a determination criterion to be satisfied by the three-dimensional position information are stored in association with each other. The determination criterion includes, for example, a change amount, a change direction, a change speed, and the like of the three-dimensional coordinate of each part of the player 100. Specifically, for example, the determination criterion is a condition to be satisfied by the motion vector (three-dimensional vector) of each part of the player 100.

In a case where “putting the right foot forward” is the movement of the body which is stored in the action determination criterion information, for example, conditions relating to the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P13 and the right toe P15 are associated with this movement of the body. In this case, if the change amounts, the change directions, and the change speeds of the sets of the three-dimensional coordinates of the right heel P13 and the right toe P15 satisfy the conditions stored in the action determination criterion information, it is determined that the player 100 has put their right foot forward.

The same applies to other actions of the player 100 (for example, punching with the right hand, etc.), and the action of the player 100 is determined based on whether or not the three-dimensional coordinates indicated by the three-dimensional position information satisfy the conditions stored in the action determination criterion information. Specifically, in this embodiment, the determination criterion information is obtained by storing information for making a determination as to dancing of the player 100. It should be noted that the determination criterion information may be stored in a ROM (not shown) or the like of the game device 20.

Further, the game data storage unit 80 stores, for example, determination subject space information for identifying the determination subject space 70. For example, in a case where the shape of the determination subject space 70 is such a truncated pyramid as illustrated in FIG. 7, the length of each side of the determination subject space 70 and information indicating the representative point 71 are stored. That is, based on those items of information, the position of the determination subject space is identified. Further, the length of each side of the determination subject space 70 may be a value determined in advance.

For example, when the player 100 starts the game, the initial position of the representative point 71 is determined. For example, the position of the determination subject space 70 is determined so as to contain the position of the player 100 when the game is started. Specifically, for example, the representative point 71 is determined so as to correspond to the standing position of the player 100 at the time starting the game. Alternatively, for example, the representative point 71 may be a point located along the line-of-sight of the CCD camera 2.

It should be noted that information that may be used as the determination subject space information is not limited to the above-mentioned example. The determination subject space information may be any information as long as the information allows the position and the size of the determination subject space 70 to be identified. For example, in the case where the shape of the determination subject space 70 is a truncated pyramid, the determination subject space information may be information indicating the upper left vertices and the lower right vertices of the top surface and the bottom surface of the determination subject space 70 and information indicating the representative point 71.

Further, the game data storage unit 80 stores information for identifying the detectable space 60. Similarly to the determination subject space information, this information may be any information as long as the information allows the position and the size of the detectable space 60 to be identified.

1-6-2. Position Acquiring Unit

The position acquiring unit 82 is mainly implemented by the microprocessor 23. The position acquiring unit 82 acquires the three-dimensional position information (FIG. 5) from position information generating means (microprocessor 10) for generating the three-dimensional position information relating to the position of the player 100 in the three-dimensional space based on the photographed image acquired from the position detecting device (photographing unit 12) for photographing the player 100 and the depth information relating to a distance between the measurement reference position of the depth measuring means (depth measuring unit 13) and the player 100.

In this embodiment, the position acquiring unit 82 acquires the three-dimensional position information generated by the microprocessor 10 (position information generating means) of the position detecting device 1.

1-6-3. Determination Unit

The determination unit 84 is mainly implemented by the microprocessor 23. The determination unit 84 determines whether or not the position of the player 100 in the three-dimensional space is contained in the determination subject space 70. For example, in a case where any one of the sets of the three-dimensional coordinates contained in the three-dimensional position information is outside the determination subject space 70, it is determined that the position of the player corresponding to the three-dimensional position information is not contained in the determination subject space 70 of the position detecting device 1.

It should be noted that the determination method performed by the determination unit 84 may be any method as long as the method is performed based on the three-dimensional position information and the determination subject space 70, and that the determination method of the determination unit 84 is not limited thereto. For example, in a case where sets of the three-dimensional coordinates corresponding to a plurality of (for example, three) portions of a plurality of (for example, sixteen) parts of the player 100 indicated by the three-dimensional position information are outside the determination subject space 70, it may be determined that the position of the player 100 is not contained in the determination subject space 70.

1-6-4. Game Processing Execution Unit

The game processing execution unit 86 is mainly implemented by the microprocessor 23. The game processing execution unit 86 executes game processing based on a result of a determination made by the determination unit 84. Details of operation of the game processing execution unit 86 are described later (see S105, S106, and S107 of FIG. 19).

1-6-5. Determination Subject Space Changing Unit

The determination subject space changing unit 88 is mainly implemented by the microprocessor 23. In a case where it is determined that the position of the player 100 in the three-dimensional space is not contained in the determination subject space 70, the determination subject space changing unit 88 changes the position of the determination subject space 70 based on the position of the player 100 in the three-dimensional space. Details of operation of the determination subject space changing unit 88 are described later (see S108 and S109 of FIG. 19).

1-6-6. Display Control Unit

The display control unit 90 is mainly implemented by the microprocessor 23. The display control unit 90 displays the game screen 50 on the display unit 40. In this embodiment, the display control unit 90 causes display means (display unit 40) to display the game screen 50 containing the game character 51 and the focused area (spotlight area 53) having its lightness set higher than that of the other area.

Further, the display control unit 90 includes means for controlling the positional relation between the display position of the game character 51 and the display position of the focused area based on the positional relation between the position of the player 100 in the three-dimensional space and the determination subject space 70. Details of operation of the display control unit 90 are described later (see S102 of FIG. 19).

1-7. Processing to be Executed on Game Device

FIG. 19 is a flow chart illustrating an example of processing to be executed on the game device 20. The processing of FIG. 19 is executed by the microprocessor 23 operating according to programs read from the optical disk 42. For example, the processing of FIG. 19 is executed at predetermined time intervals (for example, every 1/60th of a second).

As illustrated in FIG. 19, first, the microprocessor 23 (position acquiring unit 82) acquires the three-dimensional position information of the player 100 (S101).

The microprocessor 23 (display control unit 90) changes the position of the game character 51 to be displayed on the game screen 50 (S102). In S102, for example, the display position of the game character 51 is changed based on the positional relation between the three-dimensional position information of the player 100 and the representative point 71. For example, a determination is made as to the positional relation between the three-dimensional coordinates of the waist P10 contained in the three-dimensional position information of the player 100 and the representative point 71. Specifically, a direction D from the representative point 71 of the determination subject space 70 toward the three-dimensional coordinate of the waist P10 of the player and a distance L therebetween are acquired (FIG. 7).

Next, the display position of the game character 51 is changed so that the positional relation between the display position of the game character 51 and a guidance position 55 of the spotlight area 53 corresponds to the positional relation between the three-dimensional coordinate of the waist P10 of the player and the representative point 71 of the determination subject space 70.

For example, as illustrated in FIG. 11 and FIG. 13, the display position of the game character 51 is changed from the guidance position 55 of the spotlight area 53 to a position 57 obtained by shifting the guidance position 55 of the spotlight area 53 by a distance Ls corresponding to the above-mentioned distance L in a direction Ds corresponding to the above-mentioned direction D. The direction Ds and the distance Ls are respectively calculated based, for example, on the direction D or the distance L and a predetermined mathematical expression. The predetermined mathematical expression may be, for example, a predetermined matrix (for example, projection matrix) for transforming a three-dimensional vector to a two-dimensional vector.

Further, the guidance position 55, which is a position for guiding the game character 51, corresponds to the representative point 71. For example, the guidance position 55 is a position located a predetermined distance above a center point of the spotlight area 53.

Through the processing of S102, the display position of the game character 51 is controlled. Specifically, by referring to the positional relation between the game character 51 and the spotlight area 53 which are displayed on the game screen 50, the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70.

As a result, the player can adjust their own standing position. Further, in a case where the game character 51 has moved out of the spotlight area 53, the game character 51 becomes less visible, and hence it is conceivable that the player will unconsciously adjust their own standing position so that the game character 51 is positioned within the spotlight area 53. By controlling the display position of the game character 51 as described above, it also becomes possible to make the player unconsciously adjust their own standing position.

Referring back to FIG. 19, the microprocessor 23 (display control unit 90) updates the posture of the game character 51 displayed on the game screen 50 based on animation data (S103).

The microprocessor 23 (determination unit 84) determines whether or not at least one position of the body of the player 100 indicated by the three-dimensional position information is outside the determination subject space 70 (S104). The determination of S104 is performed by, for example, comparing the three-dimensional position information and the determination subject space information. Specifically, for example, a determination is made based on whether or not the three-dimensional coordinates (FIG. 5) contained in the three-dimensional position information are inside the determination subject space 70 (FIG. 7).

In a case where the position of the player is not outside the determination subject space 70 (S104; N), that is, in a case where all the positions corresponding to the player 100 are inside the determination subject space 70, the microprocessor 23 (game processing execution unit 86) determines whether or not the player 100 has moved their body according to the movement of the body of the game character 51 (S105).

In S105, it is determined whether or not the player 100 has performed an action similar to the action (movement of the body) performed by the game character 51. This determination is executed based, for example, on the three-dimensional position information, the reference action information (FIG. 17), and the determination criterion information (FIG. 18).

In a case where the reference action information is the data storage example illustrated in FIG. 17, for example, it is indicated that the game character 51 puts its right foot forward at the time t1. In this case, at a time at which the game character 51 puts its right foot forward (hereinafter, referred to as “reference time”), it is determined whether or not the player has put their right foot forward. The reference time is, for example, a time within a predetermined period including times (for example, time t1) stored in the reference action information.

Here, the “predetermined period” is, for example, a period from a start time that is a predetermined time before the reference time until an end time that is a predetermined time after the reference time. In a case where the player 100 has put their right foot forward within the above-mentioned predetermined period, it is determined that the player has moved their foot according to the movement of the foot of the game character 51. In other words, it is determined that the player 100 has performed an action according to the movement of the game character 51. As described above, whether or not the player 100 has put their right foot forward is determined based on the determination criterion information (FIG. 18).

In a case where it is determined that the player 100 has performed an action according to the movement of the game character 51 (S105; Y), the microprocessor 23 (game processing execution unit 86) displays the message 54 such as “GOOD” on the game screen 50 (S106).

On the other hand, in a case where it is not determined that the player 100 has performed an action according to the movement of the game character 51 (S105; N), the microprocessor 23 does not display such a message as in S106 and ends the processing.

On the other hand, in a case where at least one position of the body of the player is outside the determination subject space 70 (S104; Y), the microprocessor 23 (game processing execution unit 86) displays the message 54 such as “CAUTION” on the game screen (S107).

The microprocessor 23 (determination subject space changing unit 88) determines whether or not a state in which at least one position of the body of the player is outside the determination subject space 70 has continued for a reference period (for example, three seconds) (S108).

In a case where the state in which at least one position of the body of the player is outside the determination subject space 70 has continued for the reference period (S108; Y), the microprocessor 23 (determination subject space changing unit 88) changes the position of the determination subject space 70 (S109).

In S109, for example, the three-dimensional coordinate of the waist P10 in the three-dimensional position information is referred to. Subsequently, the position of the determination subject space 70 is changed so that the representative point 71 of the determination subject space 70 coincides with the three-dimensional coordinate corresponding to the waist P10 of the player.

FIG. 20 is a diagram illustrating the determination subject space 70 after the change. As illustrated in FIG. 20, the position of the determination subject space 70 is changed so that the position of the waist P10 of the player 100 coincides with the representative point 71.

It should be noted that the change method for the position of the determination subject space 70 performed in S109 may be any method as long as the position of the player 100 is contained in the determination subject space 70 based on the position of the player 100 indicated by the three-dimensional position information, and that the change method is not limited to the above-mentioned example. In addition, for example, the position of the determination subject space 70 may be changed so that an average value of the sets of three-dimensional coordinates contained in the three-dimensional position information coincides with the representative point 71.

For example, in a case where the player 100 has moved closer to an obstacle, it is conceivable that the player 100 will notice that fact within a predetermined time period and return to the original position. Specifically, in view of the above, in a case where the state in which the player 100 is out of the determination subject space 70 before the change has continued for the predetermined time period, it is conceivable that there is a high possibility that there is no obstacle around the current standing position of the player 100. Thus, in this case, in order to allow the player 100 to continue the gameplay, as illustrated in FIG. 19, the position of the determination subject space 70 is changed so that the position of the body of the player is contained in the determination subject space 70.

On the other hand, in a case where the state in which at least one position of the body of the player is outside the determination subject space 70 has not continued for the reference period (S108; N), the microprocessor 23 ends the processing. Specifically, in this case, the microprocessor 23 does not perform the processing of displaying the message 54 such as “GOOD” based on the movement of the body of the player 100. In other words, by making no response to the dance action performed by the player 100, it is also possible to make the player 100 understand that the player 100 is not in the determination subject space 70.

1-8. Summary of Embodiment

In the game device 20 described above, when the game processing is executed, whether or not the message 54 such as “CAUTION” is to be displayed or whether or not detection of the action of the player 100 is to be avoided is determined based on whether or not the player 100 is in the determination subject space 70. Further, the position of the determination subject space 70 is changed based on the position of the player 100 in the three-dimensional space, and hence it is possible to change the determination subject space 70 for the player 100 to be able to continue the gameplay while prompting the player 100 to stay in the determination subject space 70.

As described above, at the time of starting the game (alternatively, immediately before or after starting the game), it is possible to ensure safety of the player 100 within their surroundings are safe. Accordingly, by guiding the player 100 to this safe position, it is possible to reduce a risk that the player 100 will hit an obstacle or another player 100. Therefore, even if a game is configured to require the player 100 to move their body, the player 100 can play the game safely.

Further, based on the positional relation between the position of the body of the player 100 and the position of the determination subject space 70, the positional relation between the display position of the game character 51 and the display position of the spotlight area 53 is controlled. For example, in the case where the position of the body of the player 100 is out of the determination subject space 70, the game character 51 is located outside the spotlight area 53 (see FIG. 11 and FIG. 13).

According to the game device 20, by referring to the positional relation between the game character 51 and the spotlight area 53 which are displayed on the game screen 50, the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70. As a result, in the case where the standing position of the player 100 has changed during the gameplay, the player 100 can know that their standing position has changed. Therefore, the player 100 can adjust their own standing position.

Further, in the case where the game character 51 has moved out of the spotlight area 53, the game character 51 becomes less visible. Accordingly, it is conceivable that the player 100 will try to unconsciously adjust their own standing position so that the game character 51 is located within the spotlight area 53. As described above, according to the game device 20, it is also possible to make the player 100 unconsciously adjust their own standing position.

Further, in the game device 20, in the case where the state in which at least one position of the body of the player 100 is outside the determination subject space 70 has continued for the reference period (for example, three seconds), the position of the determination subject space 70 is changed so that the position of the body of the player 100 is contained in the determination subject space 70 (see FIG. 20).

According to the above-mentioned processing of the game device 20, in the state in which the position of at least one part of the body of the player 100 is outside the determination subject space 70 because the standing position of the player 100 has changed during the gameplay, it is possible to continue the game even if the player 100 does not adjust their standing position. As described above, in the case where the state in which the player 100 is outside the determination subject space 70 has continued for the reference period, there is a high possibility that no obstacle is around the standing position of the player 100, and hence it is possible to guarantee safety for the player 100 even if the position of the determination subject space 70 is changed.

Further, for example, if the position of the determination subject space 70 is changed even in a case where any one position of the body of the player is outside the determination subject space 70 for a brief moment, there arises a fear that the player will become confused instead. In this respect, the game device 20 is capable of preventing the player from feeling such confusion.

2. Modified Examples

It should be noted that the present invention is not limited to the embodiment described above.

2-1. First Modified Example

In S102 of FIG. 19, the display position of the spotlight area 53 (and the spotlight 52) may be changed so that the positional relation between the display position of the game character 51 and the guidance position 55 of the spotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P10) and the representative point 71 of the determination subject space 70. In other words, the spotlight area 53 may be moved instead of moving the game character 51.

Alternatively, the display positions of both the game character 51 and the spotlight area 53 (and the spotlight 52) may be changed so that the positional relation between the display position of the game character 51 and the guidance position 55 of the spotlight area 53 corresponds to the positional relation between the position of the player 100 (for example, the three-dimensional coordinate of the waist P10) and the representative point 71 of the determination subject space 70.

Further, without changing the relative positions of the game character 51, the spotlight area 53, and the like, the display position of the game character 51 may be moved to the right-hand side, the left-hand side, or the like of the game screen 50. Specifically, for example, in a case where the game screen 50 is a screen showing a situation of a virtual game space viewed from a virtual camera, by changing the position of the virtual camera, the display position of the game character 51 is changed as described above.

FIG. 21 is a diagram illustrating a case where the display position of an image contained in the game screen 50 has been changed. The game screen 50 illustrated in FIG. 21 is displayed, for example, in a case where the player 100 has moved to the right-hand side of the determination subject space 70 with respect to the position detecting device 1. The position of the virtual camera is changed to the left, and thus, for example, the game character 51 located in the vicinity of the center of the game screen 50 is moved to the right in relation to a center point of the game screen 50.

Further, for example, like an area 50 a, the vicinity of a left end portion of the game screen 50 is displayed in black. The width and the position of the area 50 a are determined based, for example, on the distance L and the direction D between the representative point 71 and the waist P10 of the player 100. It seems to the player 100 that nothing is displayed in the area 50 a located in the vicinity of the left end portion of the game screen 50. In this case, it is conceivable that the player 100 will move to their left, trying to move the display position of the game character 51 back to the original position. Thus, according to the game screen 50, it is possible to guide the player 100 to the inside of the determination subject space 70.

By performing the display control of the game screen 50 as described above, it is possible to notify the player 100 that their standing position is displaced, without changing the relative positions of respective images (game character 51 and the like) contained in the game screen 50. Here, the description above is directed to the case where the position of the virtual camera is changed. However, the notification of the standing position of the player 100 may be performed by changing the angle of view or the line-of-sight of the virtual camera.

2-2. Second Modified Example

Further, the game screen 50 only needs to show the positional relation between the display positions of the standing position of the player 100 and the representative point 71 of the determination subject space 70, and thus the example of the game screen 50 is not limited to the example of this embodiment.

FIG. 22 is a diagram illustrating another example of the game screen 50. On the game screen 50 illustrated in FIG. 22, a player character 51 a (first game character) corresponding to the player 100 and an instructor character 51 b (second game character) are displayed.

In this case, for example, the player 100 moves their body according to the movement of the instructor character 51 b. Then, based on the movement of the player 100, the player character 51 a performs an action. Similarly to the embodiment, in a case where the player 100 has succeeded in performing the action, the message 54 such as “GOOD” is displayed.

Alternatively, the player character 51 a and the instructor character 51 b may move in the same manner, and the player 100 may move their body according to the movement of the player character 51 a and the instructor character 51 b.

In the second modified example, the positional relation between the player character 51 a and the instructor character 51 b is changed based on the positional relation between the position of the player 100 and the determination subject space 70. For example, in the case where the position of the player 100 is contained in the determination subject space 70, the player character 51 a is displayed substantially in front of the instructor character 51 b as illustrated in FIG. 22.

On the other hand, for example, in the case where the position of the player 100 is out of the determination subject space 70 as illustrated in FIG. 10, the player character 51 a is displayed at a position displaced far from the instructor character 51 b as illustrated in FIG. 23, for example. Further, the message 54 such as “CAUTION” is displayed.

Further, for example, in the case where the position of the player 100 is out of the determination subject space 70 as illustrated in FIG. 12, the player character 51 a is displayed at a position significantly displaced sideways from the instructor character 51 b as illustrated in FIG. 24, for example.

In the second modified example, for example, processing similar to the processing of S102 of FIG. 19 is executed. Specifically, at least one of the display positions of the player character 51 a and the instructor character 51 b is changed so that the positional relation between the display position of the player character 51 a and the display position of the instructor character Sib corresponds to the positional relation between the position of the player 100 and the representative point 71 of the determination subject space 70.

For example, first, the three-dimensional coordinate of the waist P10 of the player 100 are referred to. Subsequently, a determination is made as to the positional relation between the three-dimensional coordinate of the waist P10 of the player 100 and the representative point 71 of the determination subject space 70. For example, a difference between the three-dimensional coordinate of the waist P10 of the player 100 and the representative point 71 of the determination subject space 70 is acquired. Specifically, the direction D from the representative point 71 of the determination subject space 70 toward the three-dimensional coordinate of the waist P10 of the player 100 and the distance L therebetween are acquired.

After that, the display position of the player character 51 a is changed so that the positional relation between the display position of the player character 51 a and the display position of the instructor character 51 b corresponds to the positional relation between the three-dimensional coordinate of the waist P10 of the player 100 and the representative point 71 of the determination subject space 70. For example, as illustrated in FIG. 23 and FIG. 24, the display position of the player character 51 a is changed from a basic position 56 set in front of the instructor character 51 b to a position 57 obtained by shifting the basic position 56 by the distance Ls corresponding to the above-mentioned distance L in the direction Ds corresponding to the above-mentioned direction D.

According to the second modified example, by referring to the positional relation between the player character 51 a and the instructor character 51 b, the player 100 can recognize whether or not the position of the body of the player 100 is out of the determination subject space 70. As a result, in such a case where the standing position of the player 100 has changed during the gameplay, the player 100 can know that their standing position has changed, and accordingly can adjust their own standing position. Therefore, the player 100 can play the game in a relatively safe place within the determination subject space 70.

Here, in the case where the position of the player character 51 a is displaced from the front of the instructor character 51 b or displaced far from the instructor character 51 b, it is generally conceivable that the player 100 will try to set the position of the player character 51 a to the front of the instructor character 51 b.

Specifically, in the case where the position of the player character 51 a is displaced from the front of the instructor character 51 b or displaced far from the instructor character 51 b, the player 100 conceivably feels difficulty in imitating the movement of the instructor character 51 b. Therefore, it is conceivable that the player 100 will unconsciously adjust their own standing position so that the position of the player character 51 a is set to the front of the instructor character 51 b.

As described above, by controlling the positional relation between the player character 51 a and the instructor character 51 b, it is possible to make the player 100 unconsciously adjust their own standing position.

In the second modified example, too, the display control as illustrated in FIG. 21 may be performed. Specifically, without changing the relative positions of the player character 51 a and the instructor character 51 b, the display positions of the player character 51 a and the instructor character 51 b may be moved to the right-hand side, the left-hand side, or the like of the game screen 50. In this case, too, similarly to the case illustrated in FIG. 21, by changing the position of the virtual camera, for example, the display positions of the player character 51 a and the instructor character 51 b are changed as described above.

FIG. 25 is a diagram illustrating a case where the display position of an image contained in the game screen 50 has been changed. The game screen 50 illustrated in FIG. 25 is displayed, for example, in the case where the player 100 has moved to the right-hand side of the determination subject space 70 with respect to the position detecting device 1. The position of the virtual camera is changed to the left, and thus, for example, the player character 51 a and the instructor character 51 b located in the vicinity of the center of the game screen 50 are moved to the right in relation to the center point of the game screen 50.

Further, for example, similarly to FIG. 21, the area 50 a is displayed. In this case, it is conceivable that the player 100 will move to their left so as to move the display positions of the player character 51 a and the instructor character 51 b back to the original positions. Thus, according to the game screen 50, it is possible to guide the player 100 to the inside of the determination subject space 70.

2-3. Other Modified Examples

It should be noted that the present invention is not limited to the embodiment and the modified examples which are described above, and that various modifications may be made as needed without departing from the gist of the present invention.

(1) For example, the three-dimensional position information indicating the position of the player 100 has been described by taking, as an example, the data storage example illustrated in FIG. 5. However, the three-dimensional position information transmitted from the position detecting device 1 may be any information as long as the information allows the position (for example, standing position) of the player 100 to be identified, and thus the data storage example is not limited to the example of FIG. 5. Alternatively, for example, the three-dimensional position information may be such information that indicates a distance and a direction from a reference point of the player 100 (for example, a point corresponding to the head) to each part of the body.

(2) Further, for example, the description above has been given by taking the example in which the position information generating means for generating the three-dimensional position information based on the photographed image and the depth information (depth image) is included in the position detecting device 1. However, the position information generating means may be included in the game device 20. Specifically, the game device 20 may receive the photographed image and the depth image from the position detecting device 1, to thereby generate the three-dimensional position information based on those images.

(3) Further, for example, the description above has been given by taking, as a method of analyzing the movement of the player 100 based on the three-dimensional position information, the example in which a comparison is made between the action determination criterion information illustrated in FIG. 18 and the change amount, the change direction, the change speed, etc. of the three-dimensional coordinate of each part of the player 100. The analysis method for the movement of the player 100 may be any method as long as the method is performed based on the three-dimensional position information, and thus the analysis method is not limited to the above-mentioned example. Alternatively, for example, the movement of the player 100 may be analyzed based on values acquired by substituting the three-dimensional coordinates contained in the three-dimensional position information into a predetermined mathematical expression.

(4) Further, for example, in the case where a plurality of players 100 play the game, such control may be performed that prevents the determination subject spaces 70 corresponding to the respective players 100 from overlapping each other. For example, in a case where two players 100 play the game, the three-dimensional position information contains sets of the three-dimensional coordinates for the two players. In a case where changing the determination subject space 70 so that the representative point 71 moves to the three-dimensional coordinate of the waist P10 of one player 100 causes the changed determination subject space 70 to overlap the determination subject space 70 of the other player 100, there is a risk that the two players will hit each other, and thus control may be performed so as to prevent such change.

Specifically, in the case where the game executed by the game device 20 is played by a plurality of players 100, the determination subject space changing unit 88 may include means for inhibiting change in the case where changing the position of the determination subject space 70 corresponding to one player 100 causes the changed determination subject space 70 to overlap the determination subject space 70 corresponding to another player 100.

(5) Further, in the first modified example and the second modified example, the reference points, which are referred to when the display control unit 90 controls the display position and which represent the positional relation between the position of the player 100 and the determination subject space 70, are set to the three-dimensional coordinate of the waist P10 and the representative point 71, respectively. The display control unit 90 only needs to determine whether or not to control the display position based on the three-dimensional position information corresponding to the player 100 and information identifying the position of the determination subject space 70, and thus information items to be compared are not limited to the above-mentioned example. For example, a comparison may be made between an average value of sets of the three-dimensional coordinates contained in the three-dimensional position information and one arbitrary point within the determination subject space 70.

(6) Further, in this embodiment, the method of measuring the depth of the player 100 has been described by taking, as an example, the case of performing calculation based on the TOF of the infrared light. However, the measuring method is not limited to the example of this embodiment. Alternatively, for example, a method of performing triangulation, a method of performing three-dimensional laser scanning, or the like may be applied. Further, the description has been given by taking the example in which the depth information is acquired as the depth image, but the depth information is not limited thereto. The depth information may be any information as long as the information allows the depth of the player 100 to be identified, and hence the depth information may be a value indicating the TOF, for example.

(7) Further, the determination subject space 70 has been described by taking, as an example, the shape illustrated in FIG. 7, but the shape of the determination subject space 70 is not limited thereto. The determination subject space 70 may have any shape as long as the shape allows the position at which the player 100 should be standing to be identified, and hence the determination subject space 70 may have a spherical shape, for example. In this case, the game data storage unit 80 stores the representative point 71 of the determination subject space 70 (for example, center point of the sphere) and information for identifying the radius of the sphere.

(8) Further, in this embodiment, only the position of the determination subject space 70 is changed with the size thereof kept as it is. However, by changing the size of the determination subject space 70, the position of the determination subject space 70 may be changed. Specifically, in the case where the state in which the player 100 is out of the determination subject space 70 has continued for the predetermined time period, the position of the determination subject space 70 may be changed in such a manner that the determination subject space 70 is enlarged in a direction in which the player 100 is out of the determination subject space 70.

(9) Further, for example, in this embodiment, the game device 20 makes a determination as to the movement of the player 100 based on the three-dimensional position information. However, the position detecting device 1 may make a determination as to the movement of the player 100. In this case, the determination criterion information (FIG. 18) is stored in the position detecting device 1. Specifically, a determination is made as to the movement of the player 100 by the position detecting device 1, and only information indicating a result of the determination is transmitted to the game device 20.

(10) Further, in the case where the player 100 is out of the determination subject space 70, the display control processing performed by the display control unit 90 is not limited to this embodiment and the modified examples (FIG. 11, FIG. 13, FIG. 21, FIG. 23, FIG. 24, and FIG. 25). It is only necessary to perform such display control that notifies the player 100 that the player 100 is out of the determination subject space 70. For example, the entire determination subject space 70 may correspond to the entire display area of the game screen 50. Specifically, in the case where the player 100 is out of the determination subject space 70, the game character 51 may be made invisible on the game screen 50.

(11) Alternatively, for example, in the case where the player 100 is not in the determination subject space 70, the display control unit 90 may perform predetermined image processing on an image contained in the game screen 50. Specifically, for example, noise processing may be performed on the game character 51, which is a focus target of the player 100, to thereby make the game character 51 less visible.

(12) Further, in this embodiment, the dance game has been described as an example of the game to be executed on the game device 20. The game to be executed on the game device 20 may be any game as long as the movement of the player 100 is detected to execute the game processing, and thus the kind of the game to be executed is not limited thereto. Alternatively, for example, the game may be a sport game such as a soccer game, a fighting game, or the like.

While there have been described what are at present considered to be certain embodiments of the invention, it will be understood that various modifications may be made thereto, and it is intended that the appended claims cover all such modifications as fall within the true spirit and scope of the invention.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US20060066723 *Sep 9, 2005Mar 30, 2006Canon Kabushiki KaishaMobile tracking system, camera and photographing method
US20090221374 *Apr 26, 2009Sep 3, 2009Ailive Inc.Method and system for controlling movements of objects in a videogame
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US8217327 *Feb 17, 2009Jul 10, 2012Samsung Electronics Co., Ltd.Apparatus and method of obtaining depth image
US8520901Jun 10, 2011Aug 27, 2013Namco Bandai Games Inc.Image generation system, image generation method, and information storage medium
US8854304Jun 10, 2011Oct 7, 2014Namco Bandai Games Inc.Image generation system, image generation method, and information storage medium
US20130059661 *Jun 27, 2012Mar 7, 2013Zeroplus Technology Co., Ltd.Interactive video game console
Classifications
U.S. Classification463/36
International ClassificationA63F9/24
Cooperative ClassificationA63F13/06, A63F2300/1018, A63F2300/1087
European ClassificationA63F13/06
Legal Events
DateCodeEventDescription
Mar 9, 2011ASAssignment
Owner name: KONAMI DIGITAL ENTERTAINMENT CO., LTD., JAPAN
Effective date: 20110221
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:YAMAGUCHI, TAKESHI;REEL/FRAME:025927/0391