Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20060141433 A1
Publication typeApplication
Application numberUS 11/022,774
Publication dateJun 29, 2006
Filing dateDec 28, 2004
Priority dateDec 28, 2004
Also published asUS7646934, US20060140485, US20100285874
Publication number022774, 11022774, US 2006/0141433 A1, US 2006/141433 A1, US 20060141433 A1, US 20060141433A1, US 2006141433 A1, US 2006141433A1, US-A1-20060141433, US-A1-2006141433, US2006/0141433A1, US2006/141433A1, US20060141433 A1, US20060141433A1, US2006141433 A1, US2006141433A1
InventorsCheung Hing, Hiromu Ueshima
Original AssigneeHing Cheung C, Hiromu Ueshima
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Method of detecting position of rectangular object and object detector
US 20060141433 A1
Abstract
A method of detecting a position of a rectangular object includes the steps of: capturing an image of the object by an image sensor having a rectangular image plane having four edges; detecting, for each edge of the four edges, a distance to a point of the image of the object closest to said each edge; and determining a position of a predefined point of the image of the object based on the detected distances.
Images(15)
Previous page
Next page
Claims(20)
1. A method of detecting a position of a rectangular object comprising the steps of:
capturing an image of the object by an image sensor having a rectangular image plane having four edges;
detecting, for each edge of the four edges, a distance to a point of the image of the object closest to said each edge; and
determining a position of a predefined point of the image of the object based on the detected distances.
2. A method as recited in claim 1 wherein the step of determining includes the step of determining a position of the center point of the image of the object based on the distances.
3. A method as recited in claim 2 wherein a coordinate system having a first axis and a second axis is defined on the image plane, the first axis of the coordinate system being perpendicular to a first pair of opposite edges of the image plane, and the second axis of the coordinate system being perpendicular to a second pair of the edges of the image plane, and
the step of determining a position of the center point includes the step of:
determining first-axis coordinates of points closest to respective edges of the first pair of edges;
calculating a first-axis coordinate of the center point by averaging the first-axis coordinates determined in the step of determining first-axis coordinates;
determining second-axis coordinates of points closest to respective edges of the second pair of edges; and
calculating a second-axis coordinate of the center point by averaging the second-axis coordinates determined in the step of determining second-axis coordinates.
4. A method as recited in claim 3 further comprising the steps of calculating an angle θ which one of the edges of the image of the object forms with one of the four edges of the rectangular image plane using the first-axis coordinates determined in the step of determining first-axis coordinates, and the second-axis coordinates determined in the step of determining second-axis coordinates.
5. A method as recited in claim 4 wherein
the first-axis coordinates determined in the step of determining first-axis coordinates includes coordinate values R1 x and L1 x where R1 x>L1 x; and
the second-axis coordinates determined in the step of determining second-axis coordinates includes coordinate values T1 y and B1 y where T1 y>B1 y; and wherein the step of calculating an angle includes the step of calculating the angle θ by a following equation:
θ = tan - 1 T 1 y - B 1 y R 1 x - L 1 x .
6. A method as recited in claim 5 further comprising the step of rounding the angle θ to a nearest one of a predetermined set of angles.
7. A method as recited in claim 6 wherein the predetermined set of angles includes a series of angles increasing with a specific difference.
8. A method as recited in claim 1 wherein
the objects includes a rectangular retro-reflective strip, and
the step of capturing comprising the steps of:
turning on a lighting device;
capturing and storing a first image of the object by the image sensor while the lighting device is on;
turning off the lighting device;
capturing and storing a second image of the object by the image sensor while the lighting device is off; and
subtracting the second image from the first image.
9. A method as recited in claim 8 wherein the image plane of the image sensor has a plurality of pixels each producing a pixel signal having a plurality of signal levels;
the method further including a step of down-sampling the pixel signals of the plurality of pixels of the image sensor to 1-bit signals.
10. A method as recited in claim 9 wherein the step of detecting comprises the steps of, for each edge of the four edges of the image plane, scanning the down-sampled rectangular image plane starting from the each edge in a direction to an opposite edge until a point having a predetermined first value is found.
11. An object detector for detecting a position of a rectangular object comprising:
an image sensor having a rectangular image plane having four edges;
a distance detector that detects, for each edge of the four edges, a distance to a point of the image of the object closest to said each edge; and
a position determiner that determines a position of a predefined point of the image of the object based on the detected distances.
12. An object detector as recited in claim 11 wherein the position determiner includes a center position determiner that determines a position of the center point of the image of the object based on the distances.
13. An object detector as recited in claim 12 wherein a coordinate system having a first axis and a second axis is defined on the image plane, the first axis of the coordinate system being perpendicular to a first pair of opposite edges of the image plane, and the second axis of the coordinate system being perpendicular to a second pair of the edges of the image plane, and
the center position determiner includes:
a first-axis coordinate determiner that determines first-axis coordinates of points closest to respective edges of the first pair of edges;
a first-axis coordinate calculator that calculates a first-axis coordinate of the center point by averaging the first-axis coordinates determined by the first-axis coordinate determiner;
a second-axis coordinate determiner that determines second-axis coordinates of points closest to respective edges of the second pair of edges; and
a second-axis coordinate calculator that calculates a second-axis coordinate of the center point by averaging the second-axis coordinates determined by the second-axis coordinate determiner.
14. A n object detector as recited in claim 13 further including an angle calculator that calculates an angle θ which one of the edges of the image of the object forms with one of the four edges of the rectangular image plane using the first-axis coordinates determined by the first-axis coordinate determiner, and the second-axis coordinates determined by the second-axis coordinate determiner.
15. An object detector as recited in claim 14 wherein the first-axis coordinates determined by the first-axis coordinate determiner includes coordinate values R1 x and L1 x where R1 x>L1 x;
the second-axis coordinates determined by the second-axis coordinate determiner includes coordinate values T1 y and B1Y where T1 y>B1 y, and where
the angle calculator includes a calculator of the angle θ by following equation:
θ = tan - 1 T 1 y - B 1 y R 1 x - L 1 x .
16. An object detector as recited in claim 14 further including a wireless transmitter that transmits the position of the predetermined point of the image of the object via wireless communication.
17. An object detector as recited in claim 11, wherein the objects includes a rectangular retro-reflective strip,
the object detector further including:
a light source;
a light source controller that causes the light source to periodically emit a light;
an exposure controller that causes the image sensor to capture a first image while the light source is emitting a light and to capture a second image while the light source is not emitting a light;
and an image subtracting device that subtracts the second image from the first image.
18. An object detector as recited in claim 11 wherein the image plane of the image sensor has a plurality of pixels each producing a pixel signal having a plurality of signal levels;
the object detector further including a down sampling circuit that down-samples the pixel signals of the plurality of pixels of the image sensor to 1-bit signals.
19. An object detector as recited in claim 18 wherein the down sampling circuit includes a comparator having a first input connected to receive the pixel signals and a second input connected to a predetermined threshold level voltage.
20. An object detector as recited in claim 11 further including a wireless transmitter that transmits the position of the predetermined point of the image of the object via wireless communication.
Description
BACKGROUND OF THE INVENTION

1. Field of the Invention

The present invention generally relates to a method and an apparatus for detecting a position of an object and its angle to a specific reference and, more particularly, it relates to a method and an apparatus that precisely detects a position and its angle of a tool used in computer games.

2. Description of the Background Art

Sports computer games directed to baseball, football, golf, tennis, table tennis, bowling, and so on forms one of the categories of computer games. Most of these sports games require associated tools for playing. A bat for baseball, a racket for tennis or table tennis, a bowling ball for bowling, to name a few. The game program running on a CPU (Central Processing Unit) of a game apparatus creates virtual game situation where a user is supposed to be a player, generates a video image of the surroundings, and shows the image on a television set (TV). When a specific situation arises, the player is requested to take an action using the tool. In response to the player's action, the game program changes the virtual situation, and the player is requested to take a next action.

Take a golf game as an example. At the start of a game, the golf game program creates a scene of a teeing ground. The green can be seen on the backside of the teeing ground and the virtual golf ball is placed at the center (or any other place) of the teeing ground. When the scene changes and the golf ball is displayed at the center of the screen, the player “addresses” an image sensor unit placed on the floor and try to hit the virtual with a club, i.e., swings the club above the image sensor unit.

When the player swings the club, the image sensor detects the positions of the moving club head and associated computation program within the image sensor unit computes the speed and the direction of the club head. The detected speed and the movement are applied to the golf game program. In response, the golf game program computes the direction and speed of the club head, computes the resultant trajectory of the imaginary golf ball hit by the imaginary golf club in accordance with the direction and the speed of the club head, and creates a new game situation in accordance with the new position of the golf ball.

Naturally, specific hardware is necessary for detecting the position of the club head. Japanese Patent Application Laying-Open (Tokkai) No. 2004-85524 discloses an apparatus for detecting such positions of a game tool. The apparatus is used in a computer golf game and includes a stroboscope having four LED's (light emitting diodes), a CMOS (Complementary Metal-Oxide-Silicon) image sensor (hereinafter “CIS”), and a processor. A retro-reflector is attached to the bottom (sole) of a club head or a putter head. The retro-reflector has a long rectangular shape with circular ends. The apparatus is connected to a TV monitor and a golf game program running on the processor generates the video image of a virtual golf course in response to the player's action with the club or the putter.

In operation, the CIS captures two kinds of images: images during the stroboscope LED's are on (emitting light); and images during the stroboscope LED's are off. The image signals are applied to the processor, where necessary computation is carried out.

When LED's are emitting light, the retro-reflector reflects that light to the CIS; therefore, the CMOS sensor forms the image of the retro-reflector. Other light sources also form images on the CIS. When the LED's are off, the retro-reflector does not reflect the light; their images are not formed. Only other light sources form their images. By computing the difference between these two kinds of images in the processor, therefore, the processor can detect the images of the retro-reflectors separate from other images.

The processor detects two points farthest from each other in an image of the retro-reflector. These two points indicate the two ends of the mid line of the retro-reflector; by knowing the X and Y coordinates of these points, the processor can know the position of the club head or the putter head as an average of these two points. By computing this point for each of the captured images, the processor computes the direction and the speed of the movement of the club head. Also, the processor can compute the angle θ between the line connecting the two end points of the retro-reflector and a prescribed reference line. From this angle θ, the angle of the head face can be computed.

A golf game program running on the processor processes these data, determines the trajectory of the virtual golf ball, and creates next virtual situation.

However, in order to determine the two farthest points in the image of the retro-reflector, the processor have to compute the distance of each combination of two points in the image of the retro-reflector. This is relatively complicated operation and requires a considerable amount of computing time. Further, the CIS has a 32×32 pixel, 8 bits per pixel image plane. The data size of one image therefore amounts to 8192 bits=1024 bytes. The processor needs to receive the data from the CIS, store the data, and carry out the above-described computations on the stored data.

Therefore, a processor with relatively high performance is necessary in order to carry out the computation necessary for the game in real time. Also, the processor needs to have storage with a capacity large enough to store the data output from the CIS. This results in a computer game machine with a relatively high cost. Because children are the main users of the computer game machines, the game machines should be inexpensive although they should have enough performance to fully operate in real time.

SUMMARY OF THE INVENTION

Therefore, one of the objects of the present invention is to provide an object detector that detects a position of an object with a simple operation and a method thereof.

Another object of the present invention is to provide an object detector that detects a position of an object with smaller amount of computation compared with the prior art and a method thereof.

Yet another object of the present invention is to provide an object detector having simple structure that detects a position of an object with smaller amount of computation compared with the prior art and a method thereof.

In accordance with a first aspect of the present invention, a method of detecting a position of a rectangular object includes the steps of: capturing an image of the object by an image sensor having a rectangular image plane having four edges; detecting, for each edge of the four edges, a distance to a point of the image of the object closest to said each edge; and determining a position of a predefined point of the image of the object based on the detected distances.

The distances of the four points closest to the respective edges of the image plane from the respective edges can be detected with simple operation and does not require a large amount of computation time. Therefore, a method that can detect a position of an object with a simple operation can be provided.

The step of determining may include the step of determining a position of the center point of the image of the object based on the distances.

Preferably, a coordinate system having a first axis and a second axis is defined on the image plane. The first axis of the coordinate system is perpendicular to a first pair of opposite edges of the image plane, and the second axis of the coordinate system is perpendicular to a second pair of the edges of the image plane. The step of determining a position of the center point may include the step of: determining first-axis coordinates of points closest to respective edges of the first pair of edges; calculating a first-axis coordinate of the center point by averaging the first-axis coordinates determined in the step of determining first-axis coordinates; determining second-axis coordinates of points closest to respective edges of the second pair of edges; and calculating a second-axis coordinate of the center point by averaging the second-axis coordinates determined in the step of determining second-axis coordinates.

Scanning the image plane searching for four points closest to the four edges from the edges can be implemented with simple algorithm. By detecting these four points, the center point of the image of the object is easily calculated. Therefore, a simple method for detecting a position of an object is provided.

More preferably, the method further includes the steps of calculating an angle θ that one of the edges of the image of the object forms with one of the four edges of the rectangular image plane using the first-axis coordinates determined in the step of determining first-axis coordinates, and the second-axis coordinates determined in the step of determining second-axis coordinates.

Still more preferably, the first-axis coordinates determined in the step of determining first-axis coordinates includes coordinate values R1 x and L1 x where R1 x>L1 x; and the second-axis coordinates determined in the step of determining second-axis coordinates includes coordinate values T1 y and B1 y where T1 y>B1 y. The step of calculating an angle may include the step of calculating the angle θ by a following equation: θ = tan - 1 T 1 y - B 1 y R 1 x - L 1 x .

By simply detecting four coordinate values T1 y, B1 y, R1 x and L1 x of the four points, the angle θ can be computed. There is no need to know the eight, full coordinate values of the four points.

Further preferably, the objects includes a rectangular retro-reflective strip, and the step of capturing includes the steps of: turning on a lighting device; capturing and storing a first image of the object by the image sensor while the lighting device is on; turning off the lighting device; capturing and storing a second image of the object by the image sensor while the lighting device is off, and subtracting the second image from the first image.

The image plane of the image sensor may have a plurality of pixels each producing a pixel signal having a plurality of signal levels, and the method further includes the step of down-sampling the pixel signals of the plurality of pixels of the image sensor to 1-bit signals.

Preferably, the step of detecting includes the steps of, for each edge of the four edges of the image plane, scanning the down-sampled rectangular image plane starting from the each edge in a direction to an opposite edge until a point having a predetermined first value is found.

An object detector for detecting a position of a rectangular object in accordance with another aspect of the present invention includes: an image sensor having a rectangular image plane having four edges; a distance detector that detects, for each edge of the four edges, a distance to a point of the image of the object closest to said each edge; and a position determiner that determines a position of a predefined point of the image of the object based on the detected distances.

Preferably, the object includes a rectangular retro-reflective strip, and the object detector further includes: a light source; a light source controller that causes the light source to periodically emit a light; an exposure controller that causes the image sensor to capture a first image while the light source is emitting a light and to capture a second image while the light source is not emitting a light; and an image subtracting device that subtracts the second image from the first image.

More preferably, the image plane of the image sensor has a plurality of pixels each producing a pixel signal having a plurality of signal levels; and the object detector further includes a down sampling circuit that down-samples the pixel signals of the plurality of pixels of the image sensor to 1-bit signals.

Still more preferably, the object detector further includes a wireless transmitter that transmits the position of the predetermined point of the image of the object via wireless communication.

The foregoing and other objects, features, aspects and advantages of the present invention will become more apparent from the following detailed description of the present invention when taken in conjunction with the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1 illustrates an overall arrangement of a golf game system 30 in accordance with one embodiment of the present invention;

FIG. 2 shows a game cassette 76 including a CPU and a memory that stores a golf game program, and an adaptor 46 for the game cassette 76 having TV connection capabilities and IR communication capability;

FIG. 3 is a perspective view of a swing detector 44 for detecting the direction and the speed of a club head as well as an angle of its face in accordance with the embodiment;

FIG. 4 shows a golf club 42 for a golf game used with the swing detector 44 shown in FIG. 3;

FIG. 5 shows a functional block diagram of the swing detector 44;

FIG. 6 schematically shows the image plane of CIS 146 of the swing detector 44 shown in FIG. 5 and an image of a retro-reflector strip 124 of the golf club 42 shown in FIG. 4;

FIG. 7 is a waveform diagram of the signals within swing detector 44 shown in FIG. 3;

FIG. 8 is waveform diagrams of an image signal outputted from CIS 146 to down sampling comparator 150 shown in FIG. 5 and an image signal down-sampled by down sampling comparator 150;

FIGS. 9 to 12 show the overall control structure of golf club detecting program running on the processor of swing detector 44;

FIG. 13 shows directions of a clubface that can be detected by swing detector 44;

FIG. 14 shows a detected direction 344 of the movement of the golf club with reference to a predetermined reference direction 342;

FIG. 15 shows a conventional way of determining a direction of a golf ball movement hit by a golf club;

FIG. 16 shows a novel way of determining a direction of a golf ball in accordance with the first embodiment;

FIG. 17 shows the detected angle θ2 of the clubface in accordance with the first embodiment; and

FIG. 18 shows how the direction of a golf ball in the screen is determined in the embodiment of the present invention.

DESCRIPTION OF THE PREFERRED EMBODIMENTS

Overall Arrangement of the System

FIG. 1 shows an overall arrangement of a golf game system 30 in accordance with one embodiment of the present invention. Referring to FIG. 1, golf game system 30 includes: an adaptor 46 having connection facility to TV 48 via a cable 52 and a wireless IR (Infrared) communication capability; and a game cassette 76 that is to be mounted on adaptor 46.

Referring to FIG. 2, adaptor 46 has a housing 72 and a receiving stage 74 that moves up and down within housing 72. A connector is provided within the housing of adaptor 46 and by pushing down receiving stage 74, the connector is exposed. Adaptor 46 further has an IR emitting/receiving window 70 for IR communication.

Game cassette 76 has a connector 78 with connector pins Tn. When game cassette 76 is put on receiving stage 74 and pushed down, receiving stage 74 moves down and connector 78 will be coupled with the connector (not shown) of adaptor 46. Although not illustrated, game cassette 76 includes a CPU and a memory that stores a golf game program. Through the connection of the connectors 78 and the connector of adaptor 46 not shown, the processor of game cassette 76 can utilize the IR communication capability of adaptor 46. The processor can also apply video image of a golf game to TV 48 shown in FIG. 1.

Referring again to FIG. 1, golf game system 30 further includes: a golf club 42 which a player 40 uses to play the golf game; and a swing detector 44 for detecting the position of the head of golf club 42 as well as the angle of clubface of golf club 42 with reference to a predefined reference direction. Swing detector 44 also has a wireless IR communication capability and can transmit the detected position of the head of golf club 42 as well as the angle of the clubface to adaptor 46 through the IR light 50.

Structure of Swing Detector 44

Referring to FIG. 3, swing detector 44 includes a relatively flat housing 80. Swing detector 44 further includes: an IR LED 106 for transmitting data; a power switch 90; four switches 98, 100, 102, and 104 for adjusting the function of swing detector 44; a CIS 146; and two IR LED's 94 and 96 for exposure provided on either side of CIS 146, all arranged on the upper surface of housing 80. The arrangement of the circuitry within swing detector 44 will be described later with reference to FIG. 5.

Referring to FIG. 4, golf club 42 includes a shaft 120; a club head 122 with a neck 121 that is connected to shaft 120. On the bottom (sole) of club head 122, a retro-reflector strip 124 having a rectangular shape is attached. Retro-reflector strip 124 has two sets of edges; longer ones and shorter ones. Retro-reflector strip 124 is attached to club head 122 so that its longer edges are parallel to the edge of the clubface.

Referring to FIG. 5, in addition to IR LED 106, IR LED's 94 and 96 and four buttons 98, 100, 102 and 104, swing detector 44 includes as its inner circuitry: CIS 146 having 32H (Horizontal)×32V (Vertical) resolution outputting VOUTS signal, which includes a series of pixel values quantized to 8 levels; a down sampling comparator 150 connected to receive the VOUTS signal from CIS 146 for down-sampling the VOUTS signal to a 1-bit binary signal; an MCU (Micro Controller Unit) 148 that receives the output of down sampling comparator 150 for computing the position of the center point of the club head as well as the angle of the clubface; and a power LED 152 embedded within power key 90 shown in FIG. 3 for the indication of power on and off. Although not shown, MCU 148 has an internal memory, registers, and a processor.

Down sampling comparator 150 includes a Schmidt trigger. In this embodiment, the positive going threshold and the negative going threshold of Schmidt trigger is the same: VTH When the level of the input signal goes higher than the threshold VTH, the output of down sampling comparator 150 immediately goes High. If the level of the input signal falls to a level lower than the threshold VTH, the output of down sampling comparator 150 immediately falls to Low. Thus, the VOUTS signal, which is a multi-level signal, is converted into a 1-bit binary signal.

Swing detector 44 further includes: a battery box 140 operatively coupled to power key 90; a voltage regulator circuit 142 for regulating the voltage outputted by battery box 140 and for supplying power to MCU 148 and other circuits in swing detector 44 via power lines; and a power control switch 144 that, under control of MCU 148, supplies the power from voltage regulator circuit 142 to CIS 146 so that CIS 146 captures images at prescribed timings. Power control switch 144 and CIS 146 receives control commands from MCU 148 via a control bus 149. Outputs of CIS 146 and down sampling comparator 150 are connected to the input of MCU 148 via a data bus 151.

Referring to FIG. 6, MCU 148 finds the angle θ which one of the edge of the image 182 of retro-reflector strip 124 forms with one of the edges of the image plane of CIS 146 in the following manner. First, MCU 148 scans the image 180 captured by CIS 146 row by row from the top to the bottom searching for an image 182 of retro-reflector strip 124. The first bright point at a row with a y-coordinate T1 y indicates the top most corner 190 of the image 182. For that purpose, a coordinate system is defined on the image 180 (i.e., on the image plane of CIS 146). Likewise, MCU 148 scans image 180 column by column from the rightmost column until it finds the rightmost bright point. This point indicates the column with an x-coordinate value R1 x of the corner 192 of image 182. In a similar manner, MCU 148 finds the leftmost bright point 196 at a point with x-coordinate L1 x and the bottom bright point 194 with a y-coordinate B1 y. Here, T1 y>B1 y holds. Likewise, R1 x>L1 x holds. In other words, in this operation, the distances of the four points 190, 192, 194 and 196 closest to respective edges of image 180 from the respective edges are detected and then their x- or y-coordinate values are computed.

Points 190, 192, 194 and 196 correspond to the four corners of image 180 of retro-reflector strip 124. The coordinates (X, Y) of the center point 198 of the image 182 of retro-reflector strip 124 then are then computed by:
X=(L1x+R1x)/2
Y=(T1y+B1y)/2.

The angle θ, which the longer edge of image 182 of retro-reflector strip 124 makes with the x-axis, is determined by: tan θ = Δ y / Δx = ( T 1 y - B 1 y ) / ( R 1 x - L 1 x θ = tan - 1 T 1 y - B 1 y R 1 x - L 1 x .

By the above-described simple computation, the position of the center point of retro-reflector strip 124 and its angle between the x-axis can be computed. This requires a relatively small amount of computation compared with the prior art.

FIG. 7 shows the waveforms of the signals among CIS 146, MCU 148 and down sampling comparator 150 shown in FIG. 5. Referring to FIG. 7, “FS” is the frame signal for synchronization of circuits external to CIS 146. One cycle period of signal FS is predetermined by a clock signal (SCLK) and, in this embodiment, it equals to 12288 clock cycles. In this embodiment, CIS 146 captures an image while signal FS is at the Low level. This period will be called an exposure time “Texp” hereinafter. When CIS 146 is ready to output the captured image signal, signal FS is at the High level.

The time period of CIS 146 for capturing an image (hereinafter “internal exposure time”) depends on the settings of a specific 8-bit register E0(7:0) internal to CIS 146. The settings may be externally changed. The exposure time Texp is divided into 255 (=28) parts. CIS 146 determines the internal exposure time by Texp times register value E0(7:0) divided by 255. Thus, if the register value E0(7:0) is 200, the internal exposure time will be Texp*200/255 as shown in FIG. 7.

When signal FS is at the High level, i.e., signal FS indicates the data transfer period, CIS 146 is ready to transfer the captured image data VOUTS. The rising edges of signal STR show the timings of data hold and sampling of VOUTS at down sampling comparator 150. During the data transfer period, signal STR includes 32×32+1 pulses. At each of the falling edges of these pulses, down sampling comparator 150 samples the VOUTS signal 220, compares the level of VOUTS signal 220 with the threshold level VTH 221, and outputs the result as a 1-bit signal 222. The first data is a dummy and is discarded; therefore, down sampling comparator 150 outputs 32×32 pixel data within the data transfer period. VOUTS signal 220 shows the intensity of the image quantized to 8 levels. This signal is reduced to the 1-bit signal and is supplied to MCU 148.

Because the image signal is reduced to 1-bit 32×32 pixel signals, memory capacity of MCU 148 required for storing the image data is substantially reduced and an MCU with relatively low cost can be used.

FIG. 8 shows the down sampling carried out by down sampling comparator 150. VOUTS outputted from CIS 146 has 8-bit resolution as shown in waveforms 220 (FIG. 8(b)). Down sampling comparator 150 compares the level of VOUTS with a predetermined threshold level 221 and outputs the resultant 1-bit binary signal as shown by the waveform 222 (FIG. 8(a)).

Program Structure of Swing Detector 44

FIGS. 9 to 12 show the overall control structure of the program running on MCU 148 of swing detector 44 for controlling CIS 146, capturing the image of retro-reflector strip 124, and computing the position of its center point and its angle θ with reference to the x-axis.

Referring to FIG. 9, after the power-on, the program starts at step 240 where registers of MCU 148 are initialized. At step 242, MCU 148 clears its RAM (random access memory). Then, at step 244, PIO (programmed input/output) setting of MCU 148 is carried out. At step 246, MCU 148 read option code setting and resets CIS 146 and set up registers of CIS 146 in accordance with the option code setting. At step 248, watchdog timer is reset.

At step 250, it is determined whether the signal FS is Low or not. If not, the control returns to step 250 and the determination is repeated until the signal FS is Low. When signal FS is Low, MCU 148 turns on the exposure IR LED's 94 and 96 (see FIGS. 3 and 5). At step 254, exposure IR LED's 94 and 96 are kept on until the signal FS is High. When the signal FS is found to be High, exposure IR LED's 94 and 96 are turned off at step 256.

Referring to FIG. 10, MCU 148 waits until the signal STR is at its falling edge at step 258. When the STR is at its falling edge, MCU 148 reads the VOUTS down-sampled by down sampling comparator 150 at step 260.

At step 262, it is determined whether all 32×32 data are received from CIS 146. If not, the control returns to step 258. When all of the 32×32 data are received, the control goes to step 264, where RAM loaded with the received data within MCU 148 is organized. The 32×32 data received at steps 258 to 262 forms the exposure data.

At step 266, CIS 146 tries to get key press data. At step 268, a sleep counter (not shown) within MCU 148 is checked ant it is determined whether the sleep counter has overflowed or not. If overflowed, the control goes to step 270; otherwise, it goes to step 280 (FIG. 11).

At step 270, MCU 148 controls power control switch 144 to stop the power supply to CIS 146 and enters the sleep mode. At step 272, MCU 148 turns on the sleep LED, which is power LED shown in FIGS. 3 and 5. At step 274, MCU 148 waits for a predetermined period by a delay loop. After the predetermined period, MCU 148 turns on the sleep LED at step 276. At step 278, it is determined whether key is pressed or not. If there is no key press, then control returns to step 270 and MCU 148 enters sleep mode again. If there is a key press, the control jumps back to step 240 and MCU 148 carries out the steps 240 and seq. again.

When it is determined at step 268 that the sleep counter has not overflowed, control goes to step 280 shown in FIG. 11. Referring to FIG. 11, at step 280, MCU 148 waits until the signal FS is High. When the signal FS is High, MCU 148 turns on power on LED 152 at step 282 and waits until the signal FS is Low at step 284. By turning on power on LED 152, MCU 148 indicates that MCU 148 and CIS 146 are operating. When the signal FS is Low, MCU 148 turns off power on LED 152. By turning off the power on LED 152, MCU 148 indicates that it will not receive any key input.

Next, at step 288, MCU 148 waits until the signal STR is at is falling edge. When the signal STR is at its falling edge, MCU 148 again reads VOUTS data at step 290. Steps 288 and 290 are repeated until it is determined that 32×32 data are received at step 292. The 32×32 data received at steps 258 to 262 form the dark data. Then, the control goes to step 294, where MCU 148 subtracts the dark data from the exposure data. By this operation, images of light sources other than retro-reflector strip 124 are removed from the 32×32 exposure data. Control goes to step 296 shown in FIG. 12.

At step 296, it is determined whether there is no blight point in the image or any key press. If there is a blight point or a key press, control goes to step 298; otherwise, control goes to step 318.

At step 298, it is determined whether there is no bright point in the image but a key press. If there is no bright point but a key press, control goes to step 314; otherwise, control goes to step 300.

At step 300, MCU 148 scans the 32×32 image from top to bottom row until it gets the topmost bright point T1 y. At step 302, MCU 148 scans the image from bottom to top row to get the bottommost bright point B1 y. At step 304, MCU 148 scans the image from left to right column to get the leftmost bright point L1 x. Finally, at step 306, MCU 148 scans the image from right to left column to get the rightmost bright point R1 x.

At step 308, MCU 148 calculates center point (X, Y) of the image of retro-reflector strip 124 by the following equations (1):
X=(L1x+R1x)/2
Y=(T1y+B1y)/2  (1)

At step 310, it is determined whether the game is in an angle mode where the angle of the clubface is considered in the golf game. If it is not in the angle mode, control goes to step 314; otherwise, control goes to step 312 where club angle θ is calculated by the following equation (2):
θ=tan−1(T1y−B1y)/(L1x−R1x)  (2)

Then control goes to step 314. At step 314, MCU 148 sets up the IR output pattern for the IR communication to adaptor 46 in accordance with the computed result.

The data format of the position data and angle data for IR communication includes 22 bits. The first bit is a start bit, which is always is 1. The next thirteen bits represent the X and Y coordinates of the center point including parity bits. Because X and Y are in the range from 0 to 31 (32 pixels), it requires 5 bits to represent each of the X and Y coordinates. The parity bits include three bits.

The next four bits represent the club angle. The angle computed at step 312 is rounded to the nearest 15 degrees (15°) as shown by the twelve angles θ1 to θ12 in FIG. 13. Thus, the club angle requires 4 bits in transmission.

The next three bits indicates the pressed key. If no key is pressed, these three bits are not transmitted.

Referring again to FIG. 12, at step 316, MCU 148 resets the sleep mode counter. At step 320, MCU 148 outputs the IR data set up at step 314. The golf game program running on adaptor 46 can then utilize the data and change the game situation. After step 320, the control returns to step 248 shown in FIG. 9.

When it is determined at step 296 that there is no bright point in the 32×32 image nor a key press, control goes to step 318. At step 318, MCU 148 clears the IR output patterns. Then the control goes to step 320 where the cleared IR output pattern is output to adaptor 46.

Operation of Swing Detector 44

Swing detector 44 of the present embodiment operates as follows. At the time of power-up, MCU 148 of swing detector 44 initializes its registers (FIG. 9, step 240), clears its RAM (step 242), sets up PIO settings (step 244), and reads option code setting and starts supplying power to CIS 146 (step 246). In response to the power supply, CIS 146 starts capturing images. During the exposure period, CIS 146 sets the signal FS at the Low level and during the transfer period, CIS 146 sets the signal FS at the High level.

At step 248, MCU 148 resets watchdog timer and waits for the signal FS from CIS 146 to be Low (FIG. 9, Step 250). When the signal FS becomes Low, this indicates that CIS 146 is in the exposure period and CIS 146 turns on IR LED's 94 and 96 for exposure. CIS 146 captures the image during the exposure time. CIS 146 waits for the signal FS to be High at step 254. When CIS 146 is ready to output the VOUTS, it sets the signal FS to the Higher level and MCU 148 exits step 254 and turns off IR LED's 94 and 96 for exposure at step 256.

Referring to FIG. 7, the signal FS and the signal STR attain the High level at the same time. During the transfer period, the signal FS stays at the High level and the signal STR alternately attains the Low level and the High level at a specific time period. At each of the falling edges of the signal STR, CIS 146 starts outputting data VOUTS showing the intensity of a pixel of the captured image quantized to eight levels as shown in FIG. 8(b).

The output of down sampling comparator 150 rises to the High level when the level of VOUTS is equal to or higher than the positive going threshold. It falls to the Low level when the level of VOUTS is lower than the negative going threshold. An example of the output of down sampling comparator 150 is shown in FIG. 8(a).

Referring again to FIG. 10, at steps 258 to 262, at each of the falling edges of the signal STR, MCU 148 reads VOUTS down sampled by down sampling comparator 150. When 32×32 data are received, MCU 148 organizes RAM and tries to get key data. The received data forms the exposure data.

If sleep counter is found to have overflowed at step 268, MCU 148 enters the sleep mode until any of the keys is pressed. If sleep counter has not overflowed, MCU 148 waits until the signal FS is Low at step 280 (FIG. 11). When the signal FS is Low, CIS 146 is again in the exposure period and MCU 148 turn on power on LED 152 at step 282 (FIG. 11) indicating that MCU 148 and CIS 146 are operating. Then, MCU 148 waits until the signal FS is High at step 284. During this period, CIS 146 captures the image without IR LED's 94 and 96 lighting. When the signal FS is High, CIS 146 is now in transfer mode and MCU 148 turns off power on LED 152 indicating that MCU 148 will not accept any key.

At steps 288 to 292, MCU 148 receives the 32×32 image VOUTS data outputted from CIS 146 and down sampled by down sampling comparator 150. The image forms the dark data.

At step 294, MCU 148 subtracts the dark data from the exposure data received at steps 258 to 262 (FIG. 10). The resulting data includes, if any, only the exposure data of retro-reflector strip 124.

At steps 296 and 298 (FIG. 12), MCU 148 determines whether the resulting image includes a bright point and if the image includes a bright point, referring to FIG. 6, MCU 148 scans the image from top to bottom row to get the topmost bright point T1 y at step 300 (FIG. 12), from bottom to top to get the bottommost bright point B1 y at step 302, from left to right to get the leftmost bright point L1 x at step 304, and from right to left to get the rightmost bright point R1 x at step 306.

At step 308, MCU 148 calculates the coordinates (X, Y) of the image of retro-reflector strip 124 by equations (1). If the game is in the angle mode, MCU 148 calculates club angle by equation (2).

At step 314, MCU 148 sets up IR output pattern. At step 316, it resets the sleep mode counter and outputs the IR data utilizing IR communication window 106 shown in FIGS. 3 and 5 to adaptor 46.

By repeating the above-described operation, swing detector 44 can detect the position of retro-reflector strip 124 (FIG. 4), i.e., the position of the head of golf club 42, and the club angle and transmit the detected data to adaptor 46. The adapter 46 receives the data, calculates the trajectory of the imaginary golf ball, and changes the game situation.

Use of the Club Angle

The golf game program running on the CPU of adaptor 46 can use the information of the X and Y coordinates of center of the club head and the club angle as in the following manner. First, by computing the difference between the coordinates detected at different times, the game program can compute the position of the center point of the club head and the angle of the clubface. Using this information, the game program can compute the direction of the imaginary golf ball trajectory.

In this connection, the golf game program running on adaptor 46 adopts a novel way of determining the direction of the golf ball trajectory. Referring to FIG. 14, assume that the moving direction 344 of the club head (the movement of the center point of retro-reflector strip 124) in the 32×32 image plane 340 makes an angle OS with a reference line 342, which is parallel to the y-axis of image plane 340.

In the prior art, as shown in FIG. 15, the golf game program screen 360 would show a target arrow 364 directed to the golf hole (not shown) and, given the angle θs, determines the trajectory of the imaginary golf in the direction 366 that makes the angle θs with the reference line 362, which is parallel to the y-axis of the screen 360 (Note here that the direction of y-axis is taken in a direction opposite to that of the y-axis in FIG. 6).

In contrast, the golf game program running on adaptor 46 determines the trajectory of the imaginary golf ball as in the following manner.

Referring to FIG. 16, given the angle θs the golf game program in this embodiment adds the angel θs not to the reference line 382 of the screen 380 but to the direction of the arrow 384 that is directed to the target golf hole, resulting in the direction 386. By this arrangement, the player can address the imaginary golf ball so that the swing line is on the line directed to the imaginary target golf hole. If the player swings the golf so that angle θs is zero, the imaginary golf ball will go in the direction of the target hole. Thus, the game will be much more like the real golf game.

In determining the trajectory of the imaginary golf ball, the club angle is further taken into consideration in a certain play mode (the “angle mode”) in this embodiment. In the angle mode, the trajectory of the golf ball is determined as shown in FIGS. 17 and 18.

Referring to FIG. 17, let us assume that the club angle detected by swing detector 44 of the present embodiment is θ2. This means that the angle that a clubface 408 makes with the line 406 normal to the trajectory of the club head 404, is θ2. Further assuming that the player tries to hit the imaginary golf ball 400 in the direction of the target arrow 402, but with a slight deviation of movement by an angle θs then the golf game program determines the trajectory of the imaginary golf ball 400 as in the following.

Referring to FIG. 18, an imaginary golf ball 422 is displayed on the screen 420. Target arrow 424 is also shown directed to the target golf hole. Given the club angle θ2 and the deviation angle θs of the club head movement, the program first adds angles θs to the angle of arrow 424. This results in the direction 426. Further, the program adds the clubface angle θ2 to the direction 426, resulting in a direction 428 further deviated from target arrow 424.

By this arrangement, the golf game will be more realistic and the game will be much more amusing than the prior art golf games.

As has been described, swing detector 44 can detect the position of the center point of the club, and further the angle of the clubface. A sequence of these data is transmitted to adaptor 46 (FIG. 1) via IR communication. Thus, the golf game program running on the CPU of game cassette 76 mounted o adaptor 46 can utilize these data and the resultant golf game will be more amusing than the prior art.

Although the present has been described using the embodiment directed to a computer golf game, it is not limited thereto. The present invention can be applied to any kind of position detector as long as the image of the object is rectangular. Further, there is no need to use retro-reflective strip. As long as the object can reflect a light and forms a rectangular image on the image plane of the image sensor, a detector in accordance with the present invention can detect the position and the angle of the object.

The embodiments as have been described here are mere examples and should not be interpreted as restrictive. The scope of the present invention is determined by each of the claims with appropriate consideration of the written description of the embodiments and embraces modifications within the meaning of, and equivalent to, the languages in the claims.

Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7682237 *Sep 22, 2004Mar 23, 2010Ssd Company LimitedMusic game with strike sounds changing in quality in the progress of music and entertainment music system
Classifications
U.S. Classification434/247
International ClassificationG09B19/00, A63B69/00, G09B9/00
Cooperative ClassificationA63B2225/50, A63B2220/807, A63B24/0003, A63F2300/1093, A63B2220/05, G01P3/68, G06K9/00355, A63B69/36, A63B69/3676, A63B67/04, G06T7/004, A63B2220/806, A63F13/10, A63F2300/1062, A63B2069/0008, A63F2300/8011, A63B69/00, A63B69/3614, A63B69/38, G01S17/89
European ClassificationG01P3/68, G01S17/89, A63B69/36C2, G06T7/00P, A63F13/10, G06K9/00G2, A63B24/00A
Legal Events
DateCodeEventDescription
Dec 23, 2004ASAssignment
Owner name: SSD COMPANY LIMITED, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:HING, CHEUNG CHUEN;UESHIMA, HIROMU;REEL/FRAME:016130/0309
Effective date: 20041215