Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20080200224 A1
Publication typeApplication
Application numberUS 11/865,681
Publication dateAug 21, 2008
Filing dateOct 1, 2007
Priority dateFeb 20, 2007
Also published asUS8835736, US20130036897, US20130065656, WO2008103269A1, WO2008103269A8
Publication number11865681, 865681, US 2008/0200224 A1, US 2008/200224 A1, US 20080200224 A1, US 20080200224A1, US 2008200224 A1, US 2008200224A1, US-A1-20080200224, US-A1-2008200224, US2008/0200224A1, US2008/200224A1, US20080200224 A1, US20080200224A1, US2008200224 A1, US2008200224A1
InventorsJakob Parks
Original AssigneeGametank Inc.
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Instrument Game System and Method
US 20080200224 A1
Abstract
A game system and method that uses an instrument as an input encourages a user to play along with the game's soundtrack on an instrument (e.g. guitar, bass, etc.). The game cues the player to play notes and/or chords on the instrument at an appropriate time and then data is collected from the instrument via a connection between the instrument and the apparatus running the game. The game then scores the user based on note/chord and timing information it receives.
Images(27)
Previous page
Next page
Claims(25)
1. A method for evaluating a live instrument performance, comprising:
providing a set of arrangement performance data, the set of arrangement performance data further comprising a plurality of arrangement data points wherein each arrangement data point further comprises a note and a time tag associated with each note;
receiving a sequence of live instrument performance data points, wherein each live instrument performance data point has a note and a time tag;
determining, for a particular arrangement data point having a particular arrangement note and a particular arrangement time tag, if any live instrument performance data points in the plurality of live instrument performance data points have a note equal to the particular arrangement note and a time tag that is within a time window around the particular arrangement time tag that identifies matching live performance data points; and
scoring, if there are the matching live performance data points, the live instrument performance by comparing the notes and time tags of the matching live instrument performance data points with the particular note and particular time tag of the particular arrangement data point.
2. The method of claim 1, wherein scoring the live instrument performance further comprises scoring negatively the live instrument performance if no live instrument performance data point has a note equal to the particular arrangement note or a time tag that is within a time window around the particular arrangement time tag.
3. The method of claim 1, wherein the live instrument performance further comprises a live stringed instrument performance.
4. The method of claim 1, wherein the instrument further comprises a guitar, a bass, violin, a banjo, a piano, a voice, a clarinet or a steel drum.
5. The method of claim 1 further comprising determining a periodicity component from the sequence of live instrument performance data points, converting the periodicity component to a frequency component, and converting the frequency component into a value representative of a note in the live instrument performance.
6. The method of claim 1, wherein scoring the live instrument performance further comprises comparing a time difference between the particular arrangement time tag and the time tag of the live instrument performance data point to generate a time difference value.
7. The method of claim 6, wherein each arrangement data point further comprises a first time window around the particular time tag and a second time window around the particular time tag longer than the first time window, and wherein scoring the live musical performance further comprises assigning a first score if the live instrument performance note matches the particular note and the live instrument performance time tag is within the first time window and assigning a second score that is lower than the first score when the live instrument performance note matches the particular note and the live instrument performance time tag is within the second time window but not within the first time window.
8. The method of claim 1, wherein the time window further comprises a time window whose time range is adjusted based on a difficulty level of the instrument performance data or a profile of a user performing the live instrument performance.
9. The method of claim 2 further comprising adjusting a note of one of the live instrument performance data points by one or more octaves, comparing the adjusted note to the particular arrangement data point note and scoring the live instrument performance positively if the adjusted note is equal to the particular arrangement note and not scoring the adjusted note if the adjusted note does not match the particular arrangement note.
10. The method of claim 1, wherein receiving the sequence of live instrument performance data points further comprises sampling a live instrument performance.
11. An apparatus for evaluating a live instrument performance, comprising:
a storage unit that stores a set of arrangement instrument performance data, the set of arrangement instrument performance data further comprising a plurality of arrangement data points wherein each arrangement data point further comprises a note and a time tag associated with each note;
a computing device coupled to the storage unit, the computing device having an instrument interface that is capable of receiving a sequence of live instrument performance data points, wherein each live instrument performance data point has a note and a time tag; and
the computing device further comprising a game unit having a scoring unit that determines, for a particular arrangement data point having a particular arrangement note and a particular arrangement time tag, if any live instrument performance data points in the plurality of live instrument performance data points have a note equal to the particular arrangement note and a time tag that is within a time window around the particular arrangement time tag that identifies matching live performance data points, and that scores, if there are matching live performance data points, the live instrument performance by comparing the notes and time tags of the matching live instrument performance data points with the particular note and particular time tag of the particular arrangement data point.
12. The apparatus of claim 11, wherein the scoring unit negatively scores the live instrument performance if no live instrument performance data point has a note equal to the particular arrangement note or a time tag that is within a time window around the particular arrangement time tag.
13. The apparatus of claim 11, wherein the instrument further comprises a stringed instrument.
14. The apparatus of claim 11, wherein the instrument further comprises a guitar, a bass, violin, a banjo, a piano, a voice, a clarinet or a steel drum.
15. The apparatus of claim 11, wherein the scoring unit further comprises a time window unit that defines a first time window around the particular time tag and a second time window around the particular time tag longer than the first time window, and wherein scoring the live musical performance further comprises assigning a first score if the live instrument performance note matches the particular note and the live instrument performance time tag is within the first time window and assigning a second score that is lower than the first score when the live instrument performance note matches the particular note and the live instrument performance time tag is within the second time window but not within the first time window.
16. The apparatus of claim 11, wherein the instrument interface samples a user performance using the instrument.
17. The apparatus of claim 11, wherein the computing device further comprises a personal computer, a game console, a networked computer or a gaming apparatus.
18. A computer readable medium having stored thereon instruction which, when executed by a processor, causes the processor to perform the operations of:
providing a set of arrangement performance data, the set of arrangement performance data further comprising a plurality of arrangement data points wherein each arrangement data point further comprises a note and a time tag associated with each note;
receiving a sequence of live instrument performance data points, wherein each live instrument performance data point has a note and a time tag;
determining, for a particular arrangement data point having a particular arrangement note and a particular arrangement time tag, if any live instrument performance data points in the plurality of live instrument performance data points have a note equal to the particular arrangement note and a time tag that is within a time window around the particular arrangement time tag that identifies matching live performance data points; and
scoring, if there are the matching live performance data points, the live instrument performance by comparing the notes and time tags of the matching live instrument performance data points with the particular note and particular time tag of the particular arrangement data point.
19. A method for displaying the time and pitch cues to the player of a musical instrument performance, the method comprising:
providing a representation of a musical instrument having one or more elements;
providing a plurality of note symbols representing a musical instrument performance to be played by the player, each note symbol having a graphical symbol and a character within the graphical symbol that represents a position of the musical instrument to be played by the player to generate a particular note;
displaying the representation of a musical instrument having one or more elements and the plurality of note symbols traveling across the representation of the musical instrument; and
cueing the player to play the particular note when the note symbol associated with the particular note crosses an element of the representation of a musical instrument.
20. The method of claim 19, wherein each note symbol further comprises an alphanumeric symbol that specifies a chord associated with the particular pitch for the note symbol.
21. The method of claim 19 further comprising displaying a score of the player.
22. The method of claim 19 further comprising displaying a performance meter that displays the performance of the player.
23. The method of claim 19, wherein displaying the representation further comprises displaying a user interface in which a vertical position of the note symbol represents a string of the musical instrument, a number inside of the note symbol corresponds to a fret of the musical instrument, and wherein cueing the player further comprises cueing the player, when an event of the note symbol traveling in a horizontal direction and crossing the graphical element occurs, to play the string represented by the vertical position of the note symbol and hold down the fret of the musical instrument indicated by the number inside of the note symbol.
24. A method for changing the difficulty of an instrument game, the method comprising:
providing an arrangement having a plurality of notes to be played by a player;
determining a level of difficulty of a game for a particular player; and
adjusting the difficulty of the arrangement by changing the numbers of notes in the arrangement to be played by the player based on the determined level of difficulty of the player.
25. The method of claim 24, wherein adjusting the difficulty further comprises increasing the difficulty of the arrangement by adding more notes in the arrangement to be played by the player and decreasing the difficulty of the arrangement by reducing a number of notes in the arrangement to be played by the player.
Description
PRIORITY CLAIM

This application claims priority under 35 USC 119(e) and 120 to U.S. Provisional Patent Application Ser. No. 60/902,066 filed on Feb. 20, 2007 entitled “A Music Video Game with Stringed Instrument Input” which is incorporated herein by reference.

FIELD

A system and method for game playing is described. In more detail, a system and method for using an instrument as an input to a game and the game with the instrument input is disclosed.

BACKGROUND

Video games generally are well known. In addition, video games and gaming system with music type games are also known. The game systems may be both personal computer/gaming console (Microsoft® Xbox® or Sony® Play Station2®) or stand-alone gaming consoles such as might be in an arcade. Examples of these types of games include Dance, Dance Revolution in which a user attempts to follow a dance routine set to music and is scored based on the accuracy of the user's dance routine to the exemplary dance routine and Guitar Hero in which the user has a controller (that looks like a guitar), plays along with a song and is scored based on how closely the user can play the song as compared to the exemplary song. It is desirable to provide a game system and method in which an instrument is the input controller to the game system and it is to this end that the present invention is directed.

BRIEF DESCRIPTION OF THE DRAWINGS

FIG. 1A illustrates an example of an implementation of a game system;

FIG. 1B illustrates an example of a user interface of a stringed instrument example of the game system;

FIG. 2 illustrates an example of a user interface of a stringed instrument example of the game system;

FIGS. 3A and 3B illustrate examples of the user interface of a note moving toward the play area of the stringed instrument example of the game system;

FIG. 3C illustrates another example of the user interface of FIG. 1B;

FIGS. 4A and 4B illustrate an example of a hit event and a miss event of the stringed instrument example of the game system;

FIG. 5 illustrate an example of the string, fret and time variables of the stringed instrument example of the game system;

FIG. 6 illustrates an example of an action indicator interface of the stringed instrument example of the game system;

FIG. 7 illustrates an example of another action indicator interface of the stringed instrument example of the game system;

FIG. 8 illustrates yet another example of an action indicator interface of the stringed instrument example of the game system;

FIG. 9 illustrates yet another example of an action indicator interface of the stringed instrument example of the game system;

FIG. 10 illustrates an example of a performance meter user interface of the stringed instrument example of the game system;

FIG. 11 illustrates a method for scoring notes in the stringed instrument example of the game system;

FIGS. 12A and 12B illustrate a hit scoring event and a miss scoring event of the stringed instrument example of the game system;

FIG. 13 illustrates an example of a method for scoring the notes of the stringed instrument example of the game system;

FIG. 14 illustrates an example of the stringed instrument example of the game system in which several time windows are used to score a note;

FIG. 15 illustrates an example of the user interface for selecting a level of difficulty of the stringed instrument example of the game system;

FIG. 16 illustrates examples of a sequence of notes with different difficulty levels in the stringed instrument example of the game system;

FIG. 17 illustrates examples of another sequence of notes with different difficulty levels in the stringed instrument example of the game system;

FIG. 18 illustrates an example of an arrangement of a musical arrangement of the stringed instrument example of the game system;

FIG. 19 illustrates an example of a menu in the stringed instrument example of the game system;

FIG. 20 illustrates an example of a select arrangement user interface of the stringed instrument example of the game system;

FIGS. 21A and 21B illustrate an audio and video selection user interface of the game system;

FIG. 22 illustrates a sound input device and gain user interface of the game system;

FIG. 23 illustrates an example of a hardware implementation of a video game system that incorporates the stringed instrument example of the game system; and

FIG. 24 illustrates further details of an analysis module of the exemplary embodiment of the game system shown in FIG. 1A.

DETAILED DESCRIPTION OF ONE OR MORE EMBODIMENTS

The game system and method are particularly applicable to a personal computer based, guitar based game system and method with the exemplary user interface described below and it is in this context that the system and method will be described. It will be appreciated, however, that the system and method has greater utility because: 1) the game system can be implemented with other musical or melodic instruments, such as any melodic instrument including, for example, a bass, violin, banjo, piano, voice, clarinet, steel drums, etc.; 2) it can be implemented on other gaming apparatus, such as gaming consoles or stand-alone gaming units (such as the Microsoft® Xbox® system, the Sony® PlayStation®, Nintendo® Wii®, etc.); 3) it can be implemented in peer-to-peer, ASP model, client/server architectures or as an Internet game; and 4) it can be implemented using other user interfaces and features that are not specifically described below in the exemplary embodiments which are provided to illustrate the game system and method. Now, an example of an implementation of the game system to illustrate the functions and principles is described in more detail.

FIG. 1A illustrates an example of an implementation of a game system 80 where the game system is implemented as a software based stand-alone system. The system 80 may include a game unit 81, such as a cabinet or stand-alone unit, and an instrument 82, such as any musical or melodic instruments including, for example, a bass, violin, banjo, piano, voice, clarinet, steel drums, etc., that is used as an input to the game unit 81 via an interface 82 a such as a USB cable, amplifier cord with adapter for computer sound card, networking cable carrying musical data information, a microphone, etc. The game unit may include a display 83 that is capable of displaying the user interface of the game to the user (an example of which is described below in more detail with reference to FIG. 1B), one or more processing units 84, a storage unit 86 (that may be a combination of a persistent storage device such as a hard disk drive, ROM, etc.), and a memory such as SRAM or DRAM, and an operating system 88 that controls the operation of the game system and a game module 90 that reside in the storage unit. The game module, in this embodiment, may be a plurality of lines of computer code. In other embodiments, the game module may also be implemented in hardware or a combination of hardware and software. The game module may include modules for game administration (level difficulty functions), musical instrument interface and game scoring. When the game system is active, the game module is loaded into the memory and then executed by the one or more processing units to implement the functions and operations of the game system described in more detail below. The game system permits a user to play an instrument along with an arrangement displayed on the display (use the instrument as an input to the game system using the interface) and then scores the user based on the accuracy with which the user plays the arrangement shown on the display as described in more detail below.

FIG. 1B illustrates an example of a user interface 100 of a stringed instrument example of the game system. The example of the user interface is for illustration purposes only and the game system may use other user interfaces and the game system is not limited to any particular user interface design. The example user interface may include a background graphics 102 that may consist of an number of images or a virtual environment and may be two dimensional or three dimensional. An example of a two dimensional background graphic with a single image (i.e. wallpaper) or a series of images (i.e. a movie, animation, music video, etc.) is shown in FIG. 1B. FIG. 2 shows another embodiment of a user interface 200 that is three dimensional and may include an animated character or characters 202 possibly playing a virtual instrument 204 and surrounded by virtual props 206 (audio equipment, stage, audience, etc.).

Returning to FIG. 1B, the exemplary user interface may further include an action indicator interface 104 that may include a note field 106, one or more notes 108 superimposed on top of the note field 106 and a play area 110. In one embodiment of the game system, the horizontal position of a note in the action indicator interface 104 indicates the time to play the note (cue time), the vertical position of the note indicates the string to play it on (cue string), and the number inside each note indicates the fret that is to be pressed down (cue fret) on the string to generate the correct pitch. In the one embodiment, the one or more notes 108 move horizontally towards the play area 110 and the play area 110 is stationary. FIGS. 3A and 3B show this horizontal motion of the notes relative to the play area 110 with FIG. 3A is at an earlier time than FIG. 3B. In the game system, the action indicator interface 104 cues the user to play the appropriate note at a specific time. When the overlap of the note and the play area occur, the user is to play the appropriate note.

FIG. 3C illustrates the expected user response to the action indicator interface 104. The top row of the user interface corresponds to the user playing the bottom string on a guitar (cue string). The number inside the note corresponds to the user holding down a particular fret of a guitar, such as the 2nd fret, with his/her finger (cue fret). The overlap of the note with the play area indicate that the user should play the cue string with the cued fret pressed at that instance (cue time), therefore producing a note that would match the arrangement note if played correctly and played at the correct time.

If the user plays the cued note at the cued time, a “Hit” is awarded. If the user does not play the cued note, or waits too long to play the cued note, a “Miss” is awarded. FIGS. 4A and 4B shows a hit event (when the user plays the correct note at the correct time) and a miss event (when the user fails to play the correct note at the correct time), respectively. In some embodiments of the game system, if the note is judged as a “Hit”, the note graphical symbol may change its appearance (i.e. glow, explode, turn bright color, etc.), otherwise, if the note is judged as a “Miss”, the graphical symbol for the notes may change its appearance differently (i.e. fade out, shrink, turn dark color, etc.).

In some embodiments of the game system, the user interface 100 shown in FIG. 1B may show different size and/or length notes 108 wherein the size and/or length of a note shows the player how long to hold the note with note 108 illustrating a “short note” and note 109 illustrating a “long note”.

In some embodiments of the game system, a note 111 may take on a special characteristic (i.e. glowing star, blinking, moving in a wavy path) from other notes in the note field 106 and may be “hit” by the user (played correctly) for an additional score or otherwise enhance the player's in-game status. In some embodiments, the judgment of the last played note is displayed on-screen in a textual format 113 for the player to receive immediate feedback of the played note. The judgment 113 may read “Great”, “Good”, “Fair”, “Poor”, “Miss”, etc.

The action indicator interface 104 of the game system may be modified in various ways in other embodiments of the game system. For example, each note 108 (shown as a circle in the example in FIG. 1B) may use other graphical representation (i.e. squares, stars, arrows, etc.)

As another example, the horizontal position of the note indicating the time to play it (cue time), the vertical position indicating the string to play it on (cue string), and the number inside the note indicating the fret that is to be pressed down (cue fret) on the string to generate the correct pitch is an example of the user interface that may be used to cue the stringed musical instrument play and the variables that cue the play (which string, which fret, and what time), may be arranged spatially (horizontally spaced, vertically spaced, depth-wise spaced) and demarcated by notation (using numbers, letters, colors, shapes, etc) and may have many permutations as shown in FIG. 5. Examples of these different user interfaces are shown in FIGS. 6-9. FIG. 6 shows an embodiment of the action indicator interface 104 with a note field 602, one or more notes 604, and a play area 606 wherein the horizontal position of the note indicates the cue time, the vertical position of the note represents the cue fret, and the number inside the note represents the cue string. FIG. 7 shows another embodiment of the action indicator interface 104 with a note field 702, one or more notes 704, and a play area 706 wherein the depth-wise position of the note indicates the cue time, the horizontal position indicates cue string, and the numbers inside the notes represent the cue fret. FIG. 8 shows another embodiment of the Action Indicator Interface 100 with a note field 802 and one or more notes 804 in which the horizontal position represents cue fret, the vertical position represents cue string, and the numbers inside the notes represent the cue time (i.e. the number of seconds to wait before playing the appropriate string/fret combination). Finally, FIG. 9 shows another embodiment of the action indicator interface 104 with a note field 902, one or more notes 904, and a play area 906 in which the horizontal position represents the cue fret, the depth-wise position represents the cue string, and the vertical position represents the cue time (i.e. when to play the note depends on how fast the note falls and the string/fret combination cued by where the note falls in the play area). Any of the embodiments shown in FIGS. 1-9 may be further modified by using unique colors, characters, or shapes instead of numbers to cue the appropriate string/note/time. For instance, the six colors of red, green, blue, cyan, yellow, and magenta can be used to demarcate the cue string on a 6 string guitar. Also, for instance, the characters of the note may be used to demarcate the cue note (i.e. “C#”, “D”, “B flat”, etc.) In addition to spacing along the traditional axis (i.e. horizontal, vertical, depth-wise), additional embodiments may space along non-traditional axes (i.e. diagonal). Additionally, there is no requirement that any or all axes be orthogonal (i.e. two axes may be parallel, near parallel, or otherwise not angled at 90 degrees).

The game system user interface may also include, in some embodiments, a performance feedback interface and several components of the user interface 100 may loosely define a mechanism for a player to receive feedback on their performance in the game. In some embodiments, the user interface 100 may include a score window 112 or other graphic that is used to present the player's current score during their performance. In some embodiments, a performance meter 114 may also be used to present the player's current performance which is a measure of the number of hit notes and missed notes and, if the player's performance falls below a predetermined level, the game may end. FIG. 10 shows alternative embodiments of performance meters. In some embodiments, the performance meter is a bar graph filled or unfilled with colors or patterns based on the player's performance, shown by 1000/1002 in FIG. 10. In some embodiments, the performance meter resembles an analog meter, where the needle moves up and down to indicate the player's performance, shown by 1004 in FIG. 10.

The user interface 100 of the game system in some embodiments may further comprise a chord track 116 that may be, for example, located above the note field 106. During game play, chord information appears in the chord track 116 and scrolls from right to left towards that stationary play area 110. Each piece of chord data lines up with a corresponding note(s) 108 in the note field 106 to show the association between the two.

The user interface 100 of the game system in some embodiments may further comprise a signal feedback interface wherein several components in the user interface 100 loosely define a mechanism for the player to receive the quality and power of the instrument signal being fed into the game. In some embodiments, a volume meter 118 shows the total volume of the player's instrument during instrument play. In some embodiments, a signal plot 120 shows a plot of the power of the player's signal vs. the note so that the signal plot will show peaks at the tone(s) the player is playing.

The user interface 100 of the game system in some embodiments may further comprise a combo feedback interface wherein several components in the user interface 100 loosely define a mechanism for the player to receive information on the number of correctly played notes that have been “hit” in a row (i.e. a combo of correct notes). In some embodiments, textual information 122 displays the current combo number. In some embodiments, a combo bar 124 is used, containing a graphical representation of the number of combo notes played together 126, as well as a display of a score multiplier 128 gained because of successful combos.

FIG. 11 illustrates a method for scoring notes in the stringed instrument example of the game system. If the player plays the arrangement note within the time window allotted around the cued time, the arrangement note is scored as a “Hit” (1102). If the wrong note is played (1104), or the arrangement note is played but not within the time window (1106), no judgment is given. Therefore, it is possible for the player to play several wrong notes but still receive a “Hit” after finally playing the correct arrangement note. If the arrangement note is never played, then a “Miss” is scored.

FIGS. 12A and 12B illustrate a “hit” scoring event and a “miss” scoring event, respectively. As shown in FIG. 12A, the arrangement note “G” has been cued (1202) accompanied by a time window that is shown (1204). A “Hit” is scored in FIG. 12A because the performance by the user contains the note “G” (1206) within the time window (1204). In FIG. 12B, the arrangement note “G” has also been cued (1208) with a time window (1210). However, a “Miss” is scored in FIG. 12B because no note “G” is played in the user performance in the time window. Generally, the live instrument performance of the player will be a continuous signal (with pitches) that therefore is converted in a known manner into notes with time tags so that the game system is able to compare the notes of the arrangement with the notes of the live instrument performance. To accomplish this conversion, the system (such as the analysis module described with reference to FIG. 24 below) may determine the periodicity component of the pitch so that the periodicity component can be converted into a frequency which can then be converted into a note.

FIG. 13 shows which of the performance notes by the user (1302) will be judged/scored if the player plays multiple arrangement notes within the time window. For example, a note “G” in the arrangement has been cued (1304) accompanied by a time window (1306). The player has played “G” twice within the time window (1306), at time 1308 and time 1310. However, time 1308 is closer in time to the arrangement note 1304 and is therefore the one selected for scoring.

In some embodiments, there may be several time windows associated with an arrangement note 1402 as shown in FIG. 14 wherein four different time windows are shown. Each time window allows the player a greater time tolerance for playing the correct arrangement note. In some embodiments, the scoring may be done by giving higher scores to the user performance notes that are in the smaller time windows. For instance, 1404, 1406, 1408, and 1410, may be judged as “Great”, “Good”, “Fair”, and “Poor” and be given a score 4, 3, 2, and 1 respectively. Also, there is no requirement that the time window be symmetrical, as more of a window can be given after the exact cued time 1402 that before it, or vice versa.

In some embodiments, the scoring of notes can be done independent of the octave of the note so that notes played that are one of more octaves higher or lower than the cued note will still be scored positively (i.e. a “Hit”). In these embodiments, the note of the live instrument performance data point is adjusted by one or more octaves so that the adjusted note is then compared to the arrangement note. Then, the live instrument performance is scored positively if the adjusted note is equal to the arrangement note and the live musical performance is not scored if the adjusted note does not match the arrangement note.

The game system may include a level scoring module. In the game, each level is scored based on the number of “Hits” vs. “Misses” awarded to the player. In embodiments with multiple time windows, “Hits” would be subdivided further into “Great”, “Good”, “Fair”, “Poor”, etc. In some embodiments, scoring for a level is done by the multiplying the number of judged notes by multipliers assigned for each rating (i.e. Miss-0, Poor-1, Fair-2, Good-3, Great-4). In some embodiments, a score above a certain amount will unlock one or more unlocked items (i.e. a song, a new character, a new character outfit or guitar, etc.). In some embodiments, a score below a certain amount will “fail” the player and thus not allow the player to progress to the next level.

The game system may also adjust the difficulty of each level of the game. For example, as shown in FIG. 15, the same song may be played with several different level difficulties using a select difficulty screen 1500.

In the game system, different arrangements of musical pieces can be used to give more difficult and challenging experiences of playing the same musical piece, as shown by FIG. 16. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1602. An “Easy” arrangement of the piece 1604 may be composed by cueing only every 4th note. An arrangement more difficult than the Easy arrangement, denoted as “Normal” 1606, cues only every 2 note. An arrangement more difficult than Normal, denoted as “Hard” 1608, cues the player to play every note in the melody. An arrangement more difficult than Hard, denoted as “Expert” 1610, cues the player to add grace notes 1612 and other extra note runs 1614 to the original musical piece. Furthermore, when the difficulty of an arrangement is made more difficult, the time window for each note may be made smaller than the time window for the note during an easier version of the arrangement.

An alternate arrangement technique is illustrated in FIG. 17. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1702. An “Easy” arrangement of the piece 1704 may be composed by cueing every note in the melody. An arrangement more difficult than Easy, denoted as “Normal” 1706, cues additional harmonies to be played on other strings in synchronization with the original melody. An arrangement more difficult than Normal, denoted as “Hard” 1708, cues even more additional harmonies to be played on other strings in synchronization with the original melody. In this way, the difficulty of any arrangement can be adjusted by the game system.

In addition, arrangement of songs do not have to follow the traditional melodies as shown in FIG. 18. In particular, arrangements may be designed where musical theory fundamentals (i.e. scales, chords, arpeggios, etc.) are cued instead. The piece shown, “Mary Had a Little Lamb”, has its rhythmic components shown by 1802. While the melody is shown in 1804, an equally valid series of notes consist of a major scale 1806 in the same key as the musical selection. In some embodiments, more difficult arrangements of musical pieces contain a more difficult series of notes to be played together in sequence (i.e. guitar riffs).

FIG. 19 illustrates an example of a progression of menu screens in the stringed instrument example of the game system wherein 1902 shows a non-interactive progression of screens, which may include a splash screen 1904 that displays the game developer's logo, a logo screen 1906 that displays the game logo, a demonstration screen 1908 that shows the game being autoplayed or played by a computer, and a list of high scores 1910. The user is taken to the interactive progression of screens 1912 after the user interacts with the game (i.e. presses Enter on the keyboard). The main menu 1914 lists available options. The select difficulty screen 1916 allow the player to select their desired song difficulty (FIG. 15). The select music screen allows the player to select a song to play (FIG. 20). The game play screen 1920 is the main game screen (FIG. 1B), which may be paused and then resumed. After game play, the player is taken to an evaluation screen 1922 to review their performance. From the main menu 1914, the player may select the setup instrument screen 1924 to tune their instrument and set up an appropriate sound input device and signal gain (FIG. 22). Also from the main menu 1914, the user may select other options 1926, which will give them the ability to adjust video options 1928 (fullscreen or windowed game, screen resolution, etc.) (FIG. 21 b) or audio options 1930 (music volume, sound effects volume, etc.) (FIG. 21 a).

FIG. 23 illustrates an example of a hardware implementation of a video game system that incorporates the stringed instrument example of the game system. The game system may include a system bus 2302, a ROM 2306 that holds the operating system and a memory 2308 (such as a RAM) that holds the game program 2309. The game system may also include an external storage 2310 that can either be a computer's hard drive, an optical disk, or a flash memory drive, etc. The game system also has a sound module 2312 that connects to the speaker 2314 and a video module 2316 that processes graphics and connects the display 2318, which can be a computer monitor, TV, or arcade screen. The game system may also have a peripheral input 2320 that takes input from the user's keyboard, mouse, buttoned game controllers, racing wheels, etc and a sound input 2322 that takes input from the user's musical instrument and can be a USB cable, microphone, amplifier cord with adapter for computer sound card, networking cable carrying musical data information, etc. The game system may also have a network interface 2324 that takes data in and sends data out over a network for networked play and it can send or receive game data (player profiles, “Hits”, “Misses”, etc.), sound data (from a musical instrument), or music data (i.e. .mp3 or .ogg data from a music file).

FIG. 24 illustrates further details of an analysis module 2400 that is part of the exemplary embodiment shown in FIG. 1A. The analysis module may receive an instrument input that is fed into a processing unit 2401, such as a digital signal processing unit (DSP), that detects one or more notes (and a time tag for each note) in the live instrument input using known pitch conversion and note detection processes (described above) programmed into the DSP. The note and time tag information may be fed into a compare module 2402 (implemented as one or more lines of computer code in one embodiment) that queries the arrangement storage at a particular time period and then compares the live performance notes and time tags to a set of arrangement performance notes and time tags that may be stored in a buffer 2403 as shown. The comparison may be done by determining if the notes match and, if the notes match, then finding the live instrument note with the smallest time error. The compare module then may output a time error to a score module 2404 (implemented as one or more lines of computer code in one embodiment) that generates score data which is output to the player and also output to a performance module 2405 (implemented as one or more lines of computer code in one embodiment) that outputs performance data that indicates the performance level of the particular player.

While the foregoing has been with reference to a particular embodiment of the invention, it will be appreciated by those skilled in the art that changes in this embodiment may be made without departing from the principles and spirit of the invention, the scope of which is defined by the appended claims.

Patent Citations
Cited PatentFiling datePublication dateApplicantTitle
US6225547 *Oct 28, 1999May 1, 2001Konami Co., Ltd.Rhythm game apparatus, rhythm game method, computer-readable storage medium and instrumental device
US7019205 *Oct 13, 2000Mar 28, 2006Sony Computer Entertainment Inc.Entertainment system, entertainment apparatus, recording medium, and program
US7521619 *Apr 19, 2007Apr 21, 2009Allegro Multimedia, Inc.System and method of instructing musical notation for a stringed instrument
US7758427 *Jan 16, 2007Jul 20, 2010Harmonix Music Systems, Inc.Facilitating group musical interaction over a network
US7799984 *May 30, 2003Sep 21, 2010Allegro Multimedia, IncGame for playing and reading musical notation
US7923620 *May 29, 2009Apr 12, 2011Harmonix Music Systems, Inc.Practice mode for multiple musical parts
US7935880 *May 29, 2009May 3, 2011Harmonix Music Systems, Inc.Dynamically displaying a pitch range
US7982114 *May 29, 2009Jul 19, 2011Harmonix Music Systems, Inc.Displaying an input at multiple octaves
US8026435 *May 29, 2009Sep 27, 2011Harmonix Music Systems, Inc.Selectively displaying song lyrics
US8076564 *May 29, 2009Dec 13, 2011Harmonix Music Systems, Inc.Scoring a musical performance after a period of ambiguity
US8079901 *Oct 30, 2007Dec 20, 2011Harmonix Music Systems, Inc.Game controller simulating a musical instrument
US8079907 *Nov 15, 2006Dec 20, 2011Harmonix Music Systems, Inc.Method and apparatus for facilitating group musical interaction over a network
US20010045153 *Mar 2, 2001Nov 29, 2001Lyrrus Inc. D/B/A GvoxApparatus for detecting the fundamental frequencies present in polyphonic music
US20040055441 *Aug 26, 2003Mar 25, 2004Masanori KatsutaMusical performance self-training apparatus
US20050252362 *May 14, 2004Nov 17, 2005Mchale MikeSystem and method for synchronizing a live musical performance with a reference performance
US20060107819 *May 30, 2003May 25, 2006Salter Hal CGame for playing and reading musical notation
US20060107826 *Aug 5, 2005May 25, 2006Knapp R BMethod and apparatus for sensing and displaying tablature associated with a stringed musical instrument
US20060196343 *Mar 4, 2005Sep 7, 2006Ricamy Technology LimitedSystem and method for musical instrument education
US20070163427 *Dec 19, 2005Jul 19, 2007Alex RigopulosSystems and methods for generating video game content
US20070245881 *Apr 4, 2006Oct 25, 2007Eran EgozyMethod and apparatus for providing a simulated band experience including online interaction
US20070256540 *Apr 19, 2007Nov 8, 2007Allegro Multimedia, IncSystem and Method of Instructing Musical Notation for a Stringed Instrument
US20070256551 *May 1, 2007Nov 8, 2007Knapp R BMethod and apparatus for sensing and displaying tablature associated with a stringed musical instrument
US20080078281 *Nov 28, 2007Apr 3, 2008Masanori KatsutaMusical Performance Self-Training Apparatus
US20080113797 *Jan 16, 2007May 15, 2008Harmonix Music Systems, Inc.Method and apparatus for facilitating group musical interaction over a network
US20090038467 *Aug 10, 2007Feb 12, 2009Sonicjam, Inc.Interactive music training and entertainment system
US20090064851 *Nov 27, 2007Mar 12, 2009Microsoft CorporationAutomatic Accompaniment for Vocal Melodies
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US7682237 *Sep 22, 2004Mar 23, 2010Ssd Company LimitedMusic game with strike sounds changing in quality in the progress of music and entertainment music system
US8017854May 29, 2009Sep 13, 2011Harmonix Music Systems, Inc.Dynamic musical part determination
US8026435May 29, 2009Sep 27, 2011Harmonix Music Systems, Inc.Selectively displaying song lyrics
US8076564May 29, 2009Dec 13, 2011Harmonix Music Systems, Inc.Scoring a musical performance after a period of ambiguity
US8080722May 29, 2009Dec 20, 2011Harmonix Music Systems, Inc.Preventing an unintentional deploy of a bonus in a video game
US8283547 *Oct 29, 2010Oct 9, 2012Sony Computer Entertainment America LlcScheme for providing audio effects for a musical instrument and for controlling images with same
US8481838Jan 17, 2012Jul 9, 2013Guitar Apprentice, Inc.Media system and method of progressive musical instruction based on user proficiency
US8572177Dec 15, 2010Oct 29, 2013Xmobb, Inc.3D social platform for sharing videos and webpages
US8586849Jul 19, 2012Nov 19, 2013L. Gabriel SmithMedia system and method of progressive instruction in the playing of a guitar based on user proficiency
US8622827 *Apr 27, 2011Jan 7, 2014Konami Digital Entertainment Co., Ltd.Game system and control method of controlling computer used thereof
US8629342May 7, 2010Jan 14, 2014The Way Of H, Inc.Music instruction system
US8667402Jan 7, 2011Mar 4, 2014Onset Vi, L.P.Visualizing communications within a social setting
US8678896 *Sep 14, 2009Mar 25, 2014Harmonix Music Systems, Inc.Systems and methods for asynchronous band interaction in a rhythm action game
US8771068 *Apr 27, 2011Jul 8, 2014Konami Digital Entertainment Co., Ltd.Game system, control method of controlling computer, and a storage medium storing a computer program
US8939835Jan 12, 2010Jan 27, 2015Razer (Asia-Pacific) Pte. Ltd.System and method for visually indicating actions per minute information using illumination
US20090088249 *Jun 16, 2008Apr 2, 2009Robert KaySystems and methods for altering a video game experience based on a controller type
US20100029386 *Sep 14, 2009Feb 4, 2010Harmonix Music Systems, Inc.Systems and methods for asynchronous band interaction in a rhythm action game
US20110225518 *Jan 31, 2011Sep 15, 2011Oddmobb, Inc.Friends toolbar for a virtual social venue
US20120009996 *Mar 18, 2010Jan 12, 2012Takehiro MasashiGaming device, game control method, information recording medium, and program
US20120172099 *Sep 7, 2010Jul 5, 2012Osamu MigiteraMusic game system, computer program of same, and method of generating sound effect data
US20120214592 *Feb 15, 2012Aug 23, 2012Konami Digital Entertainment Co., Ltd.Game system, and control method of controlling computer and storage medium storing computer program used thereof
US20130005470 *Jul 5, 2010Jan 3, 2013Starplayit Pty LtdMethod of obtaining a user selection
US20130040733 *Apr 27, 2011Feb 14, 2013Konami Digital Entertainment Co., LtdGame system, control method of controlling computer, and a storage medium storing a computer program
US20130040734 *Apr 27, 2011Feb 14, 2013Konami Digital Entertainment Co., Ltd.Game system and control method of controlling computer used thereof
US20130045783 *Apr 27, 2011Feb 21, 2013Takao YamamotoGame system and control method of controlling computer used therefor
US20130045802 *Apr 27, 2011Feb 21, 2013Takao YamamotoGame system, data generation system, data generation method of controlling computer, and a storage medium storing a computer program
US20130219132 *Aug 24, 2012Aug 22, 2013Tohoku UniversityStorage medium storing information processing program, information processing apparatus, information processing method, and information processing system
US20140260901 *Mar 14, 2013Sep 18, 2014Zachary LaskoLearning System and Method
EP2573760A1 *Sep 20, 2012Mar 27, 2013Casio Computer Co., Ltd.Musical performance evaluating device and musical performance evaluating method
WO2010059994A2 *Nov 20, 2009May 27, 2010Poptank Studios, Inc.Interactive guitar game designed for learning to play the guitar
WO2010090847A2 *Jan 21, 2010Aug 12, 2010Bruce CichowlasInteractive musical instrument game
WO2010115049A2 *Apr 1, 2010Oct 7, 2010Activision Publishing, Inc.A device and method for a streaming video game
WO2010138721A2 *May 27, 2010Dec 2, 2010Harmonix Music Systems, Inc.Displaying and processing vocal input
WO2011000059A1 *Jul 5, 2010Jan 6, 2011Starplayit Pty LtdMethod of obtaining a user selection
WO2012125763A2 *Mar 14, 2012Sep 20, 2012Ubisoft Entertaiment, S.A.Instrument game system and method
Classifications
U.S. Classification463/7
International ClassificationA63F9/24
Cooperative ClassificationA63F2300/8047, A63F13/10, A63F2300/1062, A63F2300/61, G10H2220/015, G10H2210/091, G10H1/368, G10H2220/151, A63F2300/638
European ClassificationA63F13/10, G10H1/36K7
Legal Events
DateCodeEventDescription
Sep 22, 2010ASAssignment
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:GAMETANK, INC.;REEL/FRAME:025028/0616
Effective date: 20100614
Owner name: UBISOFT ENTERTAINMENT, FRANCE
Oct 3, 2007ASAssignment
Owner name: GAMETANK INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:PARKS, JAKOB;REEL/FRAME:019916/0444
Effective date: 20071001