|Publication number||US7758427 B2|
|Application number||US 11/623,534|
|Publication date||Jul 20, 2010|
|Filing date||Jan 16, 2007|
|Priority date||Nov 15, 2006|
|Also published as||EP2099542A2, US8079907, US20080113698, US20080113797, US20120094730, WO2008061169A2, WO2008061169A3|
|Publication number||11623534, 623534, US 7758427 B2, US 7758427B2, US-B2-7758427, US7758427 B2, US7758427B2|
|Original Assignee||Harmonix Music Systems, Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (46), Non-Patent Citations (1), Referenced by (51), Classifications (22), Legal Events (3)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application is a continuation of U.S. patent application Ser. No. 11/560,195 filed on Nov. 15, 2006, and titled “METHOD AND APPARATUS FOR FACILITATING GROUP MUSICAL INTERACTION OVER A NETWORK.”
This invention relates to electronic music systems and, more particularly, to an electronic music system by which game players interact musically with one another in real-time over a network.
Music is a temporal medium, the organization of sound in time. Accordingly, music making is highly timing sensitive. When a musician presses a key on a piano, the musician expects the result to be immediately audible. Any delay in hearing the sound, even as brief as few milliseconds, produces a perceived sluggishness that impedes the ability of the musician to use the instrument.
Music making is also often a collaborative effort among many musicians who interact with each other. One form of musical interaction popular among non-musicians is provided by a video game genre known as “rhythm-action,” which requires a player to perform phrases from a pre-recorded musical composition using the video game's input device to simulate a musical instrument. The best-known example of this genre is the BEATMANIA series of games published by Konami Co., Ltd. of Japan. An example of the game environment provided by BEATMANIA is shown in
Multiplayer gaming increasingly incorporates various networking technologies that allow multiple players to compete against each other from remote physical locations via networks, and networked multiplayer gaming has become extremely popular. Unfortunately, however, the latency inherent in networked communication imposes a significant engineering and design burden on video game developers: data signals are often subject to large and unpredictable transmission delays. These transmission delays do not significantly impact turn-based games (such as chess) or other game genres in which timing sensitivity is not critical to gameplay. In action games and other “real-time” games, however, gameplay is extremely sensitive to the timing of various events, and transmission delays inherently result in inconsistencies continually forming between the local game states of the various players of a networked game. Consequently, developers of timing-sensitive networked games have had to invent various methods for gracefully performing “conflict resolution” to resolve divergent local game states.
The rhythm-action genre has a unique attribute, however, that makes traditional conflict resolution methods inapplicable. Specifically, the core activity of multiplayer rhythm-action involves simultaneous music-making, which is highly timing sensitive, by two or more players. If these two players are separated by a network, the data representing musical notes played by one player will incur transmission delays when being sent to the other player. If note data were simply transmitted to a receiving machine it would trigger corresponding audio that would sound “out of sync” to the receiving player, resulting in cacophony. One solution to this problem would be to mute the audio from remote players on the local player's machine. However, this would significantly degrade the entertainment value of the game experience by destroying musical communication between the players.
Therefore, a need exists for a system and method that enable musicians to achieve the experience of real-time musical interaction over a high-latency network, such as the Internet.
It is an object of the invention to provide a system and method that a group individuals connected to a network can use to compete with one another in real time in a rhythm-action game.
In one aspect, the present invention relates to a method for facilitating real-time interaction between players of a game. First music performance input data is received from a local player, the first music performance input data representing a first musical performance. Audio output responsive to the received first music performance input is generated. Second music performance input data from a remote player is received via a network. The received second music performance input data represents a musical performance by the remote player. Emulation data is created from the received second music performance input data and a local approximation of the remote musical performance is generated using the emulation data. The local approximation of the remote musical performance is synchronous with the local musical performance.
In some embodiments, the music performance input data is generated by a local player using a gamepad, a simulated musical instrument, a simulated guitar, a simulated drum, a simulated musical keyboard, a simulated turntable, or a simulated microphone. In other embodiments a note to indicate a successful input or an error tone is sounded to indicate unsuccessful input. In still other embodiments, second music performance input data from a remote player is received from an interim server between the two players. The received second music performance input data represents a musical performance by the remote player.
In still other embodiments the emulation data is created by performing a moving average of timing deltas between received second music performance events or by performing a moving average of received second music performance events.
In another aspect, the present invention relates to a method for facilitating real-time interaction between players of a game. Music performance input data from a remote player is received over a network. The received music performance input data represents a musical performance by the remote player. Emulation data from the received music performance input data is created and a local approximation of the remote musical performance is created using the emulation data. The local approximation is synchronous with a local musical performance.
In some embodiments, the received music performance input data represents a musical performance by the remote player and is received from an interim server. In other embodiments, the emulation data is created by performing a moving average of timing deltas between received music performance events or a moving average of received music performance events.
In still another aspect, the present invention relates to a system for facilitating real-time interaction between players of a game. The system includes: means for receiving first music performance input data from a local player, the first music performance input data representing a first musical performance; means for generating audio output responsive to the received first music performance input; means for receiving, via a network, second music performance input data from a remote player, the received second music performance input data representing a musical performance by the remote player; means for creating emulation data from the received second music performance input data; and means for generating a local approximation of the remote musical performance using the emulation data, the local approximation synchronous with the local musical performance.
In some embodiments, the means for receiving first music performance input data comprises a simulated musical instrument, a simulated guitar, a simulated drum, a simulated musical keyboard, a simulated turntable, or a simulated microphone. In other embodiment, the means for generating audio output plays a note to indicate a successful input or plays an error tone to indicate unsuccessful input.
The invention is pointed out with particularity in the appended claims. The advantages of the invention described above, as well as further advantages of the invention, may be better understood by reference to the following description taken in conjunction with the accompanying drawings, in which:
Referring now to
Although depicted in
In some embodiments, the spatial lane does not extend perpendicularly from the image plane of the display but instead extends obliquely from the image plane of the display. In further embodiments, the lane may be curved or may be some combination of curved portions and straight portions. In still further embodiments, the lane may form a closed loop through which the viewer may travel, such as a circular or ellipsoid loop.
As shown in
It should be understood that the display of three-dimensional “virtual” space is an illusion achieved by mathematically “rendering” two-dimensional images from objects in a three-dimensional “virtual space” using a “virtual camera,” just as a physical camera optically renders a two-dimensional view of real three-dimensional objects. Animation may be achieved by displaying a series of two-dimensional views in rapid succession, similar to motion picture films that display multiple still photographs per second.
To generate the three-dimensional space, each object in the three-dimensional space is typically modeled as one or more polygons, each of which has associated visual features such as texture, transparency, lighting, shading, anti-aliasing, z-buffering, and many other graphical attributes. The combination of all the polygons with their associated visual features can be used to model a three-dimensional scene. A virtual camera may be positioned and oriented anywhere within the scene. In many cases, the camera is under the control of the viewer, allowing the viewer to scan objects. Movement of the camera through the three-dimensional space results in the creation of animations that give the appearance of navigation by the user through the three-dimensional environment.
A software graphics engine may be provided which supports three-dimensional scene creation and manipulation. A graphics engine generally includes one or more software modules that perform the mathematical operations necessary to “render” the three-dimensional environment, which means that the graphics engine applies texture, transparency, and other attributes to the polygons that make up a scene. Graphic engines that may be used in connection with the present invention include Gamebryo, manufactured by Emergent Game Technologies of Calabasas, Calif., the Unreal Engine, manufactured by Epic Games, and Renderware, manufactured by Criterion Software of Austin, Tex. In other embodiments, a proprietary graphic engine may be used. In many embodiments, a graphics hardware accelerator may be utilized to improve performance. Generally, a graphics accelerator includes video memory that is used to store image and environment data while it is being manipulated by the accelerator.
In other embodiments, a three-dimensional engine may not be used. Instead, a two-dimensional interface may be used. In such an embodiment, video footage of a band can be used in the background of the video game. In others of these embodiments, traditional two-dimensional computer-generated representations of a band may be used in the game. In still further embodiments, the background may only slightly related, or unrelated, to the band. For example, the background may be a still photograph or an abstract pattern of colors. In these embodiments, the lane 220, 240, 260 may be represented as a linear element of the display, such as a horizontal, vertical or diagonal element.
Referring back to
As the game elements 224, 244, 264 move along a respective lane 220, 240, 260, musical data represented by the game elements 224, 244, 264 may be substantially simultaneously played as audible music. In some embodiments, audible music represented by a game element 224, 244, 264 is only played (or only played at full or original fidelity) if a player successfully “performs the musical content” by capturing or properly executing the game element 224, 244, 264. In some embodiments, a musical tone is played to indicate successful execution of a musical event by a player. In other embodiments, a stream of audio is played to indicate successful execution of a musical event by a player. In certain embodiments, successfully performing the musical content triggers or controls the animations of the avatars 210, 230, 250. In other embodiments, the audible music, tone, or stream of audio represented by a game element 224, 244, 264 is modified, distorted, or otherwise manipulated in response to the player's proficiency in executing game elements associated with a lane 220, 240, 260. For example, various digital filters can operate on the audible music, tone, or stream of audio prior to being played by the game player. Various parameters of the filters can be dynamically and automatically modified in response the player capturing game elements associated with a lane 220, 240, 260, allowing the audible music to be degraded if the player performs poorly or enhancing the audible music, tone, or stream of audio if the player performs well. For example, if a player fails to execute a game event, the audible music, tone, or stream of audio represented by the failed event may be muted, played at less than full volume, or filtered to alter the its sound. In certain embodiments, a “wrong note” sound may be substituted for the music represented by the failed event. Conversely, if a player successfully executes a game event, the audible music, tone, or stream of audio may be played normally. In some embodiments, if the player successfully executes several, successive game events, the audible music, tone, or stream of audio associated with those events may be enhanced, for example, by adding an echo or “reverb” to the audible music. The filters can be implemented as analog or digital filters in hardware, software, or any combination thereof. Further, application of the filter to the audible music output, which in many embodiments corresponds to musical events represented by game elements 224, 244, 264, can be done dynamically, that is, during play. Alternatively, the musical content may be processed before game play begins. In these embodiments, one or more files representing modified audible output may be created and musical events to output may be selected from an appropriate file responsive to the player's performance.
In addition to modification of the audio aspects of game events based on the player's performance, the visual appearance of those events may also be modified based on the player's proficiency with the game. For example, failure to execute a game event properly may cause game interface elements to appear more dimly. Alternatively, successfully executing game events may cause game interface elements to glow more brightly. Similarly, the player's failure to execute game events may cause their associated avatar 210, 230, 250 to appear embarrassed or dejected, while successful performance of game events may cause their associated avatar 210, 230, 250 to appear happy and confident. In other embodiments, successfully executing game elements associated with a lane 220, 240, 260 causes the avatar 210, 230, 250 associated with that lane 220, 240, 260 to appear to play an instrument, for example, the drummer avatar 230 will appear to strike the correct drum for producing the audible music. Successful execution of a number of successive game elements, or notes, may cause the corresponding avatar 210, 230, 250 to execute a “flourish,” such as kicking their leg, pumping their fist, performing a guitar “windmill,” spinning around, winking at the “crowd,” or throwing drum sticks.
Player interaction with the game element 224, 244, 264 may be required in a number of different ways. In general, the player is required to provide input when a game element 224, 244, 264 passes under or over a respective one of a set of target markers 228, 248, 268 disposed on the lane 220, 240, 260. For example, the player associated with avatars 210 (lead guitar) or avatar 250 (bass guitar) may use a specialized controller to interact with the game that simulates a guitar, such as a Guitar Hero SG Controller, manufactured by RedOctane of Sunnyvale, Calif. In this embodiment, the player executes the game element by activating the “strum bar” while pressing the correct fret button of the controller when the game element 224, 264 passes under the target markers 228, 268. In other embodiments, the player may execute a game element by performing a “hammer on” or “pull off,” which requires quick depression or release of a fret button without activation of the strum bar. In other embodiments the player may be required to perform a game element using a “whammy bar” provided by the guitar controller. For example, the player may be required to bend the pitch of note represented by a game element using the whammy bar. In some embodiments, the guitar controller may also use one or more “effects pedals,” such as reverb or fuzz, to alter the sound reproduced by the gaming platform.
The player associated with the middle avatar 230 (drummer) may also use a specialized controller to interact with the game that simulates a drum kit, such as the DrumMania drum controller, manufactured by Topway Electrical Appliance Co., Ltd. of Shenzhen, China. In some embodiments, the drum controller provides four drum pads and a kick drum. In other embodiments, the drum controller surrounds the player, as a “real” drum kit would do. In still other embodiments, the drum controller is designed to look and feel like an analog drum kit. In these embodiments, a game element may be associated with a particular drum. The player strikes the indicated drum when the game element 244 passes under the target marker 248, to successfully execute game element 244. In other embodiments, a player may use a standard game controller to play, such as a DualShock game controller, manufactured by Sony Corporation.
In some embodiments, a player is associated with a “turntable” or “scratch” track. In these embodiments, the player may provide input using a simulated turntable such as the turntable controller sold by Konami Corporation.
Referring now to
In other embodiments, a player may interact with the game and cooperate or compete with other players by executing specific dance moves in synchrony with music content. As shown in
Although described above in the context of a single player providing a single type of input, a single player may provide one or more types of input simultaneously. For example, a single player may provide dance and vocal input simultaneously. Another example is a single player providing instrument-based input (such as for a lead guitar track, bass guitar track, rhythm guitar track, keyboard track, drum track, or other percussion track) and vocal input simultaneously. As another example, a single player may provide instrument-based input (such as for a lead guitar track, bass guitar track, rhythm guitar track, keyboard track, drum track, or other percussion track) and dance input simultaneously.
Referring back to
Local play may be competitive or it may be cooperative. Cooperative play is when two or more players work together in an attempt to earn a combined score. Competitive play is when a player competes against another player in an attempt to earn a higher score. In other embodiments, competitive play involves a team of cooperating players competing against another team of competing players in attempt to achieve a higher team score than the other team. Competitive local play may be head-to-head competition using the same instrument, head-to-head competition using separate instruments, simultaneous competition using the same instrument, or simultaneous competition using separate instruments.
In one embodiment, competition in local play occurs when two or more players use the same type of instrument controller to play the game, for example, guitar controllers. One embodiment of such competition is depicted in
This embodiment of head-to-head play may be extended to allow the players to use different types of game controllers and, therefore, to perform different portions of the musical composition. For example, one player may elect to play using a guitar-type controller while a second player may play using a drum-type controller. Alternatively, each player may use a guitar-type controller, but one player elects to play “lead guitar” while the other player elects to play “rhythm guitar” or, in some embodiments, “bass guitar.” In these examples, the gaming platform reproduces the instruments other than the guitar when it is the first player's turn to play, and the lane associated with the first player is populated with gems representing the guitar portion of the composition. When it is time for the second player to compete, the gaming platform reproduces the instruments other than, for example, the drum part, and the second player's lane is populated with gems representing the drum portion of the musical composition. In some of these embodiments, a scalar factor may be applied to the score of one of the player's to compensate for the differences in the parts of the musical composition.
In still other embodiments, the players may compete simultaneously, that is, each player may provide a musical performance at the same time as the other player. In some embodiments, both players may use the same type of controller. In these embodiments, each player's lane provides the same pattern of game elements and each player attempts to reproduce the musical performance identified by those elements more faithfully than the other player. In other embodiments, the players use different types of controllers. In these embodiments, one player attempts to reproduce one portion of a musical composition while the other player tries to represent a different portion of the same composition.
In any of these forms of competition, the relative performance of a player may affect their associated avatar. For example, the avatar of a player that is doing better than the competition may, for example, smile, look confident, glow, swagger, “pogo stick,” etc. Conversely, the losing player's avatar may look depressed, embarrassed, etc.
Instead of competing, the players may cooperate in an attempt to achieve a combined score. In these embodiments, the score of each player contributes to the score of the team, that is, a single score is assigned to the team based on the performance of all players. As described above, a scalar factor may be applied to the score of one of the player's to compensate for the differences in the parts of the musical composition.
In some embodiments, one or more of the players may participate remotely.
When a networked multiplayer game session begins at the direction of one of the players, that player's gaming platform 510 (the “host”) transmits a “start” instruction to all other gaming platforms participating in the networked game, and the game begins on all platforms. A timer begins counting on each gaming platform, each player's game cues are displayed, and each player begins attempting to perform the musical composition.
Gameplay on gaming platform 510 is independent from game play on gaming platform 510′, except that each player's gaming platform contains a local copy of the musical event data for all other players. The timers on the various gaming platforms communicate with each other via the network 550 to maintain approximate synchrony using any number of the conventional means known in the art.
The gaming platforms 510, 510′ also continually transmit game score data to each other, so that each system (and player) remains aware of the game score of all other systems (and players). Similarly, this is accomplished by any number of means known in the art. Note that this data is not particularly timing sensitive, because if there is momentary disagreement between any two gaming platforms regarding the score (or similar game-related parameters), the consequences to gameplay are negligible.
In one embodiment, as each player plays the game at their respective location, an analyzer module 580, 580′ on that player's gaming platform 510, 510′ continually extracts data from an event monitor 585, 585′ regarding the local player's performance, referred to hereafter as “emulation data”. Emulation data may include any number of parameters that describe how well the player is performing. Some examples of these parameters include:
Each analyzer module 580, 580′ continually transmits the emulation data it extracts over the network 550 using transceiver 590, 590′; each event monitor 585, 585′ continually receives the other gaming platform's emulation data transmitted over the network 550.
In one embodiment, the emulation data essentially contains a statistical description of a player's performance in the recent past. The event monitor 585, 585′ uses received emulation data to create a statistical approximation of the remote player's performance.
In one particular example, an incoming emulation parameter from a remote player indicates that the most recent remote event was correctly reproduced. When the local event monitor 585, 585′ reaches the next note in the local copy of the remote player's note data, it will respond accordingly by “faking” a successfully played note, triggering the appropriate sound. That is, the local event monitor 585, 585′ will perform the next musical event from the other players' musical event data, even though that event was not necessarily actually performed by the other player's event monitor 585, 585′. If instead the emulation parameter had indicated that the most recent remote event was a miss, no sound would be triggered.
In another particular example, an incoming emulation parameter from a remote player indicates that during the last 8 beats, 75% of events were correctly reproduced and 25% were not correctly reproduced. When the local event monitor 585 reaches the next note in the local copy of the remote player's note data, it will respond accordingly by randomly reproducing the event correctly 75% of the time and not reproducing it correctly 25% of the time.
In another particular example, an incoming emulation parameter from a remote player indicates that during the last 4 beats, 2 events were incorrectly performed, with an average timing error of 50 “ticks.” The local event monitor 585, 585′ will respond accordingly by randomly generating incorrect events at a rate of 0.5 misses-per-beat, displacing them in time from nearby notes by the specified average timing error.
The above three cases are merely examples of the many types of emulation parameters that may be used. In essence, the remote player performances are only emulated (rather than exactly reproduced) on each local machine.
In this embodiment, the analyzer module 580, 580′ may extract musical parameters from the input and transmit them over a network 550 to a remote gaming platform. For example, the analyzer module 580, 580′ may simply transmit the input stream over a network 550 or it may extract the information into a more abstract form, such as “faster” or “lower.” Although described in the context of a two-player game, the technique may be used with any number of players.
Still referring to
In other embodiments, the transmitted data is associated with a flag that indicates whether the transmitted data represents a successfully executed musical event or an unsuccessfully executed musical event. In these embodiments, the analyzer 580, 580′ provides a locally-generated emulation parameter to the event monitor 585, 585′ based on the flag associated with the transmitted data.
One unusual side effect of these techniques is that each local player does not hear an exact reproduction of the remote players' performances; only a statistical approximation. However, these statistical approximations have two countervailing positive attributes: because they are synchronized to the local player's timer and the local copy of the remote players' note data, they are synchronous with the local player's performance; and while not exact reproductions, they are “close enough” to effectively communicate to the local player the essence of how well the remote players are performing musically. In this model, delays in the transmission of the data over the network 550 do not have the intolerable side effect of causing cacophonous asynchronicity between the note streams triggering sounds on each player's local system.
Referring now to
In some embodiments, multiple players participate in an online face-off between two bands. A “band” is two or more players that play in a cooperative mode. In some embodiments, the two bands need to have the same types of instruments at the same difficulty level selection, i.e., a guitarist playing on “hard” and a bassist playing on “medium” playing against a guitarist playing on “hard” and a bassist playing on “medium.” In other embodiments, the two bands still need to have the same types of instruments but the difficulty selections can be different: Players participating at a lower difficulty level simply have fewer gems to contribute to the overall score. The song to be played may be selected after the teams have been paired up. Alternatively, a band may publish a challenge to play a particular song and a team may accept the challenge.
Referring back to
In some particular embodiments, members of cooperating bands may be local to one another or remote from one another. Similarly, members of competing bands may be local to one another or remote from one another. In an extreme example, each player is remote from every other player.
In some embodiments, players may form persistent bands. In these embodiments, those bands may only compete when at least a majority of the band in available online. In some of the embodiments, if a member of a persistent band in not online, and the other band members want to compete, a gaming platform may substitute for the missing band member. Alternatively, a player unaffiliated with the band may substitute for the missing band member. In still other embodiments, a stream of emulation parameters stored during a previous performance by the missing band member may be substituted for the player.
In other embodiments, an online venue may be provided allowing players to form impromptu bands. Impromptu bands may dissolve quickly or they may become persistent bands.
The present invention (including without limitation, the timer 340, and the event monitor 320) may be provided as one or more computer-readable programs embodied on or in one or more articles of manufacture. The article of manufacture may be a floppy disk, a hard disk, a CD-ROM, a flash memory card, a PROM, a RAM, a ROM, or a magnetic tape. In general, the computer-readable readable programs may be implemented in any programming language, LISP, PERL, C, C++, PROLOG, or any byte code language such as JAVA. The software programs may be stored on or in one or more articles of manufacture as object code.
Having described certain embodiments of the invention, it will now become apparent to one of skill in the art that other embodiments incorporating the concepts of the invention may be used. Although the described embodiments relate to the field of rhythm-action games, the principles of the invention can extend to other areas that involve musical collaboration or competition by two or more users connected to a network. Therefore, the invention should not be limited to certain embodiments, but rather should be limited only by the spirit and scope of the following claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US3897711 *||Feb 20, 1974||Aug 5, 1975||Harvey Brewster Elledge||Music training device|
|US5270475 *||Mar 4, 1991||Dec 14, 1993||Lyrrus, Inc.||Electronic music system|
|US5739457 *||Sep 26, 1996||Apr 14, 1998||Devecka; John R.||Method and apparatus for simulating a jam session and instructing a user in how to play the drums|
|US6009457 *||Apr 22, 1996||Dec 28, 1999||Rocket Network, Inc.||Distributed real-time communications system|
|US6011212 *||Jan 27, 1997||Jan 4, 2000||Harmonix Music Systems, Inc.||Real-time music creation|
|US6031174 *||Sep 23, 1998||Feb 29, 2000||Yamaha Corporation||Generation of musical tone signals by the phrase|
|US6149523||Feb 28, 1997||Nov 21, 2000||Namco Ltd.||Image synthesis method, games machine and information storage medium with sequence checking|
|US6212571 *||Mar 4, 1997||Apr 3, 2001||Nec Corporation||Server|
|US6225547 *||Oct 28, 1999||May 1, 2001||Konami Co., Ltd.||Rhythm game apparatus, rhythm game method, computer-readable storage medium and instrumental device|
|US6253228 *||Mar 31, 1997||Jun 26, 2001||Apple Computer, Inc.||Method and apparatus for updating and synchronizing information between a client and a server|
|US6342665||Feb 14, 2000||Jan 29, 2002||Konami Co., Ltd.||Music game system, staging instructions synchronizing control method for same, and readable recording medium recorded with staging instructions synchronizing control program for same|
|US6343315 *||May 12, 1999||Jan 29, 2002||Lodgenet Entertainment Corporation||Entertainment/Information system having disparate interactive devices|
|US6353169 *||Apr 25, 2000||Mar 5, 2002||Gibson Guitar Corp.||Universal audio communications and control system and method|
|US6353174 *||Dec 10, 1999||Mar 5, 2002||Harmonix Music Systems, Inc.||Method and apparatus for facilitating group musical interaction over a network|
|US6482087||May 14, 2001||Nov 19, 2002||Harmonix Music Systems, Inc.||Method and apparatus for facilitating group musical interaction over a network|
|US6541692||Jun 28, 2001||Apr 1, 2003||Allan Miller||Dynamically adjustable network enabled method for playing along with music|
|US6645067 *||Feb 10, 2000||Nov 11, 2003||Konami Co., Ltd.||Music staging device apparatus, music staging game method, and readable storage medium|
|US6726567 *||Dec 17, 1999||Apr 27, 2004||Vinod Khosla||Simulated real time game play with live event|
|US6897779 *||Feb 22, 2002||May 24, 2005||Yamaha Corporation||Tone generation controlling system|
|US6979767 *||Dec 18, 2002||Dec 27, 2005||Medialab Solutions Llc||Systems and methods for creating, modifying, interacting with and playing musical compositions|
|US6985966 *||Mar 29, 2000||Jan 10, 2006||Microsoft Corporation||Resynchronizing globally unsynchronized multimedia streams|
|US7044857 *||Oct 15, 2002||May 16, 2006||Klitsner Industrial Design, Llc||Hand-held musical game|
|US7074999 *||Jan 29, 2003||Jul 11, 2006||Sitrick David H||Electronic image visualization system and management and communication methodologies|
|US7145070 *||Jul 14, 2003||Dec 5, 2006||Thurdis Developments Limited||Digital musical instrument system|
|US7151214 *||Apr 9, 2001||Dec 19, 2006||Thurdis Developments Limited||Interactive multimedia apparatus|
|US7164076 *||May 14, 2004||Jan 16, 2007||Konami Digital Entertainment||System and method for synchronizing a live musical performance with a reference performance|
|US7206811 *||Mar 13, 2003||Apr 17, 2007||Oracle International Corp.||System and method for facilitating real-time collaborating by collapsing a queue for a slow client|
|US7277958 *||Mar 12, 2002||Oct 2, 2007||Edgestream, Inc.||Re-assembly of streaming files from separate connections|
|US7334024 *||Feb 10, 2005||Feb 19, 2008||Cyberfone Technologies, Inc||System for transmission of voice and data over the same communications line|
|US7346698 *||Dec 20, 2000||Mar 18, 2008||G. W. Hannaway & Associates||Webcasting method and system for time-based synchronization of multiple, independent media streams|
|US7390954 *||Oct 19, 2005||Jun 24, 2008||Yamaha Corporation||Electronic musical apparatus system, server-side electronic musical apparatus and client-side electronic musical apparatus|
|US7391791 *||Dec 17, 2002||Jun 24, 2008||Implicit Networks, Inc.||Method and system for synchronization of content rendering|
|US7405355 *||Dec 10, 2004||Jul 29, 2008||Music Path Inc.||System and method for video assisted music instrument collaboration over distance|
|US7467184 *||Jul 7, 2003||Dec 16, 2008||Canon Kabushiki Kaisha||Method and device for data processing in a communication network|
|US20020007723 *||Mar 19, 2001||Jan 24, 2002||Ludwig Lester F.||Processing and generation of control signals for real-time control of music signal processing, mixing, video, and lighting|
|US20020045484 *||Sep 18, 2001||Apr 18, 2002||Eck Charles P.||Video game distribution network|
|US20020128736 *||Dec 8, 2000||Sep 12, 2002||Hirotada Yoshida||Game machine|
|US20020169014 *||May 14, 2001||Nov 14, 2002||Eran Egozy||Method and apparatus for facilitating group musical interaction over a network|
|US20040154461||Feb 7, 2003||Aug 12, 2004||Nokia Corporation||Methods and apparatus providing group playing ability for creating a shared sound environment with MIDI-enabled mobile stations|
|US20050214728 *||Mar 3, 2005||Sep 29, 2005||Yamaha Corporation||Data delivery apparatus and method, and terminal apparatus|
|US20070000374 *||Jun 30, 2005||Jan 4, 2007||Body Harp Interactive Corporation||Free-space human interface for interactive music, full-body musical instrument, and immersive media controller|
|US20070140510 *||Oct 11, 2006||Jun 21, 2007||Ejamming, Inc.||Method and apparatus for remote real time collaborative acoustic performance and recording thereof|
|US20070163427 *||Dec 19, 2005||Jul 19, 2007||Alex Rigopulos||Systems and methods for generating video game content|
|US20070191401 *||Feb 20, 2007||Aug 16, 2007||Rainer Albert||Indolylmaleimide derivatives|
|WO1998014898A2||Sep 19, 1997||Apr 9, 1998||Philips Electronics Nv||Latency effect in multi-player video game reduced by surrogate agent|
|WO2005113096A1||May 3, 2005||Dec 1, 2005||Eran B Egozy||Vocal training system and method with flexible performance evaluation criteria|
|1||International Search Report for International Application No. PCT/US2007/084753, Date of Mailing May 20, 2008 (3 pages).|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7902446||Feb 20, 2009||Mar 8, 2011||Oem, Incorporated||System for learning and mixing music|
|US7923620 *||May 29, 2009||Apr 12, 2011||Harmonix Music Systems, Inc.||Practice mode for multiple musical parts|
|US7982114||May 29, 2009||Jul 19, 2011||Harmonix Music Systems, Inc.||Displaying an input at multiple octaves|
|US8017854||May 29, 2009||Sep 13, 2011||Harmonix Music Systems, Inc.||Dynamic musical part determination|
|US8026435||May 29, 2009||Sep 27, 2011||Harmonix Music Systems, Inc.||Selectively displaying song lyrics|
|US8044289 *||Mar 8, 2010||Oct 25, 2011||Samsung Electronics Co., Ltd||Electronic music on hand portable and communication enabled devices|
|US8076564||May 29, 2009||Dec 13, 2011||Harmonix Music Systems, Inc.||Scoring a musical performance after a period of ambiguity|
|US8080722 *||May 29, 2009||Dec 20, 2011||Harmonix Music Systems, Inc.||Preventing an unintentional deploy of a bonus in a video game|
|US8119896 *||Oct 12, 2010||Feb 21, 2012||Smith L Gabriel||Media system and method of progressive musical instruction|
|US8207438||Feb 8, 2011||Jun 26, 2012||Jammit, Inc.||System for learning an isolated instrument audio track from an original, multi-track recording|
|US8278543||Feb 8, 2011||Oct 2, 2012||Jammit, Inc.||Method of providing musicians with an opportunity to learn an isolated track from an original, multi-track recording|
|US8278544||Feb 8, 2011||Oct 2, 2012||Jammit, Inc.||Method of learning an isolated instrument audio track from an original, multi-track work|
|US8283545||May 23, 2011||Oct 9, 2012||Jammit, Inc.||System for learning an isolated instrument audio track from an original, multi-track recording through variable gain control|
|US8317614 *||Apr 15, 2008||Nov 27, 2012||Activision Publishing, Inc.||System and method for playing a music video game with a drum system game controller|
|US8319084||May 25, 2011||Nov 27, 2012||Jammit, Inc.||Method of studying an isolated audio track from an original, multi-track recording using variable gain control|
|US8367923||May 23, 2011||Feb 5, 2013||Jammit, Inc.||System for separating and mixing audio tracks within an original, multi-track recording|
|US8439733||Jun 16, 2008||May 14, 2013||Harmonix Music Systems, Inc.||Systems and methods for reinstating a player within a rhythm-action game|
|US8444464||Sep 30, 2011||May 21, 2013||Harmonix Music Systems, Inc.||Prompting a player of a dance game|
|US8444486||Oct 20, 2009||May 21, 2013||Harmonix Music Systems, Inc.||Systems and methods for indicating input actions in a rhythm-action game|
|US8449360||May 29, 2009||May 28, 2013||Harmonix Music Systems, Inc.||Displaying song lyrics and vocal cues|
|US8465366||May 29, 2009||Jun 18, 2013||Harmonix Music Systems, Inc.||Biasing a musical performance input to a part|
|US8476517||May 25, 2011||Jul 2, 2013||Jammit, Inc.||Variable timing reference methods of separating and mixing audio tracks from original, musical works|
|US8481838 *||Jan 17, 2012||Jul 9, 2013||Guitar Apprentice, Inc.||Media system and method of progressive musical instruction based on user proficiency|
|US8550908||Mar 16, 2011||Oct 8, 2013||Harmonix Music Systems, Inc.||Simulating musical instruments|
|US8562403||Jun 10, 2011||Oct 22, 2013||Harmonix Music Systems, Inc.||Prompting a player of a dance game|
|US8568234||Mar 16, 2011||Oct 29, 2013||Harmonix Music Systems, Inc.||Simulating musical instruments|
|US8586849||Jul 19, 2012||Nov 19, 2013||L. Gabriel Smith||Media system and method of progressive instruction in the playing of a guitar based on user proficiency|
|US8636572||Mar 16, 2011||Jan 28, 2014||Harmonix Music Systems, Inc.||Simulating musical instruments|
|US8663013||Jul 8, 2009||Mar 4, 2014||Harmonix Music Systems, Inc.||Systems and methods for simulating a rock band experience|
|US8678895||Jun 16, 2008||Mar 25, 2014||Harmonix Music Systems, Inc.||Systems and methods for online band matching in a rhythm action game|
|US8678896||Sep 14, 2009||Mar 25, 2014||Harmonix Music Systems, Inc.||Systems and methods for asynchronous band interaction in a rhythm action game|
|US8686269||Oct 31, 2008||Apr 1, 2014||Harmonix Music Systems, Inc.||Providing realistic interaction to a player of a music-based video game|
|US8690670||Jun 16, 2008||Apr 8, 2014||Harmonix Music Systems, Inc.||Systems and methods for simulating a rock band experience|
|US8702485||Nov 5, 2010||Apr 22, 2014||Harmonix Music Systems, Inc.||Dance game and tutorial|
|US8721441 *||Jan 15, 2008||May 13, 2014||Activision Publishing, Inc.||Competitive music video game with instrument simulation|
|US8777747||Sep 14, 2012||Jul 15, 2014||Activision Publishing, Inc.||System and method for playing a music video game with a drum system game controller|
|US8835736 *||Sep 11, 2012||Sep 16, 2014||Ubisoft Entertainment||Instrument game system and method|
|US8847053||Oct 14, 2011||Sep 30, 2014||Jammit, Inc.||Dynamic point referencing of an audiovisual performance for an accurate and precise selection and controlled cycling of portions of the performance|
|US8874243||Mar 16, 2011||Oct 28, 2014||Harmonix Music Systems, Inc.||Simulating musical instruments|
|US8907193||Mar 14, 2011||Dec 9, 2014||Ubisoft Entertainment||Instrument game system and method|
|US8986090||Mar 2, 2012||Mar 24, 2015||Ubisoft Entertainment||Interactive guitar game designed for learning to play the guitar|
|US8989521 *||Nov 23, 2011||Mar 24, 2015||Google Inc.||Determination of dance steps based on media content|
|US9024166||Sep 9, 2010||May 5, 2015||Harmonix Music Systems, Inc.||Preventing subtractive track separation|
|US9120016||Nov 20, 2009||Sep 1, 2015||Ubisoft Entertainment||Interactive guitar game designed for learning to play the guitar|
|US9132348||Sep 12, 2012||Sep 15, 2015||Ubisoft Entertainment||Instrument game system and method|
|US20080200224 *||Oct 1, 2007||Aug 21, 2008||Gametank Inc.||Instrument Game System and Method|
|US20090258686 *||Apr 15, 2008||Oct 15, 2009||Mccauley Jack J||System and method for playing a music video game with a drum system game controller|
|US20120057842 *||Aug 29, 2011||Mar 8, 2012||Dan Caligor||Method and Apparatus for Remote Voice-Over or Music Production and Management|
|US20120071238 *||Sep 20, 2010||Mar 22, 2012||Karthik Bala||Music game software and input device utilizing a video player|
|US20130036897 *||Feb 14, 2013||Ubisoft Entertainment S.A.||Instrument game system and method|
|US20140033900 *||Jul 31, 2012||Feb 6, 2014||Fender Musical Instruments Corporation||System and Method for Connecting and Controlling Musical Related Instruments Over Communication Network|
|U.S. Classification||463/42, 709/203, 84/645, 709/231, 84/609, 463/35, 84/667, 709/217, 463/31, 463/7|
|International Classification||G06F19/00, G06F17/00, A63F9/24, A63F13/00|
|Cooperative Classification||A63F2300/8047, A63F2300/1062, A63F2300/303, A63F2300/534, A63F2300/305, A63F13/12, A63F2300/638|
|Dec 3, 2008||AS||Assignment|
|Jan 11, 2011||AS||Assignment|
Owner name: COLBECK PARTNERS II, LLC, AS ADMINISTRATIVE AGENT,
Free format text: SECURITY AGREEMENT;ASSIGNORS:HARMONIX MUSIC SYSTEMS, INC.;HARMONIX PROMOTIONS & EVENTS INC.;HARMONIX MARKETING INC.;REEL/FRAME:025764/0656
Effective date: 20110104
|Jan 20, 2014||FPAY||Fee payment|
Year of fee payment: 4