US 20060183545 A1
An entertainment device comprising a housing or base unit that supports display surface is disclosed. A display generating device displays visual images on the display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user appendage on the display surface in the course of a game or activity. In addition, users may interact with the entertainment device with an input controller device that comprises buttons, directional pad devices, etc. A control unit, connected to the touch/proximity sensing device and to the display generating device, is responsive to signals from the at least one input controller and/or position detections from the touch/proximity sensing device to, among other operations, alter the visual image displayed on the display surface.
1. An entertainment device comprising:
a display surface;
a display generating device operable to display a visual image on the display surface;
a touch/proximity sensing device that detects positions of a user appendage in proximity to the display surface;
a control unit operably coupled to the touch/proximity sensitive device and to the display generating device; and
at least one input controller operably coupled to the control unit;
wherein the control unit is responsive to signals from the at least one input controller and position detections from the touch/promixity sensing device to generate and supply data to the display generating device for displaying visual images on the display surface.
2. The entertainment device of
3. The entertainment device of
4. The entertainment device of
5. The entertainment device of
6. The entertainment device of
7. The entertainment device of
8. The entertainment device of
9. The entertainment device of
10. The entertainment device of
11. The entertainment device of
12. The entertainment device of
13. The entertainment device of
14. The entertainment device of
15. The entertainment device of
16. The entertainment device of
17. An entertainment device comprising:
a base unit;
a display surface on said base unit;
a plurality of player positions on said base unit circumscribing said display surface;
a display generating device operable to display a visual image on said display surface;
a touch/proximity sensing device that detects positions of a user appendage on said display surface;
a control unit operably coupled to the touch/proximity sensing device and to the display generating device;
an input controller at each player position that is operably coupled to the control unit; and
wherein the control unit is responsive to signals from the controller and signals representing user appendage positions detected by touch/promixity sensing device to alter data supplied to said display generating device.
18. The entertainment device of
19. The entertainment device of
20. The entertainment device of
21. The entertainment device of
22. An entertainment device comprising:
a housing having multiple player positions;
means for generating a visual image;
means on said housing for displaying said visual image on an image display surface
means for detecting positions of a user appendage on said image display surface;
an input controller at each player position for receiving user input; and
control means coupled to said means for generating a visual image, said means for detecting positions of a user appendage on said image display surface, and said input controller, said control means operable to alter data supplied to the means for generating a visual image in response to signals from said input controller and signals from said means for detecting positions of a user appendage on said image display surface.
23. A method for generating entertainment in a multi-user device, comprising the steps of:
(a) displaying visual images on a surface around which there are a plurality of user positions;
(b) monitoring touch or proximity positions of a user's appendage on said surface;
(c) receiving input from a user via a user input device at one of said plurality of user positions; and
(d) controlling said step (a) based upon a detected touch or proximity positions of said user's appendage on said surface and input received from a user via said user input device.
24. The method of
25. The method of
26. The method of
27. The method of
28. The method of
29. An entertainment device comprising:
a touch or proximity-sensitive display surface;
a plurality of user positions around said display surface;
a display generating device operable to display a visual image on said display surface;
a control unit operably coupled to said display generating device, wherein said control unit executes a program to supply data to said display generating device to generate visual images for display on said display surface according to said program, and wherein said control unit supplies said data to said display generating device to change the orientation of said visual images for viewing by a user at a particular user position.
30. The entertainment device of
31. The entertainment device of
32. The entertainment device of
33. A method for generating entertainment in a multi-user entertainment device, comprising:
(a) displaying visual images on a touch or proximity-sensitive surface around which there are a plurality of user positions;
(b) controlling said step (a) to rotate said visual images on said touch or proximity-sensitive surface from one user position to another user position.
34. The method of
35. An entertainment device comprising:
a base unit that supports a display surface;
a display generating device operable to display a visual image on said display surface;
at least first and second support members that attach to said base unit and operable to move between a first position and a second position with respect to said base unit;
first and second connectors attached to distal ends of said first and second support members that removeably hold said display generating device above said display surface when said support members are in said first position;
a cover member that fits over said display surface of said base unit and having a storage recess to store said display generating device therein when removed from said first and second connectors, said cover member having a body with surface portions that mate with corresponding surfaces of said first and second connectors when said first and second members are in said second position.
36. The entertainment device of
37. The entertainment device of
38. The entertainment device of
This application claims priority to U.S. Provisional Application No. 60/625,108, filed Nov. 5, 2004, the entirety of which is incorporated herein by reference.
The present invention relates to a multi-user entertainment that displays visual images and generates audio in coordinated response to touch and other user interaction.
Interactive electronic game devices have evolved to somewhat complex systems that provide audio and visual output in a variety of forms. These devices can be useful to provide entertainment to children as well as serving as a learning tool.
Many entertainment devices of this type heretofore known have limited, if no, multi-user or multiplayer capability. In addition, these games are not flexible in terms of the type of user input devices that can be used. Very often they are limited to unique controllers that operate only with a particular game device. Many prior art devices also use outdated analog technologies and do not take advantage of the many types of data or content available in digital format. Moreover, many interactive game devices require the use of physical game pieces, that are easily lost or misplaced, in combination with displayed images.
It would be desirable to provide an interactive entertainment device that is fully digital and embodied in a flexible hardware platform that can bring an endless variety of environments and experiences in a way not heretofore known.
Briefly, an entertainment device is provided comprising a housing or base unit that supports display surface. A display generating device displays visual images on the display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user's appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc. A control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio in the form of game sounds, music, etc.
Users may interact with the entertainment device at each of a plurality of user positions that are located around the display surface. When a transition in an activity is made from one user position to another, the control unit controls the display generating device to rotate the displayed visual images so that they are properly aligned with the other user position. In addition, the control unit adjusts how it interprets touch position detections made by the touch/proximity sensing device during such a transition from one user position to another.
The Entertainment Device Generally
In accordance with the present invention, a game/entertainment/creativity device is disclosed. The game system may include a display generating device that displays visual images on a display surface. The display generating device may be a projection-type display device or a display panel device. A touch/proximity sensing device detects positions of a user appendage on the display surface in the course of a game or activity. Users may also interact with the entertainment device with at least one input controller device that comprises buttons, directional pad devices, etc. A control unit is responsive to signals from the at least one input controller and/or position detections from the touch/promixity sensing device to, among other operations, alter the visual image displayed on the display surface and/or generate accompanying audio.
The housing or base unit 1100 of the entertainment device 1000 of the present invention may also include one or more input controllers, two of which are shown at reference numerals 1120(1) and 1120(2). In the embodiment of the entertainment device 1000 of the present invention illustrated in
The input controllers 1120(1), 1120(2), 1120(3) may comprise a conventional video game controller. Additionally, the input controllers 1120(1), 1122(2), 1120(3) may comprise any type of user-manipulable electronic input device (e.g., a directional pad, steering wheel, joystick, touchpad, dancepad, motion-sensitive implement (an implement such as a bat, racquet, paddle, etc. housing a motion sensor) without departing from the scope of the present invention. For example, the input controllers 1120(1), 1122(2), 1120(3) may include a directional pad 1120 a(1), 1120 a(2) and several buttons 1120 b(1), 1120 b(2), respectively. One or more of the bays 1150 may be “universal” in that it may also accept and operably connect to other accessory devices such as microphones, electric musical instruments (e.g., keyboard), and other user-manipulable electronic input devices such as dancepads, joysticks, steering wheels, etc.
The entertainment device 1000 in accordance with an embodiment of the present invention may also include a display portion 1130. The display portion 1130 of this embodiment includes a display housing 1180 with a display generating device 1175 housed therein. As shown in
As shown in
There is also a port 1190 on the housing 1100 at one of the sides 1102, 1104, 1106 and 1108 that receives a program cartridge 1192 that contains computer or microprocessor instructions for one or more games or activities for one or more players of the device 1000 (in the embodiment illustrated in
The housing 1100 of the entertainment device 1000 of the present invention includes indentations or grooves 1140 configured to permit the support portions 1132, 1134, 1136, and 1138 to be moved/adjusted from the projection position to the storage/folded position. Furthermore, the housing 1100 of the entertainment device 1000 of the present invention may incorporate an audio output generating device (e.g., a speaker or speakers (for stereo sound)). Finally, the housing 1100 of the entertainment device 1000 of the present invention may incorporate a removable media storage/playback unit (e.g., a CD, DVD, ROM cartridge—not illustrated) operably coupled to the control unit. The removable media storage/playback unit may be configured to provide additional game/entertainment content to the control unit and the display generating device 1175.
The Electrical Systems
The electrical system of the device 1000 of the present invention may include a projection sub-system 3400 that includes the display generating device 1175 (
In order to generate audio output for the device 1000, the electrical system includes a stereo coder/decoder (CODEC) 3500 that is connected to the system controller 3000. The CODEC 3500 is responsive to commands and data received from the system controller 3000 to produce sound in the form of music, speech, or other sound that is synchronized to the data representing the displayed visual images produced by the display generating device 1175. Audio output may be produced by left and right speakers 3510 and 3520, as well as headphone ports 3530 and 3540 connected to the CODEC 3500.
The electrical system of the device 1000 may also include a program cartridge interface 3600 that communicates data stored on a ROM cartridge 1192 to the system controller 3000. In addition, the electrical system of the device 1000 may utilize a memory card interface 3700 connected to the system controller 3000. The memory card interface 3700 may support one or more of a variety of memory card formats, including Multimedia™ memory card, Smartmedia™, and Compactflash™.
The accessory block shown at reference numeral 3650 may include controllers/interfaces for devices such as, audio system, television, compact disk (CD) or digital video disk (DVD) or other accessory devices such as musical instruments (e.g., keyboard), optical devices (cameras), and other user-manipulable electronic input device such as dancepads, joysticks, steering wheels, etc.
As illustrated in
The system controller 3000 coordinates displayed image data with the positions of a player's hand or finger on the touch-sensitive surface as gathered by the touch surface controller 3200. In so doing, the system controller 3000, based on instructions contained in a particular ROM cartridge 1192, will generate image display output and/or audio output, and change its interaction to another player using the device 1000.
Display Generating Device
Although an LCD projector-type system is described above for the display generating device 1175, any type of display generating system may be used without departing from the spirit and scope of the present invention. For example, a rear projection system could be utilized. Furthermore, display generating systems such as an LCD panel, plasma display panel, or a digital light processing (DLP) device could be utilized to perform the function of the display generating device 1175. Still other image generating technologies that are useful in connection with the device 1000 are a high temperature polysilicon panel (HTPS) and a MEMS reflective display device.
Regardless of the type of display generating device 1175 used, the system controller 3000 (
In addition, a projection type display generating device may be rotated from its normal projection position so as to project images onto a wall or other surface, rather than onto the display surface 1110. This feature may be useful in the event a user wishes to view images or watch a video presentation on a DVD, CD, etc.
In accordance one embodiment of the present invention, the touch-sensitive display surface 1110 may include a vellum projection screen. More specifically, the vellum projection screen may be a vacuum-metallized vellum screen with a mirrored back portion for improved reflectivity. As referenced above, the touch-sensitive display surface 1110 need not include a projection-type screen (a projector and a separate screen), and may comprise additional appropriate integrated touch-sensitive display surfaces such as an LCD or plasma touch panel display, as described hereinafter in connection with
Touch-Sensitive Surface Sub-System
As shown in
The plastic spacer 515 which forms the recess surface 130 may be approximately 0.080″ thick and is placed on top of the array 142 to act as an insulator so that a touch surface of a sensor is separated from the matrix 142 by at least this amount. The spacer 515 may be a styrene or ABS with a dielectric constant between about 2 and 3 although the thickness and dielectric constant can be adjusted to achieve the desired sensitivity. The function of the spacer 515 is to provide a stable response from the matrix 142 (when touched by finger/appendage 505). The width and thickness of the column traces 248 (vertical columns) and row traces 246 (horizontal rows) should be kept to a minimum at the cross-points to reduce the capacitive effect at each of the cross-points but are preferably increased between the cross-points and around the cross-points, for example, by widening the individual row and column traces into four pointed stars or diagonal squares or the like around and between the cross-point locations.
The conductive plane 510 is spaced approximately one-quarter inch (5 mm) below the matrix 142. The conductive plane 510 provides shielding for the matrix 142 and as a result, affects the area sensed around each cross-point in the matrix 142.
Referring back to
Generally, baseline or reference values of signals generated by the sensor matrix 142 are read and stored without human interaction with the arrays to obtain a reference value for each cross-point. The reference value of each cross-point sensor is individually determined and updated. Preferably, each is a running “average” of successive scan values (e.g., approximately sixteen) for the cross-point. Successive scans are compared to the reference values to determine the proximity of a human finger or other extremity. Data may be accumulated starting at zero when the device 1000 is powered on.
Operation of the touch surface sensing circuitry 3100 is as follows (and is illustrated in
After the initial values from the sensor array 142 are stored, the sensor array 142 is cyclically and continually scanned, and the results for each cross-point sensor are compared with the stored reference values, which are themselves cyclically and continuously updated. If any individual cross-point sensor value has a differential from its reference value that is greater than a predetermined or threshold amount (“threshold”), the touch surface controller 3200 will mark the point as “touched” or “selected”. A fixed threshold is established for the device 1000 by characterizing the device 1000 during manufacture. For the circuitry, materials and structure described, it has been found that with an applied 3300 millivolts, 250 kHz square wave signal, individual cross-point sensors of the sensor array 142 output signals of about 2200 millivolts±400 millivolts without user interaction. Deflection of the signal (i.e. a drop in detected signal strength) at each cross-point sensor location for user contacts ranging between that of a large adult directly touching the surface to a small child touching the surface ranges from about 1600 millivolts in the first case to only about 200-300 millivolts in the second case. The threshold may be set as close as possible to the smallest expected user generated deflection. In this device 1000 being described, the threshold is set for less than 200 millivolts, between about 190 and 200 millivolts, for each cross-point sensor. If the measured voltage value for the cross-point being sensed is less than the reference value in memory by an amount equal to or greater than the threshold amount, the point is considered touched and is “marked” as such by the sensor touch surface controller 3200. If the difference is less than the threshold, the reference value is updated each 64 milliseconds period (full scan time), resulting in a settling of the reference values after about one second. After the sensor array 142 is scanned, cross-points that have been “marked” as a touched for two scan cycles are considered valid and selected for further processing by a “best candidate” algorithm as will be described.
When the sensor array 142 is scanned, each cross-point data value is initially compared to a “High Limit” value. If the data value exceeds this High Limit value, it is ignored as a candidate for that scan and ignored for updating the reference value for that sensor. The purpose of the High Limit value is to prevent abnormally high data values from causing a cross-point sensor to appear permanently pressed.
As noted above, for each array scan, each time the data value associated with a cross-point sensor is read, it is compared against the reference value, which may be thought of and herein referred to as a “Running Average” associated with that cross-point sensor. If the data value is less than the Running Average minus the threshold, the cross-point sensor is considered “touched” for that scan. The threshold is the fixed data value mentioned above (i.e. 190 to 200 millivolts) that represents the minimum deflection which is expected to indicate that a cross-point sensor is considered touched.
If the data value does not indicate that the cross-point sensor is considered touched (that is, data value is less than the [Running Average-Threshold]), then the data value is used to update the Running Average. Upon power-up of the device 1000, the Running Average for each point is set to zero. Each time the data value for a cross-point sensor is not greater than the High Limit, and not low enough to indicate that the cross-point sensor is touched, the data value is used to update the Running Average for that point. The formula used to compute the new Running Average is as follows:
With the above knowledge, the function of the High Limit algorithm can now be explained. The reference value/running average algorithm can be fooled by situations where high levels of interference exist and the cross-point sensor readings climb significantly. Without the High Limit cut-off, abnormally high data values (due to a continuous noise source) could eventually result in an abnormally high Running Average for a given cross-point sensor. Then, when the scanned data values return to their nominal value range, if the data values being scanned are low enough such that the data values are greater than the abnormally high Running Average minus the threshold, the cross-point sensor will be considered touched. This will result in newly scanned data values never being used in the calculation of the Running Average and therefore, will not allow the Running Average to be lowered to it's normal level, causing the cross-point sensor to appear permanently touched during the duration of use of device 1000. Consequently, the only sensor data which is used or stored is that data which is less than the High Limit. A High Limit value of 3100 millivolts (about fifty-percent higher than the nominal voltage) may be appropriate.
The following describes a “Fast Recovery” algorithm. This algorithm compares the latest reading from a cross-point to the reference value or Running Average. If the latest reading if higher by more than the Fast Recovery Threshold, the reference value will be set equal to the latest reading. This algorithm counters a situation where the user “hovers” a finger over a point for an extended period of time, which artificially forces the reference value down. A quick release and touch of the same point in this situation may cause the system not to respond because the differential between the reference value and latest reading is not more than the touch threshold value (threshold).
The previous section described in detail how the 256 cross-point sensor array 142 is determined to be activated (i.e. “touched” or “selected”) or not. During each scan, every cross-point sensor is considered to be activated/touched or not.
After each scan, the touched points are processed to identify a “best candidate”. Generally speaking, the best candidate is the cross-point sensor selected by the touch surface controller 3200 as being the point most likely to have been selected by the user in touching the sensor. Generally speaking, it is the touched point which is highest (most northern/Top) or the highest and most left (i.e. most northwestern/Top Left) if two potential candidates of equal height are activated on the sensor array 142. For convenience, these will be referred to collectively as simply “the most northwestern” point. Also, the cross-point sensor preferably must be “touched” for two consecutive 64 millisecond scans to be considered as the new most northwestern point of the sensor.
The touch surface controller 3200 first identifies a set of touched sensors. It next identifies those which have been touched for at least two consecutive 64 millisecond cycles. These are the new most northwestern candidate sensors. Once the best candidate has been chosen, its identification/location is communicated from the touch surface controller 3200 to the system controller 3000.
Once a new most northwestern point (cross-point sensor) has been chosen, a “Southern Lockout” algorithm takes effect for the sensor array 142. The Southern Lockout algorithm causes any point of the same array touched in subsequent scans below the new most northwestern point to be ignored until the earlier of one second expiration while the new most northwestern point remains selected, or the new most northwestern point is released. After the lockout, all cross-points of the array become candidates for new most northwestern point. This algorithm covers the situation where the user rests the heel of the pointing hand on the array after finger touching the array (as a young child may be prone to do).
A “Peak Search” algorithm may be employed after a new most northwestern point of the sensor array 142 is identified. The deflection of the cross point sensors immediately East (right), South (below) and Southeast (below right) of the new most northwestern point sensor are examined for touch and the relative deflections of any touched sensor of the four compared to one another. The one sensor of those up to four sensors having the greatest deflection (i.e. change from reference value/Running Average) is selected as the “Best Candidate” and its identity/location/position is passed to the main (base unit) system controller 3000.
Each time a new best candidate is selected, its position is transferred by the touch surface sensor controller 3200 to the system controller 3000. The system controller 3000 would then decide how to use this information (interrupt current activity or not, use a neighbor cross-point sensor instead of the best candidate, etc.).
The device 1000 will determine if there are multiple hands placed touch surface. In the event that the system controller 3000 sees two hands placed on the sensor, it will look to see if either input is a clearly defined most northern point. If so, it will select this input as the best candidate. Instead of having to generate an audio output to direct the user to use “one finger at a time” or any other appropriate statement when the device 1000 cannot determine with reasonable accuracy the likely input, this technique can select a “best candidate” based on the above-described algorithm.
Other types of touch-sensitive surface or position detection technologies may be utilized without departing from the scope of the present invention. For example, analog resistive or capacitive touch panels may be used, digital camera CCD technology, so-called gesture recognition technology, heat sensitive, color sensitive, pattern sensing, object sensing or any other contextual sensing technology based on electro-physical material properties, photo-reflective properties or photo-absorption properties.
Still another alternative is to use a LCD monitor with an integrated touch panel. This alternative embodiment is shown in
Examples of other types of touch or proximity sensing technologies that may be used with the entertainment device 1000 of the present invention include pressure-sensitive switch matrices such as a Mylar® switch matrix, proximity sensing antenna arrays and proximity sensing capacitive arrays. Some examples of additional appropriate touch-sensitive display surfaces are LCD or plasma touch panel displays.
Exemplary Games and Game System Operation
With general reference to
A procedure useful to adjust the orientations during a game or activity is shown at reference numeral 5000 in
“Select the Song You Wish to Hear During Your Next Turn
This re-orientation process is repeated when transitioning from Player 2 to any other player. It should be understood that if only 2 or 3 players are active in a particular game or activity, the system controller would know to re-orient the image display data and touch position responsiveness accordingly. Moreover, this re-orientation process can be applied to an entertainment device that has fewer than 4 or more than 4 player positions such that the re-orientation is not a simple 90 degree adjustment. This re-orientation process 5000 applies for a display generating device that is a display panel or monitor 4000 as well as an image projection system 1175. In the case of an image projection system 1175 (such as in
The following are generic examples of games or activities that may be played on the entertainment device 1000 of the present invention. The instructions, scripts, programs for these games or activities may be embodied in a removable memory cartridge device 1192, as described above. The games or activities are software programs containing digital data for animated characters accompanied by voice, music, and other graphical elements. The games/activities may involve sequential, interactive, narrative stories that containing puzzles, activities or games interwoven as challenges to provide a progressive rewarding type experience for the players.
Digitally Animated Adventure Game
One type of game is an animated adventure game where one or more animated characters are displayed and the character(s) negotiate a variety of activities, such as an underwater amusement park. Each player may select a particular character and negotiate a simulated displayed game board, for example, collecting certain items in order to win the game. A player's character may progress on the displayed game board using an electronic or virtual roll of the dice, for example. In addition, when a player lands on a particular spot on the game board, the player may be prompted, through visual and audio stimulus, to engage in a particular activity in order to earn a particular item or “ticket” award that counts towards winning the game. A player may accumulate tickets in order to redeem them for certain animated or displayed items. A player may engage in these so-called mini-games or activities (including educational or learning activities) using the input controllers or the touch-sensitive display screen. These mini-games may be distributed randomly throughout the game board each time a new game is started.
Portions of the visual display proximate each player's position at the entertainment device may be dedicated to tracking each player's digital scorecard concerning their progress in the game. The scorecard may show a player's character and which items the player has collected.
There may be virtual animated “vendors” that appear randomly at different spots on the game board to “sell” certain items to players who have collected a sufficient number of tickets. The items that can be purchased may be used by a player during play of the game (e.g., a rolling bonus, a time bonus, etc.), while others may be used against opponents (lose a turn, etc.). The game may also include sudden appearance of certain animated characters that give bonus tickets to certain players, for example, or play special side games or activities.
Digitally Animated Adventure Tales
Another type of game may involve a digital book consisting of a combination of a traditional storybook, a children's activity book and web-type flash games. A player or user becomes part of the adventure, helping the animated characters complete certain challenges and reach their goals. Each so-called “page” of the storybook includes a full-screen combination of artwork, a story line, object identification and animated “hot spots”. As the story is read to the user, or as an animated character speaks, the accompanying text will appear on-screen and highlight. Each phrase or sentence will highlight individually as those words are also heard as voiceover. On certain pages, several objects are tagged as “identifiable hotspots.” When a child touches one of these objects, that word or phrase is said aloud. Certain areas and objects on the pages are tagged for special animations, so that if a child touches that area, the name of that object is said aloud, and an animation or other reward will be revealed. In addition, certain pages of the storybook may contain mini-games, activities or challenges (including educational or learning activities) related to the storyline.
The game and activity examples described above highlight the necessity for re-orienting the displayed visual image according to which player is active in a game. For example, the animated characters may be intended for a particular player. Consequently, the device needs to keep track of players'turns in the game, and re-orient certain displayed visual images to that player whose turn it currently is. Moreover, if the game calls for detecting a touch or proximity of a command from a player, the device also re-orients on which positions on the sensor array that it needs to respond to for the currently active player.
While the invention has been described in detail and with reference to specific embodiments thereof, it will be apparent to one skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope thereof. For example, the housing 1100 of the entertainment device 1000 of the present invention may include headphone jacks for a user's convenience. Additionally, the entertainment device 1000 of the present invention may include a gesture recognition system, including a camera and/or sensors to sense, model, and react to a user's hand motions, adding an extra dimension of interactivity to the entertainment device 1000. Also, the entertainment device 1000 of the present invention may include a rechargeable power source. The entertainment device 1000 of the present invention may also include night vision goggles, a magnifying glass, or a special optical device that would allow a user to reveal secret codes, cards, letters, or other information displayed in certain wavelengths of light on the touch-sensitive display surface 1110. Furthermore, the entertainment device 1000 of the present invention may include deluxe input controllers which include all of the features of the at least one input controller 1120(1), 1120(2), 1120(3), 1120(4) and also may include an onboard display screen displaying individual user messages (e.g., things like scrabble letters, hidden game clues, etc.). Also, the entertainment device 1000 of the present invention may include a memory unit (removeable or non-removeable) for storing game/player related information (such as high scores, etc.). Finally, the housing 1100 of the entertainment device 1000 of the present invention may include light sources to identify which user is in control of the entertainment device 1000 i.e., (which user's turn it is to control the entertainment device 1000). Thus, it is intended that the present invention covers the modifications and variations of this invention provided they come within the scope of the appended claims and their equivalents.