|Publication number||US6809776 B1|
|Application number||US 09/403,536|
|Publication date||Oct 26, 2004|
|Filing date||Dec 29, 1997|
|Priority date||Apr 23, 1997|
|Publication number||09403536, 403536, PCT/1997/24210, PCT/US/1997/024210, PCT/US/1997/24210, PCT/US/97/024210, PCT/US/97/24210, PCT/US1997/024210, PCT/US1997/24210, PCT/US1997024210, PCT/US199724210, PCT/US97/024210, PCT/US97/24210, PCT/US97024210, PCT/US9724210, US 6809776 B1, US 6809776B1, US-B1-6809776, US6809776 B1, US6809776B1|
|Inventors||Theodore Frederick Simpson|
|Original Assignee||Thomson Licensing S.A.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (6), Referenced by (80), Classifications (26), Legal Events (3)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This application claims the benefit of Provisional Application No. 60/044,097, filed Apr. 23, 1997.
The present invention relates to the display of alphanumeric or graphic information by an image reproducing device such as a kinescope, and, in particular, to a system and method for controlling the video level of an image reproducing device depending on the region and the content of the information being displayed.
Electronic devices such as televisions and personal computers (PC) typically require a video monitor and associated video display controller/processor for displaying images so that information can by conveyed to the users. The present inventor has recognized that, however, the differences in the nature and the content of the material to be shown place disparate requirements on the display. For example, two types of material with different requirements are:
1. Text and line-oriented graphics: These require very sharp video edges, such as black-to-white and white-to-black transitions, and therefore, need a very well-focused, crisp display. These types of materials also tend to have a large percentage of their content displayed at full brightness. Black text on white background is typical, simulating print media on white paper. Since cathode-ray-tubes tend to have larger electron spots at high currents, causing “blooming” and poorer focus, these types of material should be displayed at lower beam currents. This is typical of the way that computer desktop monitors are operated.
2. Real-world photographic-style or television broadcast images; These tend to naturally have softer edges, and so do not have the same requirement for the very sharp focus that text and line graphics do. It is another characteristic of this material that it typically has a lower duty cycle of high brightness material (i.e., most of the video is not at full brightness), but a few bright peaks are required to give the material a realistic “punch” and vividness. This is typical of the way that commercial television receivers are operated. To obtain that realism, the highest level of brightness for this type of material should be displayed at a higher brightness level than the text and graphics as described above.
In EP-A-0729273, a television set is disclosed having the capability to adjust a display parameter for either the main or a PIP image. Likewise in EP-A-0569018, a television set is disclosed which independently adjusts a display parameter of a main display and a sub-display. Both of these disclosures, however, adjust the display parameter based on the instantaneous determination of the real time signal being processed. In addition, neither of these references show any recognition that the signal being processed may be from a source other than a television.
The present inventor has recognized that a problem arises when a display must exhibit both types of material, each of which has its own requirements. This type of setup is found for example, in a device which combines the functions of a personal computer (PC) with a television. For example, a personal computer can be outfitted with a television tuner card, so that a user may watch a television program in one window of provided by the computer's graphical operating system (e.g., Microsoft Windows 95Ū), while working on a spread sheet application displayed in another window. Another example is a television receiver which is also being used as a PC monitor or has built-in PC capabilities for surfing the internet or performing other computing functions. Video drive level adjustments need to be made to optimize each of the material being displayed.
Another situation where a display must exhibit both types of material is, for example, when a television has an electronic program guide or some other text/graphical On Screen Display (OSD) information that needs to be conveyed to the users, along with real-world, television broadcast information. In addition, such a television may include internet access so that computer graphics and text can be downloaded from the internet for viewing on the television display. The computer graphics and text may be displayed, for example, in a subpicture of a television with a Picture-in-Picture (PIP) or Picture-out-of-Picture (POP) capability, while the main picture is showing the television broadcast channel. These different viewing materials may each require a different drive level to provide the proper picture on the same screen.
The inventor has recognized that drive level adjustment between two or more types of material may be made manually, if only one is shown on the display at a time. This might be done with the monitor's contrast control, for example. However, with systems such as those that combine the functionality of a PC and TV, it is common that more than one type of material will appear on the screen simultaneously. For this reason, it is desirable for the system to be able to adjust the video drives for each type of the display material in each region of the display screen automatically and independently, depending on the content of the material being displayed.
Accordingly, a video processing system and method is provided for presenting an image having more than one region on a display. The system has a control processor which can determine, for each region, the type (e.g., graphics, text, computer programs, broadcast TV video, etc.) of the image material being displayed in the region. The system also has a video driver coupled to the display for providing image signals having contrast and brightness characteristics. The control processor generates a control signal for causing the video driver to adjust the contrast and brightness characteristics of the image signals according to the type of material being displayed in the region.
Another aspect of the present invention provides for manual adjustment of the image characteristics for each of the regions independently, therefore, allowing a user to adjust the drive level for each region to suit his or her own taste.
The invention will be further explained by reference to the drawings in which:
FIG. 1 shows a block diagram of a television incorporating the principals of the present invention;
FIG. 2 shows the steps of an exemplary process to accomplish the present invention;
FIG. 3 is a block diagram of an exemplary PC which incorporates the principals of the present invention;
FIG. 4 is another exemplary process for implementing the present invention;
FIGS. 5A and 5B depicts displays having different materials being shown in different regions in accordance with the present invention; and
FIG. 6 is an example of a subroutine which allows a user to adjust the drive level for each of the regions being displayed.
FIG. 1 shows a television receiver receiving for processing both analog NTSC television signals and internet information. The system shown in FIG. 1 has a first input 1100 for receiving television signal RF_IN at RF frequencies and a second input 1102 for receiving baseband television signal VIDEO IN. Signal RF_IN may be supplied from a source such as an antenna or cable system while signal VIDEO IN may be supplied, for example, by a video cassette recorder (VCR) or a gaming device (both not shown in FIG. 1). Tuner 1105 and IF processor 1130 operate in a conventional manner for tuning and demodulating a particular television signal that is included in signal RF_IN. IF processor 1130 produces baseband video signal VIDEO representing the video program portion of the tuned television signal. IF processor 1130 also produces a baseband audio signal that is coupled to an audio processing section (not shown in FIG. 1) for further audio processing. Although FIG. 1 shows input 1102 as a baseband signal, the television receiver could include a second tuner and IF processor similar to units 1105 and 1130 for producing a second baseband video signal from either signal RF_IN or from a second RF signal source.
The system shown in FIG. 1 also includes a main microprocessor (μP) 1110 for controlling components of the television receiver such as tuner 1105, picture-in-picture processing unit 1140, video signal processor 1155, and StarSightŪ data processing module 1160. As used herein, the term “microprocessor” represents various devices including, but not limited to, microprocessors, microcomputers, microcontrollers, control processors, and controllers. Micro processor 1110 controls the system by sending and receiving both commands and data via serial data bus I2C BUS which utilizes the well-known I2C serial data bus protocol. More specifically, central processing unit (CPU) 1112 within μP 1110 executes control programs contained within memory, such as FEPROM 1127 shown in FIG. 1, in response to commands provided by a user, e.g., via IR remote control 1125 and IR receiver 1122. For example, activation of a “CHANNEL UP” feature on remote control 11425 causes CPU 1112 to send a “change channel” command along with channel data to tuner 1105 via I2C BUS. As a result, tuner 1105 tunes the next channel in the channel scan list. The control program stored in EEPROM 1127 also includes software for implementing the operations shown in FIG. 2.
Main microprocessor 1110 also controls the operation of a communications interface unit 1113 for providing the capability to download and upload information from the internet. Communication interface unit 1113 includes, for example, a modem for connecting to an Internet service provider, e.g., via a telephone line or via a cable television line. The communication capability allows the system shown in FIG. 1 to provide email capability and internet related features such as web browsing in addition to receiving television programming.
CPU 1112 controls functions included within μP 1110 via bus 1119 within μP 1110. In particular, CPU 1112 controls auxiliary data processor 1115 and on-screen display (OSD) processor 1117. One function of the auxiliary data processor 1115 is to extract auxiliary data such as StarSightŪ data from video signal PIPV.
StarSightŪ system is an Electronic Program Guide (EPG) provided by StarSight Telecast, Inc. An EPG is an interactive, on-screen equivalent to TV listings found in local newspapers or other print media. The information contained in an EPG includes programming characteristics such as channel number, program title, start time, end time, elapsed time, time remaining, rating (if available), topic, theme, and a brief description of the program's content. Aspects of the StarSightŪ system are described in U.S. Pat. Nos. 5,353,121, 5,479,268, and 5,479,266 issued to Young et al. and assigned to StarSight Telecast, Inc.
StarSightŪ data is typically received only on a particular television channel and the television receiver must tune that channel to extract StarSightŪ data. To prevent StarSightŪ data extraction from interfering with normal use of the television receiver, CPU 1112 initiates StarSightŪ data extraction by tuning the particular channel only during a time period when the television receiver is usually not in use (e.g., 2:00 AM). At that time, CPU 1112 configures decoder 1115 such that auxiliary data is extracted from horizontal line intervals such as line 16 that are used for StarSightŪ data. CPU 1112 controls the transfer of extracted StarSightŪ data from decoder 1115 via I2C BUS to StarSightŪ module 1160. A processor internal to the module formats and stores the data in memory within the module. In response to the StarSightŪ EPG display being activated (e.g., a user activating a particular key on remote control 125), CPU 1112 transfers formatted StarSightŪ EPG display data from StarSightŪ module 1160 via I2C BUS to OSD processor 1117.
OSD processor 1117 operates in a conventional manner to produce R, G, and B video signals OSD_RGB that, when coupled to a display device, will produce a displayed image representing on-screen display information such as graphics and/or text comprising an EPG or graphics and/or text downloaded from internet as described below. OSD processor 1117 also produces control signal FSW which is intended to control a fast switch for inserting signals OSD_RGB into the system's video output signal at times when an on-screen display is to be displayed. For example, when a user enables an EPG, e.g., by activating a particular switch on remote control 1125, CPU 1112 enables processors 1115 and 1117 so that processor 1115 first requests and receives EPG data from StarSightŪ module 1160 via I2C BUS. Processor 1117 then produces signals OSD_RGB representing the closed caption data. Processor 1117 also produces signal FSW indicating when the EPG is to be displayed.
Another function of the OSD processor 1110 is to generate computer text or graphics obtained from the internet, in cooperation with the communication interface unit 1113 and the Auxiliary Data Processor 1115. The communication interface unit 1113 demodulates the analog information into digital format and passes it to the Auxiliary Data Processor 1115 for further processing. The OSD processor then formats this digital information into RGB signals suitable for used by the Video Signal Processor 1155. As described above, the OSD produces control signal FSW which is intended to control a fast switch for inserting signals OSD_RGB into the system's video output signal at times when internet graphics and text is to be displayed.
Video signal processor (VSP) 1155 performs conventional video signal processing functions, such as luma and chroma processing and contrast and brightness adjustment. Output image signals produced by VSP 1155 are suitable for coupling to a display device, e.g., a kinescope or LCD device (not shown in FIG. 1), for producing a displayed image. VSP 1155 also includes a fast switch for coupling signals produced by OSD processor 1117 to the output video signal path at times when graphics and/or text is to be included in the displayed image. The fast switch is controlled by control signal FSW which is generated by OSD processor 1117 in main microprocessor 1110 at times when text and/or graphics are to be displayed.
The input signal for VSP 1155 is signal PIPV that is output by picture-in-picture (PIP) processor 1140. When a user activates PIP mode, signal PIPV represents a large picture (large pix) into which a small picture (small pix) is inset. When PIP mode is inactive, signal PIPV represents just the large pix, i.e., no small pix signal is included in signal PIPV. PIP processor 1140 provides the described functionality in a conventional manner using features included in unit 1140 such as a video switch, analog-to-digital converter (ADC), RAM, and digital to analog converter (DAC).
As described above, and as shown in FIG. 5A, the subpicture 501 of the PIP image 500 of the television may be used to present text and/or graphics information from the internet, or text and/or graphics information from the EPG while the main picture 502 is showing a TV broadcast channel. This is accomplished by feeding the processed digital signals from the OSD processor 1117 directly to one of the inputs of the PIP processor. As it is well known in the art, the video switch typically contained in the PIP processor 1140 receives all the signal inputs (e.g., VIDEO, VIDEO IN, and signal from OSD as shown in FIG. 1). Then the PIP processor under the control of the main microprocessor 1110, selects and switches the appropriate signals to be displayed during the appropriate main and subpicture scanning intervals. Conventional subsampling techniques are used to form the subpicture.
For an EPG display, the display data included in the EPG display is produced by OSD processor 1117 and included in the output signal by VSP 1155 in response to fast switch signal FSW. When controller 1110 detects activation of the EPG display, e.g., when a user presses the appropriate key on remote control 1125, controller 1110 causes OSD processor 1117 to produce the EPG display using information such as program guide data from StarSightŪ module 1160. Controller 1110 causes VSP 1155 to combine the EPG display data from OSD processor 1117 and the video image signal in response to signal FSW to produce a display including EPG. The EPG can occupy all or only a portion of the display area.
When the EPG display is active, controller 1110 executes another control program stored in EEPROM 1127. The control program monitors the location of a position indicator, such as a cursor and/or highlighting, in the EPG display. A user controls the location of the position indicator using direction and selection keys of remote control 1125. Alternatively, the system could include a mouse device. Controller 1110 detects activation of a selection device, such as clicking a mouse button, and evaluates current cursor location information in conjunction with EPG data being displayed to determine the function desired, e.g., tuning a particular program. Controller 1110 subsequently activates the control action associated with the selected feature.
Examples of suitable components for implementing the exemplary embodiment include an ST9296 microprocessor produced by SGS-Thomson Microelectronics for providing the features associated with μP 1110; an M65616 picture-in-picture processor produced by Mitsubishi for providing the described basic PIP functionality associated with PIP processor 1140; and an LA7612 video signal processor produced by Sanyo for providing the functions of VSP 1155.
In accordance with the present invention, the main microprocessor 1110, under the supervision of the control programs contained within memory EEPROM 1127, will direct the video signal processor 1155 to provide the proper drive levels to the display, via the bus I2C BUS. As shown in the processing steps of FIG. 2, as the video signal processor is providing drive current for picture rasters, the main microprocessor 1110 will first determine which region the rasters belong to. A region, for example, may be a main or subpicture in the PIP mode. As discussed above, the main picture may be displaying the TV images and the subpicture of the PIP may be showing internet text and/or graphics or an EPG, or vise versa.
The microprocessor 1110 will then determine whether the material being displayed is: 1) a computer text/graphics image such as information obtained from the internet, 2) a television broadcast image, or 3) an OSD information such as an EPG, as shown in steps 210, 220 and 230 of FIG. 2. The main microprocessor 1110 makes this determination by coordinating and monitoring the operations of the OSD processor 1117, CPU 1112, and the Auxiliary Data Processor 1115. The main microprocessor 1110 will then provide a control signal to the video signal processor so that the video signal processor 1155 can provide the desired video drive level to the display for the particular material being displayed during the associated scanning interval.
In addition, FIG. 1 includes a control switch 1118 coupled to microprocessor 1110. The control switch 1118 can be used to select the mode of operation for the television set. For example, when a user selects the “auto mode”, the television operates automatically to adjust the image characteristics of the drive signals for each display region as described above. When the switch is in the “manual mode 1” position, the user may only adjust the image characteristic of the entire screen. When the switch is in “manual mode 2” position, the television may provide a prompt for the user to select the desired image characteristic for each region. This aspect of the invention will be further described in regard to a PC implementation shown in FIGS. 3 and 4.
FIG. 3 shows another example of an electronic device implementing a video drive control system in accordance with the present invention. The example is a computer system having a television tuner card installed in one of its computer card slots.
The computer system 10 is under the control of a CPU processor 110. The computer includes a display processor 130 for controlling a kine drive 135 of an associated display monitor 136. The display processor 130, under the control of the CPU processor 110 via a computer operating system to be discussed later, provides for each pixel a desired video drive level.
The computer system 10 includes a television tuner card 100 for receiving television signals to be displayed. It is known, for example, to be able to concurrently display a television image in one of the windows under, for example, a windows based operating system, while performing other computing applications in other windows, as shown in FIG. 5B. An example of an expansion card having a TV tuner that can be used with an IBM-compatible PC is “All-In-Wonder”™ card made by ATI Technologies Inc., of Canada.
FIG. 4 shows in a flowchart form a feature of the system shown in FIG. 3. In FIG. 4, the operating system of the PC has a video processing subroutine 400 to provide the proper display drive control for each pixel of the display according to the principles of the present invention.
The operating system which contains the video processing subroutine 400, provides to the display processor 130 intensity information such as contrast and brightness characteristics about each pixel, as each pixel is being displayed on the display monitor 136 in the display scanning process. This intensity information typically corresponds to R, G, and B drive levels for each pixel on the display.
The operating system knows the location and the boundary of the windows or regions since the operating system is responsible for coordinating the resources among the different applications being displayed. Therefore, as each pixel is being displayed in the CRT scanning process, the operating system first determines which region or application this pixel belongs to, as shown in Step 401.
Once this is determined, the operating system then obtains a normal drive level (X) for this pixel from the corresponding application program as shown in step 402. The drive levels (X) for a pixel in an application are normally represented internally as numbers, which can be represented as fractions of the maximum drive level. These might be thought of, for example as 0, 0.25, 0.50, 0.75, and 1.00, although many more levels are usually incorporated.
To exert the required control of regions in accordance with the type of the material or information being displayed, these drive numbers, received from the application being displayed in a window, would be multiplied by a second number Y as shown in step 403. The second number Y is an attenuation number associated with the window currently being scanned. For example, the high-drive picture regions corresponding to real-world photographic style or television broadcast images might have an assigned attenuation number Y of 1.0 (no attenuation), but the computer-related text/graphics regions might have an assigned attenuation number Y of 0.25 (4 to 1 attenuation).
Each pixel appearing in a window, which is of a certain type, is thereafter assigned an associated, final, attenuation level or a drive signal level Z. This level, Z, is derived from, for example, the multiplication of the normal application video level X for each pixel in the region by the assigned attenuation number Y for that region or window, resulting in independent control of each region's video level appropriate for its type of material being presented, as shown in step 404. This number Z is then provided to the display processor 130 which will drive the kine drive 135 and the associated display 136 at the appropriate level for each pixel. This process is then repeated for each pixel of the displayed image until the system 10 is turned off.
In accordance with another aspect of the present invention, the system shown in FIG. 3 may provide a user interface feature to provide user adjustment of the final drive level for each window or application on the screen. This function is similar to that described above when switch 1118 of the television shown in FIG. 1 is in “manual mode 2” position.
FIG. 6, steps 600 to 606, show in flow chart form, a user interface feature according to the present invention. The feature illustrated in FIG. 6 may be implemented as a subroutine of the operating system that is exercised by CPU 110 of the system shown in FIG. 3. The user can, for example enter a key on a keyboard (not shown) of the computer system 100 to invoke this feature, as shown in Step 600. Thereafter, the computer system 100 will prompt the user via the display screen 136 to enter a new, overriding attenuation number Y for each region shown on the screen, as shown in Step 603. Once a number is entered by the user, the computer system overrides the old attenuation number Y previously derived by the CPU processor with this new number Y, thereby causing the final drive level Z for this particular window to be adjusted according to the user's taste. Thus the user is able to manually adjust the drive level for each of the regions currently being displayed on the computer screen.
The above described concepts according to the present invention can be extended, if desired, to several levels of control for numerous types of presentation materials, to optimize the drive level of each region, according to the content.
It is to be understood that the embodiments and variations shown and described herein are for illustrations only and that various modifications may be implemented by those skilled in the art without departing from the scope and spirit of the invention.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5204748||Dec 11, 1991||Apr 20, 1993||Thomson Consumer Electronics, Inc.||Beam current limiting arrangement for a television system with picture-in-picture provisions|
|US5969767 *||Sep 4, 1996||Oct 19, 1999||Matsushita Electric Industrial Co., Ltd.||Multipicture video signal display apparatus with modified picture indication|
|US5977946 *||Jan 16, 1997||Nov 2, 1999||Matsushita Electric Industrial Co., Ltd.||Multi-window apparatus|
|EP0569018A1||May 6, 1993||Nov 10, 1993||Matsushita Electric Industrial Co., Ltd.||Gradation correcting apparatus|
|EP0675644A2||Mar 23, 1995||Oct 4, 1995||Kabushiki Kaisha Toshiba||Television receiver with picture-in-picture|
|EP0729273A2||Dec 20, 1995||Aug 28, 1996||Matsushita Electric Industrial Co., Ltd.||Compensation voltage generating apparatus for multipicture display and video display apparatus using it|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US6943848 *||Mar 1, 2002||Sep 13, 2005||Koninklijke Philips Electronics N.V.||Unit for and method of video signal enhancement and image display apparatus provided with such a video signal enhancement unit|
|US7034776 *||Apr 8, 2003||Apr 25, 2006||Microsoft Corporation||Video division detection methods and systems|
|US7046303 *||May 22, 2001||May 16, 2006||Sanyo Electric Co., Ltd.||Digital broadcasting receiver operative for displaying picture characters in a non-central portion of a screen|
|US7096489 *||Apr 21, 2001||Aug 22, 2006||Sony Corporation||System and method for interactive television|
|US7154558 *||May 22, 2002||Dec 26, 2006||Canon Kabushiki Kaisha||Display control apparatus and method, and recording medium and program therefor|
|US7225456||Apr 23, 2001||May 29, 2007||Sony Corporation||Gateway screen for interactive television|
|US7400314 *||Jan 27, 2000||Jul 15, 2008||Fujifilm Corporation||Display device|
|US7406704||Mar 9, 2001||Jul 29, 2008||Sony Corporation||Virtual channel system for web appliance, including interactive television|
|US7505013||Dec 12, 2005||Mar 17, 2009||Microsoft Corporation||Video division detection|
|US7567301 *||Oct 2, 2004||Jul 28, 2009||Funai Electric Co., Ltd.||Flat panel television|
|US7570228||Nov 2, 2005||Aug 4, 2009||Microsoft Corporation||Video division detection methods and systems|
|US7600189 *||Sep 29, 2003||Oct 6, 2009||Sony Corporation||Display device, display method, and program|
|US7643095 *||May 26, 2005||Jan 5, 2010||Sharp Kabushiki Kaisha||Image display device, image display method, and television receiver|
|US7646372||Dec 12, 2005||Jan 12, 2010||Sony Computer Entertainment Inc.||Methods and systems for enabling direction detection when interfacing with a computer program|
|US7663689||Jan 16, 2004||Feb 16, 2010||Sony Computer Entertainment Inc.||Method and apparatus for optimizing capture device settings through depth information|
|US7760248||May 4, 2006||Jul 20, 2010||Sony Computer Entertainment Inc.||Selective sound source listening in conjunction with computer interactive processing|
|US7844661||Jun 15, 2006||Nov 30, 2010||Microsoft Corporation||Composition of local media playback with remotely generated user interface|
|US7874917||Jan 25, 2011||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US7883415||Sep 15, 2003||Feb 8, 2011||Sony Computer Entertainment Inc.||Method and apparatus for adjusting a view of a scene being displayed according to tracked head motion|
|US7921443||Feb 27, 2003||Apr 5, 2011||Qwest Communications International, Inc.||Systems and methods for providing video and data services to a customer premises|
|US8024503 *||Aug 10, 2007||Sep 20, 2011||Shanda Computer (Shanghai) Co., Ltd.||System and method for accessing internet via TV and a PC connecting set and a TV connecting set|
|US8035629||Dec 1, 2006||Oct 11, 2011||Sony Computer Entertainment Inc.||Hand-held computer interactive device|
|US8072470||May 29, 2003||Dec 6, 2011||Sony Computer Entertainment Inc.||System and method for providing a real-time three-dimensional interactive environment|
|US8095948||Mar 9, 2001||Jan 10, 2012||Sony Corporation||System and method for billing for interactive television|
|US8112449||Aug 1, 2003||Feb 7, 2012||Qwest Communications International Inc.||Systems and methods for implementing a content object access point|
|US8142288||May 8, 2009||Mar 27, 2012||Sony Computer Entertainment America Llc||Base station movement detection and compensation|
|US8188968||May 29, 2012||Sony Computer Entertainment Inc.||Methods for interfacing with a program using a light input device|
|US8251820||Aug 28, 2012||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US8287373||Apr 17, 2009||Oct 16, 2012||Sony Computer Entertainment Inc.||Control device for communicating visual information|
|US8303411||Oct 12, 2010||Nov 6, 2012||Sony Computer Entertainment Inc.||Methods and systems for enabling depth and direction detection when interfacing with a computer program|
|US8310656||Sep 28, 2006||Nov 13, 2012||Sony Computer Entertainment America Llc||Mapping movements of a hand-held controller to the two-dimensional image plane of a display screen|
|US8313380||May 6, 2006||Nov 20, 2012||Sony Computer Entertainment America Llc||Scheme for translating movements of a hand-held controller into inputs for a system|
|US8323106||Jun 24, 2008||Dec 4, 2012||Sony Computer Entertainment America Llc||Determination of controller three-dimensional location using image analysis and ultrasonic communication|
|US8342963||Apr 10, 2009||Jan 1, 2013||Sony Computer Entertainment America Inc.||Methods and systems for enabling control of artificial intelligence game characters|
|US8352544||Nov 29, 2010||Jan 8, 2013||Microsoft Corporation||Composition of local media playback with remotely generated user interface|
|US8368753||Feb 5, 2013||Sony Computer Entertainment America Llc||Controller with an integrated depth camera|
|US8393964||May 8, 2009||Mar 12, 2013||Sony Computer Entertainment America Llc||Base station for position location|
|US8490129||Sep 30, 2003||Jul 16, 2013||Qwest Communications International Inc.||Methods, systems and apparatus for selectively distributing urgent public information|
|US8527657||Mar 20, 2009||Sep 3, 2013||Sony Computer Entertainment America Llc||Methods and systems for dynamically adjusting update rates in multi-player network gaming|
|US8542907||Dec 15, 2008||Sep 24, 2013||Sony Computer Entertainment America Llc||Dynamic three-dimensional object mapping for user-defined control device|
|US8547401||Aug 19, 2004||Oct 1, 2013||Sony Computer Entertainment Inc.||Portable augmented reality device and method|
|US8570378||Oct 30, 2008||Oct 29, 2013||Sony Computer Entertainment Inc.||Method and apparatus for tracking three-dimensional movements of an object using a depth sensing camera|
|US8686939||May 6, 2006||Apr 1, 2014||Sony Computer Entertainment Inc.||System, method, and apparatus for three-dimensional input control|
|US8713617||May 22, 2003||Apr 29, 2014||Qwest Communications International Inc.||Systems and methods for providing television signals using a network interface device|
|US8758132||Aug 27, 2012||Jun 24, 2014||Sony Computer Entertainment Inc.|
|US8781151||Aug 16, 2007||Jul 15, 2014||Sony Computer Entertainment Inc.||Object detection using video input combined with tilt angle information|
|US8793303 *||Jun 29, 2006||Jul 29, 2014||Microsoft Corporation||Composition of local user interface with remotely generated user interface and media|
|US8797260||May 6, 2006||Aug 5, 2014||Sony Computer Entertainment Inc.||Inertially trackable hand-held controller|
|US8840470||Feb 24, 2009||Sep 23, 2014||Sony Computer Entertainment America Llc||Methods for capturing depth data of a scene and applying computer actions|
|US8961313||May 29, 2009||Feb 24, 2015||Sony Computer Entertainment America Llc||Multi-positional three-dimensional controller|
|US8976265||Oct 26, 2011||Mar 10, 2015||Sony Computer Entertainment Inc.||Apparatus for image and sound capture in a game environment|
|US9130898||Mar 24, 2009||Sep 8, 2015||Qwest Communications International Inc.||Transmitting utility usage data via a network interface device|
|US9177387||Feb 11, 2003||Nov 3, 2015||Sony Computer Entertainment Inc.||Method and apparatus for real time motion capture|
|US20010043284 *||May 22, 2001||Nov 22, 2001||Sanyo Electric Co., Ltd.||Digital broadcasting receiver|
|US20020088004 *||Apr 21, 2001||Jul 4, 2002||Sony Corporation||System and method for interactive television|
|US20020129376 *||Mar 9, 2001||Sep 12, 2002||Tadamasa Kitsukawa||Virtual channel system for web appliance, including interactive television|
|US20020144258 *||Mar 9, 2001||Oct 3, 2002||Tadamasa Kitsukawa||System and method for billing for interactive television|
|US20020144288 *||Mar 9, 2001||Oct 3, 2002||Tadamasa Kitsukawa||System and method for allowing access to web sites using interactive television|
|US20020157092 *||Apr 23, 2001||Oct 24, 2002||Sony Corporation||System and method for pulling internet content onto interactive television|
|US20020157100 *||Apr 23, 2001||Oct 24, 2002||Sony Corporation||Electronic program guide including virtual channels for interactive television|
|US20020157107 *||Apr 23, 2001||Oct 24, 2002||Sony Corporation||Interactive television system|
|US20020157108 *||Apr 23, 2001||Oct 24, 2002||Sony Corporation||Gateway screen for interactive television|
|US20020157109 *||Apr 21, 2001||Oct 24, 2002||Sony Corporation||System and method for interactive television|
|US20020171772 *||Mar 1, 2002||Nov 21, 2002||Funke Eric Peter||Unit for and method of video signal enhancement and image display apparatus provided with such a video signal enhancement unit|
|US20020196367 *||May 22, 2002||Dec 26, 2002||Hideaki Yui||Display control apparatus and method, and recording medium and program therefor|
|US20040070620 *||Sep 29, 2003||Apr 15, 2004||Hirotoshi Fujisawa||Display device, display method, and program|
|US20050063418 *||Sep 23, 2003||Mar 24, 2005||Case Michael L.||Tuner module utilizing device-specific controller|
|US20050117054 *||Oct 2, 2004||Jun 2, 2005||Funai Electric Co., Ltd.||Flat panel television|
|US20050264702 *||May 26, 2005||Dec 1, 2005||Sharp Kabushiki Kaisha||Image display device, image display method, and television receiver|
|US20060079326 *||Dec 12, 2005||Apr 13, 2006||Microsoft Corporation||Video Division Detection|
|US20060248562 *||Jun 26, 2006||Nov 2, 2006||Sony Corporation Inc.||System and method for interactive television|
|US20080005302 *||Jun 29, 2006||Jan 3, 2008||Microsoft Corporation||Composition of local user interface with remotely generated user interface and media|
|US20080034029 *||Jun 15, 2006||Feb 7, 2008||Microsoft Corporation||Composition of local media playback with remotely generated user interface|
|US20090199253 *||Aug 10, 2007||Aug 6, 2009||Shanda Computer (Shanghai) Co., Ltd.||System and Method for Accessing Internet Via TV and PC Connecting Set and a TV Connecting Set|
|US20090219443 *||Dec 1, 2008||Sep 3, 2009||Kabushiki Kaisha Toshiba||Video signal processing apparatus, and digital television broadcast receiver controlling method|
|US20100026722 *||Dec 18, 2007||Feb 4, 2010||Tetsujiro Kondo||Display control apparatus display control method, and program|
|US20100033502 *||Oct 13, 2006||Feb 11, 2010||Freescale Semiconductor, Inc.||Image processing apparatus for superimposing windows displaying video data having different frame rates|
|US20100275227 *||Oct 2, 2009||Oct 28, 2010||Samsung Electronics Co., Ltd.||Display apparatus and control method of the same|
|US20110072081 *||Mar 24, 2011||Microsoft Corporation||Composition of local media playback with remotely generated user interface|
|US20150009418 *||Jul 19, 2012||Jan 8, 2015||Sharp Kabushiki Kaisha||Video display device and television receiving device|
|U.S. Classification||348/565, 348/678, 348/E05.119, 348/E05.112, 348/687, 348/E05.114|
|International Classification||H04N5/45, H04N5/46, H04N5/57|
|Cooperative Classification||H04N5/46, G09G2320/0686, H04N21/4782, H04N5/45, H04N5/445, H04N21/4622, H04N5/57, H04N21/4854, H04N21/4316, H04N21/4318, H04N21/4143|
|European Classification||H04N21/462S, H04N21/41P2, H04N21/4782, H04N5/45, H04N5/57, H04N5/445|
|May 7, 2004||AS||Assignment|
|Mar 7, 2008||FPAY||Fee payment|
Year of fee payment: 4
|Mar 5, 2012||FPAY||Fee payment|
Year of fee payment: 8