|Publication number||US7410260 B2|
|Application number||US 11/161,465|
|Publication date||Aug 12, 2008|
|Filing date||Aug 4, 2005|
|Priority date||Aug 4, 2005|
|Also published as||CN101238409A, CN101238409B, EP1920294A2, EP1920294A4, US20070030460, WO2007019241A2, WO2007019241A3|
|Publication number||11161465, 161465, US 7410260 B2, US 7410260B2, US-B2-7410260, US7410260 B2, US7410260B2|
|Inventors||David J. Mehrl|
|Original Assignee||Texas Instruments Incorporated|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (9), Referenced by (8), Classifications (15), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
Disclosed embodiments relate to optical projection devices, and more particularly to integrating an image sensor with an optical projection device along with an input device for smart screen capability and other enhancements.
Optical projection devices such as business projectors are prevalent in the marketplace and in businesses for making presentations to clients and customers. A presentation can be recorded for future playback if the client or customer is not physically present to see it, but the recording of such presentations requires additional recording equipment. In addition, a presenter may need to further explain his or her presentation, often by writing on a white board or a chalkboard.
An image capture device, such as a charge-coupled device (CCD) camera or a complementary metal oxide semiconductor (CMOS) image sensor, may be integrated into a front or rear projection device, such as a business projector or a high-definition television (HDTV). In one embodiment, an input device, such as a laser pointer, may be used in conjunction with the integrated device to enable and facilitate interactive presentations. The laser pointer may be pulsed to improve identifying and tracking of a laser light spot on a projected image. In another embodiment, the application may be extended to rear projection systems for gaming and interactive graphics.
Integrating a CCD camera 106 with a projector 102 provides several value-added functions. The camera 106 may assess the distance between the projector 102 and the display screen 110 with ambient light sensing, and reduce lamp power in order to extend lamp life and reduce heat. This can also be accomplished by looking at the apparent size of the projected image 108 acquired by image processing in the camera 106. For example, the smaller the image 108 in the camera 106, the further away the screen 110. This technique can also be combined with the ambient light sensing to determine the amount of lamp power reduction. Additionally, the CCD camera 106 may also monitor and correct image 108 brightness or balance color/intensity as necessary to compensate for lamp aging. Color/intensity corrections may be achieved by boosting white levels or specific color levels. The projector 102 may also alert or warn the user when the light source needs replacement.
Real-time image processing capabilities may be provided, such as by a digital signal processor (DSP) to detect movement of the laser light spot 120 from the laser pointer 118. Such processing capabilities may be provided by the integrated processing electronics hardware 112 and software 114 embedded within the projector 102 as previously described. As the projector 102 projects an image 108 onto a screen 110, the projection screen 110 functions like a smart board. The presenter may use the laser pointer 118 as a normal laser pointer 118 by pushing a first normal operating button 122 and projecting a laser light spot 120 onto the screen 110 as illustrated in
If a user wants to effect a “click” action, he could press harder or increase the pressure on the laser pointer's 118 button 124, which would then respond with a tactile “click” feedback response (much like the “click” action of a conventional mouse button). Such action would then close a switch contact pair, which would then cause the laser pointer 118 to pulse at about 15 Hz. Once assured that a mouse click event is recognized, appropriate actions would then be taken. The system 100 would continue to monitor and track where the laser light spot 120 resides in the image 108, to then capture “drag” operations, and send appropriate mouse drag events, which contain the X, Y pixel positions of the current laser light spot location 120 within the image 108. It would then continue to monitor the image 108 until the laser light spot 120 stopped pulsing, where it would then issue a left mouse click “break” event. By monitoring the subsequent motion of the laser light spot 120 and tracking it until the user releases the “mouse button” on the laser pointer 118, the laser pointer 118 is doubling as a mouse in this way (can be “clicked”, “dragged”, or “released”), thereby providing portable built-in “smart screen” capability. The ensuing “mouse commands” can then be used by the display or television electronics, or sent to a connected system, such as a PC or set-top box.
Additionally, during “tracking” of the laser light spot 120 to catch “drag” operations, localized regions of the image 108 may be zoomed in to reduce the real-time processing overhead. The reason for this is that once the laser pointer 118 has been identified to be in command mode, and that the position of the laser pointer spot 120 has been identified within the image 108, there will be minimal changes in its position from one acquired CCD frame to the next acquired frame. Since analyzing the image 108 for mouse movement is a processing intensive operation, limiting the analysis to a much smaller portion, such as zooming in on the image 108, can thereby minimize the processing intensive operation.
In another embodiment, the pulsing circuitry and/or frequency of the laser pointer 118 may be synchronized between the room's AC wiring and the frame rate of the CCD camera 106. Alternately, the pulsing circuitry of the laser pointer 118 could be synchronized to the projector 102 frame rate by means of a small optical sensor (not shown) built into the laser pointer 118 that would sense the flicker rate of the ambient light produced by the projector screen 110. In yet another embodiment, a small wireless transmitter in the projector 102 combined with a small wireless receiver (not shown) in the laser pointer 118 could allow the projector 102 to send a synchronous signal to the laser pointer 118. In these cases, whether relying on ambient 60 Hz AC fields, projector light flicker, or wireless signal, the synchronization signals would drive a phase-lock loop in the laser pointer 118 that would provide the precise phase locked synchronization signal so that the laser pointer 118 could optionally pulse in exact steps with the projector 102 and/or camera 106 frame rate.
Frame differencing of a pulsed laser light as described above may enhance image detection and sensitivity. A video processor (not shown) would do conventional analysis of sequential frames to detect image regions 108 that are pulsating at 15 Hz, and thus be able to initially detect the presence of the pulsating laser light spot 120. Use of the relatively slow 15 Hz frequency would assure that within any two successively captured frames (at a typical 30 Hz frames per second) from an image 108, at least one of those frames would capture the laser light spot 120 in some manner, whether the laser pointer 118 was “on” or “off” during the entire frame exposure. It is not necessary for the camera's 106 frame rate to be synchronized to the projector's 102 frame rate. The methodology for detecting the presence of the 15 Hz pulsating laser pointer 118 is to successively subtract sequentially acquired frames (doing a pixel-by-pixel subtraction of the pixel intensities and finding the absolute value of this “difference” image), and then looking for sudden changes that highlight a transition (associated with the rising or falling of the pulsing laser pointer 118). Additionally, the frame differencing technique may be coupled with closely related synchronous detection techniques to boost sensitivity and reduce noise. The combined techniques require hardware sensor 112 and processing electronics 114 superimposing a mouse pointer on the image 108 to give the user audio/visual signal feedbacks 116 in the event that the laser pointer spot 120 is hard to visualize.
In another embodiment, the pulsing laser pointer 118 may employ time-encoded signals. Multiple time-sequence patterns and pulses may be generated corresponding to the actions of a computer “mouse.” For example, the “mouse button down”, “mouse button hold”, and the “mouse button up” events may be employed with a single button (122 or 124) on the laser pointer 118. When you first push the “mouse button down”, it sends out a pulse train sequence at frequency f1. After holding the “mouse button down,” it automatically changes the pulse frequency to a second frequency f2 that indicates that the button is being held down. When the button (122 or 124) is released, it sends out a short burst of pulses at frequency f3 that indicates that the mouse button has been released. The changing frequencies make it easier to identify the different events. The laser pointer 118 could also have additional buttons (not shown) that sends out other frequencies or otherwise specially encoded temporal patterns.
The above described embodiments may be used in rear projection systems 200, namely rear projection televisions (RPTV) or high-definition television (HDTV) for games (shooting type games), interactive graphics, or surfing on the Internet, etc.
It will be appreciated by those of ordinary skill in the art that the invention can be embodied in other specific forms without departing from the spirit or essential character thereof. The presently disclosed embodiments are therefore considered in all respects to be illustrative and not restrictive. The scope of the invention is indicated by the appended claims rather than the foregoing description, and all changes that come within the meaning and ranges of equivalents thereof are intended to be embraced therein.
Additionally, the section headings herein are provided for consistency with the suggestions under 37 C.F.R. § 1.77 or otherwise to provide organizational cues. These headings shall not limit or characterize the invention(s) set out in any claims that may issue from this disclosure. Specifically and by way of example, although the headings refer to a “Technical Field,” the claims should not be limited by the language chosen under this heading to describe the so-called technical field. Further, a description of a technology in the “Background” is not to be construed as an admission that technology is prior art to any invention(s) in this disclosure. Neither is the “Summary” to be considered as a characterization of the invention(s) set forth in the claims found herein. Furthermore, any reference in this disclosure to “invention” in the singular should not be used to argue that there is only a single point of novelty claimed in this disclosure. Multiple inventions may be set forth according to the limitations of the multiple claims associated with this disclosure, and the claims accordingly define the invention(s), and their equivalents, that are protected thereby. In all instances, the scope of the claims shall be considered on their own merits in light of the specification, but should not be constrained by the headings set forth herein.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4067026||Jul 19, 1976||Jan 3, 1978||George Pappanikolaou||Front projection system embodying a single lens|
|US5235363 *||May 10, 1991||Aug 10, 1993||Nview Corporation||Method and apparatus for interacting with a computer generated projected image|
|US5422693 *||May 28, 1993||Jun 6, 1995||Nview Corporation||Method and apparatus for interacting with a computer generated projected image|
|US6246446||Jun 10, 1997||Jun 12, 2001||Texas Instruments Incorporated||Auto focus system for a SLM based image display system|
|US6474819||Mar 21, 2001||Nov 5, 2002||Texas Instruments Incorporated||Combination overhead projector and electronic display device|
|US6802611||Oct 22, 2002||Oct 12, 2004||International Business Machines Corporation||System and method for presenting, capturing, and modifying images on a presentation board|
|US6877863||Jun 12, 2003||Apr 12, 2005||Silicon Optix Inc.||Automatic keystone correction system and method|
|US6979087 *||Oct 31, 2002||Dec 27, 2005||Hewlett-Packard Development Company, L.P.||Display system with interpretable pattern detection|
|US7036938 *||Oct 31, 2002||May 2, 2006||Microsoft Corporation||Pen projection display|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7768527 *||Nov 20, 2006||Aug 3, 2010||Beihang University||Hardware-in-the-loop simulation system and method for computer vision|
|US8297755 *||Sep 7, 2007||Oct 30, 2012||Koninklijke Philips Electronics N.V.||Laser projector with alerting light|
|US8434873||Mar 31, 2010||May 7, 2013||Hong Kong Applied Science and Technology Research Institute Company Limited||Interactive projection device|
|US8538367||Jun 29, 2009||Sep 17, 2013||Qualcomm Incorporated||Buffer circuit with integrated loss canceling|
|US8842096||Jan 7, 2011||Sep 23, 2014||Crayola Llc||Interactive projection system|
|US20100073580 *||Sep 7, 2007||Mar 25, 2010||Koninklijke Philips Electronics N.V.||Laser projector with alerting light|
|US20110119638 *||Nov 17, 2009||May 19, 2011||Babak Forutanpour||User interface methods and systems for providing gesturing on projected images|
|US20120218294 *||Feb 23, 2012||Aug 30, 2012||Sanyo Electric Co., Ltd.||Projection-type image display apparatus|
|U.S. Classification||353/28, 348/745, 353/42, 375/295, 353/29, 353/79|
|International Classification||H04N3/22, H04L27/00, G03B21/26, G03B21/14, G03B21/00|
|Cooperative Classification||G03B21/14, G03B17/48|
|European Classification||G03B17/48, G03B21/14|
|Aug 4, 2005||AS||Assignment|
Owner name: TEXAS INSTRUMENTS INCORPORATED, TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MEHRL, DAVID J;REEL/FRAME:016352/0631
Effective date: 20050715
|Jan 27, 2012||FPAY||Fee payment|
Year of fee payment: 4