|Publication number||US8020997 B2|
|Application number||US 11/973,456|
|Publication date||Sep 20, 2011|
|Filing date||Oct 9, 2007|
|Priority date||Oct 9, 2007|
|Also published as||US20090091714|
|Publication number||11973456, 973456, US 8020997 B2, US 8020997B2, US-B2-8020997, US8020997 B2, US8020997B2|
|Inventors||Richard Aufranc, Bruce A. Stephens, Olan Way|
|Original Assignee||Hewlett-Packard Development Company, L.P.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (14), Referenced by (2), Classifications (12), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
Various devices for displaying images exist. One example is digital image projectors. Digital image projectors are widely used to project color images generated from digital signals onto a display surface. In some cases, the display surface may be the front of a reflective display screen, for example, in a theater or conference-room. In other cases, the display surface may be the rear of a semi-transparent diffusive screen of a rear-projection display monitor or projection television.
Portable digital image projectors are common. Such digital image projectors, while connected to a personal computer or other image/video signal source, typically sit on a supporting surface and are directed at a display surface on which images or video is to be shown. Many of these projectors use transmissive or reflective liquid crystal displays. Other such projectors use different imaging devices, such as digital micro-mirrors. These projectors can display images one at a time or as a sequence of images, as in the case of video.
Digital projectors are typically designed so that undistorted rectangular images are projected on the display surface when the projector is placed horizontally on a level support surface with the projector's optical axis lined up perpendicular to the display surface. However, if this alignment or orientation is not made, the resulting image on the display surface may be distorted. In many cases, the distorted image will appear as a trapezoid or an arbitrarily shaped quadrilateral. The non-rectangular shape of the resulting projected image is referred to as keystoning.
The accompanying drawings illustrate various embodiments of the principles described herein and are a part of the specification. The illustrated embodiments are merely examples and do not limit the scope of the claims.
Throughout the drawings, identical reference numbers designate similar, but not necessarily identical, elements.
As indicated above, unless a projector is carefully aligned with a corresponding display surface (e.g., a projection screen), the resulting image on the display surface can appear distorted, or “keystoned.” As used herein and in the appended claims, “keystoning” refers to a projected image that has a non-rectangular shape due to misalignment between the display surface and the projector or projectors that are projecting the image. As above, such an image may also be referred to as “keystoned.”
Keystoning is undesirable, not only because viewers find this warping to be distracting, but also because the distortion can significantly affect the interpretation of visual information such as graphs, bar charts and technical drawings that are being displayed. Keystoning can be prevented or corrected by aligning the projection system's optical axis so that it is perpendicular to the display surface and ensuring that the image is not rotated with respect to the display surface.
Consequently, one technique to eliminate keystoning is to manually adjust the physical position of the digital image projector by moving it around with respect to the corresponding display surface, tilting and rotating the projector, etc., until an image without keystoning, i.e., a near rectangular image, is displayed. However, in many situations, it may not be feasible to sufficiently physically adjust the position of the projector. For example, a properly oriented support surface relative to the display surface may not be available. In other examples, the projector may be positioned at an angle above or below the display surface due to the configuration of the projection area.
Some projectors can help adjust for keystoning electronically, without physically moving or reorienting the projector. This is done by making adjustments to the optical axis of the projector by, for example, adjusting the position of optical elements within the projector or adjustable support structures integrated into the projector housing. The image being produced by the projector can also be skewed electronically to compensate for keystoning.
Such solutions may involve a menu-based approach in which a menu system is provided to the user. The user is then able to adjust keystoning and other distortions by changing certain parameters within the menu system. However, controlling the variety of parameters in a projector to make these adjustments to correct keystoning can be complicated and time consuming and beyond the ability of many users. Another such solution requires the user to manually adjust elements of the projection system to eliminate keystoning. This can also be time consuming and further requires the user to have physical access to the projector.
With respect to keystoning, some projection systems have additional complications. Some projection systems exist that utilize more than one projector. In some such systems, the projected images from multiple projectors are “tiled” in order to form a bigger image or several smaller images. In other multi-projector systems, the projected images from different projectors may be superimposed in order to improve resolution, sharpness, brightness, contrast, etc. These systems may also provide redundancy. In multi-projector systems, however, keystoning may be an issue with each projector in the system, thereby compounding the adjustments that are necessary to avoid or correct distortion. Each individual projected image, and the total composite image, must be adjusted so that the resulting image appears to have right angled corners.
In both single and multi-projector projection systems, in order to solve a keystoning problem, it may be helpful for the user to define a bounding box. As used herein and in the appended claims, the term “bounding box” refers to a box defined on the display surface in which the projected image is to appear. Typically, the bounding box is rectangular, but this is not necessarily so. The bounding box may be marked with lines, physical structure, projected light or any other means of indicating the boundary or outline of the bounding box. In creating the bounding box, the user may determine the desired size and aspect ratio of the projected image. Defining a bounding box for projected images can be difficult.
To address the issues involved in defining a bounding box for projected images, the present specification provides various systems and methods for easily and intuitively defining bounding boxes. The principles of the presence specification enable a user to easily define a bounding box that may then be used to align and display a projected image. The methods discussed below are especially useful in multi-projector systems that involve aligning and displaying a ‘tiled’ image.
In the following description, for purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the present systems and methods. It will be apparent, however, to one skilled in the art that the present systems and methods may be practiced without these specific details. Reference in the specification to “an embodiment,” “an example” or similar language means that a particular feature, structure, or characteristic described in connection with the embodiment or example is included in at least that one embodiment, but not necessarily in other embodiments. The various instances of the phrase “in one embodiment” or similar phrases in various places in the specification are not necessarily all referring to the same embodiment.
The principles disclosed herein will now be discussed with respect to illustrative systems and methods.
The computing element (120) of the present embodiment may be a desktop computer, laptop, server or other computing device. In other embodiments, the computing element (120) may be integrated into the digital image projector (110). The digital image sensor (130) and optical beam generators (140) may also be integrated into the image projector (110) in some embodiments.
The display surface (150) may be a screen, wall, or other surface suitable for viewing projected images. In some instances, the display surface (150) may be reflective and configured to be viewed from the same side on which the image from the projector (110) is received. In other embodiments, the display surface may be transmissive and configured so as to allow the image (115) to be viewed from the opposite side of the surface (150) onto which it is projected.
In the system (100) of the present embodiment, the optical beam generators are directed towards the display surface (150), creating orientation indicators (145) on the display surface (150). In the present embodiment, these indicators (145) are adjusted by the user until they are located at approximately the four corners of the desired bounding box. This bounding box may or may not currently encompass or correspond to the image (115) projected by the image projector (110), which is shown in this figure as an irregular polygon to represent the distortions that may be caused by a misaligned projector (110). The digital image sensor (130) is configured so as to be able to sense the desired bounding box as defined by the indicators (145) i.e., the portion (135) of the display surface (150) upon which the image (115) is desired to be projected.
After the orientation indicators (145) have been adjusted by a user, the image sensor (130) images the display surface including the orientation indicators (145). From this image, the computing element (120) determines the relative positions of the indicators (145) on the display surface (150). For example, an algorithm can process the image starting at the center and scanning outwards until the indicators (145) are identified. For example, the scan could be spiral, radial or raster starting in the center. The indicators (145) can be identified by a particular number of clustered image pixels of a particular color, shape or other characteristic indicative of the orientation indicators (145). The computing element (120) can then use this information to interpolate the desired bounding box within the field of view of the image sensor (130) and in correspondence with the display surface. The computing element (120) of the present embodiment may include a user interface that displays, and allow the user to change, certain parameters of the bounding box.
Upon the calculation of the bounding box, the computing element (120) may control the projector (110) to display the image received from the image sensor (130) with the bounding box overlaid or superimposed on the projected image. Also displayed may be text or other indicators of the size, aspect ratio, etc. of the bounding box. At this point the user may either change the details of the bounding box or accept them. For instance, the computing element (120) may display corner markers and allow the user to manipulate them to make adjustments to the bounding box's location or size. Once the bounding box has been accepted, the computing element (120) may communicate with the projector (110) and adjust the projected image until the projected image (115) fills the bounding box specified by the user.
Turning now to
Once a bounding box has been defined, the computing element (120) may begin adjusting the images (115) projected by the image projectors (110) based on the specified bounding box until the desired image (155,
Turning now to
Turning now to
Referring now to
The position of these indicators is then captured (step 505) by an image sensing device and transmitted to a computing element. The computing element determines a desired bounding box from the positioning of the indicators relative to the field of view of the image sensing device. If the computing element includes a display screen or monitor, the computing element may display a representation of the corresponding bounding box, in some cases, relative to the image to be projected or a representation thereof. In other embodiments, particularly where the indicators used are dots or points of light, the computing element may control the associated projector to including a corresponding bounding box in the image projected (step 510). In either case, the computing element may also display other information in textual or other form about the bounding box, for example, the aspect ratio (step 510).
The user may then adjust (step 515) the bounding box by manipulating an image displayed on a computer monitor or projected on the display surface, for example. The projector or projectors may then be activated and project an image towards the display surface (step 520). The image displayed (step 525) may be corrected for keystoning and other distortions before being projected so as to appear in the defined bounding box.
In some embodiments the image may be a tiled image in which a plurality of projectors are projecting a portion of the total image. This process may be iterated if desired, in order to improve or change the aspect ratio or size.
Referring now to
In another example, there may be fiducial marks on the display surface indicating the desired projection area within the display surface. In some examples, the fiducial marks may not be actual marks on the display surface, but are projected on the display device by the projector system. In which case, the user may be able to reposition the projected fiducials using the interface or controls of the projector system to define the desired bounding box.
Next, the computing element will calculate (step 610) a bounding box based on the detected boundaries or fiducial marks of the display surface. In some embodiments, this bounding box may be substantially the same size and geometry as the display surface.
The bounding box's information is then displayed (step 615) for the user. For example, the image captured by the image sensing device may be provided to the user on a computer monitor, with a bounding box overlaid on the image to be projected as shown in
As mentioned above,
Referring now to
The captured image is then provided (step 805) to a computing element. With the computing element, the user may then define (step 810) the remaining sides of the bounding box, and adjust (step 815) the bounding box to the desired proportions and size relative to the display surface. In other embodiments, the computing element may automatically produce a bounding box based on the two edges of the image. In either case, the system then prepares (step 820) to project an image within the defined bounding box, and the image is then projected (step 825) in the defined bounding box.
Turning now to
In one embodiment, illustrated in
The preceding description has been presented only to illustrate and describe embodiments and examples of the principles described. This description is not intended to be exhaustive or to limit these principles to any precise form disclosed. Many modifications and variations are possible in light of the above teaching.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US5574835||Nov 2, 1995||Nov 12, 1996||Silicon Engines, Inc.||Bounding box and projections detection of hidden polygons in three-dimensional spatial databases|
|US6753907||Nov 14, 2000||Jun 22, 2004||Justsystem Corporation||Method and apparatus for automatic keystone correction|
|US6791542||Jun 17, 2002||Sep 14, 2004||Mitsubishi Electric Research Laboratories, Inc.||Modeling 3D objects with opacity hulls|
|US7119833||Dec 3, 2003||Oct 10, 2006||University Of Kentucky Research Foundation||Monitoring and correction of geometric distortion in projected displays|
|US7125122||Feb 2, 2004||Oct 24, 2006||Sharp Laboratories Of America, Inc.||Projection system with corrective image transformation|
|US20030231173||Jun 17, 2002||Dec 18, 2003||Wojciech Matusik||Modeling 3D objects with opacity hulls|
|US20040091084||Oct 30, 2003||May 13, 2004||Griffith Lionell K.||3D projection method|
|US20040101191||Nov 14, 2003||May 27, 2004||Michael Seul||Analysis, secure access to, and transmission of array images|
|US20050063582||Aug 27, 2004||Mar 24, 2005||Samsung Electronics Co., Ltd.||Method and apparatus for image-based photorealistic 3D face modeling|
|US20050184958 *||Mar 29, 2005||Aug 25, 2005||Sakunthala Gnanamgari||Method for interactive user control of displayed information by registering users|
|US20060098167 *||Nov 9, 2005||May 11, 2006||Casio Computer Co., Ltd.||Projector device, projecting method and recording medium in which projection control program is recorded|
|US20070040800 *||Aug 18, 2005||Feb 22, 2007||Forlines Clifton L||Method for stabilizing and precisely locating pointers generated by handheld direct pointing devices|
|US20070242233 *||Apr 13, 2006||Oct 18, 2007||Nokia Corporation||Relating to image projecting|
|GB2399631A||Title not available|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8274613 *||Aug 27, 2009||Sep 25, 2012||Seiko Epson Corporation||Display masks for display and calibration in projector-based display systems|
|US20110050873 *||Aug 27, 2009||Mar 3, 2011||Steve Nelson||Display Masks for Display and Calibration in Projector-Based Display Systems|
|U.S. Classification||353/30, 353/94, 353/28, 353/42|
|Cooperative Classification||H04N9/3147, H04N9/3194, G03B21/14, G03B21/26, H04N9/3185|
|European Classification||G03B21/26, G03B21/14|
|Oct 9, 2007||AS||Assignment|
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AUFRANC, RICHARD;STEPHENS, BRUCE A.;WAY, OLAN;REEL/FRAME:019996/0781;SIGNING DATES FROM 20071005 TO 20071006
Owner name: HEWLETT-PACKARD DEVELOPMENT COMPANY, L.P., TEXAS
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AUFRANC, RICHARD;STEPHENS, BRUCE A.;WAY, OLAN;SIGNING DATES FROM 20071005 TO 20071006;REEL/FRAME:019996/0781
|Feb 26, 2015||FPAY||Fee payment|
Year of fee payment: 4