|Publication number||US6857746 B2|
|Application number||US 10/430,977|
|Publication date||Feb 22, 2005|
|Filing date||May 7, 2003|
|Priority date||Jul 1, 2002|
|Also published as||CA2490795A1, EP1550103A2, EP1550103A4, US20040001182, WO2004003880A2, WO2004003880A3, WO2004003880A8|
|Publication number||10430977, 430977, US 6857746 B2, US 6857746B2, US-B2-6857746, US6857746 B2, US6857746B2|
|Inventors||Chad D. Dyner|
|Original Assignee||Io2 Technology, Llc|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (24), Non-Patent Citations (1), Referenced by (189), Classifications (22), Legal Events (7)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This invention is described in my U.S. Provisional Application No. 60/392,856 filed on Jul. 1, 2002.
This invention relates to augmented reality input/output interfaces involving free-space imaging displays, environments, simulation, and interaction.
Current technologies attempt to create the visual perception of a free-floating image through the manipulation of depth cues generated from two-dimensional data employing well-established techniques. A few examples of these include stereoscopic imaging via shutter or polarized glasses, as well as auto-stereoscopic technologies composed of lenticular screens directing light from a conventional display, or real-imaging devices utilizing concave mirror arrangements. All of these technologies suffer convergence and accommodation limitations. This is a function of the original two-dimensional image generating data and its disparity to its perceived spatial location, resulting in user eyestrain and fatigue due to the difficulty of focusing on an image that does not truly exist where it is perceived to occur.
In order to resolve this visual limitation, the image and its perceived location must coincide spatially. A well-established method solving this constraint is by projection onto an invisible surface that inherently possesses a true spatially perceived image location; yet prior art methods rendered poor image fidelity. Projection onto non-solid screens was first suggested in 1899 by Just, in U.S. Pat. No. 620,592, where an image was projected onto a simple water screen known in the art as fog screen projections. Since then, general advancements to image quality have been described depending solely on improving the laminar quality of the screen directly correlating to image quality. As such in prior art, these methodologies limit the crispness, clarity, and spatial image stability solely based on the dynamic properties of the screen, which intrinsically produce a relatively spatially unstable image. Minor screen fluctuations further compound images distortion. Image fidelity was further compromised and image aberrations amplified by the easily discernible screen detracting from the intended objective of free-space imaging. Advancements in this invention allow the device to be self-sustainable, and overcome prior art limitations of image stability and fidelity, improve viewing angles, and incorporate additional interactive capabilities.
One of the main disadvantages found in prior art was the reliance on a supply of screen generating material. These devices depended on either a refillable storage tank for the screen generating material, or the device had to be positioned in or around a large body of water such as a lake in order to operate. This limited the operating time of the device in a closed environment such as in a room required refilling, or a plumbing connection for constant operation. The result severely limited the ease of operation, portability, and placement of the device caused by this dependence. Furthermore, some fog screen projection systems changed the operating environment by over-saturating the surrounding ambient air with particulates, such as humidity or other ejected gases. The constant stream of ejected material created a dangerous environment, capable of short-circuiting electronics as well as producing a potential health hazard of mold build-up in a closed space, such as in a room. The dehumidification process disclosed both in Kataoka's U.S. Pat. No. 5,270,752 and Ismo Rakkolainen's WAVE white paper, was not employed to collect moisture for generating the projection surface screen but rather to increase laminar performance as a separate detached aspirator. The present invention employs condensate extraction method specifically to serve as a self-sustained particle cloud manufacturing and delivery system.
Furthermore in prior art, while the projection surface can be optimized for uniformity, thickness, and planarity by improving laminar performance, the inherent nature of a dynamic system's natural tendency towards turbulence will ultimately affect the overall imaging clarity or crispness and image spatial stability such as image fluttering. These slight changes caused by common fluctuating air currents and other environmental conditions found in most indoor and outdoor environments induce an unstable screen, thereby affecting the image. Prior art attempted to solve these image degradation and stability issues by relying on screen refinements to prevent the transition of laminar to turbulent flow. Kataoka's, U.S. Pat. No. 5,270,752 included improvements to minimize boundary layer friction between the screen and surrounding air by implementing protective air curtains, thereby increasing the ejected distance of the screen size while maintaining a relatively homogeneous laminar thin screen depth and uniform particulate density for a stable image. While a relatively laminar screen can be achieved using existing methodologies, generating a spatially stable and clear image is limited by depending solely on improvements to the screen. Unlike projecting onto a conventional physical screen with a single first reflection surface, the virtual projection screen medium invariably exhibits thickness and consequently any projection imaged is visible throughout the depth of the medium. As such, the image is viewed most clearly when directly in front, on-axis. This is due to the identical image alignment stacked through the depth of the screen is directly behind each other and on-axis with respect to the viewer. While the image is most clearly observed on-axis it suffers a significant viewing limitation on a low particulate (density) screen. In order to generate a highly visible image on an invisible to near-invisible screen required high intensity illumination to compensate for the low transmissivity and reflectivity of the screen cloud. This is caused by viewing directly into the bright projection source due to the high intensity illumination to compensate for a low transmissivity and reflectivity of the screen. While in a high particulate count (high density) particle cloud scenario a lower intensity illumination can compensate for the high reflectivity of the screen, this invariable causes the screen to become visibly distracting as well as require a larger and more powerful system to collect the greater amount of airborne particulates.
Additional advancements described in this invention automatically monitor changing environmental conditions such as humidity and ambient temperature to adjust cloud density, microenvironment and projection parameters in order to minimize the visibility of the particle cloud screen. This invention improves invisibility of the screen and image contrast in the multisource embodiment by projecting multiple beams at the image location to maximize illumination intensity and minimize the individual illumination source intensities.
Prior art also created a limited clear viewing zone of on or near on-axis. The projection source fan angle generates an increasingly off-axis projection towards the edges of the image, fidelity falls off where the front surface of the medium is imaging a slightly offset image throughout the depth of the medium with respect to the viewers line of sight. Since the picture is imaged thru the depth of the screen, the viewer not only sees the intended front surface image as on a conventional screen, but all the unintended illuminated particulates throughout the depth of the screen, resulting in an undefined and blurry image. In this invention, a multisource projection system provides continuous on-axis illumination visually stabilizing the image and minimizing image flutter.
This invention does not suffer from any of these aforementioned limitations, by incorporating a self-sustainability particle cloud manufacturing process, significant advances to imaging projection, advances to the microenvironment improving image fidelity, and include additional interactive capabilities.
This invention provides a method and apparatus for generating true high-fidelity full color, high-resolution free-space video or still images with interactive capabilities. The composed video or still images are clear, have a wide viewing angle, possess additional user input interactive capabilities and can render discrete images, each viewed from separate locations surrounding the device. All of these attributes are not possible with present augmented reality devices, existing fog screen projections, current displays or disclosed in prior art.
The system comprises a self-generating means for creating a dynamic, invisible or near invisible, non-solid particle cloud, by collecting and subsequentially ejecting condensate present in the surrounding air, in a controlled atomized fashion, into a laminar, semi-laminar or turbulent, particle cloud. A projection system consisting of an image generating means and projection optics, projects an image or images onto said particle cloud. The instant invention projects still images or dynamic images, text or information data onto an invisible to near-invisible particle cloud screen surface. The particle cloud exhibits reflective, refractive and transmissive properties for imaging purposes when a directed energy source illuminates the particle cloud. A projection system comprising single or multiple projection sources illuminate the particle cloud in a controlled manner, in which the particulates or elements of the particle cloud act as a medium where the controlled focus and intersection of light generate a visible three-dimensional spatially addressable free-space illumination where the image is composed.
Furthermore, any physical intrusion, occurring spatially within the particle cloud image region, is captured by a detection system and the intrusion such as a finger movement, enables information or image to be updated and interacted with in real-time. This input/output (I/O) interface provides a novel display and computer interface, permitting the user to select, translate and manipulate free-space floating visual information beyond the physical constraints of the device creating the image. This invention provides a novel augmented reality platform for displaying information coexisting spatially as an overlay within the real physical world. The interactive non-solid free floating characteristics of the image allow the display space to be physically penetrable for efficient concurrent use between physical and ‘virtual’ activities in multi-tasking scenarios including collaborative environments for military planning, conferencing, and video gaming, as well as presentation displays for advertising and point-of-sales presentations.
The invention comprises significant improvements over existing non-physical screens to display clear images, independent of the pure laminar screen found in the prior art, by functioning with non-laminar, semi-laminar and turbulent particle clouds. Novel advancements to the microenvironment deployment method by means of a multiple stage equalization chamber and baffles generate an even laminar airflow reducing pressure gradients and boundary layer friction between the particle cloud and the surrounding air. Furthermore, the electronic environmental management control (EMC) attenuates particle cloud density by controlling the amount of particulates generated and ejected in conjunction with the particle cloud exit velocity, thereby ensuring an invisible to near-invisible screen. This delicate balance of the particle cloud density and illumination intensity was not possible in the prior art and therefore the cloud was either highly visible or too low of a density to generate a bright image. Further advancements to both an improved projection system improve viewing angle limitations inherent with prior art such as fluttering caused by turbulence within the screen. Furthermore, the invention's self-contained and self-sustaining system is capable of producing a constant stream of cloud particles by condensing moisture from the surrounding air, thereby allowing the system to operate independently without affecting the general operating environment. Furthermore, the invention incorporates interactive capabilities, absent in prior art.
The multiple projection source of this invention has the capacity to produce multi-imaging; were discrete images projected from various sources can each be viewed from different locations. This allows a separate image to be generated and viewed independently from the front and rear of the display, for use as example in video-gaming scenarios, where opposing players observe their separate “points of view” while still being able to observe their opponent through the image. In addition, the multisource projection redundancy mitigates occlusion from occurring, such as in the prior art, where a person standing between the projection source and the screen, blocks the image from being displayed.
By projecting from solely one side, the display can also serve as a one-way privacy display where the image is visible from one side and mostly transparent from the other side, something not possible with conventional displays such as television, plasma or computer CRT's and LCD monitors. Varying the projected illumination intensity and cloud density can further attenuate the image transparency and opacity, a function not possible with existing displays. Furthermore, since the image is not contained within a “physical box” comprising a front, flat physical screen, such as in a conventional display, the image is capable of taking on numerous geometries that are not limited to a flat plane. Furthermore, the dimensions of the image are substantially larger than the dimensions of the device creating the image since the image is not constrained to a physical enclosure such as a convention LCD or CRT. The display can also take on varying geometric shapes, generating particle cloud surfaces other than a flat plane, such as cylindrical or curved surfaces. For these particle cloud types adaptive or corrective optics allow compensate for variable focal distances for the projection.
Applications for this technology are wide-ranging, since the displayed image is non-physical and therefore unobtrusive. Imaged information can be displayed in the center of a room, where people or objects can move through the image, for use in teleconferencing, or can be employed as a ‘virtual’ heads-up display in a medical operating theater, without interfering with surgery. The system of this invention not only frees up space where a conventional display might be placed, but due to its variable opacity and multi-viewing capability, allows the device to be centered around multiple parties, to freely view, discuss and interact collaboratively with the image and each other. The device can be hung from the ceiling, placed on walls, on the floor, concealed within furniture such as a desk, and project images from all directions, allowing the image can be retracted when not in use. A scaled down version allows portable devices such as PDA's and cell phones to have ‘virtual’ large displays and interactive interface in a physically small enclosure.
The basic elements of invention are illustrated in the
Signals originating from an external source (12), a VCR, DVD, video game, computer or other video source, pass through optional scan converter (38), to processing unit (6), to decode the incoming video signal. Stored video data (13), contained for example on a hard disk, flash memory, optical, or alternate storage means, can be employed as the source of content. The processing unit (6), receives these signals, interprets them and sends instructions to graphics board (7), which generates video signal (8), which is sent to an image generating means (9), producing a still or video image. The image generator (9), comprises a means of displaying still or video data for projection, which may be a liquid crystal display, (LCD), digital light processing unit (DLP), organic light emitting diodes (OLED's) or a laser based means of directing or modulating light from any illumination source used to generate a still or video image. Single image delivery optics (10), comprising telecentric projection optics, may include adaptive anamorphic optics for focusing onto non-linear screens, such as curved surface screens. Components (38, 6, 7, 8, 9, 10) may also be replaced by a video projector in a simplified embodiment. Anamorphic optics and digital keystone correction are also employed to compensate for off-axis projection onto non-parallel surfaces.
In the preferred multisource embodiment, a single projection source (9) includes a multi-delivery optical path (20), comprising a series of lenses, prisms, beamsplitters, mirrors, as well as other optical elements required to split the generated image to “phantom” source locations surrounding the perimeter of the device and redirect the projection beam onto particle cloud (5). In an alternate multi-image generation embodiment, multiple images are generated on either a single image generator, such as one projection unit or a plurality of them (19), and are directed, using a single optical delivery path (10), or multiple delivery paths using multi-delivery optics (20), splitting and recombining the projection. Optical or software based means, well known in the art, or a combination of both means are employed to compensate and correct image focus caused from off-axis projection including image trapezoidal keystone correction for one or more axis (i.e. 4 point keystoning). In all instances, the directed projection illuminates particle cloud (5), where free-space image (11) appears to be floating in protective microenvironment (37) within the surrounding air (21). Microenvironment (37) functions to increase boundary layer performance between the particle cloud and the ambient surrounding air by creating a protective air current of similar ejection velocity to that of particle cloud (5). This microenvironment (37), and particle cloud (5) characteristics can be continuously optimized to compensate for changing environmental conditions, in order to minimize cloud visibility, discussed in further detail below.
In the interactive embodiment, coexisting spatially with image (11) is an input detectable space (39), allowing the image to serve as an input/output (I/O) device. Physical intrusion within the input detectable space (39) of particle cloud (5), such as a user's finger, a stylus or another foreign object, is recognized as an input instruction (14). The input is registered when an illumination source (16), comprised of a specific wavelength, such as infrared (IR) source, is directed towards the detectable space highlighting the intrusion. Illumination comprises a means to reflect light off a physical object within a defined detectable region by utilizing a laser line stripe, IR LED's, or conventional lamp or can include the same illumination source from the image projection illuminating the detectable space. In its preferred embodiment, reflected light scattered off the user's finger or other input means (14) is captured by optical sensor (15). Optical sensor or detector (15) may include a charge-coupled device (CCD), complementary metal-oxide silicon (CMOS) sensor or a similar type of detector or sensor capable of capturing image data.
Sensor (15) is capable of filtering unwanted ‘noise’ by operating at a limited or optimized sensitivity response similar to or equal to the illumination source (16) wavelength either by employing a specific bandwidth sensor, utilizing band-pass filters or a combination of both. Light beyond the frequency response bandwidth of the sensor is ignored or minimized, diminishing background interference and recognizing only intentional input (14). The coordinate in space where the intrusion is lit by the illumination source corresponds to an analogous two or three-dimensional location within a computer environment, such as in a graphic user interface (GUI) where the intrusion input (14) functions as a mouse cursor, analogous to a virtual touch-screen. The highlighted sensor captured coordinates are sent to controller (17), that read and interpret the highlighted input data using blob recognition or gesture recognition software at processing unit (6), or controller (17). Tracking software coupled for example with mouse emulation software instructs the operating system or application running on processing unit (6) to update the image, accordingly in the GUI. Other detection system variations comprise the use of ultrasonic detection, proximity based detection or radar based detection, all capable of sensing positional and translational information.
In its preferred embodiment, this invention operates solely on a power source independent of a water source by producing its own particle cloud material. By passing the surrounding air through a heat pump, air is cooled and drops below its dew point where condensate can be removed and collected for the cloud material. One method well known in the arts comprises a dehumidification process by which a compressor propels coolant through an evaporator coil for dropping the temperature of the coils or fins and allows moisture in the air to condense while the condenser expels heat. Another variation includes the use of a series of solid-state Peltier TEC modules, such as a sandwich of two ceramic plates with an array of small Bismuth Telluride (Bi2Te3) “couples” in between, which produce condensation that can be collected on the cold side. Other variations include extracting elements from the ambient air such as nitrogen or oxygen, as well as other gases, to manufacture supercooled gases or liquids by expansion, and as a result, create the thermal gap to generate the condensate cloud material. Another method includes electrochemical energy conversion, such as is employed in fuel cell technology, consisting of two electrodes sandwiched around an electrolyte in which water and electricity are produced. Oxygen passing over one electrode and hydrogen over the other generates electricity to run the device, water for the cloud material and heat as a by-product.
The particle cloud composition consists of a vast number of individual condensate spheres held together by surface tension with a mean diameter in the one to ten micron region, too small to be visible individually by a viewer, yet large enough to provide an illuminated cloud for imaging. The focus and controlled illumination intensity onto the overall cloud, allow the individual spheres to act as lenses, transmitting and focusing light at highest intensity on-axis, whereby the viewer positioned directly in front of both screen and projection source views the image at its brightest and clearest. In the multisource embodiment, the directing of light from multiple sources onto the particle cloud ensures that a clear image is viewable from all around, providing continuous on-axis viewing. The on-axis imaging transmissivity of the cloud screen coupled with the multisource projection insure a clear image, regardless of the viewer's position and compensates for any aberration caused by turbulent breakdown of the cloud. Intersecting light rays from multiple sources further maximize illumination at the intended image location by localizing the sum of illumination from each projection source striking the particle cloud imaging location. In this way, the illumination falloff beyond the particle cloud is minimized onto unintended surfaces beyond, as found in prior art where the majority of the light went through the screen and created a brighter picture on a surface beyond rather than on the intended particle cloud. Similarly, multisource projection further minimizes the individual projection source luminosity allowing the viewer to view directly on-axis without being inundated with a single high intensity projection source, as found in the prior art.
In an alternate embodiment, the particle cloud material can include fluorescence emissive additives or doped solutions, creating an up or down fluorescence conversion with a specific excitation source, utilizing a non-visible illumination source to generate a visible image. Soluble non-toxic additives injected into the cloud stream at any point in the process can include for example Rhodamine, or tracer dyes from Xanthane, each with specific absorption spectra excited by a cathode, laser, visible, (ultra-violet) UV or IR stimulation source. A tri-mixture of red, green and blue visibly emissive dyes, each excited by specific wavelength, generate a visible full spectra image. These additives have low absorption delay times and fluorescence lifetime in the nanosecond to microsecond region, preventing a blurry image from the dynamically moving screen and generating a high fluorescence yield for satisfactory imaging luminosity. An integrated or separate aspirator module collects the additives from the air and prevents these additive dyes from scattering into the surrounding air.
In prior art, lenticular screens have been utilized to selectively direct a predefined image by means of a lens screen so that a particular eye or position of the viewer will render a discrete image. Similarly, when this invention's particle cloud screen is illuminated by an intensity level below where internal refraction and reflection occur within each sphere, producing scattered diffused light rays, the individual particle spheres act as small lenslets performing the similar optical characteristics of lenticular imaging and allow the cloud to perform as a lenticular imaging system. This concept is further explained in
On-axis illumination intensity is determined by source intensity and the depth of the cloud which is represented in polar diagram
Maximizing condensate is critical as it is a high power demanding process. Increasing airflow and maximizing surface area of the evaporator are essential for ensuring constant operation and minimizing overload on the heat exchanger, TEC's or compressor. In a solid-state TEC embodiment, compressor (45) would be absent and evaporator (33) and condenser (41) would be replaced by the hot and cold sides of a TEC module, with appropriate heat sinks to collect moisture on the cold side and draw heat on the other side. Due to the time lag before condensate formation, vessel (43) allows the device to run for a duration while condensate is formed and collected. The stored condensate travels beyond check valve (51), controlling the appropriate quantity via sensor or switch (55) and enters nebulizing expansion chamber (52) for use in the particle cloud manufacturing process.
In the preferred embodiment, expansion chamber (52) employs electro-mechanical atomizing to vibrate a piezoelectric disk or transducer (53), oscillating ultrasonically and atomizing the condensate, generating a fine cloud mist of microscopic particulates for subsequent deployment. Alternate cloud mist generating techniques can be employed, including thermal foggers, thermal cooling using cryogenics, spray or atomizing nozzles, or additional means of producing a fine mist. The design of the chamber prevents larger particles from leaving expansion chamber (52), while allowing the mist to form within expansion chamber (52). A level sensor (55), such as a mechanical float switch or optical sensor, maintains a specific fluid level within expansion chamber (52) to keep the particulate production regulated. When the fluid surface (54) drops, valve (51) opens, thereby maintaining a predefined depth for optimized nebulization.
Fan or blower (56), injects air into chamber (52), mixing with the mist generated by nebulizer (53), and the air/mist mixture is ejected through center nozzle (57) at a velocity determined by the height required for creating particle cloud (58). Furthermore, nozzle (57) can comprise a tapered geometry so as to prevent fluid buildup at the lip of nozzle (57). Ejection nozzle (57) may have numerous different shapes, such as curved or cylindrical surfaces, to create numerous extruded particle cloud possibilities. Particle cloud (58) comprises a laminar, semi-laminar or turbulent flow for deployment as the particle cloud screen for imaging.
Fans (59 and 60) draw ambient air, or expelled air from the heat exchanger, through vents (61 and 88), comprising baffles, or vents, to produce a laminar protective air microenvironment (62, 63) enveloping cloud screen (58). For laminar particle cloud screen (58), this microenvironment improves boundary layer performance by decreasing boundary layer friction and improving the laminar quality of screen (58) for imaging.
It is important to note that in the prior art, a “Reynolds Number” was the determining factor for image quality and maximum size, but because this invention integrates multisource projection, the reliance on laminar quality is diminished. A “Reynolds Number” (R) determines whether the stream is laminar or not. Viscosity is (u), velocity (V), density (ρ) and thickness of the stream (D) determine the transition point between laminar and turbulent flow, which was the limiting factor in the prior art. Furthermore, the EMC continuously modifies the microenvironment and particle cloud ejection velocity to compensate for a change in particle cloud density in order to minimize the visibility of the cloud. The change in particle cloud density affects directly the viscosity of the cloud and therefore the ejection velocities must change accordingly to maximize the laminar flow.
The ejected particle cloud continues on trajectory (64) along a straight path producing the particle cloud surface or volume for imaging and eventually disperses at (85) and is not used for imaging purposes. Particulates of screen at (58) return to device (84) to create a continuous loop system. The particle cloud moisture laded air returns back into device (84) not impacting the moisture level in the room where the device is operating. The density of the cloud is continuously monitored for its invisibility by onboard environmental diagnostics management control EMC (66), which monitors ambient parameters including but not limited to, humidity, temperature and ambient luminosity, which factors are collected by a plurality of sensors (65). Sensors (65) can comprise for example, a photodiode or photo-sensor, temperature, barometric as well as other climactic sensors to collect data. Sensor information is interpreted by diagnostics management control (66), which adjusts the density of screen (58) by optimizing the intensity of particle cloud manufacturing at (53), and the luminosity of projection from source (69) with respect to ambient humidity and ambient luminosity to control invisibility of the cloud screen (58). A photo-emitter placed on one side of the particle cloud and photo-detector on the opposite side, can be employed to calculate the visibility of the cloud by monitoring the amount of light passing from emitter to detector thereby maximizing the invisibility of the cloud.
Images stored on an internal image or data storage device such as CD, programmable memory, CD, DVD, computer (67), or external computer, including ancillary external video-sources such as TV, DVD, or videogame (68), produce the raw image data that is formed on an image generating means (70). Image generating means (70) may include an LCD display, acousto-optical scanner, rotating mirror assembly, laser scanner, or DLP micromirror to produce and direct an image through optical focusing assembly (71).
Illumination source (69), within an electromagnetic spectrum, such as a halogen bulb, xenon-arc lamp, UV or IR lamp or LED's, directs a beam of emissive energy consisting of a mono or polychromatic, coherent, non-coherent, visible or invisible illumination, ultimately towards cloud screen (58). The illumination means can also comprise coherent as well as polychromatic light sources. In a substitute embodiment the illumination source consists of high intensity LED's or an RGB white light laser or single coherent source, where image-generating means (70) operates above or below the visible spectrum. Light directed from illumination source (69) towards an image generating means (70), passes through focusing optics (71), producing light rays (76) directed to an external location as a “phantom” delivery source location (77). Phantom source (77) may employ one or more optical elements including a mirror or prism (83) to redirect or steer the projection (79, 80) towards particle cloud (58).
Collimating optics such as a parabolic mirror, lenses, prisms or other optical elements may be employed at anamorphic correction optics (77 or 78) for compensating projection for off-axis keystoning in one or more axis. Furthermore, electronic keystone correction may be employed to control generator (71). Anamorphic correction optics (78) may also include beam-splitting means for directing the light source passing through the image generator to various sources such as source (77) positioned at a location around the perimeter of cloud (58) and collimate the beam until reaching source (77). Beam splitting can employ plate, cube beam-splitters or rotating scanning mirrors with electronic shutters or optical choppers dividing the original source projection into a plurality of projections. Projection beams (76) are steered towards a single or plurality of phantom sources or locations surrounding cloud (58) redirecting light rays (79, 80) onto said cloud (58) for imaging. Imaging light rays (81, 82) traveling beyond particle cloud (58) continue to falloff and, caused by the limited depth of field range of optics (71, 78, 83) thereby appear out of focus.
The detection system comprises illumination source (72), directing illumination beam (130) producing a single (131) or dual stripe plane of light (131, 132), in which an intrusion is captured by optical sensor (86) contained in the cone of vision of the sensor image boundary (133, 134) of cloud screen (58). Similarly, two separate sources may be employed to generate two separate planes or the device may operate utilizing exclusively one plane of light. When foreign object intrusion penetrates the planar light source (131, 132) parallel to the image, this illumination reflects off the intrusion and is captured by optical sensor (86). Detected information is sent via signal (135) to computer (67) running current software or operating system (OS) to update the image generator (70) according to the input information. The device may also include user audio feedback for recognizing the selection or interaction with the non-solid image thereby providing the necessary user haptic feedback.
In the preferred embodiment of the invention the detection system utilizes optical, machine vision means to capture physical intrusion within the detectable perimeter of the image, but may employ other detection methods. These include for example acoustic based detection methods such as ultrasonic detectors, illumination based methods such as IR detectors, to locate and position physical objects, such as a hand or finger, for real-time tracking purposes. The area in which the image is being composed is monitored for any foreign physical intrusion such as a finger, hand, pen or other physical object such as a surgical knife. The detectable space corresponds directly to an overlaid area of the image, allowing the image coupled with the detection system to serve as an I/O interface that can be manipulated through the use of a computer. To diminish external detection interference in its preferred embodiment, the detection system relies on an optical detector (86), operating at a narrow band within the invisible spectrum, minimizing captured ambient background light illuminating undesired background objects that are not related to the user input. The operating detection system wavelength furthermore, does not interfere with the imaging and remains unnoticed by the user. The preferred embodiment utilizes a narrow bandwidth illumination source (72), beyond the visible spectrum, such as infrared (IR) or near-infrared (NIR) illumination and subsequentially composed into a beam by collimating the illumination. The beam generated by a illumination source (72), is sent to one or a plurality of line generating means such as employing a line generating cylindrical lens or rotating mirror means to produce a single or dual illuminated plane of light (73, 74) coexisting spatially parallel to or on top of the image on cloud (58). This interactive process is described more clearly below.
Signal (228) attenuates particle cloud manufacturing density (216) by controlling the amount of particulates generated by regulating the supply voltage or current to the ultrasonic atomizer. Similarly, the signal (228) can vary the outlet opening of particulates escaping the expansion chamber thereby controlling the amount of particulates (217), ejected into the cloud (221). Since the amount of particulates ejected is directly proportional to the viscosity as defined in Reynolds Equation, modifying the particulate density (the amount of material into the air) requires a proportional change in both particle cloud ejection velocity (218) and microenvironment ejection velocity (219). Signal (228) controls this ejection velocity by varying fan speed, such as by utilizing pulse width modulation to alter exit velocities of particle cloud (221) and microenvironment (220).
Augmenting these detectors, or operating as a separate unit, a cloud visibility detector (224) comprising an illumination source (222), such as a photo emitter or laser and corresponding photo detector (223), such as a Cadmium sulfide photocell. Both detector (223) and illumination source (222), each disposed at opposite ends of the particle cloud are arranged so as to gather a known quantity of light from the illumination source (222) passing through the particle cloud (221) which is received by the opposite detector. The loss in signal strength to the light reflected off particle cloud (221) and not received by detector (223) corresponds to the density and therefore visibility of the cloud. This signal (225) can be sent to controller (214) to regulate density and velocity modifying visibility of cloud (221). Similarly, another method includes, an airborne particulate counter (226) to acquire particulate air sample data within the particle cloud (221) to determine the particle count corresponding to the density or visibility of particle cloud (221). Particulate data (227) is sent to controller (214), instructing (228), to adjust particle cloud manufacturing (216) and exit velocities (215) in the same way as the previously described methods.
The detection system is isolated for clearer explanation in
An illumination source (167), with a spectral output similar to the frequency response of the detector, such as an IR laser projecting a beam through a line generator and collimator (166), reflect off beam splitter (176) towards mirror (165) and mirror (108), into two separate IR light planes (109 and 177). Line generating techniques, well known in the art to create a plane of light, such as those employing rotating mirrors or cylindrical lenses, such as Ohmori's U.S. Pat. No. 5,012,485 can be employed are employed at (108, 165). Finger (111) intersects with beam (109) reflecting light back to detector (159) for real-time capture. Similarly, finger (112) intersecting both beams (109 and 177), reflects two separate highlights captured by detector (159). In another embodiment each detectable light plane functions at a different wavelength. Similarly, the invention can operate using a single detectable light plane and utilize dwell software, well-known in the art, or create a selection by penetrating the plane twice in rapid succession to “double click”, as in a computer OS.
While a description of the preferred embodiment of the present invention has been given, further modifications and alterations will occur to those skilled in the art. It is therefore understood that all such modifications and alterations be considered as within the spirit and scope of the invention as defined by the appended claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US620592||Dec 14, 1898||Mar 7, 1899||Ornamental fountain|
|US3334816||Nov 30, 1964||Aug 8, 1967||Kurita Industrial Co Ltd||Apparatus for projecting an image on a jet of water|
|US3901443||Jan 7, 1974||Aug 26, 1975||Tdk Electronics Co Ltd||Ultrasonic wave nebulizer|
|US4974779||Apr 12, 1989||Dec 4, 1990||Ishikzwajima-Harima Heavy Industries Co., Ltd.||Screen forming apparatus and method|
|US5012485||Sep 8, 1989||Apr 30, 1991||Minolta Camera Kabushiki Kaisha||Laser beam scanning apparatus|
|US5067653||Jun 12, 1990||Nov 26, 1991||Ishikawajima-Harima Heavy Industries Co., Ltd.||Screen forming apparatus and method|
|US5095386||May 1, 1990||Mar 10, 1992||Charles Lescrenier||Optical system for generating lines of light using crossed cylindrical lenses|
|US5168531||Jun 27, 1991||Dec 1, 1992||Digital Equipment Corporation||Real-time recognition of pointing information from video|
|US5270752||Dec 4, 1992||Dec 14, 1993||Ushio U-Tech Inc.||Method and apparatus for a fog screen and image-forming method using the same|
|US5311335||Feb 24, 1992||May 10, 1994||Crabtree Allen F||Method for generating holographic images|
|US5445322||Oct 21, 1994||Aug 29, 1995||Aquatique U.S.A.||Apparatus for projecting water to form an insubstantial screen for receiving images|
|US5553459||Jul 26, 1994||Sep 10, 1996||The Watermarker Corp.||Water recovery device for reclaiming and refiltering atmospheric water|
|US5669221||Apr 8, 1996||Sep 23, 1997||Worldwide Water, Inc.||Portable, potable water recovery and dispensing apparatus|
|US5767842||Apr 21, 1995||Jun 16, 1998||International Business Machines Corporation||Method and device for optical input of commands or data|
|US5989128||Jan 8, 1998||Nov 23, 1999||Universal Studios, Inc.||Flame simulation|
|US6058718||May 29, 1998||May 9, 2000||Forsberg; Francis C||Portable, potable water recovery and dispensing apparatus|
|US6076931||Nov 14, 1997||Jun 20, 2000||Aurora Systems, Inc.||De-centered lens group for use in an off-axis projector|
|US6195069||Nov 18, 1993||Feb 27, 2001||Pinecone Imaging Corporation||Method and apparatus for 3-dimensional motion picture display|
|US6243054||Mar 31, 2000||Jun 5, 2001||Deluca Michael||Stereoscopic user interface method and apparatus|
|US6300986||Oct 2, 1997||Oct 9, 2001||Adrian Robert Leigh Travis||Flat-panel display|
|US6329987||Dec 2, 1998||Dec 11, 2001||Phil Gottfried||Lenticular image and method|
|US20040046747 *||Sep 26, 2001||Mar 11, 2004||Eugenio Bustamante||Providing input signals|
|US20040080820 *||Jan 15, 2002||Apr 29, 2004||Karri Palovuori||Method and apparatus for forming a projection screen or a projection volume|
|WO2002056111A1||Jan 15, 2002||Jul 18, 2002||Karri Palovuori||Method and apparatus for forming a projection screen or a projection volume|
|1||Rakkolainen et al-"WAVE-A Walk-thru Virtual Environment" paper.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7222966 *||Jan 21, 2004||May 29, 2007||Microsoft Corporation||Projection system and method|
|US7400342 *||Nov 21, 2003||Jul 15, 2008||Carl-Zeiss-Stiftung||Optical observation apparatus with video device|
|US7401924||May 24, 2007||Jul 22, 2008||Microsoft Corporation||Projection system and method|
|US7460282 *||Apr 30, 2003||Dec 2, 2008||Texas Instruments Incorporated||Dynamic pattern generation for optical signal processing|
|US7593159 *||Sep 30, 2004||Sep 22, 2009||Panasonic Corporation||Display device|
|US7673994 *||Jul 24, 2006||Mar 9, 2010||Seiko Epson Corporation||Image display apparatus and control method for the same|
|US7675513||Dec 2, 2008||Mar 9, 2010||Evans & Sutherland Computer Corp.||System and method for displaying stereo images|
|US7710643||Sep 26, 2007||May 4, 2010||Alion Science And Technology Corporation||Apparatus for and method of delivering visual image into air|
|US7733298 *||Oct 19, 2004||Jun 8, 2010||Hewlett-Packard Development Company, L.P.||Display device|
|US7763841||May 27, 2009||Jul 27, 2010||Microsoft Corporation||Optical component for a depth sensor|
|US7949202 *||Feb 10, 2006||May 24, 2011||Seiko Epson Corporation||Image processing system, projector, and image processing method|
|US8042748||Mar 2, 2009||Oct 25, 2011||Zodiac Pool Systems, Inc.||Surface disruptor for laminar jet fountain|
|US8139110||Nov 1, 2007||Mar 20, 2012||Northrop Grumman Systems Corporation||Calibration of a gesture recognition interface system|
|US8166421||Jan 13, 2009||Apr 24, 2012||Primesense Ltd.||Three-dimensional user interface|
|US8177141||Dec 19, 2008||May 15, 2012||Zodiac Pool Systems, Inc.||Laminar deck jet|
|US8180114||Jun 5, 2008||May 15, 2012||Northrop Grumman Systems Corporation||Gesture recognition interface system with vertical display|
|US8207651||Sep 16, 2009||Jun 26, 2012||Tyco Healthcare Group Lp||Low energy or minimum disturbance method for measuring frequency response functions of ultrasonic surgical devices in determining optimum operating point|
|US8234578||Jul 25, 2006||Jul 31, 2012||Northrop Grumman Systems Corporatiom||Networked gesture collaboration system|
|US8249334||May 10, 2007||Aug 21, 2012||Primesense Ltd.||Modeling of humanoid forms from depth maps|
|US8262236||Jun 30, 2008||Sep 11, 2012||The Invention Science Fund I, Llc||Systems and methods for transmitting information associated with change of a projection surface|
|US8267526||Oct 27, 2008||Sep 18, 2012||The Invention Science Fund I, Llc||Methods associated with receiving and transmitting information related to projection|
|US8279168||Dec 7, 2006||Oct 2, 2012||Edge 3 Technologies Llc||Three-dimensional virtual-touch human-machine interface system and method therefor|
|US8308304||Oct 27, 2008||Nov 13, 2012||The Invention Science Fund I, Llc||Systems associated with receiving and transmitting information related to projection|
|US8339379||Feb 15, 2009||Dec 25, 2012||Neonode Inc.||Light-based touch screen|
|US8345920||Jun 20, 2008||Jan 1, 2013||Northrop Grumman Systems Corporation||Gesture recognition interface system with a light-diffusive screen|
|US8376558||Jun 30, 2008||Feb 19, 2013||The Invention Science Fund I, Llc||Systems and methods for projecting in response to position change of a projection surface|
|US8384005||Jul 11, 2008||Feb 26, 2013||The Invention Science Fund I, Llc||Systems and methods for selectively projecting information in response to at least one specified motion associated with pressure applied to at least one projection surface|
|US8390169||Feb 20, 2012||Mar 5, 2013||Covidien Lp||Low energy or minimum disturbance method for measuring frequency response functions of ultrasonic surgical devices in determining optimum operating point|
|US8396252||May 20, 2010||Mar 12, 2013||Edge 3 Technologies||Systems and related methods for three dimensional gesture recognition in vehicles|
|US8403501||Jun 30, 2008||Mar 26, 2013||The Invention Science Fund, I, LLC||Motion responsive devices and systems|
|US8406859||Aug 10, 2009||Mar 26, 2013||Board Of Regents, The University Of Texas System||Digital light processing hyperspectral imaging apparatus|
|US8416217||Mar 20, 2012||Apr 9, 2013||Neonode Inc.||Light-based finger gesture user interface|
|US8430515||Jun 30, 2008||Apr 30, 2013||The Invention Science Fund I, Llc||Systems and methods for projecting|
|US8432448||Aug 10, 2006||Apr 30, 2013||Northrop Grumman Systems Corporation||Stereo camera intrusion detection system|
|US8467599||Aug 31, 2011||Jun 18, 2013||Edge 3 Technologies, Inc.||Method and apparatus for confusion learning|
|US8471830||Jul 6, 2007||Jun 25, 2013||Neonode Inc.||Scanning of a touch screen|
|US8480086 *||Mar 26, 2010||Jul 9, 2013||Universal Entertainment Corporation||Gaming device that intercepts light|
|US8523087||Oct 24, 2011||Sep 3, 2013||Zodiac Pool Systems, Inc.||Surface disruptor for laminar jet fountain|
|US8540381||Jun 30, 2008||Sep 24, 2013||The Invention Science Fund I, Llc||Systems and methods for receiving information associated with projecting|
|US8565479||Aug 11, 2010||Oct 22, 2013||Primesense Ltd.||Extraction of skeletons from 3D maps|
|US8567954 *||Jun 30, 2011||Oct 29, 2013||Disney Enterprises, Inc.||3D display system with rear projection screens formed of water mist or spray|
|US8569925||Mar 4, 2013||Oct 29, 2013||Covidien Lp||Low energy or minimum disturbance method for measuring frequency response functions of ultrasonic surgical devices in determining optimum operating point|
|US8582866||Feb 10, 2011||Nov 12, 2013||Edge 3 Technologies, Inc.||Method and apparatus for disparity computation in stereo images|
|US8582867||Sep 11, 2011||Nov 12, 2013||Primesense Ltd||Learning-based pose estimation from depth maps|
|US8589824||Jul 13, 2006||Nov 19, 2013||Northrop Grumman Systems Corporation||Gesture recognition interface system|
|US8594425||Aug 11, 2010||Nov 26, 2013||Primesense Ltd.||Analysis of three-dimensional scenes|
|US8602564||Aug 22, 2008||Dec 10, 2013||The Invention Science Fund I, Llc||Methods and systems for projecting in response to position|
|US8608321||Jun 30, 2008||Dec 17, 2013||The Invention Science Fund I, Llc||Systems and methods for projecting in response to conformation|
|US8625855||Feb 7, 2013||Jan 7, 2014||Edge 3 Technologies Llc||Three dimensional gesture recognition in vehicles|
|US8641203||Jul 28, 2008||Feb 4, 2014||The Invention Science Fund I, Llc||Methods and systems for receiving and transmitting signals between server and projector apparatuses|
|US8644599||May 20, 2013||Feb 4, 2014||Edge 3 Technologies, Inc.||Method and apparatus for spawning specialist belief propagation networks|
|US8655093||Feb 10, 2011||Feb 18, 2014||Edge 3 Technologies, Inc.||Method and apparatus for performing segmentation of an image|
|US8666144||Feb 10, 2011||Mar 4, 2014||Edge 3 Technologies, Inc.||Method and apparatus for determining disparity of texture|
|US8674966||Mar 20, 2012||Mar 18, 2014||Neonode Inc.||ASIC controller for light-based touch screen|
|US8705877||Nov 15, 2011||Apr 22, 2014||Edge 3 Technologies, Inc.||Method and apparatus for fast computational stereo|
|US8718387||Dec 12, 2011||May 6, 2014||Edge 3 Technologies, Inc.||Method and apparatus for enhanced stereo vision|
|US8723787||May 12, 2009||May 13, 2014||The Invention Science Fund I, Llc||Methods and systems related to an image capture projection surface|
|US8733952||Feb 27, 2009||May 27, 2014||The Invention Science Fund I, Llc||Methods and systems for coordinated use of two or more user responsive projectors|
|US8740391 *||Mar 13, 2013||Jun 3, 2014||Eski Inc.||Devices and methods for providing a distributed manifestation in an environment|
|US8761509||Nov 15, 2011||Jun 24, 2014||Edge 3 Technologies, Inc.||Method and apparatus for fast computational stereo|
|US8775023||Nov 25, 2013||Jul 8, 2014||Neanode Inc.||Light-based touch controls on a steering wheel and dashboard|
|US8781217||Apr 21, 2013||Jul 15, 2014||Primesense Ltd.||Analysis of three-dimensional scenes with a surface model|
|US8787663||Feb 28, 2011||Jul 22, 2014||Primesense Ltd.||Tracking body parts by combined color image and depth processing|
|US8798358||Oct 9, 2013||Aug 5, 2014||Edge 3 Technologies, Inc.||Apparatus and method for disparity map generation|
|US8810551||Mar 30, 2013||Aug 19, 2014||Neonode Inc.||Finger gesture user interface|
|US8820939||Sep 30, 2008||Sep 2, 2014||The Invention Science Fund I, Llc||Projection associated methods and systems|
|US8824737||Apr 21, 2013||Sep 2, 2014||Primesense Ltd.||Identifying components of a humanoid form in three-dimensional scenes|
|US8857999||Aug 22, 2008||Oct 14, 2014||The Invention Science Fund I, Llc||Projection in response to conformation|
|US8872762||Dec 8, 2011||Oct 28, 2014||Primesense Ltd.||Three dimensional user interface cursor control|
|US8881051||Jul 5, 2012||Nov 4, 2014||Primesense Ltd||Zoom-based gesture user interface|
|US8884926||Jul 15, 2014||Nov 11, 2014||Neonode Inc.||Light-based finger gesture user interface|
|US8891859||Jan 1, 2014||Nov 18, 2014||Edge 3 Technologies, Inc.||Method and apparatus for spawning specialist belief propagation networks based upon data classification|
|US8907894||Mar 25, 2010||Dec 9, 2014||Northridge Associates Llc||Touchless pointing device|
|US8918252||Jun 24, 2014||Dec 23, 2014||Neonode Inc.||Light-based touch controls on a steering wheel|
|US8933876||Dec 8, 2011||Jan 13, 2015||Apple Inc.||Three dimensional user interface session control|
|US8936367||Jul 11, 2008||Jan 20, 2015||The Invention Science Fund I, Llc||Systems and methods associated with projecting in response to conformation|
|US8939586||Jul 11, 2008||Jan 27, 2015||The Invention Science Fund I, Llc||Systems and methods for projecting in response to position|
|US8944608||Jul 11, 2008||Feb 3, 2015||The Invention Science Fund I, Llc||Systems and methods associated with projecting in response to conformation|
|US8955984||Sep 30, 2008||Feb 17, 2015||The Invention Science Fund I, Llc||Projection associated methods and systems|
|US8959013||Sep 25, 2011||Feb 17, 2015||Apple Inc.||Virtual keyboard for a non-tactile three dimensional user interface|
|US8970589||Jul 24, 2011||Mar 3, 2015||Edge 3 Technologies, Inc.||Near-touch interaction with a stereo camera grid structured tessellations|
|US8972902||Aug 22, 2008||Mar 3, 2015||Northrop Grumman Systems Corporation||Compound gesture recognition|
|US8983178||Oct 9, 2013||Mar 17, 2015||Edge 3 Technologies, Inc.||Apparatus and method for performing segment-based disparity decomposition|
|US9002099||Mar 6, 2013||Apr 7, 2015||Apple Inc.||Learning-based estimation of hand and finger pose|
|US9019267||Oct 30, 2012||Apr 28, 2015||Apple Inc.||Depth mapping with enhanced resolution|
|US9030498||Aug 14, 2012||May 12, 2015||Apple Inc.||Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface|
|US9035876||Oct 17, 2013||May 19, 2015||Apple Inc.||Three-dimensional user interface session control|
|US9035917||Feb 16, 2014||May 19, 2015||Neonode Inc.||ASIC controller for light-based sensor|
|US9047507||May 2, 2012||Jun 2, 2015||Apple Inc.||Upper-body skeleton extraction from depth maps|
|US9052777||Mar 20, 2012||Jun 9, 2015||Neonode Inc.||Optical elements with alternating reflective lens facets|
|US9092093||Jan 6, 2015||Jul 28, 2015||Neonode Inc.||Steering wheel user interface|
|US9122311||Aug 23, 2012||Sep 1, 2015||Apple Inc.||Visual feedback for tactile and non-tactile user interfaces|
|US9134599||Sep 25, 2012||Sep 15, 2015||Pentair Water Pool And Spa, Inc.||Underwater image projection controller with boundary setting and image correction modules and interface and method of using same|
|US9152853||Dec 2, 2013||Oct 6, 2015||Edge 3Technologies, Inc.||Gesture recognition in vehicles|
|US9158375||Dec 23, 2012||Oct 13, 2015||Apple Inc.||Interactive reality augmentation for natural interaction|
|US9164654||Jun 17, 2009||Oct 20, 2015||Neonode Inc.||User interface for mobile computer unit|
|US9201501||Dec 23, 2012||Dec 1, 2015||Apple Inc.||Adaptive projector|
|US9213443||Apr 15, 2010||Dec 15, 2015||Neonode Inc.||Optical touch screen systems using reflected light|
|US9218063||Aug 23, 2012||Dec 22, 2015||Apple Inc.||Sessionless pointing user interface|
|US9229311 *||Sep 28, 2014||Jan 5, 2016||Active Ion Displays, Inc.||Projection display device with vapor medium screen|
|US9229534||Feb 27, 2013||Jan 5, 2016||Apple Inc.||Asymmetric mapping for tactile and non-tactile user interfaces|
|US9262074||Sep 28, 2014||Feb 16, 2016||Neonode, Inc.||Finger gesture user interface|
|US9285874||Feb 9, 2012||Mar 15, 2016||Apple Inc.||Gaze detection in a 3D mapping environment|
|US9286028||May 7, 2014||Mar 15, 2016||Eski Inc.||Devices and methods for providing a distributed manifestation in an environment|
|US9323395||Jan 20, 2015||Apr 26, 2016||Edge 3 Technologies||Near touch interaction with structured light|
|US9324154||Mar 27, 2014||Apr 26, 2016||Edge 3 Technologies||Method and apparatus for enhancing stereo vision through image segmentation|
|US9342146||Aug 7, 2013||May 17, 2016||Apple Inc.||Pointing-based display interaction|
|US9360746||Feb 10, 2015||Jun 7, 2016||Pentair Water Pool And Spa, Inc.||Underwater image projection system and method|
|US9377863||Mar 24, 2013||Jun 28, 2016||Apple Inc.||Gaze-enhanced virtual touchscreen|
|US9377865||May 29, 2013||Jun 28, 2016||Apple Inc.||Zoom-based gesture user interface|
|US9377874||Nov 2, 2007||Jun 28, 2016||Northrop Grumman Systems Corporation||Gesture recognition light and video image projector|
|US9389710||Nov 24, 2014||Jul 12, 2016||Neonode Inc.||Light-based controls on a toroidal steering wheel|
|US9417700||May 20, 2010||Aug 16, 2016||Edge3 Technologies||Gesture recognition systems and related methods|
|US9423608||Sep 25, 2012||Aug 23, 2016||Pentair Water Pool And Spa, Inc.||Multidimensional rotary motion apparatus moving a reflective surface and method of operating same|
|US9435997||Aug 1, 2013||Sep 6, 2016||Pentair Water Pool And Spa, Inc.||Multidimensional rotary motion apparatus moving a reflective surface and method of operating same|
|US9437002||Sep 25, 2014||Sep 6, 2016||Elwha Llc||Systems and methods for a dual modality sensor system|
|US9454225||Aug 7, 2013||Sep 27, 2016||Apple Inc.||Gaze-based display control|
|US9459758||May 29, 2013||Oct 4, 2016||Apple Inc.||Gesture-based interface with enhanced features|
|US20030231365 *||Apr 30, 2003||Dec 18, 2003||So John Ling Wing||Dynamic pattern generation for optical signal processing|
|US20040104998 *||Nov 21, 2003||Jun 3, 2004||Carl-Zeiss-Stiftung Trading As Carl Zeiss||Optical observation apparatus with video device|
|US20050157262 *||Jan 21, 2004||Jul 21, 2005||James Reichert||Projection system and method|
|US20060082874 *||Oct 19, 2004||Apr 20, 2006||Anderson Daryl E||Display device|
|US20060181687 *||Feb 10, 2006||Aug 17, 2006||Seiko Epson Corporation||Image processing system, projector, and image processing method|
|US20070035826 *||Sep 30, 2004||Feb 15, 2007||Toshifumi Yokoyama||Display device|
|US20070046903 *||Jul 24, 2006||Mar 1, 2007||Seiko Epson Corporation||Image display apparatus and control method for the same|
|US20070132721 *||Dec 7, 2006||Jun 14, 2007||Edge 3 Technologies Llc||Three-Dimensional Virtual-Touch Human-Machine Interface System and Method Therefor|
|US20070197274 *||Mar 27, 2007||Aug 23, 2007||Dugan Brian M||Systems and methods for improving fitness equipment and exercise|
|US20070216870 *||May 24, 2007||Sep 20, 2007||Microsoft Corporation||Projection system and method|
|US20080013826 *||Jul 13, 2006||Jan 17, 2008||Northrop Grumman Corporation||Gesture recognition interface system|
|US20080028325 *||Jul 25, 2006||Jan 31, 2008||Northrop Grumman Corporation||Networked gesture collaboration system|
|US20080043106 *||Aug 10, 2006||Feb 21, 2008||Northrop Grumman Corporation||Stereo camera intrusion detection system|
|US20080180798 *||Sep 26, 2007||Jul 31, 2008||Alion Science And Technology Corporation||Apparatus for and method of delivering visual image into air|
|US20080244468 *||Jun 5, 2008||Oct 2, 2008||Nishihara H Keith||Gesture Recognition Interface System with Vertical Display|
|US20090103780 *||Dec 17, 2008||Apr 23, 2009||Nishihara H Keith||Hand-Gesture Recognition Method|
|US20090115721 *||Nov 2, 2007||May 7, 2009||Aull Kenneth W||Gesture Recognition Light and Video Image Projector|
|US20090116742 *||Nov 1, 2007||May 7, 2009||H Keith Nishihara||Calibration of a Gesture Recognition Interface System|
|US20090183125 *||Jan 13, 2009||Jul 16, 2009||Prime Sense Ltd.||Three-dimensional user interface|
|US20090189878 *||Feb 15, 2009||Jul 30, 2009||Neonode Inc.||Light-based touch screen|
|US20090231331 *||Dec 2, 2008||Sep 17, 2009||Evans & Sutherland Computer Corporation||System and method for displaying stereo images|
|US20090309718 *||Jul 11, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods associated with projecting in response to conformation|
|US20090309826 *||Jun 17, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and devices|
|US20090310035 *||Jul 28, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for receiving and transmitting signals associated with projection|
|US20090310036 *||Aug 22, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for projecting in response to position|
|US20090310039 *||Jan 27, 2009||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for user parameter responsive projection|
|US20090310088 *||Jun 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods for projecting|
|US20090310093 *||Jun 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods for projecting in response to conformation|
|US20090310094 *||Jul 11, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods for projecting in response to position|
|US20090310095 *||Jul 11, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods associated with projecting in response to conformation|
|US20090310096 *||Jul 11, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of Delaware||Systems and methods for transmitting in response to position|
|US20090310097 *||Aug 22, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Projection in response to conformation|
|US20090310098 *||Aug 22, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for projecting in response to conformation|
|US20090310099 *||Oct 27, 2008||Dec 17, 2009||Searete Llc,||Methods associated with receiving and transmitting information related to projection|
|US20090310101 *||Sep 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Projection associated methods and systems|
|US20090310102 *||Sep 30, 2008||Dec 17, 2009||Searete Llc.||Projection associated methods and systems|
|US20090310103 *||Feb 27, 2009||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for receiving information associated with the coordinated use of two or more user responsive projectors|
|US20090310104 *||Feb 27, 2009||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for coordinated use of two or more user responsive projectors|
|US20090310144 *||Jun 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods for transmitting information associated with projecting|
|US20090311965 *||Oct 27, 2008||Dec 17, 2009||Searete Llc,||Systems associated with receiving and transmitting information related to projection|
|US20090313150 *||Oct 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods associated with projection billing|
|US20090313151 *||Oct 30, 2008||Dec 17, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods associated with projection system billing|
|US20090316952 *||Jun 20, 2008||Dec 24, 2009||Bran Ferren||Gesture recognition interface system with a light-diffusive screen|
|US20090324138 *||May 12, 2009||Dec 31, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems related to an image capture projection surface|
|US20090326681 *||Jun 30, 2008||Dec 31, 2009||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Systems and methods for projecting in response to position|
|US20100017872 *||Jun 17, 2009||Jan 21, 2010||Neonode Technologies||User interface for mobile computer unit|
|US20100034457 *||May 10, 2007||Feb 11, 2010||Tamir Berliner||Modeling of humanoid forms from depth maps|
|US20100050133 *||Aug 22, 2008||Feb 25, 2010||Nishihara H Keith||Compound Gesture Recognition|
|US20100056928 *||Aug 10, 2009||Mar 4, 2010||Karel Zuzak||Digital light processing hyperspectral imaging apparatus|
|US20100066689 *||Jul 2, 2009||Mar 18, 2010||Jung Edward K Y||Devices related to projection input surfaces|
|US20100066983 *||Jul 2, 2009||Mar 18, 2010||Jun Edward K Y||Methods and systems related to a projection surface|
|US20100155497 *||Dec 19, 2008||Jun 24, 2010||Zodiac Pool Systems, Inc.||Laminar Deck Jet|
|US20100155498 *||Mar 2, 2009||Jun 24, 2010||Zodiac Pool Systems, Inc.||Surface disruptor for laminar jet fountain|
|US20100184511 *||Mar 26, 2010||Jul 22, 2010||Universal Entertainment Corporation||Gaming device|
|US20100235786 *||Mar 11, 2010||Sep 16, 2010||Primesense Ltd.||Enhanced 3d interfacing for remote devices|
|US20100238138 *||Apr 15, 2010||Sep 23, 2010||Neonode Inc.||Optical touch screen systems using reflected light|
|US20100295783 *||May 20, 2010||Nov 25, 2010||Edge3 Technologies Llc||Gesture recognition systems and related methods|
|US20110043485 *||Jul 6, 2007||Feb 24, 2011||Neonode Inc.||Scanning of a touch screen|
|US20110052006 *||Aug 11, 2010||Mar 3, 2011||Primesense Ltd.||Extraction of skeletons from 3d maps|
|US20110090147 *||Mar 25, 2010||Apr 21, 2011||Qualstar Corporation||Touchless pointing device|
|US20110164032 *||Jan 7, 2010||Jul 7, 2011||Prime Sense Ltd.||Three-Dimensional User Interface|
|US20110176119 *||Aug 22, 2008||Jul 21, 2011||Searete Llc, A Limited Liability Corporation Of The State Of Delaware||Methods and systems for projecting in response to conformation|
|US20110211754 *||Feb 28, 2011||Sep 1, 2011||Primesense Ltd.||Tracking body parts by combined color image and depth processing|
|US20120154760 *||Oct 7, 2011||Jun 21, 2012||Kaz Usa, Inc.||Humidifying device with a projection mechanism|
|US20130003020 *||Jun 30, 2011||Jan 3, 2013||Disney Enterprises, Inc.||3d display system with rear projection screens formed of water mist or spray|
|US20150092266 *||Sep 28, 2014||Apr 2, 2015||Active Ion Displays, Inc.||Projection display device with vapor medium screen|
|US20160042553 *||Feb 13, 2015||Feb 11, 2016||Pixar||Generating a Volumetric Projection for an Object|
|DE102012106181A1||Jul 10, 2012||Jan 16, 2014||Miele & Cie. Kg||Household appliance for use as dishwashers, refrigerators or freezers, has display device for displaying appliance information, where display device has projector for projecting appliance information on projection area|
|WO2012131554A2||Mar 23, 2012||Oct 4, 2012||Manfredo Giuseppe Mario Ferrari||Improved equipment for generating a free air volume suitable for projecting holographic images|
|WO2013100786A1 *||Dec 30, 2011||Jul 4, 2013||Obshestvo S Ogranichennoi Otvetstvennostju «Cachalot»||Method and apparatus for producing a non-solid-state projection screen|
|WO2014046566A1 *||Jan 21, 2013||Mar 27, 2014||"Displair" Limited Liability Company||Method and device for forming an aerosol projection screen|
|U.S. Classification||353/28, 353/62, 353/122, 239/522, 239/20, 239/590.5, 359/460, 239/275, 239/18, 359/443|
|International Classification||G09G5/00, G09G3/02, G09G3/20, G03B21/60, G09F9/00, G03B21/62, G09F19/18, G03B21/00|
|Cooperative Classification||G09F19/18, G03B21/608|
|European Classification||G09F19/18, G03B21/60|
|May 7, 2003||AS||Assignment|
Owner name: IO2 TECHNOLOGY, LLC, DELAWARE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DYNER, CHAD D.;REEL/FRAME:014064/0332
Effective date: 20030506
|Sep 1, 2008||REMI||Maintenance fee reminder mailed|
|Sep 19, 2008||SULP||Surcharge for late payment|
|Sep 19, 2008||FPAY||Fee payment|
Year of fee payment: 4
|Oct 8, 2012||REMI||Maintenance fee reminder mailed|
|Feb 22, 2013||LAPS||Lapse for failure to pay maintenance fees|
|Apr 16, 2013||FP||Expired due to failure to pay maintenance fee|
Effective date: 20130222