|Publication number||US7242466 B2|
|Application number||US 10/814,517|
|Publication date||Jul 10, 2007|
|Filing date||Mar 31, 2004|
|Priority date||Mar 31, 2004|
|Also published as||US20050225749|
|Publication number||10814517, 814517, US 7242466 B2, US 7242466B2, US-B2-7242466, US7242466 B2, US7242466B2|
|Inventors||Yuan Kong, Glen Larsen|
|Original Assignee||Microsoft Corporation|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (49), Non-Patent Citations (7), Referenced by (4), Classifications (7), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
Embodiments of the present invention relate to the field of computer input devices, and particularly pointing devices, employing light striking an encoded surface for identifying absolute position on the encoded surface and/or relative movement on the encoded surface for providing computer input.
Utilizing a visible-light laser pointer to designate a position on a display is a known means for providing input to a control device such as a computer. For example, a display, such as an image projected on a screen, may include particular command areas, viewable by the user, and corresponding to particular commands. A camera is directed toward the display to provide image information to a computer, or controller. Thus, when a user directs a visible-light laser beam toward the display, specifically toward a particular command area, the controller can compare the projected image and the detected image to determine the difference between the two images, which corresponds to the location on the display where the laser beam strikes the display. If the laser beam is striking one of the command areas, the controller can execute a command associated with the command area. An example of such a command is to change the image on the display. Such devices are limited to referencing laser beam position on projected images only. The projected image must be stored within the controller for comparison with the laser-modified image. Such devices are not capable of determining a position of a laser spot on a surface wherein the unaltered image of the surface is not stored within the controller.
Another known input device determines the location of an infrared laser beam striking a display by utilizing an infrared camera in fixed relative position to the display. The infrared camera sends an infrared video signal to an image processing unit to digitize the video signal to determine the location of the infrared laser spot on the display. In addition, the device has the capability of sensing the location of multiple lasers pointed at the display simultaneously. Each laser may have a particular shape (e.g., three spots, a plus sign), also detectable by the infrared camera, such that a particular cursor associated with each particular pointer may be imaged upon the display in a position corresponding to the location of the particular laser pointer. Such devices are limited to circumstances wherein an infrared camera may be placed in fixed relation to the display for the determination of the absolute position of the laser spot on the display.
Yet another known device eliminates the camera of the above-noted examples and utilizes a detector located on the axis of an image projector. The device collects light from each on-screen pixel via a single detector during a scanning procedure, whereby the value of each pixel is collected individually, utilizing a pivoting mirror. During scanning, the detector can determine the presence and location of a spot illuminated by the laser pointer. Like the previous devices, such a device is useful for determining the location of an illuminated spot within a projected image, but is not useful in determining the location of an illuminated spot on any surface.
Another known input device utilizes a pen-shaped device for use with a patterned writing surface for detecting the location of the device with respect to the patterned surface. The patterned surface includes features that reflect light, such as infrared light, whereas the remaining areas of the patterned surface do not reflect such light. For example, the stylus includes an infrared light-emitting diode for projecting infrared light onto the patterned surface and a sensor sensitive to infrared light for detecting infrared light reflected by the features of patterned surface. Thus, when the device projects infrared light over a particular area of the patterned surface, the sensor detects a reflected sub-pattern of the patterned surface, which corresponds to the position of the device with respect to the patterned surface. By processing this sub-pattern, the location of the device with respect to the patterned surface may be determined. By determining this location, the location and movement of the device over the surface may be determined. Such a device is useful for handwriting recognition, for example, because the device must abut against, or be held a short distance from, the patterned surface in order to determine the position of the device. But such a device is not generally useful for pointing to a location on a patterned surface remote from the device some distance.
Accordingly, an improved pointing system is desired to address one or more of these and other disadvantages. Aspects of such a pointing system involve an encoded surface and a device having a collimated light source, a detector, and a controller, for pointing the device and a light beam having a wavelength outside the visible light spectrum toward the encoded surface for identifying a position on the encoded surface and determining the position where the collimated light beam strikes the encoded surface corresponding to where the device is pointing. In particular, embodiments of this invention relate to pointing systems, pointing devices, and methods capable of projecting a collimated light beam onto the encoded surface to scatter the collimated light beam, detecting at least a portion of the scattered light, and responding to the detected portion of the scattered light to determine the absolute position where the collimated light beam strikes the encoded surface and/or any relative movement of the position where the collimated light beam strikes the encoded surface, which corresponds to where the device is pointing or any relative movement of where the device is pointing, respectively. Moreover, the features of the present invention described herein are less laborious and easier to implement than currently available techniques as well as being economically feasible and commercially practical.
In accordance with one aspect of the invention, a pointing system has an encoded surface and a pointing device for use with the encoded surface, wherein the device is remote from the encoded surface during pointing. The pointing device includes a collimated light source for projecting a collimated light beam onto the encoded surface. The encoded surface scatters the collimated light beam striking the encoded surface. A detector associated with the collimated light source detects at least a portion of the scattered light. A controller is associated with the detector and configured to respond to the detected portion of the scattered light to determine a position where the collimated light beam strikes the encoded surface. The position corresponds to where the device is pointing.
In another aspect of the invention, a pointing device for use with an encoded surface remote from the pointing device has a collimated light source, a detector, and a controller generally as set forth above. The device further comprises a housing, wherein the collimated light source and the detector mount on the housing.
In yet another aspect of the invention, a method determines a position where a collimated light beam of a pointing device strikes an encoded surface remote from the pointing device, the location corresponding to where the device is pointing. The method includes projecting the collimated light beam from the pointing device onto the encoded surface. The encoded surface has light-scattering properties for scattering the collimated light beam. The method further includes detecting at least a portion of the light scattered by the encoded surface and determining the position where the collimated light beam strikes the encoded surface, which corresponds to where the device is pointing, as a function of a characteristic of the detected scattered light.
Alternatively, the invention may comprise various other methods and apparatuses.
Other features will be in part apparent and in part pointed out hereinafter.
Referring first to
Each collimated light source 29A,29B projects a collimated light beam C onto the encoded surface 23. The collimated light beams C are not visible to the human eye. In one example, the collimated light beams C comprise infrared light, and other types of non-visible radiation are also contemplated as within the scope of the invention (e.g., microwaves, etc.). Other types of collimated light sources 29A,29B and lasers are also contemplated as within the scope of the claimed invention. Exemplary light sources will draw as little current as possible. This ensures that the light source may be used in a cordless device application without unduly limiting the battery life of the device. The collimated light source 29 may be of any suitable type, such as a resonant cavity light-emitting diode (RC-LED), a Vertical Cavity Surface-Emitting Laser (VCSEL), and an Edge Emitting Laser-Diode (EELD). Other lasers and sources of collimated light may also be utilized without departing from the scope of the claimed invention.
As will be discussed in detail below, the encoded surface 23 is configured to scatter the collimated light beam C striking the encoded surface, thereby diffusing the light in many directions. Where the collimated light source 29 emits only infrared radiation, the encoded surface 23 is configured to scatter only infrared light. Other wavelengths of light will simply pass though or be absorbed by the encoded surface 23.
The encoded surface comprises at least one scattering feature 37 that substantially scatters infrared light in a manner to maximize the signals detected, as discussed in detail below. The encoded surface 23 also comprises at least one non-scattering feature 39 that does not substantially scatter infrared light. In most examples discussed herein, many scattering features 37 and non-scattering features 39 are included within the encoded surface 23. In particular, the many scattering features 37 and non-scattering features 39 cooperate to create a pattern on the encoded surface 23. In particular, the encoded surface 23 is encoded with a digital pattern, and the controller 33 is configured to determine position as a function of the digital pattern. The infrared scattering features 37 of the encoded surface 23 are transparent to visible light. The encoded pattern is not necessarily continuous, but may comprise many scattering features 37 that together form a two-dimensional bit pattern. The bit pattern may be coded by an appropriate algorithm, such that any subset area of a defined size contains a number of bits whose arrangement can be decoded to indicate the position of the subset area. The collimated light source 29 of the device 25 illuminates the encoded surface 23 with a collimated, infrared light beam C, and a detector 31 detects the bit pattern as the individual scattering features 37 scatter the infrared light. The pattern on the encoded surface 23 is not detectable to the human eye, and allows complete transmission of light at visible wavelengths. Such a pattern may serve to provide one or both of the following functions, namely, providing absolute position information or relative displacement information, as will be discussed in greater detail below. In one example, the scattering features 37 are infrared coatings, as discussed in detail below and in Appendix A.
With the collimated light scattering in many directions from the encoded surface 23, the orientation of the collimated light beam C with respect to the encoded surface is relatively unimportant (with respect to detecting the light scattered from the encoded surface). For example, the collimated light beam C may be oriented at several acceptable angles relative to the encoded surface 23 because the surface scatters the light in many directions, including toward the detector 31. As used herein, scattering may also be considered reflecting the collimated light in many directions.
In one example, the encoded surface 23 is visible-light transparent, such that visible light striking the encoded surface will pass through freely. This type of encoded surface 23 is essentially transparent to the user because it does not reflect visible light, thereby making the encoded surface particularly appropriate for adding to virtually any surface. For example, the encoded surface may be incorporated into a display 41 (e.g., a cathode ray tube (CRT), a liquid crystal display (LCD), an organic light-emitting diode (OLED) display, a projected image, or a plasma display panel (PDP), etc.), a screen, a whiteboard, a wall, an appliance, or any other surface, thereby encoding the surface for interaction with the device 25 and likely a networked computer. For example, a networked microwave oven may include an encoded surface 23 over a series of activation areas, or buttons, of the microwave. Thus, the microwave may be used customarily without use of the encoded surface 23. In addition, however, by pointing the collimated light beam C of the device 25 at an activation area, the detector 31 can detect the scattered light, indicating that the particular activation area of interest should be activated, and send a command corresponding to the activation area on the Microwave via a personal computer. The encoded surface 23 may either be incorporated into the product during manufacture, or applied to the product after manufacture to enhance functionality of the product. An explanation of techniques for creating such an encoded surface 23 are set forth in detail below in Appendix A, although other techniques are also contemplated as within the scope of the claimed invention.
Because the encoded surface 23 may be applied to virtually any surface, the size of the encoded surface is not limited. In particular, in one example the encoded surface includes an area of at least 0.13 square meters (200 square inches). Moreover, the encoded surface 23 may be as large as a large television, a projection screen, or even a large wall. The size of the encoded surface 23 is unimportant, as long as the collimated light beam C can reach the encoded surface and the scattered light can reach the detector 31.
The detector 31 is associated with the collimated light source 29 for detecting at least a portion of the scattered light. Suitable detectors 31 may include photodetectors, CCDs (charge-coupled devices), CMOS (complementary metal-oxide semiconductor) sensors, or other detector arrays, such as those integral with the collimated light source 29. In one example, the detector 31 is a photodetector comprising at least four elements for detecting at least a portion of the scattered light. The detector 31 may comprise additional elements if additional light detection is required by the controller to determine position, such as when absolute positioning is desired.
The device 25 may further comprise a filter 45 for substantially filtering out light of a wavelength irrelevant to the detected portion of scattered light. In the case where the detected scattered light is infrared, the filter 45 may be an infrared light passing/visible light blocking filter for substantially filtering out visible light, ensuring the detector only sees the intended scattered infrared light.
The controller 33 is associated with the detector 31 and is configured to respond to the detected portion of the scattered light to determine a position where the collimated light beam C strikes the encoded surface 23, the position corresponding to where the device is pointing. Those skilled in the art would recognize that the controller 33 may be either a processor or an application-specific integrated circuit (ASIC), among other things. The controller 33 may further comprise imaging processing firmware or circuitry to process the detected scattered light, as would be understood by one skilled in the art.
In one embodiment, when the encoded surface 23 is mounted on or incorporated with a display 41, the controller 33 signals the display to display an image corresponding to the absolute position where the collimated light beam strikes the encoded surface. Thus, when the pointing device 25 points toward a particular area upon the encoded surface 23 of the display 41, an image, such as a cursor, appears on the display corresponding to the position where the pointing device is pointing. To enhance the precision with which the pointing device 25 can refer to a particular position on the encoded surface 23, the device may further comprise a position adjustment mechanism 51 selectable by the user for manually adjusting the location of the image on the display 41. The position adjustment mechanism 51 may be of any type known in the art, such as a trackball or a touchpad.
The device 25 further comprises a housing 55 associated with the collimated light source 29, the detector 31, and the controller 33 for containing and protecting the components of the device. The housing 55 may take any form, without departing from the scope of the claimed invention. For example, the housing 55 may be in the shape of a remote control, an optical pointer, or any other pointing device.
In addition to the collimated light sources 29A,29B discussed above, the pointing device may further comprise a visible light source 61 for projecting a visible light beam V, such as a laser beam, toward the encoded surface 23 in substantially the same position where the collimated light beam C strikes the encoded surface. Because the collimated light beam C is not within the visible spectrum, a user may have difficulty determining exactly where the device 25 is pointed. This is particularly true where the encoded surface 23 is incorporated with a surface not capable of imaging a cursor, such as a large screen or wall with an encoded surface. In these situations, having visible light beam V aids the user in aiming the pointing device 25 to the desired location.
Although not shown, the pointing device 25 may additionally comprise an optic, or optics, arranged between the collimated light source 29 and the encoded surface 23 for directing the collimated light beam C. Such optics may be converging or diverging optics, and the optics may also be integral with the collimated light source 29.
The components of the device 25 may further be mounted on a common substrate (not shown). Specifically, the collimated light source 29, the detector 31, and the controller 33 may mount adjacent each other on the same substrate, providing the added benefit during manufacturing of allowing these components to be added to the device 25 as a single assembly. Moreover, the single substrate aids in packaging the device 25 because it is compact and lowers cost, as only a single substrate is necessary. More importantly, mounting the detector 31 and the collimated light source 29 on the same substrate allows these two components to be mounted closer to one another for compactness. The substrate may comprise at least one of a micro-chip, a printed circuit board (PCB), and a leadframe.
In operation, the pointing device 25 may be remote from the encoded surface 23 during pointing. In one example, the device 25 is at least 15 centimeters (6 inches) from the encoded surface 23. In another example, the device 25 is at least 90 centimeters (3 feet) from the encoded surface 23. The device 25 will operate at relatively large distances, as long as the collimated light beam C can reach the encoded surface 23 and sufficient scattered light can reach the detector 31. Such a device 25 is particularly useful for off-desk navigation, large displays, presentations, collaborations, and home-based screens that may extend beyond 3 meters (10 feet) across.
In one example, the position determined on the encoded surface 23 is a relative position. The controller 33 responds to the detected portion of the scattered light to determine any relative movement of the position where the collimated light beam C strikes the encoded surface 23, which corresponds to any relative movement of where the device 25 is pointing. When the collimated light beam C projects onto the encoded surface 23, the scattering features 37 will scatter the light at the particular wavelength corresponding to the wavelength of the light source 29. The spatial and/or time variations of the detected scattered light may then be utilized to deduce the position and displacement of the pointing device 25 operated by the user, which in turn may be utilized to drive software of a connected computer.
In a particular example of an encoded surface 23 utilized for determining relative position, a repetitive pattern may be utilized, as long as the pattern is different in the x-direction than it is in the y-direction. Such a differentiation between the x-direction and y-direction encoding may be readily achieved by various means, such as geometrical differences (e.g., the width, or spacing, of the pattern), as well as physical properties (e.g., surface roughness and reflectivity). Additionally, any crossing points of the x-direction and y-direction features will provide a third type of unique scattering, such as implied either in time duration or in the signal amplitude of the corresponding detected pulse.
Utilizing the encoded surface 23 described above, when the collimated light beams C move across the encoded surface, the detectors 31 will send a time-series of signals to the controller that may be digitized as square pulses 71A,71B (e.g.,
In the example depicted, the spot size D of the collimated light beam C is comparable in size to the scattering features 37 of the encoded surface 23. For this example, only time-series information from a 2×2 photodetector needs to be analyzed, much like the conventional ball mouse. But spatial resolution of this approach may be limited by spot size D. Thus, for a diverging collimated light beam C and a large distance E between the encoded surface 23 and the device 25, the resolution of location detection may be limited as the spot size D increases. In another example, to improve resolution, diffraction techniques may be utilized, wherein a phase grating pattern (not shown) may be used to generate a fringe pattern, and the 2×2 photodetector senses and counts the fringes to deduce the motion of the collimated light beam C with respect to the encoded surface 23.
In one example providing relative position of the pointing device 25, the device 25 comprises a second collimated light source 29B and a second detector 31B associated with the second collimated light source. The second collimated light source 29B projects a second collimated light beam C onto the encoded surface 23, and the encoded surface scatters the second collimated light beam striking the encoded surface. This second detector 31B detects at least a portion of the light scattered from the second collimated light beam C. Generally speaking, one skilled in the art may modify the number of collimated light sources 29, the optics associated with the light sources, and the arrangement of the light sources, optics, and detectors 31 to produce a variety of devices 25. For example, the arrangement depicted in
In any event, the example of
In another example, the determined position on the encoded surface 23 is an absolute position. For absolute position determination, the encoded pattern 23 may include scattering features arranged in spatially varying patterns (e.g., barcode-like) for detecting absolute positions. In absolute positioning processing examples, the controller 33 will require additional information around the pixels of interest. For example, the encoded surface 23 may be encoded with varying gray-scale or varying reflectivity, among other possibilities, to achieve spatial position-coding.
In another example, a method determines a position where a collimated light beam of a pointing device strikes an encoded surface remote from the pointing device. The location corresponds to where the device is pointing. The method comprises projecting the collimated light beam from the pointing device onto the encoded surface generally as set forth above. The encoded surface has light-scattering properties for scattering the collimated light beam. The method further comprises detecting at least a portion of the light scattered by the encoded surface. Finally, the method comprises determining the position where the collimated light beam strikes the encoded surface, which corresponds to where the device is pointing, as a function of a characteristic of the detected scattered light. The method may further comprise utilizing the position information to display an image on a display associated with the encoded surface. The image corresponds to the position where the collimated light beam strikes the encoded surface. Moreover, the method may further comprise utilizing the position information to execute a command on a computer associated with the pointing device. The command corresponds to an item on a display associated with the encoded surface, whereby the item corresponds to the position where the collimated light beam strikes the encoded surface.
The computer 130 typically has at least some form of computer readable media. Computer readable media, which include both volatile and nonvolatile media, removable and non-removable media, may be any available medium that can be accessed by computer 130. By way of example and not limitation, computer readable media comprise computer storage media and communication media. Computer storage media include volatile and nonvolatile, removable and non-removable media implemented in any method or technology for storage of information such as computer readable instructions, data structures, program modules or other data. For example, computer storage media include RAM, ROM, EEPROM, flash memory or other memory technology, CD-ROM, digital versatile disks (DVD) or other optical disk storage, magnetic cassettes, magnetic tape, magnetic disk storage or other magnetic storage devices, or any other medium that can be used to store the desired information and that can be accessed by computer 130. Communication media typically embody computer readable instructions, data structures, program modules, or other data in a modulated data signal such as a carrier wave or other transport mechanism and include any information delivery media. Those skilled in the art are familiar with the modulated data signal, which has one or more of its characteristics set or changed in such a manner as to encode information in the signal. Wired media, such as a wired network or direct-wired connection, and wireless media, such as acoustic, RF, infrared, and other wireless media, are examples of communication media. Combinations of the any of the above are also included within the scope of computer readable media.
The system memory 134 includes computer storage media in the form of removable and/or non-removable, volatile and/or nonvolatile memory. In the illustrated embodiment, system memory 134 includes read only memory (ROM) 138 and random access memory (RAM) 140. A basic input/output system 142 (BIOS), containing the basic routines that help to transfer information between elements within computer 130, such as during start-up, is typically stored in ROM 138. RAM 140 typically contains data and/or program modules that are immediately accessible to and/or presently being operated on by processing unit 132. By way of example, and not limitation,
The computer 130 may also include other removable/non-removable, volatile/nonvolatile computer storage media. For example,
The drives or other mass storage devices and their associated computer storage media discussed above and illustrated in
A user may enter commands and information into computer 130 through input devices or user interface selection devices such as a keyboard 180 and a pointing device 182 (e.g., a mouse, trackball, pen, or touch pad). Other input devices (not shown) may include a microphone, joystick, game pad, satellite dish, scanner, finger tracker, or the like. These and other input devices are connected to processing unit 132 through a user input interface 184 that is coupled to system bus 136, but may be connected by other interface and bus structures, such as a parallel port, game port, PS/2 port or a Universal Serial Bus (USB). A monitor 188 or other type of display device is also connected to system bus 136 via an interface, such as a video interface 190. In addition to the monitor 188, computers often include other peripheral output devices (not shown) such as a printer and speakers, which may be connected through an output peripheral interface (not shown).
The computer 130 may operate in a networked environment using logical connections to one or more remote computers, such as a remote computer 194. The remote computer 194 may be a personal computer, a server, a router, a network PC, a peer device or other common network node, and typically includes many or substantially all of the elements described above relative to computer 130. The logical connections depicted in
When used in a local area networking environment, computer 130 is connected to the LAN 196 through a network interface or adapter 186. When used in a wide area networking environment, computer 130 typically includes a modem 178 or other means for establishing communications over the WAN 198, such as the Internet. The modem 178, which may be internal or external, is connected to system bus 136 via the user input interface 184, or other appropriate mechanism. In a networked environment, program modules depicted relative to computer 130, or portions thereof, may be stored in a remote memory storage device (not shown). By way of example, and not limitation,
Generally, the data processors of computer 130 are programmed by means of instructions stored at different times in the various computer-readable storage media of the computer. Programs and operating systems are typically distributed, for example, on floppy disks or CD-ROMs. From there, they are installed or loaded into the secondary memory of a computer. At execution, they are loaded at least partially into the computer's primary electronic memory. The invention described herein includes these and other various types of computer-readable storage media when such media contain instructions or programs for implementing the operations described below in conjunction with a microprocessor or other data processor.
For purposes of illustration, programs and other executable program components, such as the operating system, are illustrated herein as discrete blocks. It is recognized, however, that such programs and components reside at various times in different storage components of the computer, and are executed by the data processor(s) of the computer.
Although described in connection with an exemplary computing system environment, including computer 130, the invention is operational with numerous other general purpose or special purpose computing system environments or configurations. The computing system environment is not intended to suggest any limitation as to the scope of use or functionality of the invention. Moreover, the computing system environment should not be interpreted as having any dependency or requirement relating to any one or combination of components illustrated in the exemplary operating environment. Examples of well known computing systems, environments, and/or configurations that may be suitable for use with the invention include, but are not limited to, personal computers, server computers, hand-held or laptop devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, mobile telephones, personal digital assistants (PDAs), network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The present invention is also applicable for non-computer applications, such as television remote cursor control, among others.
Those skilled in the art will note that the order of execution or performance of the methods illustrated and described herein is not essential, unless otherwise specified. That is, it is contemplated by the inventors that elements of the methods may be performed in any order, unless otherwise specified, and that the methods may include more or less elements than those disclosed herein.
When introducing elements of the present invention or the embodiment(s) thereof, the articles “a,” “an,” “the,” and “said” are intended to mean that there are one or more of the elements. The terms “comprising,” “including,” and “having” are intended to be inclusive and mean that there may be additional elements other than the listed elements.
In view of the above, it will be seen that the several objects of the invention are achieved and other advantageous results attained.
As various changes could be made in the above products and methods without departing from the scope of the invention, it is intended that all matter contained in the above description and shown in the accompanying drawings shall be interpreted as illustrative and not in a limiting sense.
There are three major techniques employed today for optical thin-film coating to achieve the desired transmission/reflection/absorption characteristics: thermal vaporization, ion-assisted thermal vaporization, and sputtering. Of the three techniques, the ion-assisted thermal vaporization technique has the following benefits:
The deposition may be carried out on crystal, glass, flexible or rigid plastic substrate, or other surfaces that are of interest to serve as the two-dimensional surface. The likely thin film structure is multilayer film and non-metallic. The film should possess suitable mechanical, reliability, and optical properties, such as proper forward-scattering effects.
The typical process for discovering and developing the proper thin film coating is:
The software model only accounts for optical characteristics, thus, one needs to examine mechanical and other properties such as hardness, adhesion, etc. in addition to optical characterization.
The patterning can be realized through one the following methods:
With ink jet printing, a variation of the above includes mixing the correct elements in the ink so the ink can emit, e.g., fluorescence that may be detected by the camera with the correct optical filtering. This way, patterns may be detected even when encoded patterns are shadowed by the regular printings of texts and graphics.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4364035||Jun 15, 1981||Dec 14, 1982||Kirsch Steven T||Electro-optical mouse|
|US4499374 *||May 21, 1982||Feb 12, 1985||Mitutoyo Mfg. Co., Ltd.||Photoelectrical encoder employing an optical grating|
|US4719455||Jan 24, 1986||Jan 12, 1988||Louis William M||Integrating pointing device|
|US4794384||Apr 9, 1987||Dec 27, 1988||Xerox Corporation||Optical translator device|
|US5107541||Jul 5, 1990||Apr 21, 1992||National Research Development Corporation||Method and apparatus for capturing information in drawing or writing|
|US5155355 *||Sep 5, 1990||Oct 13, 1992||Mitutoyo Corporation||Photoelectric encoder having a grating substrate with integral light emitting elements|
|US5274361||Aug 15, 1991||Dec 28, 1993||The United States Of America As Represented By The Secretary Of The Navy||Laser optical mouse|
|US5442147||Apr 3, 1992||Aug 15, 1995||Hewlett-Packard Company||Position-sensing apparatus|
|US5574480||May 8, 1995||Nov 12, 1996||Kensington Microware Limited||Computer pointing device|
|US5604345 *||Nov 16, 1995||Feb 18, 1997||Mitutoyo Corporation||Optical encoder having a combination of a uniform grating and a non-uniform grating|
|US5644139||Aug 14, 1996||Jul 1, 1997||Allen; Ross R.||Navigation technique for detecting movement of navigation sensors relative to an object|
|US5712658||Sep 6, 1994||Jan 27, 1998||Hitachi, Ltd.||Information presentation apparatus and information display apparatus|
|US5729009||Jun 7, 1995||Mar 17, 1998||Logitech, Inc.||Method for generating quasi-sinusoidal signals|
|US5793357||Nov 15, 1993||Aug 11, 1998||Ivey; Peter Anthony||Device and method for determining movement of a surface|
|US5852434||Dec 18, 1995||Dec 22, 1998||Sekendur; Oral F.||Absolute optical position determination|
|US5907152||Jun 19, 1997||May 25, 1999||Logitech, Inc.||Pointing device utilizing a photodetector array|
|US5914783||Mar 24, 1997||Jun 22, 1999||Mistubishi Electric Information Technology Center America, Inc.||Method and apparatus for detecting the location of a light source|
|US6031218||Mar 8, 1999||Feb 29, 2000||Logitech, Inc.||System and method for generating band-limited quasi-sinusoidal signals|
|US6057540||Apr 30, 1998||May 2, 2000||Hewlett-Packard Co||Mouseless optical and position translation type screen pointer control for a computer system|
|US6124587||Dec 23, 1998||Sep 26, 2000||Logitech Inc.||Pointing device utilizing a photodetector array|
|US6246482||Mar 8, 1999||Jun 12, 2001||Gou Lite Ltd.||Optical translation measurement|
|US6256016||Jun 5, 1997||Jul 3, 2001||Logitech, Inc.||Optical detection system, device, and method utilizing optical matching|
|US6281882||Mar 30, 1998||Aug 28, 2001||Agilent Technologies, Inc.||Proximity detector for a seeing eye mouse|
|US6310988||Aug 31, 1998||Oct 30, 2001||Xerox Parc||Methods and apparatus for camera pen|
|US6323839||Nov 6, 1997||Nov 27, 2001||Canon Kabushiki Kaisha||Pointed-position detecting apparatus and method|
|US6330057||Mar 8, 1999||Dec 11, 2001||Otm Technologies Ltd.||Optical translation measurement|
|US6331848||Apr 24, 1997||Dec 18, 2001||U.S. Philips Corporation||Projection display system|
|US6424407||Mar 9, 1998||Jul 23, 2002||Otm Technologies Ltd.||Optical translation measurement|
|US6448977||Feb 15, 2000||Sep 10, 2002||Immersion Corporation||Textures and other spatial sensations for a relative haptic interface device|
|US6452683||Mar 8, 1999||Sep 17, 2002||Otm Technologies Ltd.||Optical translation measurement|
|US6454482||Oct 20, 2000||Sep 24, 2002||Silverbrook Research Pty Ltd||Universal pen|
|US6455840||Oct 28, 1999||Sep 24, 2002||Hewlett-Packard Company||Predictive and pulsed illumination of a surface in a micro-texture navigation technique|
|US6474888||Oct 20, 2000||Nov 5, 2002||Silverbrook Research Pty Ltd.||Universal pen with code sensor|
|US6498604||May 20, 1999||Dec 24, 2002||Kanitech A/S||Input device for a computer|
|US6515651||Sep 24, 1998||Feb 4, 2003||International Business Machines Corporation||Reversible wireless pointing device|
|US6570104||May 26, 2000||May 27, 2003||Anoto Ab||Position determination|
|US6621068 *||Jun 27, 2001||Sep 16, 2003||Mitutoyo Corporation||Optical encoder and method of fabricating its sensor head|
|US6667695||Jun 25, 2002||Dec 23, 2003||Anoto Ab||Position code|
|US6674427||Oct 2, 2000||Jan 6, 2004||Anoto Ab||Position determination II—calculation|
|US6689966||Mar 21, 2001||Feb 10, 2004||Anoto Ab||System and method for determining positional information|
|US6707027||Nov 6, 2001||Mar 16, 2004||Koninklijke Philips Electronics N.V.||Method of measuring the movement of an input device|
|US6759647 *||Jan 16, 2003||Jul 6, 2004||Harmonic Drive Systems, Inc.||Projection encoder|
|US6918538||Apr 29, 2003||Jul 19, 2005||Symbol Technologies, Inc.||Image scanning device having a system for determining distance to a target|
|US20030103037||Dec 5, 2001||Jun 5, 2003||Em Microelectronic-Marin Sa||Sensing device for optical pointing devices such as an optical mouse|
|US20040061680||Jul 7, 2003||Apr 1, 2004||John Taboada||Method and apparatus for computer control|
|US20050035947||Aug 15, 2003||Feb 17, 2005||Microsoft Corporation||Data input device for tracking and detecting lift-off from a tracking surface by a reflected laser speckle pattern|
|EP0295720A2||Jun 20, 1988||Dec 21, 1988||Omron Tateisi Electronics Co.||Laser speckel velocity-measuring apparatus|
|GB2272763A||Title not available|
|WO1997043607A1||May 9, 1997||Nov 20, 1997||Michel Sayag||Method and apparatus for generating a control signal|
|1||Asakura et al., "Dynamic Laser Speckles and Their Application to Velocity Measurements of the Diffuse Object," Applied Physics, 1981, 179-194.|
|2||Meyer, "Pen Computing: A Technology Overview and A Vision," ACM SIGCHI Bulletin, Jul. 1995, pp. 46-90, vol. 27, Issue 3, ACM Press, New York, USA.|
|3||Ohtsubo et al., "Velocity Measurement of a Diffuse Object by Using Time-Varying Speckles," Optical and Quantum Electronics, 1976, pp. 523-529, Chapman and Hall Ltd., Great Britain.|
|4||Optical Mouse Saves Space, The Online Photonics Resource, http://optics.org, Dec. 8, 2003, 2 pp., United States.|
|5||Optical Scrolling, The Online Photonics Resource, http://optics.org, Dec. 8, 2003, 1 pg., United States.|
|6||Prototype Device, The Online Photonics Resource, http://optics.org, Dec. 8, 2003, 1 pg., United States.|
|7||Schnell et al., "Detection of Movement with Laser Speckle Patterns: Statistical Properties," Optical Society of America, Jan. 1998, pp. 207-216, vol. 15, No. 1.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7667855 *||Feb 29, 2008||Feb 23, 2010||International Business Machines Corporation||Providing position information to computing equipment installed in racks of a datacenter|
|US20090219536 *||Feb 29, 2008||Sep 3, 2009||International Business Machines Corporation||Providing Position Information To Computing Equipment Installed In Racks Of A Datacenter|
|US20100060567 *||Sep 5, 2008||Mar 11, 2010||Microsoft Corporation||Controlling device operation relative to a surface|
|EP2336715A1||Nov 25, 2010||Jun 22, 2011||Chung Shan Institute of Science and Technology, Armaments Bureau, M.N.D.||Method and system for positioning by using optical speckle|
|International Classification||G06F3/03, G01J1/42, G01B11/26|
|Cooperative Classification||G06F3/0321, G06F3/0317|
|Jun 25, 2004||AS||Assignment|
Owner name: MICROSOFT CORPORATION, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KONG, YUAN;LARSEN, GLEN;REEL/FRAME:015502/0479
Effective date: 20040608
|Dec 8, 2010||FPAY||Fee payment|
Year of fee payment: 4
|Dec 9, 2014||AS||Assignment|
Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034541/0477
Effective date: 20141014
|Dec 29, 2014||FPAY||Fee payment|
Year of fee payment: 8