|Publication number||USRE40880 E1|
|Application number||US 11/529,693|
|Publication date||Aug 25, 2009|
|Filing date||Sep 28, 2006|
|Priority date||May 17, 2000|
|Also published as||US6611252, US6798401, US20030193479|
|Publication number||11529693, 529693, US RE40880 E1, US RE40880E1, US-E1-RE40880, USRE40880 E1, USRE40880E1|
|Inventors||Douglas P. DuFaux|
|Original Assignee||P. Milton Holdings, Llc|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (24), Non-Patent Citations (1), Referenced by (5), Classifications (7), Legal Events (4)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This is a continuation of application Ser. No. 09/572,349, filed on May 17, 2000 now U.S. Pat. No. 6,611,252.
The present invention relates generally to data input devices and, more particularly, to data input devices adapted for use with portable communications and computing equipment.
Demand for compact communications and computing equipment has increased dramatically over the last decade. Computers that can be held in the palm of your hand and wireless phones that fit in a shirt pocket are two examples of popular miniaturized machines. More recently, demand for wireless e-mail and Internet access has begun to soar, with experts projecting future demand to rise at unprecedented rates.
One problem associated with miniaturized communications and computing equipment is having a convenient way to input data, such as character and pointing device data, into such equipment. Early miniaturized computing equipment, typical of the 1990s, included a miniaturized keyboard that was scaled to fit the desired sized of the computing equipment. Typing a few words using such systems is quite laborious because the operator's fingers are typically too large to use the device as a traditional keyboard. Portable communication equipment, on the other hand, typically include a conventional 12-button keypad to input data. It is extremely difficult to use this set-up to enter non-numerical data. For example, to enter the word CALL, an operator would hit the button marked “2-A-B-C” three times for C, the button marked “2-A-B-C” once for A, the button marked “5-J-K-L” three times for L, and finally the button marked “5-J-K-L” three times again for the final L.
To ease the problem of character input, some manufacturers of both communications and computing equipment have recently developed pen-type portable devices in which a real keyboard or keypad is not provided and data input operations can be carried out by utilizing a miniaturized virtual keyboard displayed on a touch sensitive screen. The pen can also be used as a pointing device to select items on the screen. An additional feature of many touch sensitive screen systems is the ability to write characters with a pen or stylus that is recognized as individual characters by the device. Another recent development is the collapsible keyboard, such as those currently being marketed by Palm Computing. This keyboard may be folded and carried in a briefcase or even a pocket, and is opened and plugged into the miniaturized equipment before it is ready to use. Yet another development is voice recognition. However, this technology is not currently highly reliable and, as a result, input errors are common. Furthermore, numerous circumstances arise when voice input is not practical or appropriate. Moreover, voice recognition is not suitable for entering pointing device information.
While each of these methods represents a form of improvement over previous technologies, the need remains for a data input device for use with miniaturized communications and computing equipment that allows an operator to easily input characters and data into such equipment. Preferably, such an input device would incorporate wireless techniques to sense the position and motion of the operator's fingers to allow the user to enter data without the use of a physical keyboard or a pointing device.
A data input device having these desired features has now been developed. Broadly speaking, the data input device of the present invention optically interfaces with an operator to detect the position of objects within a particular input zone, e.g., an area defined as a “virtual keyboard” in which the operator may interact to enter character data into associated computing equipment. Preferably, the objects are the operator's fingers placed within the input zone. As a character input device, each character data corresponds to a unique arrangement and position of the objects within the input zone. As a pointing device, the relative motion of the operator's fingers defines the input area. The input device includes a source of optical sensor light illuminating the input zone with sensor light. The source of optical sensor light may be ambient light surrounding the operator or a light emitting device adapted to emit light in a direction toward the operator's fingers. Preferably, the source of optical light covers the input zone. The sensor light reflects off the objects in a direction generally toward the light device. The data input device also includes an optical detector arranged to receive the reflected sensor light as a reflected light pattern representing the relative position of the operator's fingers within the input zone. The optical detector converts the reflected light pattern to an electrical signal representing the particular character data desired to be entered by the operator. A microprocessor then receives the electrical signal and correlates the electrical signal to character or position/motion data.
In one embodiment of the virtual data input device, an image generator is used to project an optical image that represents character data, e.g., an image of a real keyboard. The image generator may be formed by an optical element, such as a stencil, mask, holographic element, mirror array, or other suitable device known in the art of image projection, designed to allow light to pass through portions of the optical element and a light generator positioned to emit visible light through the portions of the optical element, whereby the light passed through the optical element forms the optical image.
The present invention also includes a novel method to enter character data into communications or computing equipment based on the position of objects within an input zone, e.g., an operator's fingers on a virtual keyboard. The position of the objects within the input zone uniquely corresponds to particular character data. A reference position is established to associate an initial position of the objects within the input zone and to associate a plurality of positions of the objects within the input zone to unique character data. A source of optical sensor light is provided to illuminate the input zone such that the sensor light reflects off the objects. The reflected sensor light is then received as a reflected light pattern representing the position of the objects within the input zone. The reflected light pattern is converted to an electrical signal that is then correlated to the unique selection of character data, which may then be input into the electronic equipment.
These and other features, aspects, and advantages of the present invention will become better understood with regard to the following description, appended claims, and accompanying drawings wherein:
These drawings are provided for illustrative purposes only and should not be used to unduly limit the scope of the present invention.
The light generator 22 delivers a visible light beam 24 to a collimating lens 26. The collimating lens 26 diverges the light beam 24 and emits collimated rays 28 of light having an increased cross-sectional area. Rays 28 are then projected through a deflective optical element 30 and through lens 32. Lens 32 expands the rays 28 to project an image, such as of a full size keyboard, at a suitable distance from the input device 10 and preferably adjacent an operator. Alternatively, the image generator 20 may be designed to project light through the deflective optical element 30 at a particular distance without the use of lens 32. A diverging image beam 34 is produced that projects an image, such as an image 80 of a conventional keyboard, onto any surface. The image 80 is typically formed from a plurality of discrete sub-images 82, that may represent, for example, particular keys on a conventional keyboard. The image 80 generally defines an input zone, i.e., the two-dimensional area within which the virtual data input device 10 will detect the position of the operator's fingers and correlate such position to unique character data. Each of the discrete sub-images 82 may then be selected by the operator for input into the equipment 70 based upon a unique position of objects within the input zone, e.g., a particular arrangement of fingers on the “virtual keyboard.”
The light generator 22 may include features to control the brightness, contrast and focal length of image 80. In addition, collimating lens 26 and/or lens 32 may be adjustable to allow the operator to control the position and focus of the projected image 80.
As shown in
To provide accurate input data and to reduce the possibility of input errors, the optical sensor receiving unit 50 may incorporate features to limit the wavelength of light detected by the light sensing device 52. For example, the light sensing device 52 may be designed to detect only the wavelength of light emitted by the light emitting device 42. Thus, the receiving unit 50 may include a light filter 56 to eliminate light of other wavelengths. Alternatively, the light sensing device 52 may be tuned to be wavelength sensitive or selective and designed for the particular wavelength of light emitting by the light emitting device 42.
As those skilled in the art will appreciate, the optical sensor receiving unit 50 requires a source of optical sensor light, which is described above as the light source 40. In an alternative embodiment, the virtual data input device 10 may rely upon ambient light existing in the operator's surroundings as the source of optical sensor light to be reflected off of the operator's fingers and detected by the light sensing device 52. This embodiment, which eliminates the need for the components illustrated in
The optical sensor receiving unit 50 transmits the electric signal that it generates to a microprocessor 60, which may be included within the virtual data input device 10. Alternatively, the optical sensor receiving unit 50 may send the electrical signal to a microprocessor 60 of the equipment 70 such that the signals may be interpreted and appropriate data input into the equipment 70. Thus, the microprocessor 60 interprets the signal from the optical sensor receiving unit 50 that is then delivered to and recognized by communications and computing equipment 50. The microprocessor 60 preferably includes an algorithm to determine which of the character data is selected. As an example, the microprocessor 60 may include a stored database of character data signals and then compare the received electrical signals to the signals stored in the database to correlate the received signals to character data. It may be preferred for the operator to first “teach” the system the unique position and/or motion of his/her fingers for a given character. This information may be stored in a database for reference during operation.
The step of reading the operator's finger positions may be accomplished in a variety of ways. For example, the electrical signal generated by the optical sensor receiving unit 50 and transmitted to microprocessor 60 may be substantially continuously read (e.g., every tenth, hundredth, or thousandth of a second) by microprocessor 60 and a character determined after the signal changes. For example, when the operator's fingers are positioned stationary over the “home keys” or “asdf” and “jkl;”, the signal from the optical sensor receiving unit 50 remains constant and no character data would be transmitted to the equipment 70. If the operator desires to input the letter “j” the right index finger positioned on that letter would be raised and then repositioned on the “j”. After the motion is complete, the microprocessor 60 would detect a change in the signal from the optical sensor receiving unit 50, compare the new signal to the previous signal and a look-up compiled database, and correctly read the newly inputted character. If the new signal does not match any signal stored in the database, no data or character is input during that particular cycle.
Once the operator is aligned with the image, the virtual data input device 10 begins to repetitively attempt to read new character data in a series of detection cycles. Thus, after a preset delay time (e.g., 10 milliseconds), the optical sensor receiving unit 50 receives a pattern of light reflected off of the operator's fingers that represents the position of the operator's fingers within an input zone, i.e., on the image. As described earlier, the reflected light may originate from an optical sensor light source 40 or from ambient light in the operator's surroundings. The reflected light pattern is converted to an electrical signal that is then compared to the electrical signal generated in the last detection cycle. If the electrical signal differs from the signal in the previous cycle, the signal is then used to determine the character (or key) associated with such signal. If the signal corresponds to a recognizable character, the microprocessor 60 accepts such character and inputs it to the equipment 70.
The operation of the virtual data input device 10 may be described with reference to
The method, which is similar in many respects to the method illustrated in
A microprocessor may be used to execute an algorithm to determine which of a plurality of selection options (e.g., keys) is selected. For example, the microprocessor may compare the electrical signal to a stored database of signals representing unique character data selections. Alternatively, the microprocessor may compare the electrical signal to the previous electrical signal to relate the position of a pointing device on a display screen.
As one will understand upon reading the disclosure of the present invention, the use of a projected image is solely for the operator's benefit. Just as many people can readily type on a keyboard without looking down to “hunt and peck” for keys, one may use the virtual data input device without the use of the protected image. It is preferred that there is a mechanism to align the operator's fingers to the virtual data input device 10. Alternatives to the projected image include a simple sheet of paper that is located at a predetermined distance and position from the equipment (e.g., the equipment is rested on a spot marked on the paper). However, since the system is ultimately comparing the relative position of the operator's finger(s), there is an acceptable range of space for the operator to place his/her fingers and, therefore, the alignment of the device 10 is not a necessity.
In another embodiment, the virtual data input device 10 may include the ability to “learn” additional keystrokes. Thus, the operator may assign certain finger positions to a character or set of characters similar to a macro in popular word processing and spreadsheet software programs. Additionally, the system may incorporate a “Hot Key” that allows the operator to switch between modes (e.g., from keyboard to mouse) by simply moving his/her fingers to a predetermined position.
The principles of the present invention may be used as a character input device for various forms of portable communications and computer equipment, including cell phones, Internet-ready cell phones, palm-size computers, personal computers, laptop computers, notebook computers, personal digital assistants, and the like. The invention may also be used to input character data into portable hybrid communications and computing equipment such as palm-size computer telephones, standard desktop computers, hand-held data receiving or entry devices, GPS receivers, inventory management devices, and portable computing equipment integrated with automobiles, aircraft and the like. In addition, the virtual data input device may be used to detect the position of other forms of input means beyond an operator's fingers, such as a writing stylus or a computer mouse. Moreover, the optical sensor light source 40 and the optical sensor receiving unit 50 may be configured to detect the position of an operator's fingers or other objects on a computer display screen. Thus, the features of the present invention can allow the virtual data input device to act as a form of touch sensitive screen.
Although the present invention has been described in considerable detail with reference to certain presently preferred embodiments thereof, other embodiments are possible without departing from the spirit and scope of the present invention. Therefore the appended claims should not be limited to the description of the preferred versions contained herein.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4177354||Apr 17, 1978||Dec 4, 1979||Bell Telephone Laboratories, Incorporated||Graphic communications apparatus|
|US4713535 *||Sep 4, 1985||Dec 15, 1987||Rhoades Randy L||Optical keyboard|
|US4980685 *||Feb 24, 1987||Dec 25, 1990||Alain Souloumiac||Scanning optical keyboard|
|US5450148 *||Apr 18, 1994||Sep 12, 1995||Yu S. Lin||Laser pointer with selectable pointer patterns|
|US5457454 *||Sep 21, 1993||Oct 10, 1995||Fujitsu Limited||Input device utilizing virtual keyboard|
|US5581484 *||Jun 27, 1994||Dec 3, 1996||Prince; Kevin R.||Finger mounted computer input device|
|US5718496 *||Jun 25, 1996||Feb 17, 1998||Digital Optics Corporation||Projection pointer|
|US5757361 *||Mar 20, 1996||May 26, 1998||International Business Machines Corporation||Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary|
|US5785439 *||May 19, 1997||Jul 28, 1998||Product Engineering & Manufacturing, Inc.||Environmentally safe machine control security switch|
|US5815126 *||May 21, 1996||Sep 29, 1998||Kopin Corporation||Monocular portable communication and display system|
|US5909210 *||Mar 20, 1997||Jun 1, 1999||Compaq Computer Corporation||Keyboard-compatible optical determination of object's position|
|US5938308 *||Aug 4, 1997||Aug 17, 1999||Digital Opitcs Corporation||Projection pointer|
|US5977935 *||Aug 12, 1994||Nov 2, 1999||Seiko Epson Corporation||Head-mounted image display device and data processing apparatus including the same|
|US6002390 *||Nov 21, 1997||Dec 14, 1999||Sony Corporation||Text input device and method|
|US6022126 *||Jun 26, 1997||Feb 8, 2000||Sekinos Co., Ltd.||Laser pointer|
|US6037882 *||Sep 30, 1997||Mar 14, 2000||Levy; David H.||Method and apparatus for inputting data to an electronic system|
|US6097374 *||Mar 6, 1998||Aug 1, 2000||Howard; Robert Bruce||Wrist-pendent wireless optical keyboard|
|US6167469 *||May 18, 1998||Dec 26, 2000||Agilent Technologies, Inc.||Digital camera having display device for displaying graphical representation of user input and method for transporting the selected digital images thereof|
|US6218967 *||Apr 1, 1997||Apr 17, 2001||Kyosti Veijo Olavi Maula||Arrangement for the optical remote control of apparatus|
|US6266048 *||Aug 27, 1998||Jul 24, 2001||Hewlett-Packard Company||Method and apparatus for a virtual display/keyboard for a PDA|
|US6281878 *||Jan 26, 1998||Aug 28, 2001||Stephen V. R. Montellese||Apparatus and method for inputing data|
|US6611252 *||May 17, 2000||Aug 26, 2003||Dufaux Douglas P.||Virtual data input device|
|US6614422 *||Feb 11, 2000||Sep 2, 2003||Canesta, Inc.||Method and apparatus for entering data using a virtual input device|
|US20050024324 *||Dec 30, 2003||Feb 3, 2005||Carlo Tomasi||Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device|
|1||IBM "Virtual Keyboard", IBM Technical Disclosure Bulletin, Mar. 1990, pp. 359-360.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US8552983 *||Jul 8, 2008||Oct 8, 2013||Hsien-Hsiang Chiu||Intelligent robotic interface input device|
|US8959013 *||Sep 25, 2011||Feb 17, 2015||Apple Inc.||Virtual keyboard for a non-tactile three dimensional user interface|
|US9305229||Nov 29, 2012||Apr 5, 2016||Bruno Delean||Method and system for vision based interfacing with a computer|
|US20100103106 *||Jul 8, 2008||Apr 29, 2010||Hsien-Hsiang Chui||Intelligent robotic interface input device|
|US20120078614 *||Sep 25, 2011||Mar 29, 2012||Primesense Ltd.||Virtual keyboard for a non-tactile three dimensional user interface|
|U.S. Classification||345/168, 345/170|
|International Classification||G06F3/033, G06F3/042, G09G5/00|
|Apr 28, 2009||AS||Assignment|
Owner name: P. MILTON HOLDINGS, LLC, DELAWARE
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:TREE FROG TECHNOLOGIES, LLC;REEL/FRAME:022607/0835
Effective date: 20060429
Owner name: TREE FROG TECHNOLOGIES, LLC, NEW YORK
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:DUFAUX, DOUGLAS P.;REEL/FRAME:022607/0824
Effective date: 20060620
|Feb 24, 2012||FPAY||Fee payment|
Year of fee payment: 8
|Nov 19, 2015||AS||Assignment|
Owner name: BENHOV GMBH, LLC, DELAWARE
Free format text: MERGER;ASSIGNOR:P. MILTON HOLDINGS, LLC;REEL/FRAME:037094/0504
Effective date: 20150811
|Feb 23, 2016||FPAY||Fee payment|
Year of fee payment: 12