WO1994012897A1 - Diffusion-assisted position location, particularly for visual pen detection - Google Patents

Diffusion-assisted position location, particularly for visual pen detection Download PDF

Info

Publication number
WO1994012897A1
WO1994012897A1 PCT/US1993/011170 US9311170W WO9412897A1 WO 1994012897 A1 WO1994012897 A1 WO 1994012897A1 US 9311170 W US9311170 W US 9311170W WO 9412897 A1 WO9412897 A1 WO 9412897A1
Authority
WO
WIPO (PCT)
Prior art keywords
accordance
intensity distribution
photodetector
image
elements
Prior art date
Application number
PCT/US1993/011170
Other languages
French (fr)
Inventor
David I. Dunthorn
Original Assignee
Dunthorn David I
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dunthorn David I filed Critical Dunthorn David I
Priority to EP94903272A priority Critical patent/EP0671018A4/en
Priority to JP6512560A priority patent/JPH08506193A/en
Priority to AU57276/94A priority patent/AU5727694A/en
Publication of WO1994012897A1 publication Critical patent/WO1994012897A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0421Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01VGEOPHYSICS; GRAVITATIONAL MEASUREMENTS; DETECTING MASSES OR OBJECTS; TAGS
    • G01V8/00Prospecting or detecting by optical means
    • G01V8/10Detecting, e.g. by using light barriers

Definitions

  • the present invention relates generally to systems for optically determining the direction of an object relative to an imaging system or for optically determining the position of an object such as a pointer and, more particularly, to the use of imaging systems placed about a computer display device to monitor the motions of a pointer.
  • imaging systems placed about a computer display device to monitor the motions of a pointer.
  • touch screen systems wherein a user interacts with a comput system by touching or pointing to various locations within a touch screen area, typically associate with a display such as a CRT, or other display device.
  • the touch screen serves as easily used dat input apparatus because the operator can quickly and easily feed data into a computer by indicatin various specific positions of the touch screen area.
  • the term "pointer” refers to any suitable pointing object, externally illuminated or self-illuminated which is easily moved relative to a two-dimensional plane surface or area.
  • the pointer may comprise a pen, stylus, finger, or any typically long and slim element.
  • touch screen means apparatus used to locate the position of a pointer within a generally planar viewing field.
  • viewing field may be an area defined on or near the surface of solid material, such as near an inert panel, or may be a geometric plane defined in space or in the air.
  • Such "touch screens” have many possible applications, for example: pointing to or selecting a particular image or region on a video screen or on a panel, plate or tablet to select or indicate a specific item, informational element, letter, number, symbol, parameter line, region or characteristic; to locate or track a pointer held generally perpendicular to the plane and moved along the plane for plotting data, drawing a graph or picture or for laying out a network or diagram for indicating or tracking the relative motion or coordinates of movement or trajectory of a moving object; for use as a custom keyboard in an electronic input module; and so forth.
  • Visual Pen Detection refers to any “pointer” or other object whose direction relative to an imaging system or position is to be determined.
  • visual pen refers to any “pointer” or other object whose direction relative to an imaging system or position is to be determined.
  • camera is employed herein as a synonym to imaging system, where a “camera” may be characterized as an electronic device for capturing either one- or two-dimensional images and converting them to electrical signals for further processing.
  • the Visual Pen Detection system of the invention employs two or more cameras placed strategically about a computer display to obtain the position of the computer operator's hand or an object such as a pen or stylus.
  • the precise position of the hand or object is determined using trigonometric triangulation.
  • Pointer position, as well as signals derived from the change of position or the rate of change of position, can be interpreted as commands to the computer system.
  • the Visual Pen Detection devices of the invention most closely resemble what collectively have been called touch screens in the prior art. Touch screens themselves have been largely divided into two groups: overlay, in which a device sized to fit the display is placed over or attached to the display screen itself; and non-overlay, in which no such display-screen-covering device is required. Although not touch screens, the Visual Pen Detection devices of the invention could be classed as non-overlay.
  • Overlay touch screens usually require placing a specially prepared device over the computer display and using the device to determine the position, in two dimensions, of the operator's finger or some other stylus.
  • Typical examples involve one- or two-layer arrangements o glass or plastic sheets which have been coated or patterned with special materials so that the position of a stylus or finger touch can be determined by making resistance or capacitance measurements on the device.
  • Another approach broadcasts acoustic waves on the surface of a specially prepared piece of glass and determines two-dimensional positions by timing when the waves have been attenuated by a touch from a finger or other sufficiently compliant stylus. This latter device lays claim to a third-dimension sensing capability, but the "third dimension" actually relates to firmness of touch, since no detection at all takes place until the finger or stylus is firmly i contact with the display screen.
  • Non-overlay touch screens have thus far been largely unsuccessful in the marketplace, due largely to the awkwardness of the approaches which have heretofore been taken.
  • One such device includes a linear array of perhaps twenty-five infrared light-emitting diodes placed along one side of a display and a matched linear array of infrared detectors along the other side. When touching the display screen, the operator's finger breaks a light beam, thus indicating positio on the vertical axis of the display.
  • a similar arrangement at the top and bottom of the display is employed to determine horizontal axis position.
  • this prior art system provides relatively low precision in determining two-dimensional position. Because the diodes have to be placed a significant distance out from the display screen, parallax is also a problem.
  • each imaging system includes a photodetector, such as a charge-coupled device (CCD) having a plurality of elements organized as an array, and a lens to focus an image of the pointer object onto the photodetector.
  • a photodetector such as a charge-coupled device (CCD) having a plurality of elements organized as an array
  • a lens to focus an image of the pointer object onto the photodetector.
  • a deliberately diffuse or blurred image is employed.
  • the diffusion produces a characteristic "bell-shaped" or Gaussian intensity distribution.
  • the image-producing capability which is essential to prior art systems is abandoned because we already know what the image should look like.
  • position- determining capability and object-recognition capability which is exploited by the subject invention.
  • Another advantage is that the diffusion of the invention serves to make the system relatively insensitive to objects other than the pointer which the system is designed to recognize.
  • a system for optically determining the direction of an object comprises a photodetector having a plurality of detector elements extending in at least one dimension, and elements for projecting a diffuse image of the object onto the photodetector such that each of the detector elements produces an output signal corresponding to the intensity of that portion of the diffuse image which is projected onto the particular detector element.
  • the diffuse image has a characteristic intensity distribution.
  • the system additionally includes a data analyzer connected for receiving the output signals from the detector elements, which are taken as a data set, and is operable to locate the position of the diffuse image on the photodetector by recognizing a characteristic intensity distribution of the diffuse image.
  • the position of the diffuse image can be located by performing a least squares fit of a predetermined intensity distribution function (e.g. a Gaussian distribution function) known to correspond with the characteristic intensity distribution to the output signals from the detector elements.
  • a transform, a filter, or other signal processing technique can be used on the output signals from the detector elements to extract the intensity distribution function.
  • a transform may be performed on the output signals from the detector elements taken as a data set. Any one of a variety of techniques may be employed to produce the required diffusion. Examples include focusing the lens by positioning the photodetector where it is not in th image plane (focal plane) of the lens.
  • a diffuser such as a frosted screen or a layer of petroleum jelly in the optical path may be employed.
  • Various forms of fiber optic diffusers may be employed, such as fiber optic fans.
  • a system for optically determining the position of an object within a generally planar viewing field includes at least a pair of imaging systems in the same plane as the viewing field and arranged to view the object within the viewing field from different angular positions.
  • Each of the imaging systems includes a photodetector having a plurality of detector elements extending in at least one dimension, and elements for projecting a diffuse image of the object onto the photodetector such that each of the detector elements produces an output signal corresponding to the intensity of that portion of the diffuse image which is projected onto the particular detector element.
  • the diffuse image has a characteristic intensity distribution.
  • a data analyzer is connected for receiving the output signals from the detector elements, taken as a data set, and operable to locate the position of the diffuse image on each of the photodetectors by recognizing the characteristic intensity distribution of the diffuse image, and thereby to determine the direction of the object relative to each of the imaging systems, and to determine the position of the object by triangulation based on the thus-determined directions.
  • FIG. 1A is a touch screen device embodying the invention, partially in schematic form
  • FIG. 1B is a side view of the touch screen device, partially in schematic form.
  • FIGS. 2A and 2B are diagrams depicting resolution over different areas of the touch screen when two cameras are employed;
  • FIG. 3 depicts the geometry involved in determining x and y coordinates by triangulation, given angles
  • FIGS. 4A and 4B are diagrams comparable to FIGS. 2A and 2B, but employing three cameras;
  • FIG. 5 is a plot of light intensity versus position along a photodetector array for both a sharp image and a diffuse image
  • FIG. 6 is a plot of light intensity versus position along a photodetector array in the case of motion of the diffuse image
  • FIG. 7 is a schematic representation of a simple lens with its focal center arranged give the lens 90° coverage
  • FIG. 8 is a schematic representation of a focused lens system resolving a single point directly in front of the lens onto a linear sensor
  • FIG. 9 is a schematic representation similar to FIG. 8, but showing the use of a diffuser
  • FIG. 10 is a schematic representation of a lens system which produces diffusion by de-focusing the image
  • FIG. 11 is a schematic representation of a lens system depicting the situation wherein a light source is positioned above the plane which should be imaged on a linear sensor;
  • FIG. 12 is a schematic representation of a focused image system under the same conditions as FIG. 11 ;
  • FIG. 13 depicts a fiber optic fan employed as a diffuser;
  • FIG. 14 depicts a fiber optic fan with an auxiliary diffusion sheet
  • FIG. 15 depicts a lens and diffuser employed with a bundle of optical fibers.
  • a computer display device 30 has a pair of suitably-mounte imaging systems or cameras 32 and 34 positioned at the upper corners, and an optional central camera 36 to improve resolution near the upper middle region of the display 30.
  • the requirement for a camera is that it produce a representation of a fiel of view from its vantage point that can be Used in an analysis to determine the position of specific objects in that field of view. This may likely involve the reduction of the image to digital form for u in a computing device, but it is also possible to produce a device in which analog video data are used directly in determining the position of objects in the field of view.
  • the camera may therefore be a real video camera in the usual sense of the word, such as the Vfe" CCD (charge-coupled devic arrays currently in common use, producing a digitized two-dimensional depiction of the scene in terms of the levels of light (visible or invisible) perceived from the vantage point.
  • the use of a lens or similar device in a camera allows the position information to be gathered using a small, standard device rather than a device which must be varied to match the dimensions of the display as is the case with most touch screen technologies.
  • the cameras 32, 34 and 36 include respective linear CCD photodetectors 38, 40 and 42, such as Texas Instruments TSL-214 devices. Each TSL-214 device has 64 photosensitive cells arranged in a linear array approximately 5/16 inc in length.
  • the cameras 32, 34 and 36 also include respective fixed lenses 44, 46 and 48.
  • the focal length of the lenses 44, 46 and 48 is in the 3.0 to 4.5 mm fo length range.
  • the corner cameras 32 and 34 have nominal 90° wide aperture lenses 44 and 46, while the central camera 36 has a 150 ⁇ to 180 ⁇ coverage lens 48.
  • the lenses 44, 46 and 48 are positioned so as to focus at distances greater than the distance across the display 30, or even "beyond infinity".
  • the photodetecto 38, 40 and 42 do not lie in image planes in which an image of a pointer object 50 is sharply focused.
  • Outputs of the photodetectors 38, 40 and 42 are connected to a data analyzer 52 which comprises an analog-to-digital converter 54, and a computer suitably programmed to perfor the analysis described in detail hereinbelow.
  • a pair of primary infrared light sources 56 and 58 are positioned near the corner cameras 32 and 34, and each have a nominal 90° spread, with a relatively narrow spread out of plane.
  • Optional secondary infrared light sources 60 and 62 are also provided, with a slightly narrower in-plane spread. All four infrared light sources 56, 58, 60 and 62 are characterized as hi output.
  • a bezel 64 which has a background frame 66 that is non-reflective at the infrared wavelengths employed.
  • FIGS. 2A, 2B, 3, 4A and 4B there are just two camera positions in the upper corners of a display as depicted.
  • the display can obviously be of any shape and the "upper" corners may be any adjacent corners.
  • cameras can be positioned anywhere about the active area.
  • the sensitivity pattern of this arrangement is shown in FIGS. 2A and 2B. As shown, the lines are spaced apart by an equal angle. It can be seen that the coverage of the rectangle, although non-linear, is quite good with the exception of the region near the top of th rectangle between the two cameras.
  • FIG. 3 represents the geometry for triangulation, where the x and y coordinators ar
  • the third camera can be positioned along the side or in another corner, as shown in FIG. 4A, which gives complete and accurate coverage of the entire rectangle.
  • FIG. 4B (which corresponds with FIG. 1) shows that the accuracy in the upper region is substantiall improved.
  • Lighting considerations are important to the overall accuracy of the system.
  • the system can be made to work using ambient light, but greater dependability can be achieved using controlled lighting.
  • the three basic forms of lighting are using a self-illuminated stylus, using a lighted background, and using specific light sources to illuminate a stylus or finger. Each method can be more useful in specific situations.
  • Controlled, specific, light sources allow backgrounding, that is, taking periodic camera readings with the specific illumination turned off, and also allow operation using only a narrow band of light wavelengths.
  • LED's or IRLED's are particularly useful in this regard.
  • Such specific sources can be few or numerous and can be spaced around the rectangle of interest as required. Care must be taken to arrange or shade such light sources so that they do not shine directly into any camera or to sequence camera readings and illumination sources so that this does not happen.
  • a favored mode places two or more light sources along the top of the rectangle with the cameras, with the light sources iensed in such a way as to produce essentially a sheet of light parallel to the rectangular surface. That sheet of light registers as a band of light across the finger or stylus which passes through it.
  • Background readings can be taken when the light sources are momentarily switched off and can then be subtracted from the illuminated readings to produce a signal free of background interference for analysis. It is the preferred mode of operation to provide a background that is black. This blackness, or absorption of light from the illumination source, nee prevail only in the wavelengths of light sensed by the cameras. Thus, if the light sources are infrared and the cameras are supplied with infrared-passing filters, the background may be visibly white or colored, so long as it absorbs in the infrared. It is also possible to operate without a special background if it is known that there will be no interfering, highly-iliuminated objects.
  • x and y are no restricted to be within the active region of the display, so this system is quite capable of sensing that an illuminated object is beyond the rectangle of interest and can be made to ignore such signals.
  • a system using an illuminated background also can use a narrow band of light wavelengths to minimize interference from external illumination sources. In this case, however, it i often more convenient to use a diffused wide-band background illumination and a narrow-band filt on the cameras. Illuminated backgrounds can be especially useful in situations where ambient lig is bright and confusing enough to hamper the operation of a specific light source design.
  • a self-illuminated stylus has the disadvantage of requiring a special stylus.
  • the advantages however are several.
  • the stylus may send out modulated or other signals to indicating touch-down or other operator messages.
  • a significant aspect of the invention is the use of diffuse or fuzzy images, produced such as by a deliberate lack of focus.
  • a high-precision photodetector such as a linear CCD arra to determine the position of a point light source, such as an illuminated pen tip
  • a point light source such as an illuminated pen tip
  • FIG. 5 if one begi to look carefully at the process of using a high-precision photodetector, such as a linear CCD arra to determine the position of a point light source, such as an illuminated pen tip, it is seen as a ver inefficient use of the photodetector array. Typically, only one or two cells are illuminated, often to saturation of the cell, and the rest of the cells are dark. The position of the pen is determined by t number of the cell that is illuminated, or the centroid number if more than one cell is illuminated. The sharp peaked curve 100 in FIG. 5 illustrates this behavior.
  • CCD cells are however quite capable of sensing the intensity of light, typically distinguishing hundreds of levels of intensity.
  • intensity informatio is used at just two levels; present or absent.
  • related prior art devices use focused cameras to product a sharp image at the CCD plane and thus define the stylus or finger position.
  • intensity information is employed to advantage.
  • a point light source such as an illuminated pen tip
  • intensity information is employed to advantage.
  • a point light source such as an illuminated pen tip
  • this shape 102 is characteristic, in fact symmetric in this case, if one samples the light intensity at several points 104, 106, 108, 110, 112, 114 and 116 within the curve, it is possible to determine the position of the maximum intensit to a small fraction of the distance between sample points.
  • many fewer CCD cells, for exampl are required to determine the position of the light source to a given precision using a diffuse, blurr image. Note that samples taken at the evenly spaced positions 104, 106, 108, 110, 112, 114 and 116 completely miss the sharp image 100, while such samples would quite adequately determine the position of the diffuse image 102.
  • the diffusion serves to make the system relatively insensitive to image details other than the pointer which the system is designed to recognize.
  • a lens and diffusion plate is used in the example above, the fact that most clearly separates diffusion- assisted and sharp-image position detection is that no lens, slit, or other light-ray sharp focusing device is actually required by the diffusion-assisted method.
  • Possible sources of diffusion include: 1) de-focus or other lens- (or pinhole- or slit-) related imaging imperfections, 2) intentionally placin a diffusion device such as a frosted screen or a layer of petroleum jelly in the optical path, 3) digita or analog-derived combination of the input intensities to produce mathematical or other non-optical diffusion, and 4) diffusion of the pointer object itself relative to a point source of light.
  • a diffusion device such as a frosted screen or a layer of petroleum jelly in the optical path
  • 3) digita or analog-derived combination of the input intensities to produce mathematical or other non-optical diffusion and 4) diffusion of the pointer object itself relative to a point source of light.
  • the human finger As constrained by the typical size of a touch screen, the human finger itself is relatively broad and is thus a naturally diffuse pointer object in comparison to a point source.
  • These various sources of diffusion are subject to varying amounts of control in the design of a system and the required diffusion will generally result from a partially controlled combination of the several
  • diffusion is used to produce an intensity curve of known shape and with intensities varying along its length, the Gaussian distribution being physically and mathematically very usable.
  • motion of the known shap can then be tracked as it passes over a discrete array of sensors and located with a precision that a small fraction of the distance between sensors.
  • a Gaussian curve 120 is again shown in FIG. 6, with a shifted version 122 shown in dotted line representing an actual movement to the left of abo one-fourth the distance between sensor elements.
  • the sensor elements to the left of the central peak experience varying, but very specific, increases in light intensity, while those to the right experience decreases.
  • interpolation is applied with little, if any, regard for the known characteristics of the shape, and the interpolation polynomial or other mathematical form is applied to as few points as possible in the vicinity of the area of interest, the concept being that whatever mathematical form is being used wi provide a better approximation to the real curve over a small region.
  • the interest would be in finding the peak intensity and its position. We are much more interested in finding an tracking the position and characteristics of the overall intensity shape curve than in tracking its localized peak intensity, which may move somewhat relative to the position of the overall shape.
  • the approximate Gaussian we are tracking the blurred image of a finger, for example, is likely to b the sum of many closely spaced Gaussians from individual point light sources on the finger. As hand position changes, these individual Gaussian sources will shift somewhat relative to one anoth while still maintaining a good approximation to a complete Gaussian overall. It is thus more stable and generally better to track the complete shape rather than interpolating the peak intensity.
  • FIG. 7 show a simple lens 150 with its focal center arranged to give the lens 90° coverage.
  • the coverage can be considered to be linear with angle, and the position of the lens 150 arranged to give the proper coverage.
  • the usual requirement that a wide angle lens provide accurate edge-to-edge focus is not at all important.
  • non-linear angle coverage in extreme wide angle lenses is not important, since any non-linearity can be corrected, if necessary, as part of the x, y position calculation.
  • Some attention must be given to aperture shadowing at wide angles.
  • the so-called cosine correction whereby the circular lens aperture looks like an ellipse when viewed from the side is the first of these effects. This phenomenon, which effectively reduces light passage at side angles, is often accentuated by the three-dimensional aspect of the lens.
  • FIG. 8 shows a focused lens system resolving a single point 154 directly in front of lens 156 onto a linear sensor 158, which runs perpendicular to the surface of the paper.
  • the image can be diffused by placing a diffusing sheet 160 directly in front of the linear sensor 158.
  • the actual diffusion sheet 160 would be much less conspicuous than shown in FIG. 9, very likely being a spray coating on the surface of the linear sensor array chip 158.
  • FIG. 10 schematically shows a lens system which is producing diffusion by de- focusing the image, in this case by using a lens 162 with a longer natural focal length positioned th same distance from the linear sensor array as in the figures above.
  • the image plane is at 164.
  • the triangular shape shown is in reality a three-dimensional cone, so that several adjacent cells of the linear sensor array 158 receive light from the object point.
  • This representation is somewhat simplistic in that the true distribution of light approximates a Gaussian distribution rather than falling off sharply at the edges of the cone, but the general concept is correct.
  • a defocused image has a side effect that is of importance in designing a diffused image system.
  • a light source 166 is positioned above the plane 168 which should be imaged on the linear sensor 158. Because the image is de-focused, the cone of light from the source is picked up by the sensor, giving a false or interfering signal. Thus, even if a special black background is erected to cover the image plane which the sensor is designed to see, a de-focused system can also see extraneous light sources which are at a low angle above the barrier.
  • a focused image system does not have this problem (although internal reflections may still create similar problems in focused systems). Even if a diffusion sheet i used in the focused system, the sheet can be narrow enough to just cover the linear sensor 158, and thus not be subject to the same problem as in FIG. 11. In practice, a compromise must be struck. The background, and even more so the extraneous light source are necessarily relatively far from the lens, and a lens which is focused for distance will still be diffuse for nearer objects. Thus, in practice, the lens is more nearly focused for distance, and supplementary diffusion is used if necessary for satisfactory performance near the edges of the display.
  • the light intensity pattern on the linear sensor 158 is essentially a sum of Gaussian distributions each resulting from some illuminated source in the field.
  • the position of the finger or stylus can be represented by the mean, ⁇ , of the Gaussian which corresponds to the finger or stylus.
  • FIGS. 13, 14 and 15 show three different arrangements of lenses and fiber optics.
  • FIGS. 13 and 14 have no lens at all.
  • An optical fiber has the characteristic that it will accept light which enters an acceptance cone which typically ranges about ⁇ 30" from the center axis of the fiber. This makes the optical fiber a diffuser in itself, without a len or other image sharpening device.
  • the acceptanc cone for each fiber takes in a slightly different range of angles over the entire scene and the arrangement behaves much like a lens-and-diffuser system, but without requiring, a lens.
  • the fiber fans have the property of averaging light intensity over a range of angles as depicted in FIG. The arrangement in FIG.
  • the optical fiber by itself has a sharper cut-off of light intensity at the extreme angles of the acceptance cone than would be the case for an ordinary diffuser, and the use of a diffusion sheet 190 can correct this if it proves desirable to do so. Note that there is no reason a diffuser could not be effectively used between the fibers and the CCD sensor rather than at the opposite end of the fibers.
  • FIG. 15 uses a lens and diffusion sheet to feed light directly to an optical fiber, so that the effect is more nearly that of sampling a specific angle in the graph of FIG. 5.
  • the configuration of FIG. 15 offers more control over the degree of diffusion than is possible in either FIGS. 13 or 14. If this increased control is necessary, it is available.
  • this system of the invention requires a separate detection of touchdown. When an illuminated stylus is used, this can be achieved by having a tip switch in the stylus that causes a separate signal to be sent to the system controller.
  • a curren active market for LED matrix touch screens which detect the finger above the display and thus do not detect touch well, this behavior limits the usefulness of these devices. With finger activated devices, some other detection is required.
  • a touch detector 67 consisting of a piezoelectric speaker surrounded on each side by foam sponge (1/8" uncompressed) to act as a spring allowing small motions and positioned under the rear edge of a standard CRT display.
  • the output of detector 67 is applied through A/D converter 54 to data analyzer 52.
  • the sudden shifts in the mean of the harmonic motions detected by this device indicate that the display has been touched.
  • detector 67 is intended to illustrate one simple example of a motion detector system for detecting touch of the screen, and other known touch detection systems could be substituted therefor.
  • Careful implementation and data analysis can also handle "backgrounding" in a fairl straightforward manner.
  • the stylus or finger
  • the illumination can be either internal or external to the stylus.
  • the CCD sensors may give false readings. If the LED illumination is cycled on and off in synchronization with the CCD reading cycles, the illumination may be intentionally left off for some cycles and the resulting CCD reading used as background.
  • ADC analog to digital converter
  • the intensity data are corrected for background intensity (if necessary) by subtracting out a background value for each of the 64 cells.
  • background values typically comprise the average intensity seen by the cell in quiescent periods.
  • the objective is to produce clean signal that does not contain spurious peaks of intensity that might confine the following process, particularly in the locating of the peak region. In a well-controlled lighting situation, background subtraction may not be necessary at all.
  • the data are then searched for a region including a peak value of intensity. This is currently done by searching for the set of n sequential points for which the intensities sum to the highest value. Our experience is that values of n in the range of 6 to 10 produce good results with a 64 element sensor.
  • the central position of the n points is then taken as a reference. Starting from the central point, and going in both directions, the array of data is examined to find the first point index below some threshold percentage (typically 60% to 80%) of the average intensity in the peak region. The lowest index in this region is called / and the highest, h. This process defines a region which encompasses the peak region of the dominant Gaussian and avoids the tail areas where the dominant Gaussian may be confounded with non-dominant peaks.
  • some threshold percentage typically 60% to 80%
  • the light intensity pattern in the selected region is primarily the Gaussian distributio resulting from the finger or stylus in the field and its position can be represented by the mean, ⁇ , o that Gaussian.
  • the current method used for extracting ⁇ is by least-squares fitting of the data in th region to the Gaussian curve, (although other methods might be used).
  • y — e 2 ⁇ 2 in which y is the intensity, x is the position within the array, a
  • weighting factor is a suitably chosen weighting factor.
  • the ends of the 64 element array require special consideration during this selection and weighting process. As stated above, the distribution of weights must take place about the nominal maximum value, y c , but if y 0 is near either end of the array, it is important that the weights still center about y c even though the end of the array will truncate the symmetrical effect. This approach causes the progression of ⁇ to remain uniform even near the ends of the array. - -1
  • represents the relative broadness of the Gaussian curve and depends n only upon the diffusiveness resulting from the camera system but also on diffusiveness resulting from the size of the subject.
  • values of ⁇ which should validly represent, say, a finger touchin the display typically fall within a narrow range and ⁇ is only weakly dependent on the image size and intensity.
  • the analysis results in a value of ⁇ outside that range, the analysis can be rejected as not representing a true finger touch.
  • when controlled illumination is used, a valid touch within the display area will be found to produce at least a certain minimum intensity, an signals below that level can be safely rejected as invalid.
  • the value of ⁇ which represents a position within the sensor array, is directly proportional to the angular position in the field of view of that camera.
  • ⁇ 0 + ⁇ -t— , where ⁇ is the angular position of the object in the field, ⁇ 0 is a 64
  • is the angular field of view represented by the 64 cells in the array. Both ⁇ 0 and ⁇ are determined through calibration in the
  • measured counter-clockwise, is for a camera positioned in the upper right corner of a rectangle and ⁇ , measured clockwise, is for a camera positioned in the upper left corne of the same rectangle, with the distance L between the focal centers of the two cameras, then
  • the distance L is generally taken through field calibration rather than by direct measurement.
  • x and y are determined from the two corner cameras as above, with x being discarded.
  • the Gaussian curve fit is the same for the central camera as for the corner cameras wit the exception that, if an angle-distorting lens, such as a fish-eye, is used, the angle is no longer be linear with position in the 64 element array and correction appropriate for the lens must be made.
  • the center camera is positioned a distance O— from the left corner camera, and a distance V
  • Pairs of x, y coordinates are reported whenever the above process produces valid results and the point falls within the active area of the display and the computed value of ⁇ , the spread of the Gaussian representing the stylus or finger touching the display, falls within a valid range. If the touch sensing device senses that the display has actually been touched during a period of valid display, then the touch event is also reported.
  • the present invention provide a method and apparatus whereby the pointer object 50 shown in FIG. 1A is precisely located by photodetectors, such as photodetectors 38, 40 and 42.
  • the photodetectors 38 and 40 are positioned at the upper corners of the display 30 and the detector 4 is positioned at the top center of the display 30.
  • the pointer object 50 is located or positioned precisely with respect to the photodetectors by triangulation as previously described. Since the photodetectors are placed in a known position with respect to the display 30, the position of the pointer object 50 with respect to the screen 30 is also known.
  • the data analyzer 52 (computer) is programmed to display a cursor 51 on the screen 30 as soon as a pointer object 50 is detected near the display 30 (about 1 inch from the display 30 by the photodetectors 38, 40 and 42.
  • the cursor 51 may be displayed in a position precisely corresponding with the center of the pointer object 50 or the cursor may be displaced slightly with respect to the object 50 as illustrated in FIG. 1A where the cursor 51 is displaced upwardly and to the left of the pointer object 50. In some circumstances, a slight displacement of the cursor may prove to be an advantage.
  • the finger will hide the cursor when the screen i touched.
  • the cursor 51 will remain visibl to the user at all times, even when the screen is being touched.
  • a line from the center of object 50 to cursor 51 is oblique (not perpendicular) to the display 30.

Abstract

Systems and methods for optically determining the direction of an object, such as a pointer (50), relative to an imaging system, particularly systems where triangulation is empolyed to determine the location of a pointer (50) within a generally planar viewing field, such as a touch screen (30). Rather than employing focused imaging systems to produce a sharp image at the plane of a photodetector, and to thus define the visual pen for finger position, a deliberately diffuse or blurred image is employed. The diffusion produces a characteristic 'bell-shaped' or Gaussian intensity distribution (104-116). By recognizing the characteristic intensity distribution, the position of the maximum intensity, and thus the direction of the object (50), can be determined to a small fraction of the distance between sample points, with an accordingly much higher resolution than focused systems. In a preferred embodiment, the position location system is incorporated into a computer system (52, 54, 30) as a touch screen apparatus and a cursor (51) is displayed on the screen as soon as a pointer (50) is detected above the screen (30).

Description

DIFFUSION-ASSISTED POSITION LOCATION, PARTICULARLY FOR VISUAL PEN DETECTION
BACKGROUND OF THE INVENTION The present invention relates generally to systems for optically determining the direction of an object relative to an imaging system or for optically determining the position of an object such as a pointer and, more particularly, to the use of imaging systems placed about a computer display device to monitor the motions of a pointer. There are so-called "touch screen" systems wherein a user interacts with a comput system by touching or pointing to various locations within a touch screen area, typically associate with a display such as a CRT, or other display device. The touch screen serves as easily used dat input apparatus because the operator can quickly and easily feed data into a computer by indicatin various specific positions of the touch screen area. As employed herein, the term "pointer" refers to any suitable pointing object, externally illuminated or self-illuminated which is easily moved relative to a two-dimensional plane surface or area. By way of example and not limitation, the pointer may comprise a pen, stylus, finger, or any typically long and slim element.
As employed herein, the term "touch screen" means apparatus used to locate the position of a pointer within a generally planar viewing field. In accordance with the invention, such viewing field may be an area defined on or near the surface of solid material, such as near an inert panel, or may be a geometric plane defined in space or in the air.
Such "touch screens" have many possible applications, for example: pointing to or selecting a particular image or region on a video screen or on a panel, plate or tablet to select or indicate a specific item, informational element, letter, number, symbol, parameter line, region or characteristic; to locate or track a pointer held generally perpendicular to the plane and moved along the plane for plotting data, drawing a graph or picture or for laying out a network or diagram for indicating or tracking the relative motion or coordinates of movement or trajectory of a moving object; for use as a custom keyboard in an electronic input module; and so forth. For convenience, the terminology "Visual Pen Detection" is employed herein to refe to the subject invention; it will be appreciated, however, that "visual pen" refers to any "pointer" or other object whose direction relative to an imaging system or position is to be determined. In addition, the term "camera" is employed herein as a synonym to imaging system, where a "camera" may be characterized as an electronic device for capturing either one- or two-dimensional images and converting them to electrical signals for further processing.
In overview, the Visual Pen Detection system of the invention employs two or more cameras placed strategically about a computer display to obtain the position of the computer operator's hand or an object such as a pen or stylus. The precise position of the hand or object is determined using trigonometric triangulation. Pointer position, as well as signals derived from the change of position or the rate of change of position, can be interpreted as commands to the computer system.
The Visual Pen Detection devices of the invention most closely resemble what collectively have been called touch screens in the prior art. Touch screens themselves have been largely divided into two groups: overlay, in which a device sized to fit the display is placed over or attached to the display screen itself; and non-overlay, in which no such display-screen-covering device is required. Although not touch screens, the Visual Pen Detection devices of the invention could be classed as non-overlay.
At the present time, by far the most commonly used touch screen technologies are overlay. Overlay touch screens usually require placing a specially prepared device over the computer display and using the device to determine the position, in two dimensions, of the operator's finger or some other stylus. Typical examples involve one- or two-layer arrangements o glass or plastic sheets which have been coated or patterned with special materials so that the position of a stylus or finger touch can be determined by making resistance or capacitance measurements on the device. Another approach broadcasts acoustic waves on the surface of a specially prepared piece of glass and determines two-dimensional positions by timing when the waves have been attenuated by a touch from a finger or other sufficiently compliant stylus. This latter device lays claim to a third-dimension sensing capability, but the "third dimension" actually relates to firmness of touch, since no detection at all takes place until the finger or stylus is firmly i contact with the display screen.
Non-overlay touch screens have thus far been largely unsuccessful in the marketplace, due largely to the awkwardness of the approaches which have heretofore been taken. One such device includes a linear array of perhaps twenty-five infrared light-emitting diodes placed along one side of a display and a matched linear array of infrared detectors along the other side. When touching the display screen, the operator's finger breaks a light beam, thus indicating positio on the vertical axis of the display. A similar arrangement at the top and bottom of the display is employed to determine horizontal axis position. As implemented, this prior art system provides relatively low precision in determining two-dimensional position. Because the diodes have to be placed a significant distance out from the display screen, parallax is also a problem. While this system obviously has the ability to detect the approach of a finger or stylus some distance above the display, it actually is forced to use this above-display point as precisely equivalent to touching the display. This detection of a touch before it actually happens results in a poor feel and lack of tactile response, and contributes greatly to the lack of success of the sensor.
It will be appreciated that, in systems of the type employing imaging systems place strategically about a computer display, each imaging system includes a photodetector, such as a charge-coupled device (CCD) having a plurality of elements organized as an array, and a lens to focus an image of the pointer object onto the photodetector. Assuming the position of the object image on the sensor array can be accurately determined, the direction of the object relative to the imaging system can be determined in a straightforward manner since position within the photodetector array is related to the angular position within the field of view of the imaging system typically in a nearly proportional manner. Given directions relative to at least two imaging systems pointer object position can be determined by triangulation, as is disclosed for example in Denlinge U.S. Pat. No. 4,782,328. Prior to the present invention, the resolution of such systems has generally been limited by the number of photodetector elements, although interpolation in some cases has been used to infer an image position between two detector elements, for example. As described in greater detail hereinbelow, such is a highly inefficient use of a CCD array.
SUMMARY OF THE INVENTION Accordingly, it is an object of the invention to provide improved systems and methods for optically determining the direction of an object relative to an imaging system, particularly for application in systems for optically determining the position of an object within a generally planar viewing field, such as a touch screen.
Briefly, and in accordance with an overall aspect of the invention, rather than employing focused imaging systems to produce a sharp image at the plane of a photodetector, an to thus define the stylus for finger position, a deliberately diffuse or blurred image is employed. As explained in detail hereinbelow, it is possible to determine the position of the maximum intensity, and thus the direction of the object, to a small fraction of the distance between sample points, with an accordingly much higher resolution than prior art focused systems. The diffusion produces a characteristic "bell-shaped" or Gaussian intensity distribution. The image-producing capability which is essential to prior art systems is abandoned because we already know what the image should look like. There is a trade-off between position- determining capability and object-recognition capability which is exploited by the subject invention. Another advantage is that the diffusion of the invention serves to make the system relatively insensitive to objects other than the pointer which the system is designed to recognize.
In accordance with a more particular aspect of the invention, a system for optically determining the direction of an object comprises a photodetector having a plurality of detector elements extending in at least one dimension, and elements for projecting a diffuse image of the object onto the photodetector such that each of the detector elements produces an output signal corresponding to the intensity of that portion of the diffuse image which is projected onto the particular detector element. As noted above, the diffuse image has a characteristic intensity distribution. The system additionally includes a data analyzer connected for receiving the output signals from the detector elements, which are taken as a data set, and is operable to locate the position of the diffuse image on the photodetector by recognizing a characteristic intensity distribution of the diffuse image. t/ For example, the position of the diffuse image can be located by performing a least squares fit of a predetermined intensity distribution function (e.g. a Gaussian distribution function) known to correspond with the characteristic intensity distribution to the output signals from the detector elements. Alternately, a transform, a filter, or other signal processing technique can be used on the output signals from the detector elements to extract the intensity distribution function. As yet another alternative, a transform may be performed on the output signals from the detector elements taken as a data set. Any one of a variety of techniques may be employed to produce the required diffusion. Examples include focusing the lens by positioning the photodetector where it is not in th image plane (focal plane) of the lens. A diffuser such as a frosted screen or a layer of petroleum jelly in the optical path may be employed. Various forms of fiber optic diffusers may be employed, such as fiber optic fans. In accordance with another aspect of the invention, there is provided a system for optically determining the position of an object within a generally planar viewing field. The system includes at least a pair of imaging systems in the same plane as the viewing field and arranged to view the object within the viewing field from different angular positions. Each of the imaging systems includes a photodetector having a plurality of detector elements extending in at least one dimension, and elements for projecting a diffuse image of the object onto the photodetector such that each of the detector elements produces an output signal corresponding to the intensity of that portion of the diffuse image which is projected onto the particular detector element. Again, the diffuse image has a characteristic intensity distribution. A data analyzer is connected for receiving the output signals from the detector elements, taken as a data set, and operable to locate the position of the diffuse image on each of the photodetectors by recognizing the characteristic intensity distribution of the diffuse image, and thereby to determine the direction of the object relative to each of the imaging systems, and to determine the position of the object by triangulation based on the thus-determined directions.
BRIEF DESCRIPTION OF THE DRAWINGS While the novel features of the invention are set forth with particularity in the appended claims, the invention, both as to organization and content, will be better understood and appreciated, along with other objects and features thereof, from the following detailed description taken in conjunction with the drawings, in which:
FIG. 1A is a touch screen device embodying the invention, partially in schematic form;
FIG. 1B is a side view of the touch screen device, partially in schematic form. FIGS. 2A and 2B are diagrams depicting resolution over different areas of the touch screen when two cameras are employed;
FIG. 3 depicts the geometry involved in determining x and y coordinates by triangulation, given angles;
FIGS. 4A and 4B are diagrams comparable to FIGS. 2A and 2B, but employing three cameras;
FIG. 5 is a plot of light intensity versus position along a photodetector array for both a sharp image and a diffuse image; FIG. 6 is a plot of light intensity versus position along a photodetector array in the case of motion of the diffuse image;
FIG. 7 is a schematic representation of a simple lens with its focal center arranged give the lens 90° coverage; FIG. 8 is a schematic representation of a focused lens system resolving a single point directly in front of the lens onto a linear sensor;
FIG. 9 is a schematic representation similar to FIG. 8, but showing the use of a diffuser;
FIG. 10 is a schematic representation of a lens system which produces diffusion by de-focusing the image;
FIG. 11 is a schematic representation of a lens system depicting the situation wherein a light source is positioned above the plane which should be imaged on a linear sensor; FIG. 12 is a schematic representation of a focused image system under the same conditions as FIG. 11 ; FIG. 13 depicts a fiber optic fan employed as a diffuser;
FIG. 14 depicts a fiber optic fan with an auxiliary diffusion sheet; and
FIG. 15 depicts a lens and diffuser employed with a bundle of optical fibers.
DETAILED DESCRIPTION Referring first to FIG. 1, a computer display device 30 has a pair of suitably-mounte imaging systems or cameras 32 and 34 positioned at the upper corners, and an optional central camera 36 to improve resolution near the upper middle region of the display 30.
In general, the requirement for a camera is that it produce a representation of a fiel of view from its vantage point that can be Used in an analysis to determine the position of specific objects in that field of view. This may likely involve the reduction of the image to digital form for u in a computing device, but it is also possible to produce a device in which analog video data are used directly in determining the position of objects in the field of view. The camera may therefore be a real video camera in the usual sense of the word, such as the Vfe" CCD (charge-coupled devic arrays currently in common use, producing a digitized two-dimensional depiction of the scene in terms of the levels of light (visible or invisible) perceived from the vantage point. It may also be an arrangement of devices such as lenses, light carrying devices such as fiber-optic cables, electric signal carrying cables, and light-sensing devices such as CCD arrays. So long as the camera effe is achieved, the physical layout of equipment to achieve the effect can be varied as required to produce a reasonable physical layout as constrained by the resulting cost and appearance of the device. The use of a lens or similar device in a camera allows the position information to be gathered using a small, standard device rather than a device which must be varied to match the dimensions of the display as is the case with most touch screen technologies.
In the particular arrangement of FIG. 1 , the cameras 32, 34 and 36 include respective linear CCD photodetectors 38, 40 and 42, such as Texas Instruments TSL-214 devices. Each TSL-214 device has 64 photosensitive cells arranged in a linear array approximately 5/16 inc in length. The cameras 32, 34 and 36 also include respective fixed lenses 44, 46 and 48. By way example and not limitation, the focal length of the lenses 44, 46 and 48 is in the 3.0 to 4.5 mm fo length range. The corner cameras 32 and 34 have nominal 90° wide aperture lenses 44 and 46, while the central camera 36 has a 150β to 180β coverage lens 48. Significantly, the lenses 44, 46 and 48 are positioned so as to focus at distances greater than the distance across the display 30, or even "beyond infinity". Thus, the photodetecto 38, 40 and 42 do not lie in image planes in which an image of a pointer object 50 is sharply focused.
Outputs of the photodetectors 38, 40 and 42 are connected to a data analyzer 52 which comprises an analog-to-digital converter 54, and a computer suitably programmed to perfor the analysis described in detail hereinbelow.
A pair of primary infrared light sources 56 and 58 are positioned near the corner cameras 32 and 34, and each have a nominal 90° spread, with a relatively narrow spread out of plane. Optional secondary infrared light sources 60 and 62 are also provided, with a slightly narrower in-plane spread. All four infrared light sources 56, 58, 60 and 62 are characterized as hi output. Associated with the display 30 is a bezel 64 which has a background frame 66 that is non-reflective at the infrared wavelengths employed.
General principles relating to the FIG. 1 configuration will now be discussed with reference to FIGS. 2A, 2B, 3, 4A and 4B. Preferably, there are just two camera positions in the upper corners of a display as depicted. (Although the discussion is of rectangles herein, the display can obviously be of any shape and the "upper" corners may be any adjacent corners. In fact, cameras can be positioned anywhere about the active area.) Since what the cameras distinguish is essentially the angular position of the finger or stylus, the sensitivity pattern of this arrangement is shown in FIGS. 2A and 2B. As shown, the lines are spaced apart by an equal angle. It can be seen that the coverage of the rectangle, although non-linear, is quite good with the exception of the region near the top of th rectangle between the two cameras.
FIG. 3 represents the geometry for triangulation, where the x and y coordinators ar
given by the equations x ~ L and y = x tan β . Directly between the tan α + tan β
cameras there is a singularity in the descriptive equations, where α = β = 0.
From FIG. 2A, it is apparent that the vertical position is well-determined in the regi between the cameras, but that slight errors in the angle lead to large errors in the sensed horizont position.
In practice, this singularity is not a problem. Response is usably accurate except f a narrow band near the top of the rectangle. Most displays are designed so that the cameras can be positioned some distance outside the rectangle that represents the active display area, as in FI 2B. In fact, most current touch screen technologies have similar requirements for an inactive regio near the edges of a display.
When required, however, accuracy in the region between cameras can be improved by adding the third camera position. Depending upon other requirements, lighting in particular, the third camera can be positioned along the side or in another corner, as shown in FIG. 4A, which gives complete and accurate coverage of the entire rectangle.
If conditions dictate, accuracy in the region between cameras can be substantially improved by positioning a third camera between the two corner cameras. A 180β camera is shown, although cameras serving lesser angles clearly would also be useful in increasing accuracy. Although in this case the three cameras are still situated so that singularities exist between them,
FIG. 4B (which corresponds with FIG. 1) shows that the accuracy in the upper region is substantiall improved.
Lighting considerations are important to the overall accuracy of the system. The system can be made to work using ambient light, but greater dependability can be achieved using controlled lighting. The three basic forms of lighting are using a self-illuminated stylus, using a lighted background, and using specific light sources to illuminate a stylus or finger. Each method can be more useful in specific situations.
Controlled, specific, light sources allow backgrounding, that is, taking periodic camera readings with the specific illumination turned off, and also allow operation using only a narrow band of light wavelengths. LED's or IRLED's are particularly useful in this regard. Such specific sources can be few or numerous and can be spaced around the rectangle of interest as required. Care must be taken to arrange or shade such light sources so that they do not shine directly into any camera or to sequence camera readings and illumination sources so that this does not happen. A favored mode places two or more light sources along the top of the rectangle with the cameras, with the light sources iensed in such a way as to produce essentially a sheet of light parallel to the rectangular surface. That sheet of light registers as a band of light across the finger or stylus which passes through it. Background readings can be taken when the light sources are momentarily switched off and can then be subtracted from the illuminated readings to produce a signal free of background interference for analysis. It is the preferred mode of operation to provide a background that is black. This blackness, or absorption of light from the illumination source, nee prevail only in the wavelengths of light sensed by the cameras. Thus, if the light sources are infrared and the cameras are supplied with infrared-passing filters, the background may be visibly white or colored, so long as it absorbs in the infrared. It is also possible to operate without a special background if it is known that there will be no interfering, highly-iliuminated objects. As can be seen from the equations, x and y are no restricted to be within the active region of the display, so this system is quite capable of sensing that an illuminated object is beyond the rectangle of interest and can be made to ignore such signals. A system using an illuminated background also can use a narrow band of light wavelengths to minimize interference from external illumination sources. In this case, however, it i often more convenient to use a diffused wide-band background illumination and a narrow-band filt on the cameras. Illuminated backgrounds can be especially useful in situations where ambient lig is bright and confusing enough to hamper the operation of a specific light source design.
A self-illuminated stylus has the disadvantage of requiring a special stylus. The advantages however are several. First, because the stylus itself is the light source, there is no difficulty in providing a light which overpowers ambient illumination. Second, accuracy can be significantly better than is possible with finger touch, since the "stylus" is smaller and the light source makes its true point position better defined. Third, because the light source is relatively intense, there is less problem in determining when the stylus is actually in use. And fourth, the stylus may send out modulated or other signals to indicating touch-down or other operator messages.
As noted above, a significant aspect of the invention is the use of diffuse or fuzzy images, produced such as by a deliberate lack of focus. With reference now to FIG. 5, if one begi to look carefully at the process of using a high-precision photodetector, such as a linear CCD arra to determine the position of a point light source, such as an illuminated pen tip, it is seen as a ver inefficient use of the photodetector array. Typically, only one or two cells are illuminated, often to saturation of the cell, and the rest of the cells are dark. The position of the pen is determined by t number of the cell that is illuminated, or the centroid number if more than one cell is illuminated. The sharp peaked curve 100 in FIG. 5 illustrates this behavior.
CCD cells are however quite capable of sensing the intensity of light, typically distinguishing hundreds of levels of intensity. In the typical prior art approach, intensity informatio is used at just two levels; present or absent. Furthermore, it is not at all clear that there is any wa of using intensity information to improve the process. Thus, related prior art devices use focused cameras to product a sharp image at the CCD plane and thus define the stylus or finger position.
In accordance with the subject invention, intensity information is employed to advantage. Thus, if one completely abandons the concept of a sharp image and places a diffusio plate over the lens, for example, a point light source, such as an illuminated pen tip, will produce characteristic "bell-shaped" intensity distribution 102. Because this shape 102 is characteristic, in fact symmetric in this case, if one samples the light intensity at several points 104, 106, 108, 110, 112, 114 and 116 within the curve, it is possible to determine the position of the maximum intensit to a small fraction of the distance between sample points. Thus many fewer CCD cells, for exampl are required to determine the position of the light source to a given precision using a diffuse, blurr image. Note that samples taken at the evenly spaced positions 104, 106, 108, 110, 112, 114 and 116 completely miss the sharp image 100, while such samples would quite adequately determine the position of the diffuse image 102.
We are not getting something for nothing here. We have abandoned the image- producing capability which is so key to the previous systems because we already know what the image should look like. In information theory terms, we are using the ability to precisely determine light intensity to augment an accurate but imprecise position measurement. There is a trade-off between position-determining capability and object-recognition capability, so the pointer object mu be very distinctive in intensity. This is easily true of an illuminated-tip pen. It is also true of an opaque object, such as a finger, obscuring a diffuse light-source, such as a diffuse light panel mounted around the perimeter of a display screen. It is also true of a properly illuminated object. Rather than being a disadvantage, the diffusion serves to make the system relatively insensitive to image details other than the pointer which the system is designed to recognize. Although a lens and diffusion plate is used in the example above, the fact that most clearly separates diffusion- assisted and sharp-image position detection is that no lens, slit, or other light-ray sharp focusing device is actually required by the diffusion-assisted method.
We have discovered that the intentional use of diffuse images carries a significant side benefit. Because de-focusing is one means of producing a diffuse image, focus is a minor concern at most in constructing a camera for our use. In a focused-camera application, one must pay careful attention to maintaining a depth-of-field sufficient to keep the stylus or finger in reasonably sharp focus over the complete range of distances from the camera that the stylus or finger might be placed. Indeed, in that usage, lack of focus can lead directly to a lack of accuracy or precision. Thus in focused-camera devices, the lens aperture must be kept small, and often a pinhole or slit is used to insure adequate depth-of-field. This in turn requires that bright incident illumination or an illuminated stylus be used. In one instance with our intentionally diffuse system, we have used a lens with an aperture approximately twice its effective focal length (f 0.5) with complete success. Likewise, for our usage the optical quality of the lens is of only minor importance; most of the imperfections that are usually of concern in lens design merely contribute to the diffusion. Diffusion can be achieved through any of several mechanisms, and will in any case be the result sum of several mechanisms. Depending upon the situation, the precise nature of diffusion can be selected to advantage.
Possible sources of diffusion include: 1) de-focus or other lens- (or pinhole- or slit-) related imaging imperfections, 2) intentionally placin a diffusion device such as a frosted screen or a layer of petroleum jelly in the optical path, 3) digita or analog-derived combination of the input intensities to produce mathematical or other non-optical diffusion, and 4) diffusion of the pointer object itself relative to a point source of light. As constrained by the typical size of a touch screen, the human finger itself is relatively broad and is thus a naturally diffuse pointer object in comparison to a point source. These various sources of diffusion are subject to varying amounts of control in the design of a system and the required diffusion will generally result from a partially controlled combination of the several sources.
As just described with reference to FIG. 5, diffusion is used to produce an intensity curve of known shape and with intensities varying along its length, the Gaussian distribution being physically and mathematically very usable. Referring now also to FIG. 6, motion of the known shap can then be tracked as it passes over a discrete array of sensors and located with a precision that a small fraction of the distance between sensors. A Gaussian curve 120 is again shown in FIG. 6, with a shifted version 122 shown in dotted line representing an actual movement to the left of abo one-fourth the distance between sensor elements. The sensor elements to the left of the central peak experience varying, but very specific, increases in light intensity, while those to the right experience decreases. A further slight difference in position makes a corresponding slight difference in the length of each of the arrows in the diagram. From this it will be appreciated that measurement of movement of the curve 120,122 between two sensor element positions is essentially a continuous process, the precision being limited more by the error in the individual intensity measurements than by the distance between sensors. Of course, other factors, such as how well the intensity shape is known, also influence the precision.
Although there are similarities, the process of finding the exact position of a curve known characteristics is not the same as what is commonly called interpolation. Typically, interpolation is applied with little, if any, regard for the known characteristics of the shape, and the interpolation polynomial or other mathematical form is applied to as few points as possible in the vicinity of the area of interest, the concept being that whatever mathematical form is being used wi provide a better approximation to the real curve over a small region. In doing this, the interest would be in finding the peak intensity and its position. We are much more interested in finding an tracking the position and characteristics of the overall intensity shape curve than in tracking its localized peak intensity, which may move somewhat relative to the position of the overall shape.
The approximate Gaussian we are tracking, the blurred image of a finger, for example, is likely to b the sum of many closely spaced Gaussians from individual point light sources on the finger. As hand position changes, these individual Gaussian sources will shift somewhat relative to one anoth while still maintaining a good approximation to a complete Gaussian overall. It is thus more stable and generally better to track the complete shape rather than interpolating the peak intensity.
Considering now several possible sources of diffusion in greater detail, FIG. 7 show a simple lens 150 with its focal center arranged to give the lens 90° coverage. In the simplest of cases, the coverage can be considered to be linear with angle, and the position of the lens 150 arranged to give the proper coverage. Because we use a diffuse image, the usual requirement that a wide angle lens provide accurate edge-to-edge focus is not at all important. Likewise, non-linear angle coverage in extreme wide angle lenses is not important, since any non-linearity can be corrected, if necessary, as part of the x, y position calculation. Some attention must be given to aperture shadowing at wide angles. The so-called cosine correction whereby the circular lens aperture looks like an ellipse when viewed from the side is the first of these effects. This phenomenon, which effectively reduces light passage at side angles, is often accentuated by the three-dimensional aspect of the lens.
FIG. 8 shows a focused lens system resolving a single point 154 directly in front of lens 156 onto a linear sensor 158, which runs perpendicular to the surface of the paper. As shown in FIG. 9, the image can be diffused by placing a diffusing sheet 160 directly in front of the linear sensor 158. The actual diffusion sheet 160 would be much less conspicuous than shown in FIG. 9, very likely being a spray coating on the surface of the linear sensor array chip 158. FIG. 10 schematically shows a lens system which is producing diffusion by de- focusing the image, in this case by using a lens 162 with a longer natural focal length positioned th same distance from the linear sensor array as in the figures above. Thus the image plane is at 164. The triangular shape shown is in reality a three-dimensional cone, so that several adjacent cells of the linear sensor array 158 receive light from the object point. This representation is somewhat simplistic in that the true distribution of light approximates a Gaussian distribution rather than falling off sharply at the edges of the cone, but the general concept is correct.
The use of a defocused image has a side effect that is of importance in designing a diffused image system. In FIG. 11 , a light source 166 is positioned above the plane 168 which should be imaged on the linear sensor 158. Because the image is de-focused, the cone of light from the source is picked up by the sensor, giving a false or interfering signal. Thus, even if a special black background is erected to cover the image plane which the sensor is designed to see, a de-focused system can also see extraneous light sources which are at a low angle above the barrier.
As shown in FIG. 12, a focused image system does not have this problem (although internal reflections may still create similar problems in focused systems). Even if a diffusion sheet i used in the focused system, the sheet can be narrow enough to just cover the linear sensor 158, and thus not be subject to the same problem as in FIG. 11. In practice, a compromise must be struck. The background, and even more so the extraneous light source are necessarily relatively far from the lens, and a lens which is focused for distance will still be diffuse for nearer objects. Thus, in practice, the lens is more nearly focused for distance, and supplementary diffusion is used if necessary for satisfactory performance near the edges of the display.
The light intensity pattern on the linear sensor 158 is essentially a sum of Gaussian distributions each resulting from some illuminated source in the field. The position of the finger or stylus can be represented by the mean, μ, of the Gaussian which corresponds to the finger or stylus. There are many ways of analyzing the intensity data to extract μ, and the following has bee found to a reasonable compromise between computational intensity and dependability of the algorithm.
The diffusion-assisted position locating concept also makes it possible to employ fiber optic elements 180 as an active part of the system rather than just passive message carriers. FIGS. 13, 14 and 15 show three different arrangements of lenses and fiber optics.
The configurations of FIGS. 13 and 14 have no lens at all. An optical fiber has the characteristic that it will accept light which enters an acceptance cone which typically ranges about ± 30" from the center axis of the fiber. This makes the optical fiber a diffuser in itself, without a len or other image sharpening device. By arranging the fibers in a fan shape, as shown in FIGS. 13 and 14, the acceptanc cone for each fiber takes in a slightly different range of angles over the entire scene and the arrangement behaves much like a lens-and-diffuser system, but without requiring, a lens. Thus, the fiber fans have the property of averaging light intensity over a range of angles as depicted in FIG. The arrangement in FIG. 14 uses a diffuser 190 to better control the amount and nature of diffusion experienced in a fiber. The optical fiber by itself has a sharper cut-off of light intensity at the extreme angles of the acceptance cone than would be the case for an ordinary diffuser, and the use of a diffusion sheet 190 can correct this if it proves desirable to do so. Note that there is no reason a diffuser could not be effectively used between the fibers and the CCD sensor rather than at the opposite end of the fibers.
FIG. 15 uses a lens and diffusion sheet to feed light directly to an optical fiber, so that the effect is more nearly that of sampling a specific angle in the graph of FIG. 5. Clearly, the configuration of FIG. 15 offers more control over the degree of diffusion than is possible in either FIGS. 13 or 14. If this increased control is necessary, it is available. For best performance, this system of the invention requires a separate detection of touchdown. When an illuminated stylus is used, this can be achieved by having a tip switch in the stylus that causes a separate signal to be sent to the system controller. Although there is a curren active market for LED matrix touch screens which detect the finger above the display and thus do not detect touch well, this behavior limits the usefulness of these devices. With finger activated devices, some other detection is required. We use methods which involve detecting the small motion of the display that comes with the force of touch. We have used with success a touch detector 67 consisting of a piezoelectric speaker surrounded on each side by foam sponge (1/8" uncompressed) to act as a spring allowing small motions and positioned under the rear edge of a standard CRT display. The output of detector 67 is applied through A/D converter 54 to data analyzer 52. The sudden shifts in the mean of the harmonic motions detected by this device indicate that the display has been touched. These are true touch events if the camera devices also detect that the display screen is being touched. Such motion can be detected employing piezo devices, strain gauges, devices which depend upon resistive changes, as well as devices in which the motion brings closer the plates of a variable capacitor, or LVDT-like devices in which the inductance of a coil is varied through motion. It will be understood that detector 67 is intended to illustrate one simple example of a motion detector system for detecting touch of the screen, and other known touch detection systems could be substituted therefor.
Careful implementation and data analysis can also handle "backgrounding" in a fairl straightforward manner. As a practical illustration, consider a system in which the stylus (or finger) is illuminated using infrared LEDs as a source. The illumination can be either internal or external to the stylus. In environments which may experience intense stray illumination, as from auto headlights, carnival lights, or the like, the CCD sensors may give false readings. If the LED illumination is cycled on and off in synchronization with the CCD reading cycles, the illumination may be intentionally left off for some cycles and the resulting CCD reading used as background. By way of specific example, using a clock-driven analog to digital converter (ADC), data are taken representing the light intensity observed at each of the 64 sensors of the Texas Instruments TSL214 in each of the two or three cameras being used. This data taking operation is initiated at regular intervals typically in the range of 1/25th to 1/100th of a second, the interval being decided mainly by the time required to produce an intensity amplitude sufficient for analysis the dimmest area of the field.
These data are read into the digital computer memory (for example using direct memory access, an interrupt or by polling the ADC). The data are then analyzed using the followin process: First, the intensity data are corrected for background intensity (if necessary) by subtracting out a background value for each of the 64 cells. These background values typically comprise the average intensity seen by the cell in quiescent periods. The objective is to produce clean signal that does not contain spurious peaks of intensity that might confine the following process, particularly in the locating of the peak region. In a well-controlled lighting situation, background subtraction may not be necessary at all.
The data are then searched for a region including a peak value of intensity. This is currently done by searching for the set of n sequential points for which the intensities sum to the highest value. Our experience is that values of n in the range of 6 to 10 produce good results with a 64 element sensor. The central position of the n points is then taken as a reference. Starting from the central point, and going in both directions, the array of data is examined to find the first point index below some threshold percentage (typically 60% to 80%) of the average intensity in the peak region. The lowest index in this region is called / and the highest, h. This process defines a region which encompasses the peak region of the dominant Gaussian and avoids the tail areas where the dominant Gaussian may be confounded with non-dominant peaks.
The light intensity pattern in the selected region is primarily the Gaussian distributio resulting from the finger or stylus in the field and its position can be represented by the mean, μ, o that Gaussian. The current method used for extracting μ is by least-squares fitting of the data in th region to the Gaussian curve, (although other methods might be used). The Gaussian
distribution is: y = — e 2σ2 in which y is the intensity, x is the position within the array, a
running from 1 to 64, and σ2 is the variance. This equation can be transformed to
In y = AQ + A,x + AgX2, where A0 = m-A-Jt A = - - , and A, . Thus, if the A σ 2σ22 14
are determined, μ = The form for In y above is suitable for applying linear least squares,
2A,
whereby n = h-l+ 1 is th
Figure imgf000016_0001
number of points selected for the peak region and w, is a suitably chosen weighting factor. In the standard application of least squares to the original Gaussian equation for y„ the weighting factor would be chosen as w- = y, to compensate for the logarithmic transformation that was required to use linear least squares. In the method currently used, however, this weighting factor is used to ai in producing a smooth progression of μ as the Gaussian distribution moves across the array of cells. This is achieved by using weights which are largest near the central, peak values of the distribution and diminish to much smaller values at the ends of the selected arrays. If yc is the central (nominal maximum) intensity, and p = max (c, π-c-l), then one suitable weighting factor is
w± = ( j c-1 j -p) ~ and another is wd each of which has been used
( |c-i | +l ) «
successfully with q = 1 , 2 and 3. Optimal choice of the weighting function depends upon the diffusive characteristics of the cameras. Because in this process all of the intensities usually have been selected to be within 60% or closer to the highest intensity, the logarithmic transformation is usually superfluous in application. Because it saves computation time, γ, is generally used in place of In y- in the above equations. Additionally, when using weighting factors similar to those describe above, it is seen that the inverse least-squares matrix above does not depend upon the intensities, y, and can therefore be computed and inverted in advance. Because sets of n points, where n<64 are generally used, and n is variable, when using this approach it is necessary to pre-compute a number of inverse matrices and to select the proper one of these to use in each computation.
The ends of the 64 element array require special consideration during this selection and weighting process. As stated above, the distribution of weights must take place about the nominal maximum value, yc, but if y0 is near either end of the array, it is important that the weights still center about yc even though the end of the array will truncate the symmetrical effect. This approach causes the progression of μ to remain uniform even near the ends of the array. - -1
Note that this approach yields Δ = — - as well as μ. The value of σ is a useful
2-42
diagnostic. The value of σ represents the relative broadness of the Gaussian curve and depends n only upon the diffusiveness resulting from the camera system but also on diffusiveness resulting from the size of the subject. Thus values of σ which should validly represent, say, a finger touchin the display typically fall within a narrow range and σ is only weakly dependent on the image size and intensity. When the analysis results in a value of σ outside that range, the analysis can be rejected as not representing a true finger touch. Similarly, when controlled illumination is used, a valid touch within the display area will be found to produce at least a certain minimum intensity, an signals below that level can be safely rejected as invalid. With an ordinary camera and lens, the value of μ, which represents a position within the sensor array, is directly proportional to the angular position in the field of view of that camera.
Thus, α = α0 + Δα-t— , where α is the angular position of the object in the field, α0 is a 64
calibration bias to correct for the true camera angular position, and Δα is the angular field of view represented by the 64 cells in the array. Both α0 and Δα are determined through calibration in the
field. Similarly, for the second camera, β = β0 +Δ β -<—• , and β represents the angle of the obje
64
in the field of view of that camera.
If α, measured counter-clockwise, is for a camera positioned in the upper right corner of a rectangle and β, measured clockwise, is for a camera positioned in the upper left corne of the same rectangle, with the distance L between the focal centers of the two cameras, then
x = an_α — ancJ _ « where x is measured from left to right across the tan α + tan p
rectangle and y is measured from top to bottom of the rectangle. The distance L is generally taken through field calibration rather than by direct measurement.
If it is necessary to use a third camera to adequately cover the less precise region a the top between the two cameras, that camera may be placed in another corner, in which case the data analysis simply pairs the new camera with one of the two at the upper corners to perform and proceeds as above.
If a super-wide angle camera is used between the two at the top corners of the rectangle, then x and y are determined from the two corner cameras as above, with x being discarded. The Gaussian curve fit is the same for the central camera as for the corner cameras wit the exception that, if an angle-distorting lens, such as a fish-eye, is used, the angle is no longer be linear with position in the 64 element array and correction appropriate for the lens must be made.
the center camera is positioned a distance O— from the left corner camera, and a distance V
above the line between the corner cameras and the angle for the center camera, γ, is measured counterclockwise, then x = C + (Y + y) tan(γ-90°), where y is as determined from the corner cameras. Again, C and Y are generally determined by field calibration rather than by direct measurement.
Pairs of x, y coordinates are reported whenever the above process produces valid results and the point falls within the active area of the display and the computed value of σ, the spread of the Gaussian representing the stylus or finger touching the display, falls within a valid range. If the touch sensing device senses that the display has actually been touched during a period of valid display, then the touch event is also reported.
From the above discussion, it will be appreciated that the present invention provide a method and apparatus whereby the pointer object 50 shown in FIG. 1A is precisely located by photodetectors, such as photodetectors 38, 40 and 42. In the preferred embodiment, the photodetectors 38 and 40 are positioned at the upper corners of the display 30 and the detector 4 is positioned at the top center of the display 30. In this arrangement, the pointer object 50 is located or positioned precisely with respect to the photodetectors by triangulation as previously described. Since the photodetectors are placed in a known position with respect to the display 30, the position of the pointer object 50 with respect to the screen 30 is also known. In the preferred embodiment, the data analyzer 52 (computer) is programmed to display a cursor 51 on the screen 30 as soon as a pointer object 50 is detected near the display 30 (about 1 inch from the display 30 by the photodetectors 38, 40 and 42. The cursor 51 may be displayed in a position precisely corresponding with the center of the pointer object 50 or the cursor may be displaced slightly with respect to the object 50 as illustrated in FIG. 1A where the cursor 51 is displaced upwardly and to the left of the pointer object 50. In some circumstances, a slight displacement of the cursor may prove to be an advantage. For example, if a finger is used as the pointer object 50, and the cursor 51 is positioned directly in the center of the finger, the finger will hide the cursor when the screen i touched. By displacing the cursor upwardly and slightly to the left, the cursor 51 will remain visibl to the user at all times, even when the screen is being touched. In such a case, a line from the center of object 50 to cursor 51 is oblique (not perpendicular) to the display 30. Thus, by displacing the cursor 51 slightly with respect to the pointer object 50, one may actually achieve greater precision, because the user will always have a visual feedback as to the cursor position, even when the screen is being touched.
Also, by displaying a cursor before the screen is actually being touched, the user may more accurately position the pointer object 50 and touch the screen at the desired location, which is normally required on a touch screen to select a desired function. While specific embodiments of the invention have been illustrated and described herein, it is realized that numerous modifications and changes will occur to those skilled in the art. It is therefore to be understood that the appended claims are intended to cover all such modifications and changes as fall within the true spirit and scope of the invention.

Claims

CLAIMS:
1. A system for optically determining the direction of an object, said system comprising: a photodetector having a plurality of detector elements extending in at least one dimension; elements for projecting a diffuse image of the object onto said photodetector such that each of said detector elements produces an output signal corresponding to the intensity of tha portion of the diffuse image which is projected onto the particular detector element, the diffuse image having a characteristic intensity distribution; and a data analyzer connected for receiving the output signals from said detector elements and operable to locate the position of the diffuse image on said photodetector by recognizing the characteristic intensity distribution of the diffuse image.
2. A system in accordance with claim 1 , wherein said elements for projecting a diffuse image comprise a lens arranged to focus a sharp image of the object at an image plane, in which said photodetector is not positioned.
3. A system in accordance with claim 2, wherein said photodetector is positioned between said lens and the image plane.
4. A system in accordance with claim 1 , wherein said elements for projecting a diffuse image comprise a focusing lens and a diffuser positioned between said lens and said photodetector.
5. A system in accordance with claim 1 , wherein said elements for projecting a diffuse image comprise a plurality of optical fibers, one end of each of said optical fibers being arranged to collect optical radiation and the other end of each of said optical being arranged to deliver optical radiation to one of said detector elements.
6. A system in accordance with claim 5, wherein said one ends of said optical fibers are arranged in a fan configuration so that each optical fiber collects optical radiation from a different direction.
7. A system in accordance with claim 5, wherein an optical path is defined between the object and said photodetector, and which system further comprises a diffuser positioned in said optical path.
8. A system in accordance with claim 6, wherein an optical path is defined between the object and said photodetector, and which system further comprises a diffuser positioned in said optical path.
9. A system in accordance with claim 1 , wherein said data analyzer includes elements for locating the position of the diffuse image on said photodetector by performing a least squares fit of a predetermined intensity distribution function known to correspond with the characteristic intensity distribution to the output signals from said detector elements.
10. A system in accordance with claim 9, wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
11. A system in accordance with claim 1 , wherein said data analyzer includes elements for locating the position of the diffuse image on said photodetector by performing a signal processing operation to extract from the output signals from said detector elements the characteristics of a predetermined intensity distribution function known to correspond with the
> characteristic intensity distribution.
12. A system in accordance with claim 11 , wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
13. A system in accordance with claim 1 , wherein said data analyzer includes elements for locating the position of the diffuse image on said photodetector by performing a transform on the output signals from said detector elements taken as a data set.
14. A system in accordance with claim 1 , which comprises an element for illuminating the object with radiation having a recognizable characteristic.
15. A method for optically determining the direction of an object, said method comprising: projecting a diffuse image of the object onto a photodetector having a plurality of detector elements extending in at least one dimension such that each of the detector elements produces an output signal corresponding to the intensity of that portion of the diffuse image which is projected onto the particular detector element, the diffuse image having a characteristic intensity distribution; locating the position of the diffuse image on the photodetector by recognizing the characteristic intensity distribution of the diffuse image.
16. A method in accordance with claim 15, wherein said step of projecting a diffuse image comprises employing a lens to focus a sharp image of the object at an image plane in which the photodetector is not positioned.
17. A method in accordance with claim 15, in which said step of projecting a diffuse image comprises employing a plurality of optical fibers, with one end of each of the optical fibers being arranged to collect optical radiation, and the other end of each of the optical fibers being arranged to deliver optical radiation to one of the detector elements.
18. A method in accordance with claim 17, which comprises employing a plurality of optical fibers with the one ends of the optical fibers arranged in a fan configuration so that each optical fiber collects optical radiation from a different direction.
19. A method in accordance with claim 15, wherein said step of projecting a diffuse image comprises employing a diffuser.
20. A method in accordance with claim 15, wherein said step of locating the position of the diffuse image on the photodetector comprises performing a least squares fit of a predetermined intensity distribution function known to correspond with the characteristic intensity distribution to the detector element output signals.
21. A method in accordance with claim 20, wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
22. A method in accordance with claim 15, wherein said step of locating the position of the diffuse image on the photodetector comprises performing a signal processing operation to extract from the output signals from said detector elements the characteristics of a predetermined intensity distribution function known to correspond to the characteristic intensity distribution.
23. A system in accordance with claim 22, wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
24. A method in accordance with claim 15, wherein said step of locating the position of the diffuse image on the photodetector comprises performing a transform on the output signals from said detector elements.
25. A system for optically determining the position of an object within a general planar viewing field, said system comprising: at least a pair of detector systems in the same plane as the viewing field and arranged to view the object within the viewing field from different angular positions, each of said detector systems including a photodetector having a plurality of detector elements extending in at least one dimension, and elements for projecting a diffuse image of the object onto said photodetector such that each of said detector elements produces an output signal corresponding t the intensity of that portion of the diffuse image which is projected onto the particular detector element, the diffuse image having a characteristic intensity distribution; a data analyzer connected for receiving the output signals from said detector elements, and operable to locate the position of the diffuse image on each of said photodetectors by recognizing the characteristic intensity distribution of the diffuse image and thereby to determin the direction of the object relative to each of said detector systems, and to determine the position of the object by triangulation based on the determined directions.
26. A system in accordance with claim 25, wherein said elements for projecting diffuse image comprise a lens for each of said detector systems arranged to focus a sharp image the object at an image plane in which said photodetector is not positioned.
27. A system in accordance with claim 26, wherein, in each of said detector systems, said photodetector is positioned between said lens and the image plane.
28. A system in accordance with claim 25, wherein said elements for projecting diffuse image comprise, for each of said detector systems, a focusing lens and a diffuser positione between said lens and said photodetector.
29. A system in accordance with claim 28, wherein said elements for projecting diffuse image comprise, for each of said detector systems, a plurality of optical fibers, one end of each of said optical fibers being arranged to collect optical radiation and the other end of each of said optical being arranged to deliver optical radiation to one of said detector elements.
30. A system in accordance with claim 29, wherein, for each of said detector systems, said one ends of said optical fibers are arranged in a fan configuration so that each optic fiber collects optical radiation from a different direction.
31. A system in accordance with claim 29, wherein, for each of said detector systems, an optical path is defined between the object and said photodetector, and which system further comprises a diffuser positioned in said optical path.
32. A system in accordance with claim 30, wherein, for each of said detector systems, an optical path is defined between the object and said photodetector, and which system further comprises a diffuser positioned in said optical path.
33. A system in accordance with claim 25, wherein said data analyzer includes elements for locating the position of the diffuse images on each of said photodetectors by performing a least squares fit of a predetermined intensity distribution function known to correspon with the characteristic intensity distribution to the output signals from said detector elements.
34. A system in accordance with claim 33, wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
35. A system in accordance with claim 25, wherein said data analyzer includes elements for locating the position of the diffuse images on each of said photodetectors by performing a signal processing operation to extract from the output signals from said detector elements the characteristics of a predetermined intensity distribution function known to correspond with the characteristic intensity distribution.
36. A system in accordance with claim 35, wherein the predetermined intensity distribution function is a Gaussian intensity distribution function.
37. A system in accordance with claim 25, wherein said data analyzer includes elements for locating the position of the diffuse image on each of said photodetectors by performin a transform on the output signals from said detector elements taken as a data set.
38. A system for determining the position of an object adjacent to a surface of display having a perimeter, said system comprising: at least a pair of detector systems positioned outside of and adjacent to the perimeter of the display and arranged to view the object from different angular positions and to vie the object when it is positioned on the surface of the display and when the object is positioned adjacent to the surface of the display, but spaced apart from the display, each of said imaging systems including a photodetector and elements for projecting a light pattern corresponding to the object onto said photodetector such that said photodetector produces output signals correspondin to the intensity distribution of the light pattern; and a data analyzer connected for receiving the output signals and operable to locate th position of the light pattern on each of said photodetectors and thereby to determine the direction the object relative to each of said imaging systems, to determine the position of the object by triangulation based on the determined directions, and to display a cursor on the display at a position corresponding to the position of the object, said cursor being displayed when the object is adjacent to, but spaced apart from, the display.
39. The system of Claim 38 further comprising a touch sensor for sensing a touch the object to the display, producing a touch signal when a touch is sensed, and transmitting the touch signal to said data analyzer.
40. The system of claim 38 wherein said data analyzer displays the cursor at a position on the display that is shifted with respect to the object so that a line from the object to the cursor is oblique (not perpendicular) with respect to the screen.
PCT/US1993/011170 1992-11-24 1993-11-17 Diffusion-assisted position location, particularly for visual pen detection WO1994012897A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
EP94903272A EP0671018A4 (en) 1992-11-24 1993-11-17 Diffusion-assisted position location, particularly for visual pen detection
JP6512560A JPH08506193A (en) 1992-11-24 1993-11-17 Device and method for diffusion-assisted localization for visual detection of pens
AU57276/94A AU5727694A (en) 1992-11-24 1993-11-17 Diffusion-assisted position location, particularly for visual pen detection

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US07/980,915 1992-11-24
US07/980,915 US5317140A (en) 1992-11-24 1992-11-24 Diffusion-assisted position location particularly for visual pen detection

Publications (1)

Publication Number Publication Date
WO1994012897A1 true WO1994012897A1 (en) 1994-06-09

Family

ID=25527954

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1993/011170 WO1994012897A1 (en) 1992-11-24 1993-11-17 Diffusion-assisted position location, particularly for visual pen detection

Country Status (5)

Country Link
US (1) US5317140A (en)
EP (1) EP0671018A4 (en)
JP (1) JPH08506193A (en)
AU (1) AU5727694A (en)
WO (1) WO1994012897A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991648A (en) * 2015-07-20 2015-10-21 江苏惠通集团有限责任公司 Screen input system

Families Citing this family (192)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5905810A (en) 1990-02-05 1999-05-18 Cummins-Allison Corp. Automatic currency processing system
US5680474A (en) * 1992-10-27 1997-10-21 Canon Kabushiki Kaisha Corresponding point extraction method for a plurality of images
WO1995002163A1 (en) * 1993-07-08 1995-01-19 Science Accessories Corp. Position and angle determination using light
JP2813728B2 (en) * 1993-11-01 1998-10-22 インターナショナル・ビジネス・マシーンズ・コーポレイション Personal communication device with zoom / pan function
US5564974A (en) * 1994-09-06 1996-10-15 Cummins-Allison Corp. Coin sorting system with touch screen device
US5573457A (en) * 1995-03-07 1996-11-12 Cummins-Allison Corp. Coin Wrapping system with touch screen device
GB2299856B (en) * 1995-04-13 1999-03-24 Motorola Israel Ltd Position-determining input device
US5982918A (en) 1995-05-02 1999-11-09 Cummins-Allison, Corp. Automatic funds processing system
US6748101B1 (en) 1995-05-02 2004-06-08 Cummins-Allison Corp. Automatic currency processing system
US6363164B1 (en) 1996-05-13 2002-03-26 Cummins-Allison Corp. Automated document processing system using full image scanning
US5943655A (en) * 1995-06-06 1999-08-24 Cummins-Allison Corp. Cash settlement machine
US6044165A (en) * 1995-06-15 2000-03-28 California Institute Of Technology Apparatus and method for tracking handwriting from visual input
US5648844A (en) * 1995-11-20 1997-07-15 Midland Manufacturing Corp. Laser liquid level gauge with diffuser
US6718053B1 (en) * 1996-11-27 2004-04-06 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
AU724393B2 (en) * 1995-11-30 2000-09-21 Chromavision Medical Systems, Inc. Method and apparatus for automated image analysis of biological specimens
GB9605216D0 (en) * 1996-03-12 1996-05-15 Ncr Int Inc Display system and method of moving a cursor of the display system
US8443958B2 (en) 1996-05-13 2013-05-21 Cummins-Allison Corp. Apparatus, system and method for coin exchange
US6661910B2 (en) 1997-04-14 2003-12-09 Cummins-Allison Corp. Network for transporting and processing images in real time
US8162125B1 (en) 1996-05-29 2012-04-24 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US7187795B2 (en) 2001-09-27 2007-03-06 Cummins-Allison Corp. Document processing system using full image scanning
US20050276458A1 (en) 2004-05-25 2005-12-15 Cummins-Allison Corp. Automated document processing system and method using image scanning
US7903863B2 (en) 2001-09-27 2011-03-08 Cummins-Allison Corp. Currency bill tracking system
US6108637A (en) * 1996-09-03 2000-08-22 Nielsen Media Research, Inc. Content display monitor
US5936615A (en) * 1996-09-12 1999-08-10 Digital Equipment Corporation Image-based touchscreen
US6021883A (en) * 1996-11-25 2000-02-08 Cummins Allison, Corp. Funds processing system
US8478020B1 (en) 1996-11-27 2013-07-02 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US6061177A (en) * 1996-12-19 2000-05-09 Fujimoto; Kenneth Noboru Integrated computer display and graphical input apparatus and method
US5949402A (en) * 1997-02-13 1999-09-07 International Business Machines Corporation Optical alignment method for pointing devices
JPH1186038A (en) * 1997-03-03 1999-03-30 Sega Enterp Ltd Image processor, image processing method, medium and game machine
US5796952A (en) * 1997-03-21 1998-08-18 Dot Com Development, Inc. Method and apparatus for tracking client interaction with a network resource and creating client profiles and resource database
US6039645A (en) 1997-06-24 2000-03-21 Cummins-Allison Corp. Software loading system for a coin sorter
US5940623A (en) 1997-08-01 1999-08-17 Cummins-Allison Corp. Software loading system for a coin wrapper
JP4033582B2 (en) * 1998-06-09 2008-01-16 株式会社リコー Coordinate input / detection device and electronic blackboard system
US7268774B2 (en) * 1998-08-18 2007-09-11 Candledragon, Inc. Tracking motion of a writing instrument
US20100008551A9 (en) * 1998-08-18 2010-01-14 Ilya Schiller Using handwritten information
DE19856007A1 (en) * 1998-12-04 2000-06-21 Bayer Ag Display device with touch sensor
CA2371874C (en) 1999-04-28 2005-04-12 Cummins-Allison Corp. Currency processing machine with multiple coin receptacles
US6637576B1 (en) 1999-04-28 2003-10-28 Cummins-Allison Corp. Currency processing machine with multiple internal coin receptacles
US6449382B1 (en) * 1999-04-28 2002-09-10 International Business Machines Corporation Method and system for recapturing a trajectory of an object
AUPQ206399A0 (en) 1999-08-06 1999-08-26 Imr Worldwide Pty Ltd. Network user measurement system and method
DE19951320A1 (en) * 1999-10-25 2001-04-26 Siemens Ag Arrangement for detecting an object on and/or in front of a surface comprises a receiving device and a computer to determine reflected waves from the object
JP4052498B2 (en) * 1999-10-29 2008-02-27 株式会社リコー Coordinate input apparatus and method
US6710770B2 (en) * 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6414674B1 (en) 1999-12-17 2002-07-02 International Business Machines Corporation Data processing system and method including an I/O touch pad having dynamically alterable location indicators
JP2001184161A (en) 1999-12-27 2001-07-06 Ricoh Co Ltd Method and device for inputting information, writing input device, method for managing written data, method for controlling display, portable electronic writing device, and recording medium
JP5072160B2 (en) 2000-01-12 2012-11-14 ネットレイティングス・インコーポレーティッド System and method for estimating the spread of digital content on the World Wide Web
US6828956B2 (en) * 2000-01-26 2004-12-07 Canon Kabushiki Kaisha Coordinate input apparatus, coordinate input system, coordinate input method, and pointer
US8701857B2 (en) 2000-02-11 2014-04-22 Cummins-Allison Corp. System and method for processing currency bills and tickets
US6674426B1 (en) * 2000-03-10 2004-01-06 Oregon Health & Science University Augmenting and not replacing paper based work practice via multi-modal interaction
JP4393030B2 (en) * 2000-04-14 2010-01-06 富士通株式会社 Optical position detection device and recording medium
CA2412878C (en) * 2000-07-05 2015-02-03 Smart Technologies Inc. Camera-based touch system
US6803906B1 (en) * 2000-07-05 2004-10-12 Smart Technologies, Inc. Passive touch system and method of detecting user input
WO2002021502A1 (en) * 2000-09-07 2002-03-14 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US7058204B2 (en) 2000-10-03 2006-06-06 Gesturetek, Inc. Multiple camera control system
US6774889B1 (en) * 2000-10-24 2004-08-10 Microsoft Corporation System and method for transforming an ordinary computer monitor screen into a touch screen
US6717073B2 (en) 2000-12-29 2004-04-06 Intel Corporation Wireless display systems, styli, and associated methods
AUPR505601A0 (en) * 2001-05-17 2001-06-07 Traffion Technologies Pty Ltd Method of optimising content presented to a user within a communications network
US7279646B2 (en) * 2001-05-25 2007-10-09 Intel Corporation Digital signature collection and authentication
US7647275B2 (en) 2001-07-05 2010-01-12 Cummins-Allison Corp. Automated payment system and method
US8437529B1 (en) 2001-09-27 2013-05-07 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8428332B1 (en) 2001-09-27 2013-04-23 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8944234B1 (en) 2001-09-27 2015-02-03 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8433123B1 (en) 2001-09-27 2013-04-30 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8437530B1 (en) 2001-09-27 2013-05-07 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
WO2003042788A2 (en) * 2001-11-13 2003-05-22 Chromavision Medical Systems, Inc. A system for tracking biological samples
US7257255B2 (en) * 2001-11-21 2007-08-14 Candledragon, Inc. Capturing hand motion
US6896118B2 (en) 2002-01-10 2005-05-24 Cummins-Allison Corp. Coin redemption system
US7272252B2 (en) * 2002-06-12 2007-09-18 Clarient, Inc. Automated system for combining bright field and fluorescent microscopy
US20050037406A1 (en) * 2002-06-12 2005-02-17 De La Torre-Bueno Jose Methods and apparatus for analysis of a biological specimen
US20040001144A1 (en) * 2002-06-27 2004-01-01 Mccharles Randy Synchronization of camera images in camera-based touch system to enhance position determination of fast moving objects
US8271778B1 (en) 2002-07-24 2012-09-18 The Nielsen Company (Us), Llc System and method for monitoring secure data on a network
US8171567B1 (en) 2002-09-04 2012-05-01 Tracer Detection Technology Corp. Authentication method and system
US8627939B1 (en) 2002-09-25 2014-01-14 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US6954197B2 (en) * 2002-11-15 2005-10-11 Smart Technologies Inc. Size/scale and orientation determination of a pointer in a camera-based touch system
US6972401B2 (en) * 2003-01-30 2005-12-06 Smart Technologies Inc. Illuminated bezel and touch system incorporating the same
AU2003900398A0 (en) * 2003-01-31 2003-02-13 Red Sheriff Limited Method and system of measuring and recording user data in a communications network
US7629967B2 (en) * 2003-02-14 2009-12-08 Next Holdings Limited Touch screen signal processing
US8508508B2 (en) * 2003-02-14 2013-08-13 Next Holdings Limited Touch screen signal processing with single-point calibration
US8456447B2 (en) * 2003-02-14 2013-06-04 Next Holdings Limited Touch screen signal processing
JP4668897B2 (en) * 2003-02-14 2011-04-13 ネクスト ホールディングス リミティド Touch screen signal processing
US7532206B2 (en) * 2003-03-11 2009-05-12 Smart Technologies Ulc System and method for differentiating between pointers used to contact touch surface
US6947032B2 (en) * 2003-03-11 2005-09-20 Smart Technologies Inc. Touch system and method for determining pointer contacts on a touch surface
US7256772B2 (en) * 2003-04-08 2007-08-14 Smart Technologies, Inc. Auto-aligning touch system and method
US20040202357A1 (en) * 2003-04-11 2004-10-14 Perz Cynthia B. Silhouette image acquisition
AU2003304127A1 (en) * 2003-05-19 2004-12-03 Itzhak Baruch Optical coordinate input device comprising few elements
JP2005025415A (en) * 2003-06-30 2005-01-27 Sony Corp Position detector
FR2859277B1 (en) * 2003-09-02 2006-01-27 H2I Technologies METHOD AND DEVICE FOR OPTICALLY DETECTING POSITION BY REFLECTING AN OBJECT ON ANY SURFACE
US7359041B2 (en) * 2003-09-04 2008-04-15 Avago Technologies Ecbu Ip Pte Ltd Method and system for optically tracking a target using a triangulation technique
US10279254B2 (en) * 2005-10-26 2019-05-07 Sony Interactive Entertainment Inc. Controller having visually trackable object for interfacing with a gaming system
US8287373B2 (en) * 2008-12-05 2012-10-16 Sony Computer Entertainment Inc. Control device for communicating visual information
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US9573056B2 (en) * 2005-10-26 2017-02-21 Sony Interactive Entertainment Inc. Expandable control device via hardware attachment
US7411575B2 (en) * 2003-09-16 2008-08-12 Smart Technologies Ulc Gesture recognition method and touch system incorporating the same
US7274356B2 (en) 2003-10-09 2007-09-25 Smart Technologies Inc. Apparatus for determining the location of a pointer within a region of interest
US7355593B2 (en) 2004-01-02 2008-04-08 Smart Technologies, Inc. Pointer tracking across multiple overlapping coordinate input sub-regions defining a generally contiguous input region
US7232986B2 (en) * 2004-02-17 2007-06-19 Smart Technologies Inc. Apparatus for detecting a pointer within a region of interest
JP4522113B2 (en) * 2004-03-11 2010-08-11 キヤノン株式会社 Coordinate input device
US7460110B2 (en) 2004-04-29 2008-12-02 Smart Technologies Ulc Dual mode touch system
US7492357B2 (en) * 2004-05-05 2009-02-17 Smart Technologies Ulc Apparatus and method for detecting a pointer relative to a touch surface
US7538759B2 (en) 2004-05-07 2009-05-26 Next Holdings Limited Touch panel display system with illumination and detection provided from a single edge
US8120596B2 (en) 2004-05-21 2012-02-21 Smart Technologies Ulc Tiled touch system
US7653260B2 (en) * 2004-06-17 2010-01-26 Carl Zeis MicroImaging GmbH System and method of registering field of view
US8582924B2 (en) * 2004-06-30 2013-11-12 Carl Zeiss Microimaging Gmbh Data structure of an image storage and retrieval system
US20060028457A1 (en) * 2004-08-08 2006-02-09 Burns David W Stylus-Based Computer Input System
US7692627B2 (en) * 2004-08-10 2010-04-06 Microsoft Corporation Systems and methods using computer vision and capacitive sensing for cursor control
CA2578378A1 (en) * 2004-08-25 2006-03-09 Moondoggie Technologies, Inc Method and apparatus for liquid crystal displays
CN101184545B (en) * 2004-12-21 2011-07-20 康宁股份有限公司 Method and system for identifying and repairing defective cells in a plugged honeycomb structure
JP5231809B2 (en) 2005-01-12 2013-07-10 スィンクオプティクス インコーポレイテッド Handheld vision type absolute pointing system
US7852317B2 (en) * 2005-01-12 2010-12-14 Thinkoptics, Inc. Handheld device for handheld vision based absolute pointing system
US7213745B2 (en) * 2005-06-22 2007-05-08 De La Rue International Limited Financial transactions processing system including customer display screen
US20070031043A1 (en) * 2005-08-02 2007-02-08 Perz Cynthia B System for and method of intelligently directed segmentation analysis for automated microscope systems
EP1758062A3 (en) 2005-08-23 2007-10-03 De La Rue International Limited Flexible, multi-mode financial transactions processing systems and methods
US20070091109A1 (en) * 2005-09-13 2007-04-26 Roscoe Atkinson Image quality
JP4915071B2 (en) * 2005-09-22 2012-04-11 株式会社ニコン Microscope and virtual slide creation system
US7782296B2 (en) * 2005-11-08 2010-08-24 Microsoft Corporation Optical tracker for tracking surface-independent movements
US7534988B2 (en) * 2005-11-08 2009-05-19 Microsoft Corporation Method and system for optical tracking of a pointing object
US7946406B2 (en) 2005-11-12 2011-05-24 Cummins-Allison Corp. Coin processing device having a moveable coin receptacle station
US20070165007A1 (en) * 2006-01-13 2007-07-19 Gerald Morrison Interactive input system
US20070205994A1 (en) * 2006-03-02 2007-09-06 Taco Van Ieperen Touch system and method for interacting with the same
US7980378B2 (en) 2006-03-23 2011-07-19 Cummins-Allison Corporation Systems, apparatus, and methods for currency processing control and redemption
US7755026B2 (en) * 2006-05-04 2010-07-13 CandleDragon Inc. Generating signals representative of sensed light that is associated with writing being done by a user
US8913003B2 (en) * 2006-07-17 2014-12-16 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer using a projection marker system
US20080052750A1 (en) * 2006-08-28 2008-02-28 Anders Grunnet-Jepsen Direct-point on-demand information exchanges
US7929749B1 (en) 2006-09-25 2011-04-19 Cummins-Allison Corp. System and method for saving statistical data of currency bills in a currency processing device
US9442607B2 (en) * 2006-12-04 2016-09-13 Smart Technologies Inc. Interactive input system and method
US20080166175A1 (en) * 2007-01-05 2008-07-10 Candledragon, Inc. Holding and Using an Electronic Pen and Paper
US8538123B1 (en) 2007-03-09 2013-09-17 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8417017B1 (en) 2007-03-09 2013-04-09 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
GB2486832A (en) 2007-03-09 2012-06-27 Cummins Allison Corp Document processing system using blind balancing
US8115753B2 (en) 2007-04-11 2012-02-14 Next Holdings Limited Touch screen system with hover and click input methods
US9176598B2 (en) * 2007-05-08 2015-11-03 Thinkoptics, Inc. Free-space multi-dimensional absolute pointer with improved performance
US8094137B2 (en) 2007-07-23 2012-01-10 Smart Technologies Ulc System and method of detecting contact on a display
CA2697856A1 (en) * 2007-08-30 2009-03-05 Next Holdings, Inc. Low profile touch panel systems
WO2009029767A1 (en) 2007-08-30 2009-03-05 Next Holdings, Inc. Optical touchscreen with improved illumination
TWI339808B (en) * 2007-09-07 2011-04-01 Quanta Comp Inc Method and system for distinguishing multiple touch points
JP4404927B2 (en) * 2007-10-10 2010-01-27 シャープ株式会社 Display system and indication position detection method
US9171454B2 (en) * 2007-11-14 2015-10-27 Microsoft Technology Licensing, Llc Magic wand
KR100942431B1 (en) * 2007-12-11 2010-02-17 주식회사 토비스 Complementary metal oxide semiconductor, source of light using the touch coordinates preception method and the touch screen system
US8405636B2 (en) 2008-01-07 2013-03-26 Next Holdings Limited Optical position sensing system and optical position sensor assembly
FR2927433B1 (en) * 2008-02-12 2011-02-25 Peugeot Citroen Automobiles Sa CONTROL INTERFACE DEVICE FOR MOTOR VEHICLE
US8346574B2 (en) 2008-02-29 2013-01-01 Dako Denmark A/S Systems and methods for tracking and providing workflow information
US20090278816A1 (en) * 2008-05-06 2009-11-12 Next Holdings Limited Systems and Methods For Resolving Multitouch Scenarios Using Software Filters
US8902193B2 (en) 2008-05-09 2014-12-02 Smart Technologies Ulc Interactive input system and bezel therefor
US8952894B2 (en) * 2008-05-12 2015-02-10 Microsoft Technology Licensing, Llc Computer vision-based multi-touch sensing using infrared lasers
US8847739B2 (en) * 2008-08-04 2014-09-30 Microsoft Corporation Fusing RFID and vision for surface object tracking
US8221229B2 (en) * 2008-10-27 2012-07-17 Sony Computer Entertainment Inc. Spherical ended controller with configurable modes
US8339378B2 (en) * 2008-11-05 2012-12-25 Smart Technologies Ulc Interactive input system with multi-angle reflector
US20100225588A1 (en) * 2009-01-21 2010-09-09 Next Holdings Limited Methods And Systems For Optical Detection Of Gestures
US8929640B1 (en) 2009-04-15 2015-01-06 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8437532B1 (en) 2009-04-15 2013-05-07 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
US8391583B1 (en) 2009-04-15 2013-03-05 Cummins-Allison Corp. Apparatus and system for imaging currency bills and financial documents and method for using the same
JP2012530303A (en) 2009-06-18 2012-11-29 バーント インターナショナル リミテッド System and method for detecting and tracking objects obstructing radiation on a surface
US8692768B2 (en) 2009-07-10 2014-04-08 Smart Technologies Ulc Interactive input system
JP5374266B2 (en) * 2009-07-22 2013-12-25 株式会社シロク Optical position detector
TWI394072B (en) * 2009-08-21 2013-04-21 Largan Precision Co Ltd Apparatus for detecting a touching position on a flat display and a method thereof
US8421761B2 (en) * 2009-08-26 2013-04-16 General Electric Company Imaging multi-modality touch pad interface systems, methods, articles of manufacture, and apparatus
TWI498785B (en) * 2009-10-08 2015-09-01 Silicon Motion Inc Touch sensor apparatus and touch point detection method
US20110095977A1 (en) * 2009-10-23 2011-04-28 Smart Technologies Ulc Interactive input system incorporating multi-angle reflecting structure
JP2011090604A (en) * 2009-10-26 2011-05-06 Seiko Epson Corp Optical position detection apparatus and display device with position detection function
CH702146A1 (en) * 2009-11-04 2011-05-13 Ininet Solutions Gmbh A method for three-dimensional support of the manual operation of graphical user interfaces.
US8446392B2 (en) * 2009-11-16 2013-05-21 Smart Technologies Ulc Method for determining the location of a pointer in a pointer input region, and interactive input system executing the method
US20110128218A1 (en) * 2009-12-01 2011-06-02 Smart Technologies Ulc Interactive input system and bezel therefor
JP5368585B2 (en) * 2010-01-15 2013-12-18 パイオニア株式会社 Information processing apparatus, method thereof, and display apparatus
TW201126397A (en) * 2010-01-18 2011-08-01 Acer Inc Optical touch control display and method thereof
US20110187678A1 (en) * 2010-01-29 2011-08-04 Tyco Electronics Corporation Touch system using optical components to image multiple fields of view on an image sensor
JP5530809B2 (en) * 2010-06-01 2014-06-25 株式会社日立ソリューションズ Position detection apparatus and image processing system
TWI417774B (en) * 2010-06-28 2013-12-01 Pixart Imaging Inc Optical distance determination device, optical touch monitor system and method for measuring distance of a touch point on an optical touch panel
JP5489886B2 (en) * 2010-06-30 2014-05-14 キヤノン株式会社 Coordinate input device, light receiving device in the device, and manufacturing method thereof
US20120007804A1 (en) * 2010-07-12 2012-01-12 Smart Technologies Ulc Interactive input system and method
TWI421753B (en) * 2010-08-12 2014-01-01 Lite On Semiconductor Corp Calibration method, detection device and optical touch panel for optical touch panel
TWI439905B (en) * 2011-01-10 2014-06-01 Young Lighting Technology Corp Touch module and touch detecting method
JP2012173029A (en) * 2011-02-18 2012-09-10 Seiko Epson Corp Optical position detection apparatus and display system with input function
FR2976093B1 (en) * 2011-06-01 2013-08-16 Thales Sa OPTICAL TRANSMITTER AND RECEIVER TOUCH SYSTEM
JP2013024579A (en) * 2011-07-15 2013-02-04 Seiko Epson Corp Optical position detector and display system with input function
US10019112B2 (en) * 2011-10-25 2018-07-10 Semiconductor Components Industries, Llc Touch panels with dynamic zooming and low profile bezels
CN102778976B (en) * 2012-02-07 2016-03-30 北京京东方光电科技有限公司 Touch point method for determining position and touch-screen on a kind of touch-screen
KR101315033B1 (en) 2012-06-27 2013-10-04 주식회사 스마트센스테크놀러지 Apparatus for sensing the position of an object
GB2506849A (en) * 2012-09-26 2014-04-16 Light Blue Optics Ltd A touch sensing system using a pen
US9141876B1 (en) 2013-02-22 2015-09-22 Cummins-Allison Corp. Apparatus and system for processing currency bills and financial documents and method for using the same
US20140379421A1 (en) 2013-06-25 2014-12-25 The Nielsen Company (Us), Llc Methods and apparatus to characterize households with media meter data
AU2014360691B2 (en) * 2013-12-02 2019-05-23 Unlicensed Chimp Technologies, Llc Local positioning and response system
US9277265B2 (en) 2014-02-11 2016-03-01 The Nielsen Company (Us), Llc Methods and apparatus to calculate video-on-demand and dynamically inserted advertisement viewing probability
US9740338B2 (en) 2014-05-22 2017-08-22 Ubi interactive inc. System and methods for providing a three-dimensional touch screen
US10219039B2 (en) 2015-03-09 2019-02-26 The Nielsen Company (Us), Llc Methods and apparatus to assign viewers to media meter data
US9684407B2 (en) 2015-04-22 2017-06-20 Samsung Electronics Co., Ltd. Method and apparatus for determining shape and orientation of a touch object on handheld devices
US9848224B2 (en) 2015-08-27 2017-12-19 The Nielsen Company(Us), Llc Methods and apparatus to estimate demographics of a household
US10791355B2 (en) 2016-12-20 2020-09-29 The Nielsen Company (Us), Llc Methods and apparatus to determine probabilistic media viewing metrics
CN108828612B (en) * 2018-04-23 2021-03-09 北京小米移动软件有限公司 Distance sensing assembly and mobile terminal
JP2019053769A (en) * 2018-12-04 2019-04-04 シャープ株式会社 Touch detection apparatus, touch detection system, touch detection method, and program
CA3128038A1 (en) 2019-01-31 2020-08-06 Rypplzz, Inc. Systems and methods for augmented reality with precise tracking
US10969523B1 (en) * 2019-10-18 2021-04-06 Center For Quantitative Cytometry Apparatus and method to obtain intrinsic still and video images without the use of filters or dichroic mirrors
JP2023529132A (en) * 2020-05-30 2023-07-07 センター フォー クオンティテイティブ サイトメトリー Apparatus and method for acquiring unique still and video images without the use of filters or dichroic mirrors

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806747A (en) * 1988-02-19 1989-02-21 The Perkin-Elmer Corporation Optical direction of arrival sensor with cylindrical lens
US5079414A (en) * 1990-10-09 1992-01-07 Gte Government Systems Corporation Tracking telescope using an atomic resonance filter

Family Cites Families (40)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2986596A (en) * 1953-08-31 1961-05-30 Jr Wardlaw M Hammond Television writing pick-up systems
US3114283A (en) * 1960-10-31 1963-12-17 Bausch & Lomb Light sensing method and apparatus therefor
US3455637A (en) * 1964-08-07 1969-07-15 Giannini Controls Corp Method and apparatus for measuring the opacity of sheet material
DE1560382A1 (en) * 1965-07-24 1970-09-17 Heberlein & Co Ag Electro-optical arrangement for thread cleaner
AT261266B (en) * 1965-09-18 1968-04-10 Schneider Co Optische Werke Fiber-optic analog-to-digital converter
US3411011A (en) * 1965-09-18 1968-11-12 Schneider Co Optische Werke Fiber-optic cable comprising rows of interleaved light-conducting fibers with masking of fiber portions in zones common to fibers of adjacent rows
US3476482A (en) * 1967-09-27 1969-11-04 Conrac Corp Opacimeter for comparing light from different areas of sample sheet
US3535537A (en) * 1968-04-09 1970-10-20 Us Navy Optical line splitter having light conducting strips with fused input end and flared output ends
US3584142A (en) * 1968-12-13 1971-06-08 Bell Telephone Labor Inc Interactive computer graphics using video telephone
US3673327A (en) * 1970-11-02 1972-06-27 Atomic Energy Commission Touch actuable data input panel assembly
DE2360781A1 (en) * 1973-03-02 1975-10-30 Britz Hans Ernst ELECTRONIC EVALUATION FOR OPTOELECTRONIC ANTENNA SYSTEM FOR LASER RAYS AND RAYS WITH SIMILAR SPREADING CONDITIONS
CH596621A5 (en) * 1976-06-30 1978-03-15 Cerberus Ag
US4459022A (en) * 1980-10-16 1984-07-10 United Technologies Corporation Fiber optic angular sensor
US4491727A (en) * 1981-07-01 1985-01-01 Ramot University Authority For Applied Research Solar radiation sensor and system including same for measuring solar radiation distribution
US4558313A (en) * 1981-12-31 1985-12-10 International Business Machines Corporation Indicator to data processing interface
DE3300849A1 (en) * 1983-01-13 1984-07-19 Standard Elektrik Lorenz Ag, 7000 Stuttgart DEVICE FOR DETERMINING THE DIRECTION OF THE OPTICAL RADIATION
US4547666A (en) * 1983-01-19 1985-10-15 National Computer Systems, Inc. Mark array sense reader with sequential output signals
US4565999A (en) * 1983-04-01 1986-01-21 Prime Computer, Inc. Light pencil
US4507557A (en) * 1983-04-01 1985-03-26 Siemens Corporate Research & Support, Inc. Non-contact X,Y digitizer using two dynamic ram imagers
US4561017A (en) * 1983-08-19 1985-12-24 Richard Greene Graphic input apparatus
US4648052A (en) * 1983-11-14 1987-03-03 Sentient Systems Technology, Inc. Eye-tracker communication system
US4550250A (en) * 1983-11-14 1985-10-29 Hei, Inc. Cordless digital graphics input device
US4943806A (en) * 1984-06-18 1990-07-24 Carroll Touch Inc. Touch input device having digital ambient light sampling
US4682159A (en) * 1984-06-20 1987-07-21 Personics Corporation Apparatus and method for controlling a cursor on a computer display
US4654648A (en) * 1984-12-17 1987-03-31 Herrington Richard A Wireless cursor control system
DE3513350C1 (en) * 1985-04-13 1986-06-26 Messerschmitt-Bölkow-Blohm GmbH, 8012 Ottobrunn Device for the detection and direction detection of optical radiation, esp. Laser radiation
US4949079A (en) * 1985-04-19 1990-08-14 Hugh Loebner Brightpen/pad graphic device for computer inputs and the like
US5073770A (en) * 1985-04-19 1991-12-17 Lowbner Hugh G Brightpen/pad II
US4688933A (en) * 1985-05-10 1987-08-25 The Laitram Corporation Electro-optical position determining system
JPS61269456A (en) * 1985-05-17 1986-11-28 Alps Electric Co Ltd Arrangement structure of optical element
US4782328A (en) * 1986-10-02 1988-11-01 Product Development Services, Incorporated Ambient-light-responsive touch screen data input method and system
US4746770A (en) * 1987-02-17 1988-05-24 Sensor Frame Incorporated Method and apparatus for isolating and manipulating graphic objects on computer video monitor
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US4820050A (en) * 1987-04-28 1989-04-11 Wells-Gardner Electronics Corporation Solid-state optical position determining apparatus
US4811004A (en) * 1987-05-11 1989-03-07 Dale Electronics, Inc. Touch panel system and method for using same
US4891630A (en) * 1988-04-22 1990-01-02 Friedman Mark B Computer vision system with improved object orientation technique
US4891508A (en) * 1988-06-30 1990-01-02 Hewlett-Packard Company Precision infrared position detector apparatus for touch screen system
US4986662A (en) * 1988-12-19 1991-01-22 Amp Incorporated Touch entry using discrete reflectors
US4936683A (en) * 1989-06-22 1990-06-26 Summagraphics Corporation Optical tablet construction
US5055840A (en) * 1990-01-16 1991-10-08 Carroll Touch Incorporated Infrared touch input device and light emitted activation circuit

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4806747A (en) * 1988-02-19 1989-02-21 The Perkin-Elmer Corporation Optical direction of arrival sensor with cylindrical lens
US5079414A (en) * 1990-10-09 1992-01-07 Gte Government Systems Corporation Tracking telescope using an atomic resonance filter

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP0671018A4 *

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104991648A (en) * 2015-07-20 2015-10-21 江苏惠通集团有限责任公司 Screen input system
CN104991648B (en) * 2015-07-20 2018-05-11 江苏惠通集团有限责任公司 Screen input system

Also Published As

Publication number Publication date
US5317140A (en) 1994-05-31
AU5727694A (en) 1994-06-22
EP0671018A4 (en) 2000-02-23
EP0671018A1 (en) 1995-09-13
JPH08506193A (en) 1996-07-02

Similar Documents

Publication Publication Date Title
US5317140A (en) Diffusion-assisted position location particularly for visual pen detection
US7442914B2 (en) System and method of determining a position of a radiation emitting element
US7557935B2 (en) Optical coordinate input device comprising few elements
EP0055366B1 (en) System for remotely displaying and sensing information using shadow parallax
US7705835B2 (en) Photonic touch screen apparatus and method of use
US5686942A (en) Remote computer input system which detects point source on operator
US5484966A (en) Sensing stylus position using single 1-D image sensor
US7893924B2 (en) Data input device
CN1928801B (en) Position detection system using laser speckle
JP5950130B2 (en) Camera-type multi-touch interaction device, system and method
US7859519B2 (en) Human-machine interface
US4420261A (en) Optical position location apparatus
US7313255B2 (en) System and method for optically detecting a click event
US20160209948A1 (en) Human-machine interface
US20030226968A1 (en) Apparatus and method for inputting data
US20090160815A1 (en) Detection of an incident light distribution
CN109696192A (en) Optical bio gage probe with automatic gain and spectrum assignment
KR20040107479A (en) A touch pad, a stylus for use with the touch pad, and a method of operating the touch pad
KR20010051563A (en) Optical digitizer using curved mirror
JP2001175415A (en) Coordinate inputting/detecting device
JP6233941B1 (en) Non-contact type three-dimensional touch panel, non-contact type three-dimensional touch panel system, non-contact type three-dimensional touch panel control method, program, and recording medium
JPH05298016A (en) Input device for graphics
JP2593162B2 (en) Coordinate and external force detector
JP5517834B2 (en) Coordinate input device, control method therefor, and program
Fujieda et al. Detection of finger height for a multi-touch mouse

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AT AU BB BG BR BY CA CH CZ DE DK ES FI GB HU JP KP KR KZ LK LU MG MN MW NL NO NZ PL PT RO RU SD SE SK UA VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 1994903272

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1994903272

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

NENP Non-entry into the national phase

Ref country code: CA

WWW Wipo information: withdrawn in national office

Ref document number: 1994903272

Country of ref document: EP