US20010048519A1 - CMOS-Compatible three-dimensional image sensing using reduced peak energy - Google Patents

CMOS-Compatible three-dimensional image sensing using reduced peak energy Download PDF

Info

Publication number
US20010048519A1
US20010048519A1 US09/876,373 US87637301A US2001048519A1 US 20010048519 A1 US20010048519 A1 US 20010048519A1 US 87637301 A US87637301 A US 87637301A US 2001048519 A1 US2001048519 A1 US 2001048519A1
Authority
US
United States
Prior art keywords
signal
distance
homodyne
cos
high frequency
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/876,373
Other versions
US6587186B2 (en
Inventor
Cyrus Bamji
Edoardo Charbon
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microsoft Technology Licensing LLC
Original Assignee
Canesta Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canesta Inc filed Critical Canesta Inc
Assigned to CANESTA, INC. reassignment CANESTA, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: BAMJI, CYRUS, CHARBON, EDOARDO
Priority to US09/876,373 priority Critical patent/US6587186B2/en
Publication of US20010048519A1 publication Critical patent/US20010048519A1/en
Priority to PCT/US2001/048219 priority patent/WO2002049339A2/en
Priority to JP2002550710A priority patent/JP4533582B2/en
Priority to EP01987386A priority patent/EP1356664A4/en
Priority to AU2002239608A priority patent/AU2002239608A1/en
Publication of US6587186B2 publication Critical patent/US6587186B2/en
Application granted granted Critical
Assigned to MICROSOFT CORPORATION reassignment MICROSOFT CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: CANESTA, INC.
Assigned to MICROSOFT TECHNOLOGY LICENSING, LLC reassignment MICROSOFT TECHNOLOGY LICENSING, LLC ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MICROSOFT CORPORATION
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S7/00Details of systems according to groups G01S13/00, G01S15/00, G01S17/00
    • G01S7/48Details of systems according to groups G01S13/00, G01S15/00, G01S17/00 of systems according to group G01S17/00
    • G01S7/491Details of non-pulse systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C3/00Measuring distances in line of sight; Optical rangefinders
    • G01C3/02Details
    • G01C3/04Adaptation of rangefinders for combination with telescopes or binoculars
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • G01S17/32Systems determining position data of a target for measuring distance only using transmission of continuous waves, whether amplitude-, frequency-, or phase-modulated, or unmodulated
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/50Systems of measurement based on relative movement of target
    • G01S17/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/89Lidar systems specially adapted for specific applications for mapping or imaging
    • G01S17/8943D imaging with simultaneous measurement of time-of-flight at a 2D array of receiver pixels, e.g. time-of-flight cameras or flash lidar
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/87Combinations of systems using electromagnetic waves other than radio waves

Definitions

  • the invention relates generally to range finder type image sensors, and more particularly to such sensors as may be implemented on a single integrated circuit using CMOS fabrication, and especially to reducing power consumption of systems utilizing such sensors.
  • system 10 Electronic circuits that provide a measure of distance from the circuit to an object are known in the art, and may be exemplified by system 10 FIG. 1.
  • imaging circuitry within system 10 is used to approximate the distance (e.g., Z 1 , Z 2 , Z 3 ) to an object 20 , the top portion of which is shown more distant from system 10 than is the bottom portion.
  • system 10 will include a light source 30 whose light output is focused by a lens 40 and directed toward the object to be imaged, here object 20 .
  • Other prior art systems do not provide an active light source 30 and instead rely upon and indeed require ambient light reflected by the object of interest.
  • Various fractions of the light from source 30 may be reflected by surface portions of object 20 , and is focused by a lens 50 .
  • This return light falls upon various detector devices 60 , e.g., photodiodes or the like, in an array on an integrated circuit (IC) 70 .
  • Devices 60 produce a rendering of the luminosity of an object (e.g., 10 ) in the scene from which distance data is to be inferred.
  • devices 60 might be charge coupled devices (CCDs) or even arrays of CMOS devices.
  • CCDs typically are configured in a so-called bucket-brigade whereby light-detected charge by a first CCD is serial-coupled to an adjacent CCD, whose output in turn is coupled to a third CCD, and so on.
  • This bucket-brigade configuration precludes fabricating processing circuitry on the same IC containing the CCD array.
  • CCDs provide a serial readout as opposed to a random readout. For example, if a CCD range finder system were used in a digital zoom lens application, even though most of the relevant data would be provided by a few of the CCDs in the array, it would nonetheless be necessary to readout the entire array to gain access to the relevant data, a time consuming process. In still and some motion photography applications, CCD-based systems might still find utility.
  • the upper portion of object 20 is intentionally shown more distant that the lower portion, which is to say distance Z 3 >Z 3 >Z 1 .
  • the field of view is sufficiently small such that all objects in focus will be at substantially the same distance. But in general, luminosity-based systems do not work well.
  • the upper portion of object 20 is shown darker than the lower portion, and presumably is more distant than the lower portion.
  • circuits 80 , 90 , 100 within system 10 in FIG. 1 would assist in this signal processing.
  • IC 70 includes CCDs 60
  • other processing circuitry such as 80 , 90 , 100 are formed off-chip.
  • IR autofocus systems for use in cameras or binoculars produce a single distance value that is an average or a minimum distance to all targets within the field of view.
  • Other camera autofocus systems often require mechanical focusing of the lens onto the subject to determine distance.
  • these prior art focus systems can focus a lens onto a single object in a field of view, but cannot simultaneously measure distance for all objects in the field of view.
  • Stereoscopic images allow a human observer to more accurately judge the distance of an object.
  • a computer program it is challenging for a computer program to judge object distance from a stereoscopic image. Errors are often present, and the required signal processing requires specialized hardware and computation.
  • Stereoscopic images are at best an indirect way to produce a three-dimensional image suitable for direct computer use.
  • scanning laser range finding systems raster scan an image by using mirrors to deflect a laser beam in the x-axis and perhaps the y-axis plane.
  • the angle of defection of each mirror is used to determine the coordinate of an image pixel being sampled.
  • Such systems require precision detection of the angle of each mirror to determine which pixel is currently being sampled. Understandably having to provide precision moving mechanical parts add bulk, complexity, and cost to such range finding system. Further, because these systems sample each pixel sequentially, the number of complete image frames that can be sampled per unit time is limited.
  • a system that can produce direct three-dimensional imaging.
  • such system should be implementable on a single IC that includes both detectors and circuitry to process detection signals.
  • Such single IC system should be implementable using CMOS fabrication techniques, should require few discrete components and have no moving components.
  • the system should be able to output data from the detectors in a non-sequential or random fashion.
  • such system should require relatively low peak light emitting power such that inexpensive light emitters may be employed.
  • the present invention provides such a system.
  • the present invention provides a system that measures distance and velocity data in real time using time-of-flight (TOF) data rather than relying upon luminosity data.
  • the system is CMOS-compatible and provides such three-dimensional imaging without requiring moving parts.
  • the system may be fabricated on a single IC containing both a two-dimensional array of CMOS-compatible pixel detectors that sense photon light energy, and associated processing circuitry.
  • a microprocessor on the IC continuously triggered a preferably LED or laser light source whose light output pulses were at least partially reflected by points on the surface of the object to be imaged.
  • a large but brief pulse of optical energy was required, for example, a peak pulse energy of perhaps 10 W, a pulse width of about 15 ns, and a repetition rate of about 3 Khz. While average energy in applicant's earlier system was only about 1 mW, the desired 10 W peak power essentially dictated the use of relatively expensive laser diodes as a preferred energy light source.
  • Each pixel detector in the detector array had associated electronics to measure time-of-flight from transmission of an optical energy pulse to detection of a return signal. In that invention, the transmission of high peak power narrow energy pulses required the use of high bandwidth pixel detector amplifiers.
  • the present invention transmits periodic signals having a high frequency component, which signals have low average Power and low peak power, e.g., tens of mW rather than watts.
  • Emitting low peak power periodic signals with a high frequency component such as sinusoidal optical signals permits using inexpensive light sources and simpler, narrower bandwidth pixel detectors. Bandwidths can be on the order of a few hundred KHz with an operating (emitted energy) frequency of about 200 MHz. Good resolution accuracy is still obtainable using a low peak power optical emitter in that the effective duty cycle is greater than the output from a narrow-pulsed optical emitter of higher peak power.
  • phase shift ⁇ due to time-of-flight is:
  • several different modulation frequencies of optically emitted energy may be used, e.g., f 1 , f 2 , f 3 . . . , to determine z modulo C/(2 ⁇ f 1 ), C/(2 ⁇ f 2 ), C/(2 ⁇ f 3 ).
  • the use of multiple different modulation frequencies advantageously can reduce aliasing. If f 1 , f 2 , f 3 are integers, aliasing is reduced to the least common multiplier of f 1 , f 2 , f 3 , denoted LCM(f 1 , f 2 , f 3 ).
  • the mixing product S 1 ⁇ S 2 will be 0.5 ⁇ A ⁇ cos( ⁇ t+ ⁇ )+cos( ⁇ ) ⁇ and will have an average value of 0.5 ⁇ A ⁇ cos( ⁇ ).
  • the amplitude or brightness A of the detected return signal may be measured separately from each pixel detector output.
  • each pixel detector in the detector array has its own dedicated electronics that includes a low noise amplifier to amplify the signal detected by the associated pixel detector, a variable phase delay unit, a mixer, a lowpass filter, and an integrator.
  • the mixer mixes the output of low noise amplifier with a variable phase delay version of the transmitted sinusoidal signal.
  • the mixer output is lowpass filtered, integrated and fedback to control phase shift of the variable phase delay unit.
  • the analog phase information is readily digitized, and an on-chip microprocessor can then calculate z-values from each pixel detector to an associated point on the target object.
  • the microprocessor further can calculate dz/dt (and/or dx/dt, dy/dt) and other information if desired.
  • the on-chip measurement information may be output in random rather than sequential order, and object tracking and other measurements requiring a three-dimensional image are readily made.
  • the overall system is small, robust and requires relatively few off-chip discrete components.
  • On-chip circuitry can use such TOF data to readily simultaneously measure distance and velocity of all points on an object or all objects in a scene.
  • FIG. 1 is a diagram showing a generic luminosity-based range finding system, according to the prior art
  • FIG. 2A depicts a transmitted periodic signal with high frequency components transmitted by the present invention, here an ideal cosine waveform
  • FIG. 2B depicts the return waveform with phase-delay for the transmitted signal of FIG. 2A, as used by the present invention
  • FIG. 3 is a block diagram of a preferred implementation of the present invention.
  • FIG. 4 is block diagram showing an individual pixel detector with associated electronics, according to the present invention.
  • the present invention advantageously transmits and detects optical energy that is periodic with a high frequency component, and relies upon phase shift between transmitted and detected waveforms to discern time-of-flight and thus z-distance data.
  • pulsed-type periodic waveforms may be used, the present invention will be described with respect to the emission and detection of sinusoidal waveforms, as such waveforms are rather easily analyzed mathematically.
  • periodic pulsed waveforms with a high frequency component including imperfect sinusoidal waveforms are representable mathematically as groupings of perfect sinusoidal waveforms of varying coefficients and frequency multiples.
  • the transmission and detection of such waveforms can advantageously permit use of relatively inexpensive low peak-power optical emitters, and the use of relatively lower bandwidth amplifiers. This is in contrast to applicant's referenced U.S. Pat. No. ______ (2001) in which a low duty cycle pulse train of narrow pulse widths was emitted by a very high peak power optical emitter.
  • FIG. 2A depicts the high frequency component of an exemplary idealized periodic optical energy signal as emitted by the present invention, here a signal represented as cos( ⁇ t).
  • the signal is depicted as though it were AC-coupled in that any magnitude offset is not present.
  • the operative frequency of the transmitted signal preferably is in the few hundred MHz range, and the average and the peak transmitted power may be relatively modest, e.g., less than about 50 mW or so.
  • FIG. 2B depicts the returned version of the transmitted waveform, denoted A ⁇ cos( ⁇ t+ ⁇ ), where A is an attenuation coefficient, and ⁇ is a phase shift resulting from the time-of-flight (TOF) of the energy in traversing the distance from the present invention to the target object.
  • TOF time-of-flight
  • Specifying a repetition rate of the transmitted periodic optical energy signal involves tradeoffs that include considerations of the transmitted waveshape and duty cycle, the desired granularity in resolving z-distance, and peak power requirements for the optical energy emitter.
  • a transmitted periodic signal whose high frequency component is a few hundred MHz, e.g., 200 MHz, will be consistent with z-distance resolution on the order of a few cm or so, assuming eight-bit analog-to-digital conversion of the detected phase shift information.
  • the high frequency component of the transmitted periodic signal is about 50%, e.g., an idealized sinusoidal waveform or the equivalent, the peak power required from the optical energy emitter will be about 10 mW.
  • the optical energy emitter peak power would have to be increased to about 500 mW, and so on. It will be appreciated that the ability to use a low peak power optical emitter is one of the distinguishing factors between the present invention and applicant's above-referenced U.S. patent.
  • FIG. 3 a block diagram depicting the present invention 200 , a three-dimensional imaging system that preferably is fabricated on a single IC 210 .
  • System 200 requires no moving parts and relatively few off-chip components.
  • System 200 includes an optical emitter, for example a low power laser diode, or low power LED, that can output a periodic signal with 50 mW or so peak power when driven with a repetition rate of a few hundred MHz and, in the preferred embodiment, a duty cycle close to 50%.
  • optical emitters are made from materials such as GaAIAs, whose bandgap energies are quite different than that of silicon, from which CMOS IC 210 is preferably fabricated.
  • FIG. 3 depicts optical emitter 220 as being off-chip 210
  • the phantom lines surrounding emitter 220 denote that an optical emitter 220 made of CMOS-compatible materials could indeed be fabricated on IC 210 .
  • Light source 220 is preferably a low peak power LED or a laser that emits energy with a wavelength of perhaps 800 nm, although other wavelengths could instead be used. Below 800 nm wavelength, emitted light starts to become visible and laser fabrication becomes more difficult. Above 900 nm CMOS/silicon photodiode efficiency drops off rapidly, and in any event, 1100 nm is the upper wavelength for a device fabricated on a silicon substrate, such as IC 210 . By using emitted light having a specific wavelength, and by filtering out incoming light of different wavelength, system 200 can operate with or without ambient light. The ability of system 200 to function in the dark can be advantageous in certain security and military type imaging applications.
  • Off-chip mounted lens 290 preferably focuses filtered incoming light energy onto sensor array 230 such that each pixel detector 240 x receives light from only one particular point (e.g., an object surface point) in the field of view.
  • the properties of light wave propagation allow an ordinary lens 290 to be used to focus the light onto the sensor array. If a lens ( 290 ′) is required to focus the optical light energy transmitted from emitter 220 , a single lens could be used for 290 , 290 ′ if a mirror-type arrangement were used.
  • Typical LED or laser diode emitters 220 have a shunt capacitance of perhaps 100 pF.
  • inductance inductance (perhaps a few nH) in parallel with this capacitance, where the combined inductance-capacitance resonate at the periodic frequency of the emitter, typically a few hundred MHz.
  • inductance (again a few nH) can be series-coupled to the emitter and its parasitic capacitance. If desired, such inductance can be derived using a bonding wire to the emitter.
  • CMOS-compatible IC 210 will preferably have fabricated thereon oscillator 225 driver, array 230 (comprising perhaps 100 ⁇ 100 (or more) pixel detectors 240 and 100 ⁇ 100 (or more) associated electronic processing circuits 250 ), microprocessor or microcontroller unit 260 , memory 270 (which preferably includes random access memory or RAM and read-only memory or ROM), and various computing and input/output (I/O) circuitry 280 , including, for example an analog/digital (A/D) conversion unit providing 10-bit A/D conversions of phase information ⁇ detected by the various pixel detectors in array 230 .
  • A/D analog/digital
  • I/O circuit 280 preferably can also provide a signal to control frequency of the oscillator 225 that drives the energy emitter 220 .
  • the DATA output line shown in FIG. 3 represents any or all information that is calculated by the present invention using phase-shift information from the various pixel detectors 240 in array 230 .
  • microprocessor 260 can examine consecutive frames stored in RAM 270 to identify objects in the field of view scene. Microprocessor 260 can then compute z-distance and can compute object velocity dz/dt, dx/dt, dy/dt. Further, microprocessor 260 and associated on-chip circuitry can be programmed to recognize desired image shapes, for example a user's fingers if an application using system 200 to detect user interface with a virtual input device. The data provided by microprocessor 260 could be reduced to keystroke information in such an application.
  • any or all of this data can be exported from the IC to an external computer for further processing, for example via a universal serial bus. If microprocessor 260 has sufficient computational power, additional on-chip processing may occur as well. Note too that output from the array of CMOS-compatible detectors 240 may be accessed in a random manner if desired, which permits outputting TOF DATA in any order.
  • a sinusoid or cosine waveform is assumed for ease of mathematical representation, a periodic waveform with similar duty cycle, repetition rate and peak power may be used, e.g., perhaps squarewaves.
  • average and peak power is advantageously quite modest in the present invention, for example 10 mW.
  • the cost of optical emitter 220 is perhaps thirty-cents compared to a cost of many dollars for a high peak power laser diode in applicant's earlier invention, described in U.S. Pat. No. ______ (2001).
  • the incoming optical energy detected by different pixel detectors 240 can have different phase ⁇ since different times-of-flight or distances z are involved. As will be described, it is the function of electronics 250 associated with each pixel detector 240 in array 230 to examine and determine the relative phase delay, in cooperation with microprocessor 260 and software stored in memory 270 executed by the microprocessor. In an application where system 200 images a data input mechanism, perhaps a virtual keyboard, microprocessor 260 may process detection data sufficient to identify which of several virtual keys or regions on a virtual device, e.g., a virtual keyboard, have been touched by a user's finger or stylus.
  • the DATA output from system 200 can include a variety of information, including without limitation distance z, velocity dz/dt (and/or dx/dt, dy/dt) of object 20 , and object identification, e.g., identification of a virtual key contacted by a user's hand or stylus.
  • IC 210 also includes a microprocessor or microcontroller unit 260 , memory 270 (which preferably includes random access memory or RAM and read-only memory or ROM), and various computing and input/output (I/O) circuitry 280 .
  • I/O circuitry 280 can control frequency of the oscillator 225 that drives the energy emitter 220 .
  • controller unit 260 may perform z distance to object and object velocity (dz/dt, dy/dt, dx/dt) calculations.
  • the DATA output line shown in FIG. 3 represents any or all such information that is calculated by the present invention using phase-shift information from the various pixel detectors 240 .
  • the two-dimensional array 230 of pixel sensing detectors is fabricated using standard commercial silicon technology. This advantageously permits fabricating a single IC 210 that includes the various pixel detectors 240 and their associated circuits 250 , as well as circuits 225 , 260 , 270 , 280 , and preferably the energy emitter 220 as well. Understandably, the ability to fabricate such circuits and components on the same IC with the array of pixel detectors can shorten processing and delay times, due to shorter signal paths. In FIG. 3, while system 200 may include focusing lens 290 and/or 290 ′, it is understood that these lenses will be fabricated off IC chip 210 .
  • Each pixel detector 240 is equivalent to a parallel combination of a current source, an ideal diode, shunt impedance, and noise current source, and will output a current proportional to the amount of incoming photon light energy falling upon it.
  • CMOS fabrication is used to implement the array of CMOS pixel diodes or photogate detector devices.
  • photodiodes may be fabricated using a diffusion-to-well, or a well-to-substrate junction.
  • Well-to-substrate photodiodes are more sensitive to infrared (IR) light, exhibit less capacitance, and are thus preferred.
  • FIG. 4 shows a portion of IC 210 and of array 230 , and depicts pixel detectors 240 - 1 through 240 -x, and each diode's associated exemplary electronics 250 - 1 through 250 -x.
  • pixel diodes 240 and two associated electronic circuits 250 are depicted, however an actual array will include hundreds or thousands or more of such pixel detectors and associated electronic circuits.
  • a dedicated A/D converter could be provided as part of each electronics circuit 250 -x through 250 -x, as opposed to implementing an omnibus A/D function on IC chip 210 .
  • periodic emissions from optical source 220 are sinusoidal or sinusoidal-like with a high frequency component of a few hundred MHz.
  • amplifier 300 it suffices for amplifier 300 to have a bandwidth of perhaps 100 KHz or so, perhaps as low as tens of KHz because all of the frequencies of interest are themselves close to this modulation frequency. It will be appreciated that providing hundreds or thousands of low noise relatively low bandwidth amplifiers 300 on IC 210 is an easier and more economical undertaking than providing high bandwidth amplifiers able to pass narrow pulses, as in applicant's parent invention.
  • array 230 can function with relatively small bandwidth amplifiers 300 as the output from each amplifier 300 is coupled directly to a first input of an associated mixer 310 that receives as a second input a signal of like frequency as that present at the first input. It is noted that if each amplifier 300 and its associated mixer 310 were implemented as a single unit, it could suffice for the overall unit to have a bandwidth on the order of tens of KHz, and a high frequency response also on the order of tens of KHz.
  • Each circuit 250 -x couples the output of the associated low noise amplifier 300 to the first input of a mixer 310 .
  • mixer 310 may be implemented in many ways, for example using Gilbert cells, digital multipliers, etc.
  • each mixer will homodyne the amplified detected output signal S 2 from an associated pixel detector 240 with a generator 225 signal S 1 .
  • the mixer output product S 1 ⁇ S 2 will be 0.5 ⁇ A ⁇ cos( ⁇ t+ ⁇ )+cos( ⁇ ) ⁇ and will have an average value of 0.5 ⁇ A ⁇ cos( ⁇ ).
  • the amplitude or brightness A of the detected return signal may be measured separately from each pixel detector output. In practice, a ten-bit analog-to-digital resolution of A ⁇ cos( ⁇ ) will result in about 1 mm resolution for z-measurements.
  • Each multiplier 310 will have a second input that is coupled to the output of a variable phase delay (VPD) unit 320 .
  • VPD units 320 may be implemented in many ways, for example using a series-coupled string of invertors whose operating power supply voltage is varied to speed-up or slow-down the ability of each inverter to pass a signal.
  • VPD 320 adds a variable time delay ⁇ to the cos( ⁇ t) signal derived from generator 225 .
  • Mixer 310 then mixes the amplified cos( ⁇ t+ ⁇ ) signal output by amplifier 300 with the cos( ⁇ t+ ⁇ ) signal output by VPD 320 .
  • Mixer 310 now outputs signals including 0.5 ⁇ A ⁇ cos( ⁇ )+cos(2 ⁇ t+ ⁇ + ⁇ ) ⁇ .
  • the output of mixer 310 is coupled to the input of a low pass filter 340 that preferably has a bandwidth of a 100 Hz or so to a few KHz or so, such that the output from filter 340 will be a low frequency signal proportional to 0.5 ⁇ A ⁇ cos( ⁇ ).
  • This low frequency signal is now input to an integrator 330 whose output will be ⁇ x for pixel detector 240 x .
  • phase shift ⁇ due to time-of-flight may be given by:
  • modulation frequencies such as f 1 , f 2 , f 3 . . .
  • microprocessor 260 can command generator 225 to output sinusoidal drive signals of chosen frequencies, e.g., f 1 , f 2 , f 3 , etc.
  • Distance z can then be determined modulo LCM(a 1 , a 2 , a 3 )/D.
  • the output signal from each lowpass filter 340 will be, ideally, null. For example, should the output signal from a lowpass filter 340 signal go positive, then the output signal from the associated integrator 330 will add more phase shift to drive the lowpass filter output back towards a null state.
  • the phase angles are preferably converted from analog format to digital format, for example using an analog/digital converter function associated with electronics 280 .
  • This improved accuracy is in contrast to prior art systems that attempt to mix signals having a constant phase value for all pixels.
  • prior art approaches are relatively simple to implement, and the present invention could also mix signals with a constant phase, if desired.
  • microprocessor 260 can then execute software, e.g., stored or storable in memory 270 to calculate z-distances (and/or other information) using the above mathematical relationships. If desired, microprocessor 260 can also command generator 225 to output discrete frequencies e.g., f 1 , f 2 , f 3 . . . to improve system performance by reducing or even eliminating aliasing errors.
  • software e.g., stored or storable in memory 270 to calculate z-distances (and/or other information) using the above mathematical relationships.
  • microprocessor 260 can also command generator 225 to output discrete frequencies e.g., f 1 , f 2 , f 3 . . . to improve system performance by reducing or even eliminating aliasing errors.
  • FIG. 4 The configuration of FIG. 4 is analogous to a delay lock loop, as the homodyne output signal is used to vary phase ( ⁇ ) between optical emitter 220 and the signal input to mixer 310 along with the sensed return signal output from amplifier 300 .
  • the output signal from integrator 330 will be proportional to ⁇ .
  • delay lock loop configurations such as shown in FIG. 4 can exhibit low noise peaks, especially at frequencies near the fundamental frequency of interest.
  • the delay lock loop configuration of FIG. 4 is analogous to a zero-intermediate frequency (IF) system.
  • IF zero-intermediate frequency
  • in-band spectrum aliasing and out-of-band noise may be attenuated using less critical and most cost-effective components and circuits, for example switched capacitor filters.
  • low-frequency 1/f type noise appears to be more greatly attenuated in-band in low-IF systems since there is no direct mixing with the carrier frequency.
  • low IF systems After low pass filtering, low IF systems return a signal proportional to cos( ⁇ c ⁇ t+ ⁇ ), where ⁇ c is the IF frequency that is the difference between the optical modulation frequency (e.g., the signal emitted from 220 ) and the frequency with which it is mixed.
  • This signal can be further mixed with a signal from a local oscillator having frequency ⁇ c .
  • homodyning in the present invention can be implemented as a two-step process.
  • First the detected return signal S 2 is multiplied with a signal having not exactly the same frequency but a frequency synchronized with the original frequency of the emitted signal S 1 .
  • ⁇ c is the difference between the two frequencies.
  • This intermediate difference frequency can then be homodyned again with ⁇ c to return ⁇ at the output of each integrator 330 .
  • z-distance is determined from TOF information acquired from phase delay ⁇ , it is noted that the relative brightness of the signals returned from target object 20 can also provide useful information.
  • the amplitude coefficient “A” on the return signal is a measure of relative brightness.
  • a maximum lowpass filter output signal could instead be used with slight alteration.
  • a maximum lowpass filter output signal would represent brightness coefficient A.
  • Such a configuration could be implemented using a signal 90° out-of-phase with the output from VPD 320 to modulate another copy of the output of the low noise amplifier 300 .
  • the average amplitude of the thus-modulated signal would be proportional to coefficient A in the incoming detected return signal.
  • Movement of objects within a detected image contour can be computed, e.g., by microprocessor 260 , by identifying contour movements between frames of acquired data.
  • the pixel detectors within the contour can all receive a uniform velocity that is the velocity of the contour. Since objects can be identified using their contours, one can track objects of interest using the on-chip processor 260 .
  • IC chip 210 can export a single value (DATA) that can represent change in location of the entire object 20 whenever it has moved.
  • DATA single value
  • a single vector representing the change in location of the object of interest may instead be sent. So doing results in a substantial reduction in IC chip input/output and can greatly reduce off-chip data processing requirements.
  • system 200 may be called upon to recognize an object that is a virtual input device, for example a keyboard whose virtual keys are “pressed” by a user's fingers.
  • a virtual input device for example a keyboard whose virtual keys are “pressed” by a user's fingers.
  • a three-dimensional range-finding TOF system is used to implement virtual input devices. As a user's hand or stylus “presses” a virtual key or region on such device, the system using TOF measurements can determine which key or region is being “pressed”.
  • the system can then output the equivalent of key stroke information to a companion device, for example a PDA that is to receive input data from the interaction of a user with the virtual input device.
  • a companion device for example a PDA that is to receive input data from the interaction of a user with the virtual input device.
  • the present invention may be used in such application, in which case DATA in FIG. 3 could represent keystroke identification information that has been processed on-chip by microprocessor 260 .
  • microprocessor 260 executing software perhaps associated with memory 270 can control modulation of generator 225 and detection by the various electronic circuits 250 .
  • detection signals may be processed using special image processing software. Since system 200 preferably can be battery operated due to its low power consumption, when such software determines that sufficient image resolution is attained, operating power may be terminated selectively to various portions of array 230 . Further if sufficient photon energy reaches array 230 to ensure adequate detection, the shape of signals output by emitter 220 could be changed. For example, the peak power and/or duty cycle of the emitter energy could be reduced, thus reducing overall power consumption by system 200 .
  • the design tradeoffs in changing the shape of the optical energy output signal involve considerations of z-resolution accuracy, user safety, and power handling capacity of emitter 220 .
  • the overall system advantageously can be operated from a small battery in that peak and average power from optical emitter 220 is preferably in the tens of mW range. Nonetheless distance resolution is in the cm range, and signal/noise ratios are acceptable.

Abstract

A three-dimensional time-of-flight (TOF) system includes a low power optical emitter whose idealized output S1=cos(ω·t) is reflected by a target distance z away, and is detected by a two-dimensional array of pixel detectors and associated narrow bandwidth detector electronics and associated processing circuitry preferably fabricated on a common CMOS IC. The idealized detected reflected energy S2=A·cos(ω·t+Φ), where phase shift Φ is proportional to TOF or z, and z=Φ·C/2·ω=Φ·C/{2·(2·π·f)} and is known modulo 2πC/(2·ω)=C/(2·f). Phase Φ, distance z, and other information, is determined by homodyne-mixing S2 with the drive signal to the optical emitter. The idealized mixer output per each pixel detector is 0.5·A·{cos(ω·t+Φ)+cos(Φ)} and has average value 0.5·A·cos(Φ). The overall system is small, robust and requires relatively few off-chip discrete components. On-chip circuitry can use TOF data to simultaneously measure distance, object point velocity, object contours, including user interface with virtual input devices.

Description

    RELATION TO PREVIOUSLY FILED APPLICATIONS
  • Priority is claimed from applicant's co-pending U.S. provisional patent application serial No. 60/209,948 filed on Jun. 6, 2000 entitled “3D Imaging Using Multiple Pixel Phase Detection on a CMOS Chip”. Applicant also refers to and incorporates by reference herein U.S. utility application Ser. No. 09/401,059 filed Sep. 22, 1999 entitled “CMOS-Compatible Three-Dimensional Image Sensor IC”, now U.S. Pat. No. ______ (2001).[0001]
  • FIELD OF THE INVENTION
  • The invention relates generally to range finder type image sensors, and more particularly to such sensors as may be implemented on a single integrated circuit using CMOS fabrication, and especially to reducing power consumption of systems utilizing such sensors. [0002]
  • BACKGROUND OF THE INVENTION
  • Electronic circuits that provide a measure of distance from the circuit to an object are known in the art, and may be exemplified by [0003] system 10 FIG. 1. In the generalized system of FIG. 1, imaging circuitry within system 10 is used to approximate the distance (e.g., Z1, Z2, Z3) to an object 20, the top portion of which is shown more distant from system 10 than is the bottom portion. Typically system 10 will include a light source 30 whose light output is focused by a lens 40 and directed toward the object to be imaged, here object 20. Other prior art systems do not provide an active light source 30 and instead rely upon and indeed require ambient light reflected by the object of interest.
  • Various fractions of the light from [0004] source 30 may be reflected by surface portions of object 20, and is focused by a lens 50. This return light falls upon various detector devices 60, e.g., photodiodes or the like, in an array on an integrated circuit (IC) 70. Devices 60 produce a rendering of the luminosity of an object (e.g., 10) in the scene from which distance data is to be inferred. In some applications devices 60 might be charge coupled devices (CCDs) or even arrays of CMOS devices.
  • CCDs typically are configured in a so-called bucket-brigade whereby light-detected charge by a first CCD is serial-coupled to an adjacent CCD, whose output in turn is coupled to a third CCD, and so on. This bucket-brigade configuration precludes fabricating processing circuitry on the same IC containing the CCD array. Further, CCDs provide a serial readout as opposed to a random readout. For example, if a CCD range finder system were used in a digital zoom lens application, even though most of the relevant data would be provided by a few of the CCDs in the array, it would nonetheless be necessary to readout the entire array to gain access to the relevant data, a time consuming process. In still and some motion photography applications, CCD-based systems might still find utility. [0005]
  • As noted, the upper portion of [0006] object 20 is intentionally shown more distant that the lower portion, which is to say distance Z3>Z3>Z1. In a range finder autofocus camera environment, one might try to have devices 60 approximate average distance from the camera (e.g., from Z=0) to object 10 by examining relative luminosity data obtained from the object. In some applications, e.g., range finding binoculars, the field of view is sufficiently small such that all objects in focus will be at substantially the same distance. But in general, luminosity-based systems do not work well. For example, in FIG. 1, the upper portion of object 20 is shown darker than the lower portion, and presumably is more distant than the lower portion. But in the real world, the more distant portion of an object could instead be shinier or brighter (e.g., reflect more optical energy) than a closer but darker portion of an object. In a complicated scene, it can be very difficult to approximate the focal distance to an object or subject standing against a background using change in luminosity to distinguish the subject from the background. In such various applications, circuits 80, 90, 100 within system 10 in FIG. 1 would assist in this signal processing. As noted, if IC 70 includes CCDs 60, other processing circuitry such as 80, 90, 100 are formed off-chip.
  • Unfortunately, reflected luminosity data does not provide a truly accurate rendering of distance because the reflectivity of the object is unknown. Thus, a distant object surface with a shiny surface may reflect as much light (perhaps more) than a closer object surface with a dull finish. [0007]
  • Other focusing systems are known in the art. Infrared (IR) autofocus systems for use in cameras or binoculars produce a single distance value that is an average or a minimum distance to all targets within the field of view. Other camera autofocus systems often require mechanical focusing of the lens onto the subject to determine distance. At best these prior art focus systems can focus a lens onto a single object in a field of view, but cannot simultaneously measure distance for all objects in the field of view. [0008]
  • In general, a reproduction or approximation of original luminosity values in a scene permits the human visual system to understand what objects were present in the scene and to estimate their relative locations stereoscopically. For non-stereoscopic images such as those rendered on an ordinary television screen, the human brain assesses apparent size, distance and shape of objects using past experience. Specialized computer programs can approximate object distance under special conditions. [0009]
  • Stereoscopic images allow a human observer to more accurately judge the distance of an object. However it is challenging for a computer program to judge object distance from a stereoscopic image. Errors are often present, and the required signal processing requires specialized hardware and computation. Stereoscopic images are at best an indirect way to produce a three-dimensional image suitable for direct computer use. [0010]
  • Many applications require directly obtaining a three-dimensional rendering of a scene. But in practice it is difficult to accurately extract distance and velocity data along a viewing axis from luminosity measurements. Nonetheless many applications require accurate distance and velocity tracking, for example an assembly line welding robot that must determine the precise distance and speed of the object to be welded. The necessary distance measurements may be erroneous due to varying lighting conditions and other shortcomings noted above. Such applications would benefit from a system that could directly capture three-dimensional imagery. [0011]
  • Although specialized three dimensional imaging systems exist in the nuclear magnetic resonance and scanning laser tomography fields, such systems require substantial equipment expenditures. Further, these systems are obtrusive, and are dedicated to specific tasks, e.g., imaging internal body organs. [0012]
  • In other applications, scanning laser range finding systems raster scan an image by using mirrors to deflect a laser beam in the x-axis and perhaps the y-axis plane. The angle of defection of each mirror is used to determine the coordinate of an image pixel being sampled. Such systems require precision detection of the angle of each mirror to determine which pixel is currently being sampled. Understandably having to provide precision moving mechanical parts add bulk, complexity, and cost to such range finding system. Further, because these systems sample each pixel sequentially, the number of complete image frames that can be sampled per unit time is limited. [0013]
  • In summation, there is a need for a system that can produce direct three-dimensional imaging. Preferably such system should be implementable on a single IC that includes both detectors and circuitry to process detection signals. Such single IC system should be implementable using CMOS fabrication techniques, should require few discrete components and have no moving components. Optionally, the system should be able to output data from the detectors in a non-sequential or random fashion. Very preferably, such system should require relatively low peak light emitting power such that inexpensive light emitters may be employed. [0014]
  • The present invention provides such a system. [0015]
  • SUMMARY OF THE PRESENT INVENTION
  • The present invention provides a system that measures distance and velocity data in real time using time-of-flight (TOF) data rather than relying upon luminosity data. The system is CMOS-compatible and provides such three-dimensional imaging without requiring moving parts. The system may be fabricated on a single IC containing both a two-dimensional array of CMOS-compatible pixel detectors that sense photon light energy, and associated processing circuitry. [0016]
  • In applicant's referenced utility application, now U.S. Pat. No. ______ (2001), a microprocessor on the IC continuously triggered a preferably LED or laser light source whose light output pulses were at least partially reflected by points on the surface of the object to be imaged. For good image resolution, e.g., a cm or so, a large but brief pulse of optical energy was required, for example, a peak pulse energy of perhaps 10 W, a pulse width of about 15 ns, and a repetition rate of about 3 Khz. While average energy in applicant's earlier system was only about 1 mW, the desired 10 W peak power essentially dictated the use of relatively expensive laser diodes as a preferred energy light source. Each pixel detector in the detector array had associated electronics to measure time-of-flight from transmission of an optical energy pulse to detection of a return signal. In that invention, the transmission of high peak power narrow energy pulses required the use of high bandwidth pixel detector amplifiers. [0017]
  • By contrast, the present invention transmits periodic signals having a high frequency component, which signals have low average Power and low peak power, e.g., tens of mW rather than watts. Periodic signals such as an ideal sinusoid S[0018] 1=cos(ω·t) having of optical energy are relatively straightforward to analyze and will be assumed herein. Emitting low peak power periodic signals with a high frequency component such as sinusoidal optical signals permits using inexpensive light sources and simpler, narrower bandwidth pixel detectors. Bandwidths can be on the order of a few hundred KHz with an operating (emitted energy) frequency of about 200 MHz. Good resolution accuracy is still obtainable using a low peak power optical emitter in that the effective duty cycle is greater than the output from a narrow-pulsed optical emitter of higher peak power.
  • Assume that the energy emitted from the optical source is approximately S[0019] 1=K·cos(ω·t) where K is an amplitude coefficient, ω=2πf, and frequency f is perhaps 200 MHz, that distance z separates the optical energy emitter from the target object. For ease of mathematical representation, K=1 will be assumed although coefficients less than or greater than one may be used. The term “approximately” is used in recognition that perfect sinusoid waveforms can be difficult to generate. Due to the time-of-flight required for the energy to traverse distance z, there will be a phase shift Φ between the transmitted energy and the energy detected by a photo detector in the array, S2=A·cos(ω·+Φ). Coefficient A represents brightness of the detected reflected signal and may be measured separately using the same return signal that is received by the pixel detector.
  • The phase shift Φ due to time-of-flight is: [0020]
  • Φ=2·ω·z/C=2·(2·π·fz/C
  • where C is speed of light 300 Km/sec. Thus, distance z from energy emitter (and from detector array) is given by: [0021]
  • z=Φ·C/2·ω=Φ·C/{2·(2·π·f)}
  • Distance z is known modulo 2πC/(2·ω)=C/(2·f). If desired, several different modulation frequencies of optically emitted energy may be used, e.g., f[0022] 1, f2, f3 . . . , to determine z modulo C/(2·f1), C/(2·f2), C/(2·f3). The use of multiple different modulation frequencies advantageously can reduce aliasing. If f1, f2, f3 are integers, aliasing is reduced to the least common multiplier of f1, f2, f3, denoted LCM(f1, f2, f3). If f1, f2, f3 are not integers, they preferably are modeled as fractions expressible as a1/D, a2/D, and a3/D, where i in ai is an integer, and D=(GCD) represents the greatest common divisor of a1, a2, a3. From the above, distance z may be determined modulo LCM(a1, a2, a3)/D.
  • In the present invention, phase Φ and distance z are determined by mixing (or homodyning) the signal detected by each pixel detector S[0023] 2=A·cos(ω·t+Φ) with the signal driving the optical energy emitter S1=cos(ω·t). The mixing product S1·S2 will be 0.5·A·{cos(ω·t+Φ)+cos(Φ)} and will have an average value of 0.5·A·cos(Φ). If desired, the amplitude or brightness A of the detected return signal may be measured separately from each pixel detector output.
  • To implement homodyne determination of phase Φ and distance z, each pixel detector in the detector array has its own dedicated electronics that includes a low noise amplifier to amplify the signal detected by the associated pixel detector, a variable phase delay unit, a mixer, a lowpass filter, and an integrator. The mixer mixes the output of low noise amplifier with a variable phase delay version of the transmitted sinusoidal signal. The mixer output is lowpass filtered, integrated and fedback to control phase shift of the variable phase delay unit. In the equilibrium state, the output of each integrator is the phase ψ(where ψ=Φ±π/2) associated with the TOF or distance z between the associated pixel detector and a point a distance z away on the target object. The analog phase information is readily digitized, and an on-chip microprocessor can then calculate z-values from each pixel detector to an associated point on the target object. The microprocessor further can calculate dz/dt (and/or dx/dt, dy/dt) and other information if desired. [0024]
  • The on-chip measurement information may be output in random rather than sequential order, and object tracking and other measurements requiring a three-dimensional image are readily made. The overall system is small, robust and requires relatively few off-chip discrete components. On-chip circuitry can use such TOF data to readily simultaneously measure distance and velocity of all points on an object or all objects in a scene. [0025]
  • Other features and advantages of the invention will appear from the following description in which the preferred embodiments have been set forth in detail, in conjunction with their accompanying drawings.[0026]
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • FIG. 1 is a diagram showing a generic luminosity-based range finding system, according to the prior art; [0027]
  • FIG. 2A depicts a transmitted periodic signal with high frequency components transmitted by the present invention, here an ideal cosine waveform; [0028]
  • FIG. 2B depicts the return waveform with phase-delay for the transmitted signal of FIG. 2A, as used by the present invention; [0029]
  • FIG. 3 is a block diagram of a preferred implementation of the present invention; and [0030]
  • FIG. 4 is block diagram showing an individual pixel detector with associated electronics, according to the present invention.[0031]
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENT
  • The present invention advantageously transmits and detects optical energy that is periodic with a high frequency component, and relies upon phase shift between transmitted and detected waveforms to discern time-of-flight and thus z-distance data. Although pulsed-type periodic waveforms may be used, the present invention will be described with respect to the emission and detection of sinusoidal waveforms, as such waveforms are rather easily analyzed mathematically. However it is to be understood that periodic pulsed waveforms with a high frequency component including imperfect sinusoidal waveforms are representable mathematically as groupings of perfect sinusoidal waveforms of varying coefficients and frequency multiples. The transmission and detection of such waveforms can advantageously permit use of relatively inexpensive low peak-power optical emitters, and the use of relatively lower bandwidth amplifiers. This is in contrast to applicant's referenced U.S. Pat. No. ______ (2001) in which a low duty cycle pulse train of narrow pulse widths was emitted by a very high peak power optical emitter. [0032]
  • FIG. 2A depicts the high frequency component of an exemplary idealized periodic optical energy signal as emitted by the present invention, here a signal represented as cos(ωt). The signal is depicted as though it were AC-coupled in that any magnitude offset is not present. As described below, the operative frequency of the transmitted signal preferably is in the few hundred MHz range, and the average and the peak transmitted power may be relatively modest, e.g., less than about 50 mW or so. [0033]
  • A portion of the transmitted energy reaches a target object and is at least partially reflected back toward the present invention, to be detected. FIG. 2B depicts the returned version of the transmitted waveform, denoted A·cos(ωt+Φ), where A is an attenuation coefficient, and Φ is a phase shift resulting from the time-of-flight (TOF) of the energy in traversing the distance from the present invention to the target object. Knowledge of TOF is tantamount to knowledge of distance z from a point on the object target, e.g., [0034] target 20, to the recipient pixel detector in the array of detectors within a system according to the present invention.
  • Specifying a repetition rate of the transmitted periodic optical energy signal involves tradeoffs that include considerations of the transmitted waveshape and duty cycle, the desired granularity in resolving z-distance, and peak power requirements for the optical energy emitter. For example, a transmitted periodic signal whose high frequency component is a few hundred MHz, e.g., 200 MHz, will be consistent with z-distance resolution on the order of a few cm or so, assuming eight-bit analog-to-digital conversion of the detected phase shift information. If the high frequency component of the transmitted periodic signal is about 50%, e.g., an idealized sinusoidal waveform or the equivalent, the peak power required from the optical energy emitter will be about 10 mW. Of course if the transmitted waveform duty cycle were decreased to say 1%, the optical energy emitter peak power would have to be increased to about 500 mW, and so on. It will be appreciated that the ability to use a low peak power optical emitter is one of the distinguishing factors between the present invention and applicant's above-referenced U.S. patent. [0035]
  • The processing and use of phase shift information in the present invention will now be described with reference to FIG. 3, a block diagram depicting the [0036] present invention 200, a three-dimensional imaging system that preferably is fabricated on a single IC 210. System 200 requires no moving parts and relatively few off-chip components.
  • [0037] System 200 includes an optical emitter, for example a low power laser diode, or low power LED, that can output a periodic signal with 50 mW or so peak power when driven with a repetition rate of a few hundred MHz and, in the preferred embodiment, a duty cycle close to 50%. At present useful optical emitters are made from materials such as GaAIAs, whose bandgap energies are quite different than that of silicon, from which CMOS IC 210 is preferably fabricated. Thus, while FIG. 3 depicts optical emitter 220 as being off-chip 210, the phantom lines surrounding emitter 220 denote that an optical emitter 220 made of CMOS-compatible materials could indeed be fabricated on IC 210.
  • [0038] Light source 220 is preferably a low peak power LED or a laser that emits energy with a wavelength of perhaps 800 nm, although other wavelengths could instead be used. Below 800 nm wavelength, emitted light starts to become visible and laser fabrication becomes more difficult. Above 900 nm CMOS/silicon photodiode efficiency drops off rapidly, and in any event, 1100 nm is the upper wavelength for a device fabricated on a silicon substrate, such as IC 210. By using emitted light having a specific wavelength, and by filtering out incoming light of different wavelength, system 200 can operate with or without ambient light. The ability of system 200 to function in the dark can be advantageous in certain security and military type imaging applications. Off-chip mounted lens 290 preferably focuses filtered incoming light energy onto sensor array 230 such that each pixel detector 240x receives light from only one particular point (e.g., an object surface point) in the field of view. The properties of light wave propagation allow an ordinary lens 290 to be used to focus the light onto the sensor array. If a lens (290′) is required to focus the optical light energy transmitted from emitter 220, a single lens could be used for 290, 290′ if a mirror-type arrangement were used. Typical LED or laser diode emitters 220 have a shunt capacitance of perhaps 100 pF. Thus in driving emitter 220, it would be advantageous to place a small inductance (perhaps a few nH) in parallel with this capacitance, where the combined inductance-capacitance resonate at the periodic frequency of the emitter, typically a few hundred MHz. Alternatively, inductance (again a few nH) can be series-coupled to the emitter and its parasitic capacitance. If desired, such inductance can be derived using a bonding wire to the emitter.
  • CMOS-[0039] compatible IC 210 will preferably have fabricated thereon oscillator 225 driver, array 230 (comprising perhaps 100×100 (or more) pixel detectors 240 and 100×100 (or more) associated electronic processing circuits 250), microprocessor or microcontroller unit 260, memory 270 (which preferably includes random access memory or RAM and read-only memory or ROM), and various computing and input/output (I/O) circuitry 280, including, for example an analog/digital (A/D) conversion unit providing 10-bit A/D conversions of phase information Φ detected by the various pixel detectors in array 230. Depending upon implementation, a single on-chip A/D converter function could be provided, or a dedicated A/D converter could be provided as part of each electronic processing circuit 250. I/O circuit 280 preferably can also provide a signal to control frequency of the oscillator 225 that drives the energy emitter 220.
  • The DATA output line shown in FIG. 3 represents any or all information that is calculated by the present invention using phase-shift information from the [0040] various pixel detectors 240 in array 230. Preferably microprocessor 260 can examine consecutive frames stored in RAM 270 to identify objects in the field of view scene. Microprocessor 260 can then compute z-distance and can compute object velocity dz/dt, dx/dt, dy/dt. Further, microprocessor 260 and associated on-chip circuitry can be programmed to recognize desired image shapes, for example a user's fingers if an application using system 200 to detect user interface with a virtual input device. The data provided by microprocessor 260 could be reduced to keystroke information in such an application. Any or all of this data (denoted DATA in FIG. 3) can be exported from the IC to an external computer for further processing, for example via a universal serial bus. If microprocessor 260 has sufficient computational power, additional on-chip processing may occur as well. Note too that output from the array of CMOS-compatible detectors 240 may be accessed in a random manner if desired, which permits outputting TOF DATA in any order.
  • Among its other functions, [0041] microprocessor 260 acting through interface circuit 280 causes driver 225 to oscillate periodically with a desired duty cycle at a desired frequency, for example f1=200 MHz. In response to signals from oscillator driver 225, laser diode or LED 220 emits optical energy at the desired frequency, e.g., f1=200 MHz and duty cycle. Again, while a sinusoid or cosine waveform is assumed for ease of mathematical representation, a periodic waveform with similar duty cycle, repetition rate and peak power may be used, e.g., perhaps squarewaves. As noted, average and peak power is advantageously quite modest in the present invention, for example 10 mW. As a result, the cost of optical emitter 220 is perhaps thirty-cents compared to a cost of many dollars for a high peak power laser diode in applicant's earlier invention, described in U.S. Pat. No. ______ (2001).
  • The optical energy whose periodic high frequency component is ideally represented as S[0042] 1=cos(ωt) is focused by optional lens 290′ upon target object 20, some distance z away. At least some of the optical energy falling upon target 20 will be reflected back towards system 200 and will be detected by one or more pixel detectors 240 in array 230. Due to the distance z separating system 200, more particularly a given pixel detector 240 in array 230, and the target point on object 20, the detected optical energy will be delayed in phase by some amount Φ that is proportional to time-of-flight, or to the separation distance z. The incoming optical energy detected by different pixel detectors 240 can have different phase Φ since different times-of-flight or distances z are involved. As will be described, it is the function of electronics 250 associated with each pixel detector 240 in array 230 to examine and determine the relative phase delay, in cooperation with microprocessor 260 and software stored in memory 270 executed by the microprocessor. In an application where system 200 images a data input mechanism, perhaps a virtual keyboard, microprocessor 260 may process detection data sufficient to identify which of several virtual keys or regions on a virtual device, e.g., a virtual keyboard, have been touched by a user's finger or stylus. Thus, the DATA output from system 200 can include a variety of information, including without limitation distance z, velocity dz/dt (and/or dx/dt, dy/dt) of object 20, and object identification, e.g., identification of a virtual key contacted by a user's hand or stylus.
  • Preferably [0043] IC 210 also includes a microprocessor or microcontroller unit 260, memory 270 (which preferably includes random access memory or RAM and read-only memory or ROM), and various computing and input/output (I/O) circuitry 280. For example, an output from I/O circuit 280 can control frequency of the oscillator 225 that drives the energy emitter 220. Among other functions, controller unit 260 may perform z distance to object and object velocity (dz/dt, dy/dt, dx/dt) calculations. The DATA output line shown in FIG. 3 represents any or all such information that is calculated by the present invention using phase-shift information from the various pixel detectors 240.
  • Preferably the two-[0044] dimensional array 230 of pixel sensing detectors is fabricated using standard commercial silicon technology. This advantageously permits fabricating a single IC 210 that includes the various pixel detectors 240 and their associated circuits 250, as well as circuits 225, 260, 270, 280, and preferably the energy emitter 220 as well. Understandably, the ability to fabricate such circuits and components on the same IC with the array of pixel detectors can shorten processing and delay times, due to shorter signal paths. In FIG. 3, while system 200 may include focusing lens 290 and/or 290′, it is understood that these lenses will be fabricated off IC chip 210.
  • Each [0045] pixel detector 240 is equivalent to a parallel combination of a current source, an ideal diode, shunt impedance, and noise current source, and will output a current proportional to the amount of incoming photon light energy falling upon it. Preferably CMOS fabrication is used to implement the array of CMOS pixel diodes or photogate detector devices. For example photodiodes may be fabricated using a diffusion-to-well, or a well-to-substrate junction. Well-to-substrate photodiodes are more sensitive to infrared (IR) light, exhibit less capacitance, and are thus preferred.
  • FIG. 4 shows a portion of [0046] IC 210 and of array 230, and depicts pixel detectors 240-1 through 240-x, and each diode's associated exemplary electronics 250-1 through 250-x. For ease of illustration only two pixel diodes 240 and two associated electronic circuits 250 are depicted, however an actual array will include hundreds or thousands or more of such pixel detectors and associated electronic circuits. As noted, if desired a dedicated A/D converter could be provided as part of each electronics circuit 250-x through 250-x, as opposed to implementing an omnibus A/D function on IC chip 210.
  • Let us now consider detection of incoming optical energy by pixel detector [0047] 240-1. Assuming that a low power LED or laser diode or the like 220 emits optical radiation having idealized high frequency component S1=cos(ω·t), a fraction of such radiation reflected from a point on the surface of target 20 (distance z away) is given by S2=A·cos(ω·t+Φ). Upon receiving this incoming radiation, pixel detector 240-1 outputs a signal that is amplified by low noise amplifier 300. An exemplary amplifier 300 might have a gain of perhaps 12 dB.
  • As noted, periodic emissions from [0048] optical source 220 are sinusoidal or sinusoidal-like with a high frequency component of a few hundred MHz. Despite this high optical emission frequency, it suffices for amplifier 300 to have a bandwidth of perhaps 100 KHz or so, perhaps as low as tens of KHz because all of the frequencies of interest are themselves close to this modulation frequency. It will be appreciated that providing hundreds or thousands of low noise relatively low bandwidth amplifiers 300 on IC 210 is an easier and more economical undertaking than providing high bandwidth amplifiers able to pass narrow pulses, as in applicant's parent invention.
  • As will be clear upon examining FIG. 4, [0049] array 230 can function with relatively small bandwidth amplifiers 300 as the output from each amplifier 300 is coupled directly to a first input of an associated mixer 310 that receives as a second input a signal of like frequency as that present at the first input. It is noted that if each amplifier 300 and its associated mixer 310 were implemented as a single unit, it could suffice for the overall unit to have a bandwidth on the order of tens of KHz, and a high frequency response also on the order of tens of KHz.
  • As shown in FIG. 4, when comparing the detected signal to the transmitted signal, there will be a phase shift p that is related to TOF and to distance z. [0050]
  • Each circuit [0051] 250-x couples the output of the associated low noise amplifier 300 to the first input of a mixer 310. Those skilled in the art of signal processing and circuit design will appreciate that mixer 310 may be implemented in many ways, for example using Gilbert cells, digital multipliers, etc.
  • In essence, each mixer will homodyne the amplified detected output signal S[0052] 2 from an associated pixel detector 240 with a generator 225 signal S1. Assuming that the optical energy emitted has an idealized high frequency component represented as a sine wave or cosine wave, the mixer output product S1·S2 will be 0.5·A·{cos(ω·t+Φ)+cos(Φ)} and will have an average value of 0.5·A·cos(Φ). If desired, the amplitude or brightness A of the detected return signal may be measured separately from each pixel detector output. In practice, a ten-bit analog-to-digital resolution of A·cos(Φ) will result in about 1 mm resolution for z-measurements.
  • Each [0053] multiplier 310 will have a second input that is coupled to the output of a variable phase delay (VPD) unit 320. VPD units 320 may be implemented in many ways, for example using a series-coupled string of invertors whose operating power supply voltage is varied to speed-up or slow-down the ability of each inverter to pass a signal. A first input to each VPD unit 320 will be derived from signal generator 225, and will be S1=cos(ωt), give or take a signal coefficient. Assume that VPD 320 adds a variable time delay ψ to the cos(ωt) signal derived from generator 225. Mixer 310 then mixes the amplified cos(ω·t+Φ) signal output by amplifier 300 with the cos(ω·t+ψ) signal output by VPD 320. Mixer 310 now outputs signals including 0.5·A·{cos(Φ−ψ)+cos(2·ω·t+Φ+ψ)}. The output of mixer 310 is coupled to the input of a low pass filter 340 that preferably has a bandwidth of a 100 Hz or so to a few KHz or so, such that the output from filter 340 will be a low frequency signal proportional to 0.5·A·cos(Φ−ψ). This low frequency signal is now input to an integrator 330 whose output will be Φx for pixel detector 240 x.
  • Note that [0054] VPD 320 is driven by two signals each having the same frequency as that emitted by optical emitter 220, albeit with a phase difference (Φ−ψ). Because the two signals being mixed are derived from the same reference source frequency, the mixing process is referred to a homodyning rather than heterodyning. Note that if phase shift ψ>Φ, the polarity of the signal output from integrator 330 changes. In the configuration shown in FIG. 4, phase shift ψxx±90° associated with the return signal detected by each pixel detector 240-x is available from that pixel detector's integrator 330-x.
  • The phase shift ψ due to time-of-flight may be given by: [0055]
  • Φ=2·ω·z/C=2·(2·π·fz/C
  • where C is speed of light 300,000 Km/sec. Thus, distance z from [0056] energy emitter 220 to a pixel detector 240-x in array 230 is given by:
  • z=Φ·C/2·ω=Φ·C/{2·(2·π·f)}
  • Distance z is known modulo 2πC/(2·ω)=C/(2·f). Using several different modulation frequencies such as f[0057] 1, f2, f3 . . . , permits determining distance z modulo C/(2·f1), C/(2·f2), C/(2·f3), etc., and further avoids, or at least reduces, aliasing. For example, microprocessor 260 can command generator 225 to output sinusoidal drive signals of chosen frequencies, e.g., f1, f2, f3, etc. If f1, f2, f3 are integers, e.g., i=integer, aliasing is reduced to the least common multiplier of f1, f2, f3, denoted LCM(f1, f2, f3). If f1, f2, f3 are not integers, they preferably are modeled as fractions expressible as a1/D, a2/D, a3/D, where ai denotes integer i, and D=GCD(a1, a2, a3), where GCD denotes greatest common divisor. Distance z can then be determined modulo LCM(a1, a2, a3)/D.
  • The configuration of FIG. 4 presents a closed-loop feedback circuit that reaches a stable point when the two input signals to each [0058] mixer 310 are 90° out of phase with respect to each other, e.g., ψxx±90°. At this ±90° out-of-phase steady-state, the output signal from each lowpass filter 340 will be, ideally, null. For example, should the output signal from a lowpass filter 340 signal go positive, then the output signal from the associated integrator 330 will add more phase shift to drive the lowpass filter output back towards a null state.
  • When the feedback system is at a stable state, the pixel detector electronics [0059] 250-x in array 230 provide various phase angles ψ1, ψ2, ψ3, . . . ψN, where ψxx±90°. The phase angles are preferably converted from analog format to digital format, for example using an analog/digital converter function associated with electronics 280. Advantageously there is little accuracy loss for phase values close to Φ=0 and for Φ=−π. This improved accuracy is in contrast to prior art systems that attempt to mix signals having a constant phase value for all pixels. However such prior art approaches are relatively simple to implement, and the present invention could also mix signals with a constant phase, if desired. Advantageously microprocessor 260 can then execute software, e.g., stored or storable in memory 270 to calculate z-distances (and/or other information) using the above mathematical relationships. If desired, microprocessor 260 can also command generator 225 to output discrete frequencies e.g., f1, f2, f3 . . . to improve system performance by reducing or even eliminating aliasing errors.
  • The configuration of FIG. 4 is analogous to a delay lock loop, as the homodyne output signal is used to vary phase (ψ) between [0060] optical emitter 220 and the signal input to mixer 310 along with the sensed return signal output from amplifier 300. As a byproduct of this configuration, the output signal from integrator 330 will be proportional to ψ. Advantageously, delay lock loop configurations such as shown in FIG. 4 can exhibit low noise peaks, especially at frequencies near the fundamental frequency of interest. These advantages are inapposite to the performance of typical phase lock loop configurations.
  • In essence, the delay lock loop configuration of FIG. 4 is analogous to a zero-intermediate frequency (IF) system. But by using a low IF system, in-band spectrum aliasing and out-of-band noise may be attenuated using less critical and most cost-effective components and circuits, for example switched capacitor filters. Furthermore, low-frequency 1/f type noise appears to be more greatly attenuated in-band in low-IF systems since there is no direct mixing with the carrier frequency. After low pass filtering, low IF systems return a signal proportional to cos(ω[0061] c·t+Φ−ψ), where ωc is the IF frequency that is the difference between the optical modulation frequency (e.g., the signal emitted from 220) and the frequency with which it is mixed. This signal can be further mixed with a signal from a local oscillator having frequency ωc. Alternatively, one can digitize the return signal and extract ψ using digital multipliers and filters.
  • Requirements for linearity of homodyne type detectors can be further relaxed if very low near-channel interference is present, a condition that advantageously reduces circuit power dissipation and complexity. Alternatively, low IF systems can enjoy further circuit simplification resulting from the less stringent specifications required on the low pass filter and on mixer linearity. [0062]
  • Thus, if desired homodyning in the present invention can be implemented as a two-step process. First the detected return signal S[0063] 2 is multiplied with a signal having not exactly the same frequency but a frequency synchronized with the original frequency of the emitted signal S1. Thus, instead of obtaining cos(Φ) at the input of each integrator 330, what results is a signal cos(ωc·t+Φ−ψ), where ωc is the difference between the two frequencies. This intermediate difference frequency can then be homodyned again with ωc to return ψ at the output of each integrator 330.
  • Thus, various implementations may be used with the present invention to generate phase angle ψ=Φ±90°. Assume that a given application requires acquisition of an image at a frame rate of [0064] 30 frames/second. In such application, it suffices to sample phase angle ψ during A/D conversion with a sample rate of about 30 ms. This sample rate is commensurate with the relatively low bandwidth otherwise present within electronics 250-x, as shown in FIG. 4. In practice, system 200 can provide z-distance resolution of about 1 cm and in practical applications, z-range will be within perhaps 100 m or less.
  • Although z-distance is determined from TOF information acquired from phase delay ψ, it is noted that the relative brightness of the signals returned from [0065] target object 20 can also provide useful information. The amplitude coefficient “A” on the return signal is a measure of relative brightness.
  • Note that while the preferred embodiment shown in FIG. 4 uses a feedback configuration that seeks to achieve a minimum output signal from the [0066] lowpass filters 340, a maximum lowpass filter output signal could instead be used with slight alteration. A maximum lowpass filter output signal would represent brightness coefficient A. Such a configuration could be implemented using a signal 90° out-of-phase with the output from VPD 320 to modulate another copy of the output of the low noise amplifier 300. The average amplitude of the thus-modulated signal would be proportional to coefficient A in the incoming detected return signal.
  • Movement of objects within a detected image contour can be computed, e.g., by [0067] microprocessor 260, by identifying contour movements between frames of acquired data. The pixel detectors within the contour can all receive a uniform velocity that is the velocity of the contour. Since objects can be identified using their contours, one can track objects of interest using the on-chip processor 260. As such, if desired IC chip 210 can export a single value (DATA) that can represent change in location of the entire object 20 whenever it has moved. Thus instead of exporting from the IC chip an entire frame of pixels at the frame rate, a single vector representing the change in location of the object of interest may instead be sent. So doing results in a substantial reduction in IC chip input/output and can greatly reduce off-chip data processing requirements.
  • In other applications, [0068] system 200 may be called upon to recognize an object that is a virtual input device, for example a keyboard whose virtual keys are “pressed” by a user's fingers. For example, in co-pending U.S. application Ser. No. 09/502,499, filed Feb. 11, 2000, and entitled “Method and Apparatus for Entering Data Using a Virtual Input Device” a three-dimensional range-finding TOF system is used to implement virtual input devices. As a user's hand or stylus “presses” a virtual key or region on such device, the system using TOF measurements can determine which key or region is being “pressed”. The system can then output the equivalent of key stroke information to a companion device, for example a PDA that is to receive input data from the interaction of a user with the virtual input device. The present invention may be used in such application, in which case DATA in FIG. 3 could represent keystroke identification information that has been processed on-chip by microprocessor 260.
  • As noted, [0069] microprocessor 260 executing software perhaps associated with memory 270 can control modulation of generator 225 and detection by the various electronic circuits 250. If desired, detection signals may be processed using special image processing software. Since system 200 preferably can be battery operated due to its low power consumption, when such software determines that sufficient image resolution is attained, operating power may be terminated selectively to various portions of array 230. Further if sufficient photon energy reaches array 230 to ensure adequate detection, the shape of signals output by emitter 220 could be changed. For example, the peak power and/or duty cycle of the emitter energy could be reduced, thus reducing overall power consumption by system 200. The design tradeoffs in changing the shape of the optical energy output signal involve considerations of z-resolution accuracy, user safety, and power handling capacity of emitter 220.
  • In summary, the overall system advantageously can be operated from a small battery in that peak and average power from [0070] optical emitter 220 is preferably in the tens of mW range. Nonetheless distance resolution is in the cm range, and signal/noise ratios are acceptable.
  • Modifications and variations may be made to the disclosed embodiments without departing from the subject and spirit of the invention as defined by the following claims. [0071]

Claims (20)

What is claimed is:
1. A method to determine distance z between a pixel detector and a target, the method comprising the following steps:
(a) illuminating said target with optical energy having a periodic waveform that includes a high frequency component S1(ω·t);
(b) disposing said pixel detector so as to detect an optical energy signal having a high frequency component S2(ω·t)=A·S1(ω·t−Φ) reflected from said target, where A is a coefficient proportional to brightness of said target, and Φ is phase shift proportional to time-of-flight of light over said distance z; and
(c) signal-processing said signal S2(ω·t) to generate a phase signal Φ proportional to said distance z.
2. The method of
claim 1
, wherein at step (a), said high frequency component S1(ω·t) is approximated by S1(ω·t)=cos(ω·t).
3. The method of
claim 1
, wherein:
step (b) includes providing an array of pixel detectors; and
step (c) includes generating said phase signal Φ for a detected said signal output by each of said pixel detectors.
4. The method of
claim 1
, wherein step (c) includes homodyne-mixing said signal S2(ω·t) with a signal proportional to S1(ω·t+ψ).
5. The method of 4, wherein step (c) includes subjecting said input signal S1(ωt) to a variable phase delay to generate said S1(ω·t+ψ) .
6. The method of
claim 4
, wherein at steady-state, ψ=Φ±90°.
7. The method of
claim 4
, wherein step (c) further includes varying said ψ to find zero average value for a homodyne product S1·S2.
8. The method of
claim 4
, wherein step (c) further includes reducing high frequency components in S1·S2 to yield an average value for a homodyne product S1·S2.
9. The method of
claim 4
, wherein step (c) further includes integrating an average value for said homodyne product S1·S2 to produce said ψ.
10. The method of
claim 1
, further including estimating magnitude of said co-efficient A.
11. The method of
claim 1
, further including a step of estimating magnitude of said co-efficient A by homodyne mixing S2 with S1(ω·t+ψ+π/2).
12. The method of
claim 4
, wherein step (c) includes homodyne-mixing said S2 with a signal of close frequency that is phase locked onto S1 to yield an intermediate frequency signal ωc, and at least one step selected from a group consisting of (i) homodyne-mixing a resulting intermediate signal again with said ωc, and (ii) directly digitizing said intermediate frequency signal ωc and extracting Φ using digital signal processing.
13. The method of
claim 1
, wherein step (a) includes generating a plurality of discrete frequencies ωi selected to reduce aliasing.
14. The method of
claim 1
, wherein each said step is carried out by circuitry fabricated on a CMOS integrated circuit, said integrated circuit including an array of pixel detectors each identical to said pixel detector.
15. The method of
claim 14
, wherein said integrated circuit includes a microprocessor, and at least step (c) is executed by said microprocessor.
16. A CMOS-implementable integrated circuit (IC) time of flight (TOF) measurement system used with an optical emitter to determine distance z between said IC and a target, the IC including:
a generator coupleable to said optical emitter to cause said optical emitter to output a signal having a high frequency component representable as a high frequency component S1(ω·t);
an array of pixel detectors to detect an optical energy signal having a high frequency component representable as S2i(ω·t)=A·S1(ω·t−Φi) reflected from said target, where i is an integer, A is a coefficient proportional to brightness of said target, and Φ is phase shift proportional to time-of-flight of light over said distance z;
for each of said pixel detectors, an associated electronic circuit;
wherein said circuit is coupled to receive and signal process said signal S2(ω·t) and to generate a phase signal Φ proportional to said distance z.
17. The IC of
claim 16
, wherein said high frequency component S1(ω·t) is approximated by S1(ω·t)=cos(ω·t).
18. The IC of
claim 16
, wherein said electronic circuit signal processes by homodyne-mixing said signal S2(ω·t) with a signal proportional to S1(ω·t+ψ), wherein at steady-state, ψ=Φ±90°.
19. The IC of
claim 18
, wherein said electronic circuit signal processes by subjecting said input signal S1(ωt) to a variable phase delay to generate said S1(ω·t+ψ) .
20. The IC of
claim 18
, wherein said electronic circuit further signal processes by carrying out at least one function selecting from a group consisting of (a) varying said ψ to find zero average value for a homodyne product S1·S2, (b) reducing high frequency components in S1·S2 to yield an average value for a homodyne product S1·S2, (c) integrating an average value for said homodyne product S1·S2 to produce said ψ, and (d) estimating magnitude of said co-efficient A.
US09/876,373 2000-06-06 2001-06-06 CMOS-compatible three-dimensional image sensing using reduced peak energy Expired - Lifetime US6587186B2 (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
US09/876,373 US6587186B2 (en) 2000-06-06 2001-06-06 CMOS-compatible three-dimensional image sensing using reduced peak energy
AU2002239608A AU2002239608A1 (en) 2000-12-11 2001-12-11 Cmos-compatible three-dimensional image sensing using quantum efficiency modulation
EP01987386A EP1356664A4 (en) 2000-12-11 2001-12-11 Cmos-compatible three-dimensional image sensing using quantum efficiency modulation
JP2002550710A JP4533582B2 (en) 2000-12-11 2001-12-11 A CMOS compatible 3D image sensing system using quantum efficiency modulation
PCT/US2001/048219 WO2002049339A2 (en) 2000-12-11 2001-12-11 Cmos-compatible three-dimensional image sensing using quantum efficiency modulation

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US20994800P 2000-06-06 2000-06-06
US09/876,373 US6587186B2 (en) 2000-06-06 2001-06-06 CMOS-compatible three-dimensional image sensing using reduced peak energy

Publications (2)

Publication Number Publication Date
US20010048519A1 true US20010048519A1 (en) 2001-12-06
US6587186B2 US6587186B2 (en) 2003-07-01

Family

ID=26904672

Family Applications (1)

Application Number Title Priority Date Filing Date
US09/876,373 Expired - Lifetime US6587186B2 (en) 2000-06-06 2001-06-06 CMOS-compatible three-dimensional image sensing using reduced peak energy

Country Status (1)

Country Link
US (1) US6587186B2 (en)

Cited By (27)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2374743A (en) * 2001-04-04 2002-10-23 Instro Prec Ltd Surface profile measurement
US20030147002A1 (en) * 2002-02-06 2003-08-07 Eastman Kodak Company Method and apparatus for a color sequential scannerless range imaging system
WO2003085413A2 (en) * 2002-04-08 2003-10-16 Matsushita Electric Works, Ltd. Three dimensional image sensing device using intensity modulated light
US20060119833A1 (en) * 2003-02-19 2006-06-08 Leica Geosystems Ag Method and device for deriving geodetic distance data
US7248344B2 (en) 2001-04-04 2007-07-24 Instro Precision Limited Surface profile measurement
US20070280626A1 (en) * 2006-05-24 2007-12-06 Haddock Joshua N Optical rangefinder for an electro-active lens
NL1032435C2 (en) * 2006-09-05 2008-03-06 Maasland Nv Device for automatically milking a dairy animal.
WO2011051286A1 (en) * 2009-10-28 2011-05-05 Ifm Electronic Gmbh Camera system
US20110188028A1 (en) * 2007-10-02 2011-08-04 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (tof) systems
CN102298149A (en) * 2010-06-25 2011-12-28 原相科技股份有限公司 Power-saving time-difference distance measuring system and method capable of promoting precision and moveable detection efficiency
GB2485997A (en) * 2010-11-30 2012-06-06 St Microelectronics Res & Dev Camera using a Single Photon Avalanche Diode (SPAD) array
GB2485991A (en) * 2010-11-30 2012-06-06 St Microelectronics Res & Dev Camera using a Single Photon Avalanche Diode (SPAD) array
GB2486165A (en) * 2010-11-30 2012-06-13 St Microelectronics Res & Dev Oven using a Single Photon Avalanche Diode (SPAD) array
US9462168B2 (en) 2010-07-28 2016-10-04 Ifm Electronic Gmbh Light propagation time camera system having signal path monitoring
US20170123052A1 (en) * 2015-11-03 2017-05-04 Hexagon Technology Center Gmbh Optoelectronic surveying device
US10020813B1 (en) 2017-01-09 2018-07-10 Microsoft Technology Licensing, Llc Scaleable DLL clocking system
JP2019506768A (en) * 2015-12-16 2019-03-07 フェイスブック・テクノロジーズ・リミテッド・ライアビリティ・カンパニーFacebook Technologies, Llc Range gate type depth camera parts
US10234561B2 (en) 2016-05-09 2019-03-19 Microsoft Technology Licensing, Llc Specular reflection removal in time-of-flight camera apparatus
CN109643523A (en) * 2017-07-14 2019-04-16 深圳市汇顶科技股份有限公司 Pixel circuit and image sensing
CN111702306A (en) * 2015-06-24 2020-09-25 伊利诺斯工具制品有限公司 Time-of-flight camera for welding machine vision
US20210156881A1 (en) * 2019-11-26 2021-05-27 Faro Technologies, Inc. Dynamic machine vision sensor (dmvs) that performs integrated 3d tracking
US11450233B2 (en) 2019-02-19 2022-09-20 Illinois Tool Works Inc. Systems for simulating joining operations using mobile devices
US11521512B2 (en) 2019-02-19 2022-12-06 Illinois Tool Works Inc. Systems for simulating joining operations using mobile devices
US11545045B2 (en) 2015-03-09 2023-01-03 Illinois Tool Works Inc. Methods and apparatus to provide visual information associated with welding operations
US11645936B2 (en) 2019-11-25 2023-05-09 Illinois Tool Works Inc. Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment
US11721231B2 (en) 2019-11-25 2023-08-08 Illinois Tool Works Inc. Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment
US11818484B2 (en) 2020-11-26 2023-11-14 Samsung Electronics Co., Ltd. Imaging device

Families Citing this family (207)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7831358B2 (en) * 1992-05-05 2010-11-09 Automotive Technologies International, Inc. Arrangement and method for obtaining information using phase difference of modulated illumination
US6710770B2 (en) * 2000-02-11 2004-03-23 Canesta, Inc. Quasi-three-dimensional method and apparatus to detect and localize interaction of user-object and virtual transfer device
US6968073B1 (en) * 2001-04-24 2005-11-22 Automotive Systems Laboratory, Inc. Occupant detection system
JP2003115584A (en) * 2001-10-05 2003-04-18 Canon Inc Imaging apparatus and reader
US7176438B2 (en) * 2003-04-11 2007-02-13 Canesta, Inc. Method and system to differentially enhance sensor dynamic range using enhanced common mode reset
WO2004093318A2 (en) * 2003-04-11 2004-10-28 Canesta, Inc. Method and system to differentially enhance sensor dynamic range
US7406181B2 (en) * 2003-10-03 2008-07-29 Automotive Systems Laboratory, Inc. Occupant detection system
US7274815B1 (en) 2003-10-09 2007-09-25 Sandia Corporation Parallel phase-sensitive three-dimensional imaging camera
US20050148432A1 (en) * 2003-11-03 2005-07-07 Carmein David E.E. Combined omni-directional treadmill and electronic perception technology
US7317955B2 (en) * 2003-12-12 2008-01-08 Conmed Corporation Virtual operating room integration
US7317954B2 (en) 2003-12-12 2008-01-08 Conmed Corporation Virtual control of electrosurgical generator functions
JP2007526453A (en) * 2004-01-28 2007-09-13 カネスタ インコーポレイテッド Single chip red, green, blue, distance (RGB-Z) sensor
KR100636483B1 (en) 2004-06-25 2006-10-18 삼성에스디아이 주식회사 Transistor and fabrication method thereof and light emitting display
US20060244720A1 (en) * 2005-04-29 2006-11-02 Tracy James L Collapsible projection assembly
US8027029B2 (en) * 2007-11-07 2011-09-27 Magna Electronics Inc. Object detection and tracking system
US8203699B2 (en) 2008-06-30 2012-06-19 Microsoft Corporation System architecture design for time-of-flight system having reduced differential pixel size, and time-of-flight systems so designed
DE102009001894B4 (en) * 2009-03-26 2018-06-28 pmdtechnologies ag Robot system with 3D camera
DE102009026845A1 (en) 2009-06-09 2010-12-16 Ifm Electronic Gmbh Measuring device for measuring system for motor vehicle for determining tire parameter, has light source for illuminating tire surface with intensity modulated light and time-of-flight sensor for detecting light reflected from tire surface
US8681124B2 (en) 2009-09-22 2014-03-25 Microsoft Corporation Method and system for recognition of user gesture interaction with passive surface video displays
DE102009045555A1 (en) 2009-10-12 2011-04-14 Ifm Electronic Gmbh Security camera has three-dimensional camera based on photonic mixer devices, where two-dimensional camera and three-dimensional camera are associated for active illumination
DE102009045558B4 (en) 2009-10-12 2021-07-15 pmdtechnologies ag Camera system
DE102009045553B4 (en) 2009-10-12 2014-10-09 Ifm Electronic Gmbh Time of flight measurement system
DE102009045600B4 (en) 2009-10-12 2021-11-04 pmdtechnologies ag Camera system
DE102009046107A1 (en) 2009-10-28 2011-05-05 Ifm Electronic Gmbh System and method for interaction between a person and a machine
DE102009046109A1 (en) 2009-10-28 2011-05-05 Ifm Electronic Gmbh Position determining system for e.g. agricultural working machine, has evaluation device assigned to camera such that position and orientation of vehicle is determined in reference system based on detected detection marks
DE102009046124A1 (en) 2009-10-28 2011-05-05 Ifm Electronic Gmbh Method and apparatus for calibrating a 3D TOF camera system
DE102009046628A1 (en) 2009-11-11 2011-05-12 Ifm Electronic Gmbh Method for detecting covering density in e.g. bus in public passenger traffic, involves determining covering density and/or passenger distribution of transportation region based on depth image
DE102010001113B4 (en) 2010-01-21 2023-02-16 pmdtechnologies ag Illumination for a time-of-flight camera
DE102010002250B4 (en) 2010-02-23 2022-01-20 pmdtechnologies ag surveillance system
DE102010003409B4 (en) 2010-03-29 2022-06-09 pmdtechnologies ag Time of flight camera
DE102010003411A1 (en) 2010-03-29 2011-09-29 Ifm Electronic Gmbh Time-of-flight camera e.g. photo mixture detector camera, operating method, involves detecting phase shift of electromagnetic radiation for two various modulation frequencies, where difference of detected phase shifts is formed
DE102010003544A1 (en) 2010-03-31 2011-10-06 Ifm Electronic Gmbh Three-dimensional time-of-flight camera i.e. photonic mixer device, for use with car, has receiver optics arranged in such manner that pixel line of pixel array detects equal or larger portion of monitored area in spatial direction
DE102011007464A1 (en) 2010-04-19 2011-10-20 Ifm Electronic Gmbh Method for visualizing scene, involves selecting scene region in three-dimensional image based on distance information, marking selected scene region in two-dimensional image and presenting scene with marked scene region on display unit
DE102010030779A1 (en) 2010-06-30 2012-01-05 Ifm Electronic Gmbh System for monitoring and controlling actuatable passage barrier e.g. bollard, has control device allowing closing of barrier only if collision of barrier with objects is precluded in security region
DE102010038566A1 (en) 2010-07-28 2012-02-02 Ifm Electronic Gmbh Light running time camera i.e. three dimensional- time-of-flight camera, has light sources arranged such that photo sensor is illuminated during connecting light sources, where light sources are connected with modulator
DE102010043768B3 (en) * 2010-09-30 2011-12-15 Ifm Electronic Gmbh Time of flight camera
DE102010043723B4 (en) 2010-11-10 2022-03-10 pmdtechnologies ag time-of-flight camera system
GB2485990A (en) * 2010-11-30 2012-06-06 St Microelectronics Res & Dev An optical user-input device using SPADs
GB2485994A (en) 2010-11-30 2012-06-06 St Microelectronics Res & Dev Navigation device using a Single Photon Avalanche Diode (SPAD) detector
GB2485993A (en) 2010-11-30 2012-06-06 St Microelectronics Res & Dev Sports equipment comprising proximity detector using single photon avalanche diode (SPAD) for measuring the speed of an object struck by the sports equipment
DE102010062616B4 (en) 2010-12-08 2020-02-13 pmdtechnologies ag Optical rangefinder
DE102010063418B4 (en) 2010-12-17 2016-08-11 Ifm Electronic Gmbh Time of flight camera system with data channel
DE102010063579A1 (en) 2010-12-20 2012-06-21 Ifm Electronic Gmbh Optical range finder has reset devices that are controlled so as to control discharge of accumulation gates when voltage of accumulation gates reaches or exceeds threshold value
DE102011089636A1 (en) 2010-12-22 2012-06-28 PMD Technologie GmbH Light propagation time camera system, has evaluation unit connected with light propagation time sensor and photosensor and designed to start distance measurement depending on signals of photosensor
DE102011089642B4 (en) 2010-12-22 2023-02-16 pmdtechnologies ag time-of-flight sensor
DE102011089629B4 (en) 2010-12-22 2022-08-04 pmdtechnologies ag Time-of-flight camera and method for operating one
KR101722641B1 (en) 2010-12-23 2017-04-04 삼성전자주식회사 3D image acquisition apparatus and method of extractig depth information in the 3D image acquisition apparatus
US9857868B2 (en) 2011-03-19 2018-01-02 The Board Of Trustees Of The Leland Stanford Junior University Method and system for ergonomic touch-free interface
DE102012203341A1 (en) 2011-03-25 2012-09-27 Ifm Electronic Gmbh Two-dimensional-three-dimensional light for two-dimensional camera and three-dimensional camera, particularly light operating time camera, has two-dimensional light source that provides light to direction for two-dimensional camera
DE102011006613B4 (en) 2011-03-31 2023-11-30 pmdtechnologies ag lighting circuit
US8840466B2 (en) 2011-04-25 2014-09-23 Aquifi, Inc. Method and system to create three-dimensional mapping in a two-dimensional game
DE102011081560A1 (en) 2011-08-25 2013-02-28 Ifm Electronic Gmbh Time of flight camera system with signal path monitoring
DE102011081563B4 (en) 2011-08-25 2015-06-25 Ifm Electronic Gmbh Time of flight camera system with signal path monitoring
DE102011081561A1 (en) 2011-08-25 2013-02-28 Ifm Electronic Gmbh Time of flight camera system with signal path monitoring
DE102011082103B4 (en) 2011-09-02 2017-08-24 Audi Ag Safety system for a motor vehicle
GB2494663A (en) 2011-09-14 2013-03-20 St Microelectronics Res & Dev A system and corresponding method for monitoring vibration isolators
DE102012200572A1 (en) 2012-01-16 2013-07-18 Ifm Electronic Gmbh Two-dimensional three-dimensional camera for security system of motor vehicle, has control unit connected with two evaluation units and is configured to generate safety signal and security parameter for object in object list
US8854433B1 (en) 2012-02-03 2014-10-07 Aquifi, Inc. Method and system enabling natural user interface gestures with an electronic system
DE102012203596B4 (en) 2012-03-07 2023-11-23 pmdtechnologies ag Time of flight sensor
DE102012203616B4 (en) 2012-03-07 2016-02-18 Ifm Electronic Gmbh Lighting for a photoflash camera and method of making such
DE102012204512B4 (en) 2012-03-21 2020-08-06 pmdtechnologies ag Device for phase measurement of a modulated light
DE102013205600B4 (en) 2012-03-29 2019-09-19 pmdtechnologies ag Monitoring device for illumination of a light transit time measurement system
DE102013205607A1 (en) 2012-03-29 2013-10-02 Ifm Electronic Gmbh Analog-to-digital converter circuit for e.g. three-dimensional time-of-flight camera system, has evaluation unit that is configured such that error response is initiated if digital constant voltage lies outside tolerance voltage range
DE102013205605B4 (en) 2012-03-29 2018-02-01 pmdtechnologies ag Illumination for a light transit time measurement system
DE102012205217B4 (en) * 2012-03-30 2015-08-20 Ifm Electronic Gmbh Information display system with a virtual input zone
DE102012205212B4 (en) * 2012-03-30 2015-08-20 Ifm Electronic Gmbh Information display system with a virtual input zone and method for operating an information display system
DE102013207148A1 (en) 2012-05-03 2013-11-07 Ifm Electronic Gmbh Light run-time camera i.e. PMD light run-time camera, for arrangement in upper area of windscreen of motor car, has polarizing filter arranged in exit beam of lighting unit, where camera receives preferred light polarization
DE102012207328A1 (en) 2012-05-03 2013-11-07 Audi Ag System for acquisition of track profile for vehicle, has control device connected with light-time cameras and determining spacing and dimensions of uneven portion of track profile based on components of vehicle
DE102013207147A1 (en) 2012-05-03 2013-11-07 Ifm Electronic Gmbh Road surface profile detecting system for motor car, has time-of-flight cameras which detect road surface profile and determine distance and dimension of irregularities in overlapping monitored areas of cameras
DE102013207651A1 (en) 2012-05-21 2013-11-21 Ifm Electronic Gmbh Time of flight camera system
DE102013207647A1 (en) 2012-05-21 2013-11-21 Ifm Electronic Gmbh Method for operating light-time camera system, involves detecting phase shift of emitted or received signal for modulation frequency in phase measuring cycle, and performing multiple phase measurement cycles
DE102013207654B4 (en) 2012-05-21 2021-12-23 pmdtechnologies ag Time-of-flight camera system
DE102013207648B4 (en) 2012-05-21 2021-12-02 pmdtechnologies ag Time-of-flight camera system
US9664790B2 (en) 2012-05-21 2017-05-30 pmdtechnologies ag Time of flight camera system
DE102013207653B4 (en) 2012-05-21 2018-11-29 pmdtechnologies ag Time of flight camera system
DE102013207652B4 (en) 2012-05-21 2018-05-30 pmdtechnologies ag Time of flight camera system
DE102013207650B4 (en) 2012-05-21 2021-11-25 pmdtechnologies ag Time-of-flight camera system
DE102013208106A1 (en) 2012-05-24 2013-11-28 Ifm Electronic Gmbh Method for operating depth image-camera system, involves determining depth map based on data of light propagation time camera, and another depth image based on data of two-dimensional images of two-dimensional image acquisition
DE102012208995B4 (en) 2012-05-29 2017-12-07 pmdtechnologies ag Time of flight camera system with data channel
DE102013208804B4 (en) 2012-05-30 2016-07-21 Pmdtechnologies Gmbh Light transit time sensor with switchable background light suppression
DE102013208805B4 (en) 2012-05-30 2016-09-08 Pmdtechnologies Gmbh Light transit time sensor with buffer
DE102013208802A1 (en) 2012-05-30 2013-12-05 Pmdtechnologies Gmbh Time-of-flight (TOF) light sensor of TOF camera for use in three-dimensional-TOF (3D-TOF) camera system, has light transit time pixels that includes different color filters which are distributed in predetermined pattern on sensor area
US9098739B2 (en) 2012-06-25 2015-08-04 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching
US9111135B2 (en) 2012-06-25 2015-08-18 Aquifi, Inc. Systems and methods for tracking human hands using parts based template matching using corresponding pixels in bounded regions of a sequence of frames that are a specified distance interval from a reference camera
DE102013213660A1 (en) 2012-07-12 2014-01-16 Ifm Electronic Gmbh Circuit arrangement for photonic mixer detector-camera system, has reference light-time sensor arranged outside sensor element, and modulation signal of modulator is supplied to reference light-time sensor through output circuit
US8836768B1 (en) 2012-09-04 2014-09-16 Aquifi, Inc. Method and system enabling natural user interface gestures with user wearable glasses
DE102013218439A1 (en) 2012-09-13 2014-03-13 Ifm Electronic Gmbh Lighting module for time-of-flight camera installed in motor vehicle e.g. motor car, has optical system which is provided with beam shaping optical unit and auxiliary optical unit, and focal plane that is formed outside of main portion
DE102013218647A1 (en) 2012-09-17 2014-05-15 Pmdtechnologies Gmbh Light-transit time sensor e.g. photonic mixer devices (PMD) sensor for use in light-transit time camera system, has light-transit time pixel that comprises light-transit time pixels readout node connected to compression transistor
DE102012220702A1 (en) 2012-11-13 2014-05-15 Ifm Electronic Gmbh Monitoring system for detecting persons and/or objects in e.g. escalator, has controller that is connected with safety device so as to initiate safety responses
DE102012220648B4 (en) 2012-11-13 2015-11-26 Ifm Electronic Gmbh PMD camera with a volume determination
DE102012220650B4 (en) 2012-11-13 2016-07-28 Ifm Electronic Gmbh PMD camera with a speed determination
DE102012223295A1 (en) 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Photonic mixer device (PMD) camera system e.g. time-of-camera system has phase control unit that is provided to control phase position of illumination in response to signal of control sensor and signal of modulator
DE102013225439A1 (en) 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Light running time sensor i.e. photonic mixer device sensor, for use in light running time-camera system utilized for optical distance measurement, has reference running time pixel for receiving modulated light and comprising time control
DE102012223301A1 (en) 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Time-of-flight sensor for time-of-flight camera system, has time-of-flight pixel and multiple reference time-of-flight pixels for receiving modulated reference light, where two reference pixels have different dimensioned modulation gates
DE102012223298A1 (en) 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Light running time sensor e.g. photo mixture detector camera system, has light running time pixel and reference light running time pixel for reception of modulated reference light, where reference pixel exhibits nonlinear curve
DE102012223302A1 (en) 2012-12-14 2014-06-18 Pmdtechnologies Gmbh Light transit time camera e.g. three-dimensional (3D) time-of-flight (TOF) camera has phase control unit that controls phase of modulation signal of light source, based on signal of optical transducer
DE102013225438B4 (en) 2012-12-14 2017-02-23 pmdtechnologies ag Time of flight sensor with reference pixels
WO2014095539A1 (en) 2012-12-17 2014-06-26 Pmdtechnologies Gmbh Light propagation time camera with a motion detector
DE102013223586A1 (en) 2012-12-17 2014-06-18 Pmdtechnologies Gmbh Light runtime camera, particularly light runtime- or three-dimensional time of flight camera, has illuminating light source for emitting modulated light and light runtime sensor for detecting modulated light reflected from object
US9602807B2 (en) 2012-12-19 2017-03-21 Microsoft Technology Licensing, Llc Single frequency time of flight de-aliasing
US9092665B2 (en) 2013-01-30 2015-07-28 Aquifi, Inc Systems and methods for initializing motion tracking of human hands
US9129155B2 (en) 2013-01-30 2015-09-08 Aquifi, Inc. Systems and methods for initializing motion tracking of human hands using template matching within bounded regions determined using a depth map
DE102013203088B4 (en) 2013-02-25 2017-05-11 pmdtechnologies ag Time of flight camera system
DE102013203925B4 (en) 2013-03-07 2015-10-22 Ifm Electronic Gmbh Control system for vehicle headlights
DE102014205586A1 (en) 2013-03-28 2014-10-02 Pmdtechnologies Gmbh Time of flight camera system
DE102014205587A1 (en) 2013-03-28 2014-10-02 Pmdtechnologies Gmbh Time of flight camera system
DE102014205585B4 (en) 2013-03-28 2016-02-25 Pmdtechnologies Gmbh Method for operating a time of flight camera and time of flight camera system
US9298266B2 (en) 2013-04-02 2016-03-29 Aquifi, Inc. Systems and methods for implementing three-dimensional (3D) gesture based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
DE102014206236B4 (en) 2013-04-04 2020-11-19 pmdtechnologies ag Time-of-flight camera for a vehicle
DE102014206898B4 (en) 2013-04-10 2020-11-19 pmdtechnologies ag Time-of-flight camera for a vehicle
DE102013209044A1 (en) 2013-05-15 2014-11-20 Ifm Electronic Gmbh Control unit for a light transit time camera system
DE102013209162A1 (en) 2013-05-16 2014-11-20 Pmdtechnologies Gmbh Transit Time Sensor
DE102013209161A1 (en) 2013-05-16 2014-12-04 Pmdtechnologies Gmbh Transit Time Sensor
DE102014209337A1 (en) 2013-05-17 2014-11-20 Ifm Electronic Gmbh System and method for detecting a hazardous area
DE102014209338A1 (en) 2013-05-17 2014-11-20 Ifm Electronic Gmbh Time of flight camera system for free field recognition
DE102014208327A1 (en) 2013-05-17 2014-11-20 Ifm Electronic Gmbh Method and system for controlling a media output device
DE102014209371B4 (en) 2013-05-17 2018-11-15 pmdtechnologies ag System for controlling a work machine with a boom
DE102014211543A1 (en) 2013-06-21 2014-12-24 Ifm Electronic Gmbh Method and device for detecting gestures in a vehicle environment
DE102013214677B3 (en) * 2013-07-26 2014-10-30 PMD Technologie GmbH Time of flight camera system
DE102013012466B4 (en) 2013-07-26 2019-11-07 Audi Ag Operating system and method for operating a vehicle-side device
US9798388B1 (en) 2013-07-31 2017-10-24 Aquifi, Inc. Vibrotactile system to augment 3D input systems
DE102013216833A1 (en) 2013-08-23 2015-02-26 Ifm Electronic Gmbh 3D camera
DE102013218443A1 (en) 2013-09-13 2015-03-19 Ifm Electronic Gmbh illumination optics
DE102013220385B4 (en) 2013-10-09 2015-06-11 Ifm Electronic Gmbh Lighting for a light transit time camera system
DE102013019210A1 (en) 2013-11-15 2015-05-21 Audi Ag Lighting device for the passenger compartment of a motor vehicle and method for controlling the lighting device
DE102013223555A1 (en) 2013-11-19 2015-05-21 Ifm Electronic Gmbh A light transit time camera system for a vehicle and method of operating such
DE102013224937A1 (en) 2013-12-05 2015-06-11 Pmdtechnologies Gmbh Time of flight camera system
US9507417B2 (en) 2014-01-07 2016-11-29 Aquifi, Inc. Systems and methods for implementing head tracking based graphical user interfaces (GUI) that incorporate gesture reactive interface objects
US9619105B1 (en) 2014-01-30 2017-04-11 Aquifi, Inc. Systems and methods for gesture based interaction with viewpoint dependent user interfaces
DE102015202499B4 (en) 2014-02-17 2023-01-19 pmdtechnologies ag Runtime camera with static gesture recognition
DE102014204423B4 (en) 2014-03-11 2021-06-02 pmdtechnologies ag Time of flight camera system
DE102014204424A1 (en) 2014-03-11 2015-09-17 Pmdtechnologies Gmbh Time of flight camera system
DE102015205826B4 (en) 2014-04-04 2020-03-12 pmdtechnologies ag Distance measuring system with time-of-flight pixel line
DE102015205840A1 (en) 2014-04-04 2015-10-08 Ifm Electronic Gmbh Distance measuring system with light time pixel line
DE102015205841B4 (en) 2014-04-04 2023-05-11 pmdtechnologies ag Distance measurement system with time-of-flight pixel line
DE102015205927B4 (en) 2014-04-07 2023-03-16 pmdtechnologies ag Distance measurement system with time-of-flight measurement and triangulation
DE102014207163A1 (en) 2014-04-15 2015-10-15 Pmdtechnologies Gmbh Time of flight camera system
DE102015204124A1 (en) 2014-05-02 2015-11-05 Ifm Electronic Gmbh Decoupling element for a light guide for guiding light on a light transit time sensor
DE102015207567A1 (en) 2014-05-02 2015-11-05 Ifm Electronic Gmbh Light shaping optics and light guide structure for a light transit time sensor
DE102015217314A1 (en) 2014-09-12 2016-03-17 Ifm Electronic Gmbh monitoring system
DE102015223674B4 (en) 2014-12-01 2023-09-07 pmdtechnologies ag Time-of-flight sensor for an optical range finder
DE102015223675B4 (en) 2014-12-01 2023-09-07 pmdtechnologies ag Time-of-flight sensor for an optical range finder
DE102014225046A1 (en) 2014-12-05 2016-06-09 Ifm Electronic Gmbh Camera system with modulated color illumination
DE102015224615A1 (en) 2014-12-15 2016-06-16 Pmdtechnologies Gmbh Light transit time sensor with transistor sharing
DE102015222380B4 (en) 2014-12-16 2021-12-09 pmdtechnologies ag Distance measurement system
DE102015222379A1 (en) 2014-12-16 2016-06-16 Ifm Electronic Gmbh Distance measuring system
DE102015222381A1 (en) 2014-12-16 2016-06-16 Ifm Electronic Gmbh Distance measuring system
US10773329B2 (en) 2015-01-20 2020-09-15 Illinois Tool Works Inc. Multiple input welding vision system
DE102015202501B4 (en) 2015-02-12 2022-01-13 pmdtechnologies ag time-of-flight sensor
WO2016144741A1 (en) 2015-03-06 2016-09-15 Illinois Tool Works Inc. Sensor assisted head mounted displays for welding
US9977242B2 (en) 2015-03-26 2018-05-22 Illinois Tool Works Inc. Control of mediated reality welding system based on lighting conditions
DE102016205073B4 (en) 2015-03-30 2021-08-26 pmdtechnologies ag Time of flight sensor
US10419723B2 (en) 2015-06-25 2019-09-17 Magna Electronics Inc. Vehicle communication system with forward viewing camera and integrated antenna
US10137904B2 (en) 2015-10-14 2018-11-27 Magna Electronics Inc. Driver assistance system with sensor offset correction
US11027654B2 (en) 2015-12-04 2021-06-08 Magna Electronics Inc. Vehicle vision system with compressed video transfer via DSRC link
US10191154B2 (en) 2016-02-11 2019-01-29 Massachusetts Institute Of Technology Methods and apparatus for time-of-flight imaging
US10703204B2 (en) 2016-03-23 2020-07-07 Magna Electronics Inc. Vehicle driver monitoring system
US10571562B2 (en) 2016-03-25 2020-02-25 Magna Electronics Inc. Vehicle short range sensing system using RF sensors
DE102017203564A1 (en) 2016-04-11 2017-10-12 pmdtechnologies ag Time of flight camera system with function monitoring
US10190983B2 (en) 2016-04-15 2019-01-29 Massachusetts Institute Of Technology Methods and apparatus for fluorescence lifetime imaging with pulsed light
US10534081B2 (en) 2016-05-02 2020-01-14 Magna Electronics Inc. Mounting system for vehicle short range sensors
US10040481B2 (en) 2016-05-17 2018-08-07 Magna Electronics Inc. Vehicle trailer angle detection system using ultrasonic sensors
US10768298B2 (en) 2016-06-14 2020-09-08 Magna Electronics Inc. Vehicle sensing system with 360 degree near range sensing
EP3482503A4 (en) 2016-07-08 2020-03-04 Magna Electronics Inc. 2d mimo radar system for vehicle
US10239446B2 (en) 2016-07-13 2019-03-26 Magna Electronics Inc. Vehicle sensing system using daisy chain of sensors
US10708227B2 (en) 2016-07-19 2020-07-07 Magna Electronics Inc. Scalable secure gateway for vehicle
US10641867B2 (en) 2016-08-15 2020-05-05 Magna Electronics Inc. Vehicle radar system with shaped radar antennas
US10852418B2 (en) 2016-08-24 2020-12-01 Magna Electronics Inc. Vehicle sensor with integrated radar and image sensors
US10677894B2 (en) 2016-09-06 2020-06-09 Magna Electronics Inc. Vehicle sensing system for classification of vehicle model
US10836376B2 (en) 2016-09-06 2020-11-17 Magna Electronics Inc. Vehicle sensing system with enhanced detection of vehicle angle
US10347129B2 (en) 2016-12-07 2019-07-09 Magna Electronics Inc. Vehicle system with truck turn alert
US10462354B2 (en) 2016-12-09 2019-10-29 Magna Electronics Inc. Vehicle control system utilizing multi-camera module
US10703341B2 (en) 2017-02-03 2020-07-07 Magna Electronics Inc. Vehicle sensor housing with theft protection
US10782388B2 (en) 2017-02-16 2020-09-22 Magna Electronics Inc. Vehicle radar system with copper PCB
US11536829B2 (en) 2017-02-16 2022-12-27 Magna Electronics Inc. Vehicle radar system with radar embedded into radome
US11142200B2 (en) 2017-02-23 2021-10-12 Magna Electronics Inc. Vehicular adaptive cruise control with enhanced vehicle control
US10884103B2 (en) 2017-04-17 2021-01-05 Magna Electronics Inc. Calibration system for vehicle radar system
CN108955584B (en) * 2017-05-23 2021-02-05 上海汽车集团股份有限公司 Pavement detection method and device
US10870426B2 (en) 2017-06-22 2020-12-22 Magna Electronics Inc. Driving assistance system with rear collision mitigation
CN208376630U (en) 2017-06-30 2019-01-15 麦格纳电子(张家港)有限公司 The vehicle vision system communicated with trailer sensor
US11150342B2 (en) 2017-09-07 2021-10-19 Magna Electronics Inc. Vehicle radar sensing system with surface segmentation using interferometric statistical analysis
US10962641B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with enhanced accuracy using interferometry techniques
US10877148B2 (en) 2017-09-07 2020-12-29 Magna Electronics Inc. Vehicle radar sensing system with enhanced angle resolution using synthesized aperture
US10962638B2 (en) 2017-09-07 2021-03-30 Magna Electronics Inc. Vehicle radar sensing system with surface modeling
US10933798B2 (en) 2017-09-22 2021-03-02 Magna Electronics Inc. Vehicle lighting control system with fog detection
US11391826B2 (en) 2017-09-27 2022-07-19 Magna Electronics Inc. Vehicle LIDAR sensor calibration system
US11486968B2 (en) 2017-11-15 2022-11-01 Magna Electronics Inc. Vehicle Lidar sensing system with sensor module
US10816666B2 (en) 2017-11-21 2020-10-27 Magna Electronics Inc. Vehicle sensing system with calibration/fusion of point cloud partitions
US11167771B2 (en) 2018-01-05 2021-11-09 Magna Mirrors Of America, Inc. Vehicular gesture monitoring system
US11199611B2 (en) 2018-02-20 2021-12-14 Magna Electronics Inc. Vehicle radar system with T-shaped slot antennas
US11047977B2 (en) 2018-02-20 2021-06-29 Magna Electronics Inc. Vehicle radar system with solution for ADC saturation
CN109212546B (en) * 2018-09-27 2021-05-18 北京伟景智能科技有限公司 Method and device for calculating depth direction measurement error of binocular camera
US11808876B2 (en) 2018-10-25 2023-11-07 Magna Electronics Inc. Vehicular radar system with vehicle to infrastructure communication
US11683911B2 (en) 2018-10-26 2023-06-20 Magna Electronics Inc. Vehicular sensing device with cooling feature
US11638362B2 (en) 2018-10-29 2023-04-25 Magna Electronics Inc. Vehicular radar sensor with enhanced housing and PCB construction
US11454720B2 (en) 2018-11-28 2022-09-27 Magna Electronics Inc. Vehicle radar system with enhanced wave guide antenna system
US11096301B2 (en) 2019-01-03 2021-08-17 Magna Electronics Inc. Vehicular radar sensor with mechanical coupling of sensor housing
US11332124B2 (en) 2019-01-10 2022-05-17 Magna Electronics Inc. Vehicular control system
US11294028B2 (en) 2019-01-29 2022-04-05 Magna Electronics Inc. Sensing system with enhanced electrical contact at PCB-waveguide interface
US11609304B2 (en) 2019-02-07 2023-03-21 Magna Electronics Inc. Vehicular front camera testing system
US11333739B2 (en) 2019-02-26 2022-05-17 Magna Electronics Inc. Vehicular radar system with automatic sensor alignment
US11267393B2 (en) 2019-05-16 2022-03-08 Magna Electronics Inc. Vehicular alert system for alerting drivers of other vehicles responsive to a change in driving conditions
DE112021000497T5 (en) 2020-01-10 2022-11-24 Magna Electronics, Inc. Communication System and Procedures
US11823395B2 (en) 2020-07-02 2023-11-21 Magna Electronics Inc. Vehicular vision system with road contour detection feature
US11749105B2 (en) 2020-10-01 2023-09-05 Magna Electronics Inc. Vehicular communication system with turn signal identification
DE102021111602A1 (en) 2021-05-05 2022-11-10 Ifm Electronic Gmbh Computer-implemented method for correcting artifacts in measurement data generated by a time-of-flight 3D sensor, a corresponding computer program, a corresponding computer-readable medium and a PMD detector
DE102021112402A1 (en) 2021-05-12 2022-11-17 Ifm Electronic Gmbh time-of-flight sensor

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2265514B (en) * 1992-03-28 1995-08-16 Marconi Gec Ltd A receiver-transmitter for a target identification system
US5831719A (en) * 1996-04-12 1998-11-03 Holometrics, Inc. Laser scanning system
DE19821974B4 (en) 1998-05-18 2008-04-10 Schwarte, Rudolf, Prof. Dr.-Ing. Apparatus and method for detecting phase and amplitude of electromagnetic waves
FR2780163B1 (en) * 1998-06-18 2000-08-11 Agence Spatiale Europeenne INCOHERENT LASER DOPPLER TELESCOPY SYSTEM

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7248344B2 (en) 2001-04-04 2007-07-24 Instro Precision Limited Surface profile measurement
GB2374743A (en) * 2001-04-04 2002-10-23 Instro Prec Ltd Surface profile measurement
US20030147002A1 (en) * 2002-02-06 2003-08-07 Eastman Kodak Company Method and apparatus for a color sequential scannerless range imaging system
WO2003085413A2 (en) * 2002-04-08 2003-10-16 Matsushita Electric Works, Ltd. Three dimensional image sensing device using intensity modulated light
WO2003085413A3 (en) * 2002-04-08 2004-04-15 Matsushita Electric Works Ltd Three dimensional image sensing device using intensity modulated light
US20050145773A1 (en) * 2002-04-08 2005-07-07 Yusuke Hashimoto Spatial information detecting device using intensity-modulated light
US7119350B2 (en) 2002-04-08 2006-10-10 Matsushita Electric Works, Ltd. Spatial information detecting device using intensity-modulated light and a beat signal
US20060119833A1 (en) * 2003-02-19 2006-06-08 Leica Geosystems Ag Method and device for deriving geodetic distance data
US7982859B2 (en) * 2003-02-19 2011-07-19 Leica Geosystems Ag Method and device for deriving geodetic distance data
US7656509B2 (en) * 2006-05-24 2010-02-02 Pixeloptics, Inc. Optical rangefinder for an electro-active lens
US20070280626A1 (en) * 2006-05-24 2007-12-06 Haddock Joshua N Optical rangefinder for an electro-active lens
US10743512B2 (en) 2006-09-05 2020-08-18 Maasland N.V. Implement for automatically milking a dairy animal
NL1032435C2 (en) * 2006-09-05 2008-03-06 Maasland Nv Device for automatically milking a dairy animal.
AU2007293812B2 (en) * 2006-09-05 2010-11-25 Maasland N.V. Implement for automatically milking a dairy animal
US8807080B2 (en) 2006-09-05 2014-08-19 Maasland N.V. Implement for automatically milking a dairy animal
WO2008030086A1 (en) * 2006-09-05 2008-03-13 Maasland N.V. Implement for automatically milking a dairy animal
RU2473211C2 (en) * 2006-09-05 2013-01-27 Масланд Н.В. Device for automatic milking of dairy cattle
US20100186675A1 (en) * 2006-09-05 2010-07-29 Maasland N.V. Implement for automatically milking a dairy animal
US10039259B2 (en) 2006-09-05 2018-08-07 Maasland N.V. Implement for automatically milking a dairy animal
US10750712B2 (en) 2006-09-05 2020-08-25 Maasland N.V. Implement for automatically milking a dairy animal
US20110188028A1 (en) * 2007-10-02 2011-08-04 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (tof) systems
US8629976B2 (en) * 2007-10-02 2014-01-14 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (TOF) systems
WO2011051286A1 (en) * 2009-10-28 2011-05-05 Ifm Electronic Gmbh Camera system
CN102298149A (en) * 2010-06-25 2011-12-28 原相科技股份有限公司 Power-saving time-difference distance measuring system and method capable of promoting precision and moveable detection efficiency
WO2012012607A3 (en) * 2010-07-21 2012-05-10 Microsoft Corporation Methods and systems for hierarchical de-aliasing time-of-flight (tof) systems
CN102393515A (en) * 2010-07-21 2012-03-28 微软公司 Method and system for lossless dealiasing in time-of-flight (TOF) systems
US9462168B2 (en) 2010-07-28 2016-10-04 Ifm Electronic Gmbh Light propagation time camera system having signal path monitoring
US10085310B2 (en) 2010-11-30 2018-09-25 Stmicroelectronics (Research & Development) Limited Application using a single photon avalanche diode (SPAD)
GB2485991A (en) * 2010-11-30 2012-06-06 St Microelectronics Res & Dev Camera using a Single Photon Avalanche Diode (SPAD) array
GB2486165A (en) * 2010-11-30 2012-06-13 St Microelectronics Res & Dev Oven using a Single Photon Avalanche Diode (SPAD) array
GB2485997A (en) * 2010-11-30 2012-06-06 St Microelectronics Res & Dev Camera using a Single Photon Avalanche Diode (SPAD) array
US11862035B2 (en) 2015-03-09 2024-01-02 Illinois Tool Works Inc. Methods and apparatus to provide visual information associated with welding operations
US11545045B2 (en) 2015-03-09 2023-01-03 Illinois Tool Works Inc. Methods and apparatus to provide visual information associated with welding operations
CN111702306A (en) * 2015-06-24 2020-09-25 伊利诺斯工具制品有限公司 Time-of-flight camera for welding machine vision
US11679452B2 (en) 2015-06-24 2023-06-20 Illinois Tool Works Inc. Wind turbine blade and wind turbine power generating apparatus
US11035935B2 (en) * 2015-11-03 2021-06-15 Hexagon Technology Center Gmbh Optoelectronic surveying device
US20170123052A1 (en) * 2015-11-03 2017-05-04 Hexagon Technology Center Gmbh Optoelectronic surveying device
JP2019506768A (en) * 2015-12-16 2019-03-07 フェイスブック・テクノロジーズ・リミテッド・ライアビリティ・カンパニーFacebook Technologies, Llc Range gate type depth camera parts
US10234561B2 (en) 2016-05-09 2019-03-19 Microsoft Technology Licensing, Llc Specular reflection removal in time-of-flight camera apparatus
US10302768B2 (en) 2016-05-09 2019-05-28 Microsoft Technology Licensing, Llc Multipath signal removal in time-of-flight camera apparatus
US10020813B1 (en) 2017-01-09 2018-07-10 Microsoft Technology Licensing, Llc Scaleable DLL clocking system
US10320399B2 (en) 2017-01-09 2019-06-11 Microsoft Technology Licensing, Llc Scaleable DLL clocking system
CN109643523A (en) * 2017-07-14 2019-04-16 深圳市汇顶科技股份有限公司 Pixel circuit and image sensing
US11450233B2 (en) 2019-02-19 2022-09-20 Illinois Tool Works Inc. Systems for simulating joining operations using mobile devices
US11521512B2 (en) 2019-02-19 2022-12-06 Illinois Tool Works Inc. Systems for simulating joining operations using mobile devices
US11645936B2 (en) 2019-11-25 2023-05-09 Illinois Tool Works Inc. Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment
US11721231B2 (en) 2019-11-25 2023-08-08 Illinois Tool Works Inc. Weld training simulations using mobile devices, modular workpieces, and simulated welding equipment
US20210156881A1 (en) * 2019-11-26 2021-05-27 Faro Technologies, Inc. Dynamic machine vision sensor (dmvs) that performs integrated 3d tracking
US11818484B2 (en) 2020-11-26 2023-11-14 Samsung Electronics Co., Ltd. Imaging device

Also Published As

Publication number Publication date
US6587186B2 (en) 2003-07-01

Similar Documents

Publication Publication Date Title
US6587186B2 (en) CMOS-compatible three-dimensional image sensing using reduced peak energy
US6580496B2 (en) Systems for CMOS-compatible three-dimensional image sensing using quantum efficiency modulation
US6515740B2 (en) Methods for CMOS-compatible three-dimensional image sensing using quantum efficiency modulation
US7379163B2 (en) Method and system for automatic gain control of sensors in time-of-flight systems
Hussmann et al. A performance review of 3D TOF vision systems in comparison to stereo vision systems
EP2729826B1 (en) Improvements in or relating to the processing of time-of-flight signals
US6323942B1 (en) CMOS-compatible three-dimensional image sensor IC
US10000000B2 (en) Coherent LADAR using intra-pixel quadrature detection
CN111758047B (en) Single chip RGB-D camera
KR101722641B1 (en) 3D image acquisition apparatus and method of extractig depth information in the 3D image acquisition apparatus
JP4533582B2 (en) A CMOS compatible 3D image sensing system using quantum efficiency modulation
US20160299218A1 (en) Time-of-light-based systems using reduced illumination duty cycles
KR20030040490A (en) System and method for signal acquisition in a distance meter
US11393115B2 (en) Filtering continuous-wave time-of-flight measurements, based on coded modulation images
US11531094B2 (en) Method and system to determine distance using time of flight measurement comprising a control circuitry identifying which row of photosensitive image region has the captured image illumination stripe
CN111123289B (en) Depth measuring device and measuring method
CN111708039A (en) Depth measuring device and method and electronic equipment
US20210041541A1 (en) Phase anti-aliasing using spread-spectrum techniques in an optical distance measurement system
KR101145132B1 (en) The three-dimensional imaging pulsed laser radar system using geiger-mode avalanche photo-diode focal plane array and auto-focusing method for the same
JP4401989B2 (en) 3D image information acquisition system
CN113366383A (en) Camera device and automatic focusing method thereof
WO2021034409A1 (en) Depth sensor with interlaced sampling structure
US4979816A (en) Range sensing system
Christie et al. Design and development of a multi-detecting two-dimensional ranging sensor
JP2001268445A (en) Photosensor and three-dimensional shape measuring device

Legal Events

Date Code Title Description
AS Assignment

Owner name: CANESTA, INC., CALIFORNIA

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:BAMJI, CYRUS;CHARBON, EDOARDO;REEL/FRAME:011906/0694

Effective date: 20010606

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REFU Refund

Free format text: REFUND - PAYMENT OF MAINTENANCE FEE, 8TH YR, SMALL ENTITY (ORIGINAL EVENT CODE: R2552); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: REFUND - 7.5 YR SURCHARGE - LATE PMT W/IN 6 MO, SMALL ENTITY (ORIGINAL EVENT CODE: R2555); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FEPP Fee payment procedure

Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

REMI Maintenance fee reminder mailed
AS Assignment

Owner name: MICROSOFT CORPORATION, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:CANESTA, INC.;REEL/FRAME:025790/0458

Effective date: 20101122

FPAY Fee payment

Year of fee payment: 8

SULP Surcharge for late payment

Year of fee payment: 7

AS Assignment

Owner name: MICROSOFT TECHNOLOGY LICENSING, LLC, WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MICROSOFT CORPORATION;REEL/FRAME:034541/0001

Effective date: 20141014

FPAY Fee payment

Year of fee payment: 12