|Publication number||US7312766 B1|
|Application number||US 10/088,747|
|Publication date||Dec 25, 2007|
|Filing date||Sep 22, 2000|
|Priority date||Sep 22, 2000|
|Publication number||088747, 10088747, PCT/2000/1063, PCT/CA/0/001063, PCT/CA/0/01063, PCT/CA/2000/001063, PCT/CA/2000/01063, PCT/CA0/001063, PCT/CA0/01063, PCT/CA0001063, PCT/CA001063, PCT/CA2000/001063, PCT/CA2000/01063, PCT/CA2000001063, PCT/CA200001063, US 7312766 B1, US 7312766B1, US-B1-7312766, US7312766 B1, US7312766B1|
|Inventors||Eric C. Edwards|
|Original Assignee||Canadian Space Agency|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (15), Non-Patent Citations (1), Referenced by (54), Classifications (17), Legal Events (3)|
|External Links: USPTO, USPTO Assignment, Espacenet|
The present invention generally relates to telepresence systems and more particularly relates to motion compensation in telepresence systems.
The field of remote control has come a long way since the days of watching a model aircraft fly under the control of a handheld controller. Robotics and remote robotic manipulation have created a strong and pressing need for more remote and better remote control systems. Obviously, an ideal form of remote control involves providing an operator with all the sensations of operating the remote robot without the inherent dangers, travel, and so forth. In order to achieve this, a telepresence system is used.
Telepresence systems are sensory feedback systems for allowing sensing and monitoring of remote systems. A typical telepresence sensor is a camera and a head mounted display. The system provides visual feedback from a remote location to an operator. For example, in a telepresence system for an automobile, the front windshield is provided with a camera. The controls of the vehicle are provided with actuators for automatically manipulating same. An operator is provided with a duplicate of the cabin of the car. The windshield is replaced with a display and the controls are linked via communications to the actuators within the vehicle. Turning of the steering wheel in the cabin of the car causes the steering wheel to turn in the vehicle. Similarly, the camera captures images in front of the car and they are displayed on the display in the cabin of the car.
Presently, there is a trend toward providing the visual feedback using a head mounted display (HMD). A head mounted display is a small display or two small displays mounted for being worn on a users head. Advantageously, an HMD with two displays provides stereo imaging allowing a user to perceive depth of field. Alternatively, such an HMD provides two identical images, one to each display. Unfortunately, the head mounted display only presents a user with information from approximately in front of the user. Thus, when a user turns their head, the image seen and the expected image differ. Therefore, the camera is mounted on a mechanism which moves in accordance with detected HMD movement. Thus, the image before the user is in accordance with the user's head position.
Generally, it is an object of telepresence systems to provide a visual sensation of being in the place of the robot and a control system for controlling the robot as well. Thus, telepresence systems aim to provide feedback that is appropriate to different situations.
Unfortunately, a camera does not move in exact synchronisation with the HMD so the image is not perfectly aligned with the expectations of the user during head motion. This misalignment can result in disorientation and nausea on the part of an operator.
The disclosures in U.S. Pat. No. 5,579,026 issued on Nov. 26, 1996 in the name of Tabata and in U.S. Pat. No. 5,917,460 issued on Jun. 29, 1999 in the name of Kodama focus on image display for use in, for example, virtual reality and games. In there is described a head mounted display in which the position of the projected image can be displaced in response to a control unit or in response to the rotational motion of the operator's head. The essence of the head-tracking implementation is that from the user's perspective, the image can be made to remain substantially stationary in space during head movements, by being manipulated in a manner opposite to the movements. Significantly, the patents do not relate to visual telepresence using slaved cameras. In the slaved camera implementation, the camera should follow the motion of the HMD and, as such, compensation for HMD motion is unnecessary since the image is always of a direction in which the head is directed.
Further because U.S. Pat. No. 5,579,026 relates to displaying a simulated planar image, such as a simulation of a television screen located in virtual space in front of the user, the patent provides for a fixed frame of reference relative to a wearer of the HMD. The images in any direction are simulated thus being formed as needed. Unfortunately, in telepresence systems, often the video data relating to a particular direction of view is unavailable. This complicates the system significantly and as such, the prior art relating to video data display is not truly applicable and, one of skill in the art would not refer to such.
In U.S. Pat. No. 5,917,460 issued on Jun. 29, 1999 in the name of Kodama a system addressing the three-axes displacement (up/down, left/right, frontwards/backwards) of a HMD is provided. The displacement appears to linear and is accommodated through a mechanical mechanism. The displays are moved in response to detected movement of a head and as such, objects remain somewhat stationary from the visual perspective of the user.
It is not well suited to use in telepresence wherein a camera tracks the motion of the HMD. One of skill in the art, absent hindsight, would not be drawn to maintaining a visual reference when a head is turned, for a telepresence system wherein a camera is rotated in response to head movement. Of course, the different problem results in a different solution.
For example, in telepresence systems, the delay between camera image capture and head motion is often indeterminate. It is not a workable solution to implement the system of the above referenced patents to solve this problem. Because of the unknown delays caused by camera response time and communication delays, the solution is not trivial.
In U.S. Pat. No. 5,933,125 a system is disclosed using prediction of the head movement to pre-compensate for the delay expected in the generation of a virtual image, nominally in a simulated environment. By this means, a time lag in the generation of imagery is compensated for by shifting the scene to provide a stable visual frame of reference. This method is applicable to short delays and small displacements, where head tracking information can be used to predict the next head position with reasonable accuracy. The patent discloses 100 msec as a normal value. Effective prediction of head motion is aided by comprehensive information about head movement, including angular head velocity and angular acceleration. For small head movements, errors induced are small. Typically, these occur in a small period of time. The disclosed embodiments rely on knowledge of the time delay, which is nominally considered to be constant. Unfortunately, when the time delays grow large allowing for substantial motion of a head, the errors in the predictive algorithm are unknown and the system is somewhat unworkable.
Furthermore, U.S. Pat. No. 5,933,125 cannot compensate for unanticipated image movement, only that which occurs in correct response to the operator's head movement. Also, it does not relate to visual telepresence systems using remote slave cameras.
It would be highly advantageous to provide a system that does not rely on any form of prediction for compensation and which works with variable delays between image capture and image display.
In order to overcome these and other shortcomings of the prior art, it is an object of the invention to provide a method of compensating for time delays between head motion and camera motion in telepresence systems.
The invention relates to a method and apparatus that provides a wearer of an HMD with a stable frame of visual reference in cases where there may be time delays or unwanted motion within the visual capture/visual display systems.
According to the invention, in order to eliminate some of the disorientation caused by time delays in camera motion when a head motion occurs, an image shown on the display of a head-mounted display (HMD) is offset relative to the field of view of the HMD until the camera position is again synchronised with the HMD position. Offsetting of the image results in areas of the display for which no image information is available. These display areas are provided fill data in the form of a solid shading or some feature set for providing visual cues. When the transformed images again overlap the display, the fill is no longer necessary.
In accordance with the invention there is provided a method of motion compensation for head mounted displays. The method includes the following steps: providing an image from an image capture device to a head mounted display including a monitor having a field of view; providing camera position data associated with the image; providing head position data; adjusting the image location relative to the field of view of the monitor in accordance with the camera position data and the head position data; and, displaying portions of the image at the adjusted locations, those portions remaining within the field of view.
Typically position data includes at least one of orientation data and location data. Location data is also referred to as displacement data. Typically, portions of the field of view without image data are filled with a predetermined fill. When none of the image data is for display within the field of view, the entire field of view is filled with the predetermined fill.
For example, the image is adjusted by the following steps: determining an offset between the head mounted display position and the camera position; and, offsetting the image such that it is offset an amount equal to the offset between the head mounted display position and the camera position.
Advantageously, such a system is not limited by the accuracy of a predictive process nor by the time delay between image capture and image display. Instead, it is reactive, and uses sensed information on HMD position and camera position to formulate a transformation for the captured image. The present invention has no limit to the time delays for which compensation is possible since the required head position information and camera position information are sensed at different times allowing compensation for any delay between sensing one and then sensing the other.
Further advantageously, the present invention requires no knowledge of the time delay in the system and functions properly in the presence of non-constant time delays. There is no requirement that the time delay be measured and it is not used in determining the transform of the image.
The invention will now be described in conjunction with the drawings in which:
The present invention is described with reference to telepresence systems operating over long distances such that significant time delays occur between head motion and image display of an image for a current head orientation. It is, however, equally applicable when a camera drive mechanism provides insufficient response rate to allow comfortable viewing of images during normal head motion. It is also applicable in situations where unwanted and unmodeled motion of the camera is possible, such as when the camera is mounted on a moving platform.
When the mechanism 13 for pointing the camera is physically coupled to the first computer 7, the camera 11 begins to move when HMD motion is detected. The lag between camera motion and HMD motion is determined by communication delays, which are very small, processing delays, which may be minimal, and pointing mechanism performance, which varies. These delays often result in an image provided from the camera 11 remaining static while the HMD 1 is in motion or moving substantially slower than the HMD motion. Of course, since the operator's mind expects motion within the visual frame, this is disconcerting and often results in nausea and disorientation.
This problem is even more notable when communication delay times are significant such as when used for terrestrial control of systems in space. There, the delay is in the order of seconds and, as such, the disorientation of an operator during HMD motion is significant. Significantly, disorientation is a cause of operator fatigue resulting in limited operator use of a system or limited use of a system during a day.
Referring again to
The HMD position values are used to determine a current HMD orientation in a coordinate space analogous to that of the camera 11. As such, an offset between camera orientation and HMD orientation is determinable. Since the HMD 1 is being worn by an operator 5 the HMD orientation is directly correlated to the position of the head of the operator 5. Of course, the direct correlation is related to sensed position data and in use is generally an approximate direct correlation due to a refresh rate of the HMD position sensor. The offset between the camera orientation and the HMD orientation is related to a delay between the local system and the remote system.
Therefore when a non-zero offset is determined, the first computer offsets the image provided by the camera relative to the field of view of the HMD in order to compensate for the determined offset. Referring to
Though the described instantaneous corrections shown in
The image and position data are then transmitted to the first computer 7. When the image and position data are received, they are prepared for processing at the first computer 7. Then, the position data of the HMD 1 is acquired by the first computer 7 and is used to transform the image in accordance with the invention. The transformed image is provided to the display and is displayed thereon to the operator 5. Because the HMD position data is gathered immediately before it is needed, the delay between HMD position data capture and display of the transformed image is very small and results in little or no operator disorientation.
Concurrently, position data is provided to the mechanism 13 at intervals and the mechanism moves the camera 11 in accordance with received position data and a current orientation of the camera 11.
Typically, the step of transforming the image comprises the following steps, some of which are performed in advance. A correlation between angular movement and display or image pixels is determined such that an offset of α degrees results in displacement of the image by N pixels in a first direction and by M pixels in a second other direction. A transform for rotating the image based on rotations is also determined. Preferably, the transforms are sufficiently simple to provide fast image processing. That said, a small image processing delay, because it forms substantially the delay in displaying the data, is acceptable.
Once the image data is received, it is stored in memory for fast processing thereof. The HMD position data is acquired and is compared to the camera position data. The difference is used in performing the transform to correct the image position for any HMD motion unaccounted for by the mechanism 13, as of yet. Also, the method corrects for unintentional movements of the camera 11 when the camera position sensor is independent of the mechanism 13, for example with an inertial position sensor.
In the above embodiment, a general purpose processor is used to transform the image. In an alternative embodiment, a hardware implementation of the transform is used. A hardware implementation is less easily modified, but has a tremendous impact on performance. Using parallel hardware transformation processors, an image can be transformed in a small fraction of the time necessary for performing a software transformation of the image.
The computer 107 uses the camera position data and the image along with data received from the head tracker 103 to transform the image in accordance with the invention. As is evident, the delay between HMD motion and camera motion is measurable in seconds. The delay between camera image capture and receipt of the image at the computer 107 is also measurable in seconds. As such, significant disorientation of the user results absent application of the present invention.
Alternatively, the camera captures images of areas larger than can be displayed and only a portion of the image is displayed. This is considered less preferable since it increases the bandwidth requirements and often for no reason as the additional data is not displayed.
Advantageously, when implemented with independent position indicators for each of the HMD 1 and the camera 11 and independent from the mechanism 13 for moving the camera, all types of motion are compensated for including inaccuracies of the mechanism 13, delays induced by communications, delays induced by the mechanism 13, processing delays, fine operator motions and so forth.
When processing is done local to the HMD or on a computer at a same location with minimal delays therebetween, each image is accurately aligned on the display within a time delay error related only to the processing and the delay in reading HMD sensor data.
Thus, discontinuous scene changes are changed into smooth transitions in accordance with the expected visual result.
It is also within the scope of the invention to process the image data prior to display thereof in order to determine features or locations within the image data to highlight or indicate within the displayed image. For example, contrast may be improved for generally light or dark images. Also, features may be identified and labeled or highlighted. Alternatively, icons or other images are superimposed on the displayed image without processing thereof.
Alternatively, the control values are determined in the mechanism for pointing the camera instead of by the first computer. In such an embodiment, the HMD position data is transmitted to the remote system wherein a camera movement related to the HMD movement is determined and initiated.
The above described embodiment compensates for orientation—motion about any of three rotational axes. Alternatively, the invention compensates for displacement—linear motion along an axis. Further alternatively, the invention compensates for both linear motion and motion about any of the rotational axes. Displacement and orientation are both forms of position and data relating to one or both is referred to here and in the claims, which follow, as position data.
The above described embodiment does not correct images for perspective distortion. Doing so is feasible within the concept of time/motion compensation according to the invention, however it is not generally applicable to use with a single camera, since the depth of field of the observed scene varies. It would require capturing of depth data using a range sensor or a three-dimensional vision system.
Though the above embodiment is described with reference to a physical communication link or a wireless communication link between different components, clearly, either is useful with the invention so long as it is practicable. Also, though the HMD is described as a computer peripheral, it could be provided with an internal processor and act as a stand alone device.
According to another embodiment of the invention, areas within the field of view that do not correspond to displayed image locations are filled with current image data relating to earlier captured images for those locations. Preferably, any earlier captured images are deemphasized within the field of view in order to prevent the operator from being confused by “stale” image data. For example, each image received from the camera is buffered with its associated position data. When some areas within the field of view are not occupied by image data, the processor determines another image having image data for those locations within the field of view, the locations determined in accordance with the transform performed based on the camera position data associated with the earlier captured image and with the current HMD position data. The image data is then displayed at the determined location(s) in a “transparent” fashion. For example, it may be displayed with a lower contrast appearing almost ghostlike. Alternatively, the colours are faded to provide this more ghostlike appearance. Further alternatively, it is displayed identically to the current image data.
The above description is by way of example and is not intended to limit the forgoing claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4028725 *||Apr 21, 1976||Jun 7, 1977||Grumman Aerospace Corporation||High-resolution vision system|
|US4568159 *||Nov 26, 1982||Feb 4, 1986||The United States Of America As Represented By The Secretary Of The Navy||CCD Head and eye position indicator|
|US5307271 *||Sep 28, 1990||Apr 26, 1994||The United States Of America As Represented By The Secretary Of The Navy||Reflexive teleoperated control system for a remotely controlled vehicle|
|US5579026||May 13, 1994||Nov 26, 1996||Olympus Optical Co., Ltd.||Image display apparatus of head mounted type|
|US5684498 *||Jun 26, 1995||Nov 4, 1997||Cae Electronics Ltd.||Field sequential color head mounted display with suppressed color break-up|
|US5917460||Jul 5, 1995||Jun 29, 1999||Olympus Optical Company, Ltd.||Head-mounted type image display system|
|US5933125 *||Nov 27, 1995||Aug 3, 1999||Cae Electronics, Ltd.||Method and apparatus for reducing instability in the display of a virtual environment|
|US5978015 *||Oct 12, 1995||Nov 2, 1999||Minolta Co., Ltd.||Stereoscopic system with convergence and dioptric power adjustments according to object distance|
|US5980256 *||Feb 13, 1996||Nov 9, 1999||Carmein; David E. E.||Virtual reality system with enhanced sensory apparatus|
|US5984475 *||Dec 4, 1998||Nov 16, 1999||Mcgill University||Stereoscopic gaze controller|
|US6152854 *||Feb 22, 1999||Nov 28, 2000||Carmein; David E. E.||Omni-directional treadmill|
|US6307589 *||May 4, 1998||Oct 23, 2001||Francis J. Maquire, Jr.||Head mounted camera with eye monitor and stereo embodiments thereof|
|US6317127 *||Oct 16, 1996||Nov 13, 2001||Hughes Electronics Corporation||Multi-user real-time augmented reality system and method|
|US6327381 *||Jan 9, 1998||Dec 4, 2001||Worldscape, Llc||Image transformation and synthesis methods|
|US6580448 *||May 13, 1996||Jun 17, 2003||Leica Microsystems Ag||Process and device for the parallel capture of visual information|
|1||*||Hirose et al, "Transmission of Realistic Sensation: Development of Virtual Dome," Proc. IEEE VRAIS 93, IEEE Neural Networks Council, Piscataway, N.J., Jan. 1993, pp. 125-131.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7499893 *||Dec 5, 2005||Mar 3, 2009||Blue Oak Mountain Technologies, Inc.||System method for simulating conciousness|
|US7505607 *||Dec 17, 2004||Mar 17, 2009||Xerox Corporation||Identifying objects tracked in images using active device|
|US7731588 *||Sep 28, 2006||Jun 8, 2010||The United States Of America As Represented By The Secretary Of The Navy||Remote vehicle control system|
|US7810039 *||Mar 8, 2005||Oct 5, 2010||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US7867141 *||Jan 11, 2011||Panasonic Electric Works Co., Ltd.||Physical activity measuring system|
|US8040421||Jul 22, 2008||Oct 18, 2011||Fujifilm Corporation||Image display device, portable device with photography function, image display method and computer readable medium which controls the display based on an attitude angle of the display|
|US8063798 *||Nov 22, 2011||Honeywell International Inc.||Methods and apparatus to assist pilots under conditions in which spatial disorientation may be present|
|US8086551||Dec 27, 2011||Blue Oak Mountain Technologies, Inc.||Electronic system with simulated sense perception and method of providing simulated sense perception|
|US8088042 *||Dec 7, 2004||Jan 3, 2012||Elisa Oyj||Method, system, measurement device and receiving device for providing feedback|
|US8229163 *||Aug 22, 2008||Jul 24, 2012||American Gnc Corporation||4D GIS based virtual reality for moving target prediction|
|US8482649||Sep 7, 2011||Jul 9, 2013||Fujifilm Corporation||Image display device, portable device with photography function, image display method and computer readable medium|
|US8953057 *||May 22, 2007||Feb 10, 2015||Canon Kabushiki Kaisha||Display apparatus with image-capturing function, image processing apparatus, image processing method, and image display system|
|US8963804||Oct 30, 2008||Feb 24, 2015||Honeywell International Inc.||Method and system for operating a near-to-eye display|
|US9063566||Nov 30, 2011||Jun 23, 2015||Microsoft Technology Licensing, Llc||Shared collaboration using display device|
|US9086790||Jul 16, 2010||Jul 21, 2015||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US9245389 *||Dec 2, 2013||Jan 26, 2016||Sony Corporation||Information processing apparatus and recording medium|
|US9245428||Mar 14, 2013||Jan 26, 2016||Immersion Corporation||Systems and methods for haptic remote control gaming|
|US20050093891 *||Nov 4, 2003||May 5, 2005||Pixel Instruments Corporation||Image orientation apparatus and method|
|US20050223333 *||Mar 8, 2005||Oct 6, 2005||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US20060020174 *||Mar 7, 2005||Jan 26, 2006||Yoshihiro Matsumura||Physical activity measuring system|
|US20060133648 *||Dec 17, 2004||Jun 22, 2006||Xerox Corporation.||Identifying objects tracked in images using active device|
|US20060142986 *||Dec 5, 2005||Jun 29, 2006||Czora Gregory J||System method for simulating conciousness|
|US20070072662 *||Sep 28, 2006||Mar 29, 2007||Templeman James N||Remote vehicle control system|
|US20070135264 *||Dec 31, 2006||Jun 14, 2007||Outland Research, Llc||Portable exercise scripting and monitoring device|
|US20070258658 *||Apr 25, 2007||Nov 8, 2007||Toshihiro Kobayashi||Information processing apparatus and control method thereof, image processing apparatus, computer program, and storage medium|
|US20070268316 *||May 22, 2007||Nov 22, 2007||Canon Kabushiki Kaisha||Display apparatus with image-capturing function, image processing apparatus, image processing method, and image display system|
|US20070275826 *||May 18, 2007||Nov 29, 2007||Polar Electro Oy||Method and wrist device|
|US20080032870 *||Aug 2, 2006||Feb 7, 2008||Shen Yi Wu||Method and apparatus of counting steps for treadmill|
|US20080108481 *||Dec 7, 2004||May 8, 2008||Ilkka Limma||Method, System, Measurement Device and Receiving Device for Providing Feedback|
|US20080146416 *||Dec 13, 2006||Jun 19, 2008||Motorola, Inc.||Generation of user activity feedback|
|US20090086047 *||Jul 22, 2008||Apr 2, 2009||Fujifilm Corporation||Image display device, portable device with photography function, image display method and computer readable medium|
|US20090087029 *||Aug 22, 2008||Apr 2, 2009||American Gnc Corporation||4D GIS based virtual reality for moving target prediction|
|US20090119821 *||Nov 14, 2007||May 14, 2009||Jeffery Neil Stillwell||Belt with ball mark repair tool|
|US20090295602 *||Dec 3, 2009||Honeywell International Inc.||Methods and apparatus to assist pilots under conditions in which spatial disorientation may be present|
|US20100109975 *||Oct 30, 2008||May 6, 2010||Honeywell International Inc.||Method and system for operating a near-to-eye display|
|US20100309224 *||Dec 9, 2010||Canon Kabushiki Kaisha||Image displaying method, image displaying program, and display|
|US20120162411 *||Jun 28, 2012||Electronics And Telecommunications Research Institute||Method and apparatus for operation of moving object in unstructured environment|
|US20120218408 *||Oct 23, 2009||Aug 30, 2012||Yujian Wang||Method and system for improving video surveillance|
|US20140160129 *||Dec 2, 2013||Jun 12, 2014||Sony Corporation||Information processing apparatus and recording medium|
|US20140176591 *||Dec 26, 2012||Jun 26, 2014||Georg Klein||Low-latency fusing of color image data|
|US20150015707 *||Jul 10, 2013||Jan 15, 2015||Subc Control Limited||Telepresence method and system for tracking head movement of a user|
|US20150015708 *||Jul 10, 2013||Jan 15, 2015||Subc Control Limited||Telepresence method and system for supporting out of range motion|
|US20150309311 *||Jul 25, 2014||Oct 29, 2015||Lg Electronics Inc.||Head mounted display and method for controlling the same|
|US20150379772 *||May 5, 2015||Dec 31, 2015||Samsung Display Co., Ltd.||Tracking accelerator for virtual and augmented reality displays|
|CN103869468A *||Dec 3, 2013||Jun 18, 2014||索尼公司||信息处理设备和记录介质|
|DE102014015871A1||Oct 25, 2014||Apr 28, 2016||Audi Ag||Anzeigesystem für einen Kraftwagen, Kraftwagen mit einem Anzeigesystem und Verfahren zum Betreiben eines Anzeigesystems|
|EP2131153A2 *||May 29, 2009||Dec 9, 2009||Honeywell International Inc.||Methods and apparatus to assist pilots under conditions in which spatial disorientation may be present|
|EP2466361A1 *||Dec 14, 2011||Jun 20, 2012||ALSTOM Transport SA||Head-up display for a railway vehicle|
|EP2615831A1 *||Jan 13, 2012||Jul 17, 2013||ATS Group (IP Holdings) Limited||Adapting images from moving surveillance cameras for presentation to an operator|
|WO2013082041A1 *||Nov 27, 2012||Jun 6, 2013||Mcculloch Daniel||Shared collaboration using head-mounted display|
|WO2013116407A1 *||Jan 30, 2013||Aug 8, 2013||Stephen Latta||Coordinate-system sharing for augmented reality|
|WO2013168169A3 *||May 8, 2013||Jan 3, 2014||Israel Aerospace Industries Ltd.||Remote tracking of objects|
|WO2014065700A1 *||Jun 14, 2013||May 1, 2014||Gazzaev Sarmat Muratovich||System for producing animated images, for example video images|
|WO2014111923A1 *||Jan 14, 2014||Jul 24, 2014||Israel Aerospace Industries Ltd||Remote tracking of objects|
|U.S. Classification||345/8, 248/115, 359/630, 348/E07.088|
|International Classification||H04N9/47, H04N7/00, H04N7/18, G02B27/14, G09G5/00|
|Cooperative Classification||G02B2027/0187, H04N5/232, H04N7/185, G02B27/017, G02B2027/0138|
|European Classification||H04N7/18D2, G02B27/01C, H04N5/232|
|Mar 22, 2002||AS||Assignment|
Owner name: CANADIAN SPACE AGENCY, CANADA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:EDWARDS, ERIC C.;REEL/FRAME:012990/0827
Effective date: 20020320
|May 25, 2011||FPAY||Fee payment|
Year of fee payment: 4
|May 15, 2015||FPAY||Fee payment|
Year of fee payment: 8