|Publication number||US7554549 B2|
|Application number||US 10/984,488|
|Publication date||Jun 30, 2009|
|Filing date||Nov 8, 2004|
|Priority date||Oct 1, 2004|
|Also published as||CA2582451A1, CA2582451C, EP1797537A2, EP1797537A4, EP1797537B1, US20060071934, WO2006039497A2, WO2006039497A3|
|Publication number||10984488, 984488, US 7554549 B2, US 7554549B2, US-B2-7554549, US7554549 B2, US7554549B2|
|Inventors||Mark Sagar, Remington Scott|
|Original Assignee||Sony Corporation, Sony Pictures Entertainment Inc.|
|Export Citation||BiBTeX, EndNote, RefMan|
|Patent Citations (30), Non-Patent Citations (5), Referenced by (34), Classifications (20), Legal Events (2)|
|External Links: USPTO, USPTO Assignment, Espacenet|
This patent application claims priority pursuant to 35 U.S.C. § 119(c) to the following provisional patent applications: (a) Ser. No. 60/616,049, filed Oct. 4, 2004 for SYSTEM AND METHOD FOR CAPTURING FACIAL AND EYE MOTION; and (b) Ser. No. 60/615,268, filed Oct. 1, 2004 for TRACKING OF FACIAL FEATURE MOVEMENTS WITHOUT VISUAL FACE CAPTURE.
1. Field of the Invention
The present invention relates to three-dimensional graphics and animation, and more particularly, to a motion tracking system that enables capture of facial and eye motion of a performer without the use of cameras for use in producing a computer graphics animation.
2. Description of Related Art
Motion capture systems are used to capture the movement of a real object and map it onto a computer generated object. Such systems are often used in the production of motion pictures and video games for creating a digital representation of a person that is used as source data to create a computer graphics (CG) animation. In a typical system, a performer wears a suit having markers attached at various locations (e.g., having small reflective markers attached to the body and limbs) and digital cameras record the movement of the performer from different angles while illuminating the markers. The system then analyzes the images to determine the locations (e.g., as spatial coordinates) and orientation of the markers on the performer's suit in each frame. By tracking the locations of the markers, the system creates a spatial representation of the markers over time and builds a digital representation of the performer in motion. The motion is then applied to a digital model, which may then be textured and rendered to produce a complete CG representation of the actor and/or performance. This technique has been used by special effects companies to produce incredibly realistic animations in many popular movies.
Motion capture systems are also used to track the motion of facial features of an actor to create a representation of the actor's facial motion and expression (e.g., laughing, crying, smiling, etc.). As with body motion capture, markers are attached to the actor's face and cameras record the actor's expressions. Since facial movement involves relatively small muscles in comparison to the larger muscles involved in body movement, the facial markers are typically much smaller than the corresponding body markers, and the cameras typically have higher resolution than cameras usually used for body motion capture. The cameras are typically aligned in a common plane with physical movement of the actor restricted to keep the cameras focused on the actor's face. The facial motion capture system may be incorporated into a helmet or other implement that is physically attached to the actor so as to uniformly illuminate the facial markers and minimize the degree of relative movement between the camera and face.
An advantage of motion capture systems over traditional animation techniques, such as keyframing, is the capability of real-time visualization. The production team can review the spatial representation of the performer's motion in real-time or near real-time, enabling the actor to alter the physical performance in order to capture optimal data. Moreover, motion capture systems detect subtle nuances of physical movement that cannot be easily reproduced using other animation techniques, thereby yielding data that more accurately reflects natural movement. As a result, animation created using source material that was collected using a motion capture system will exhibit a more lifelike appearance.
Notwithstanding these advantages of motion capture systems, a drawback of conventional motion capture systems is that they cannot capture eye motion. Since the markers cannot be affixed to the performer's eyes, the eye movement is not detected by the motion capture cameras. This eye movement must then be added during the subsequent CG animation process. In addition to making the animation process more cumbersome, the resulting animation product is less realistic since it may not include subtle eye movement that occurs during a performance.
Another drawback of conventional motion capture systems that rely upon cameras is that motion data of a performer may be occluded by interference with other objects, such as props or other actors. Specifically, if a portion of the body or facial markers is blocked from the field of view of the digital cameras, then data concerning that body or facial portion is not collected. This results in an occlusion or hole in the motion data. While the occlusion can be filled in later during post-production using conventional computer graphics techniques, the fill data lacks the quality of the actual motion data, resulting in a defect of the animation that may be discernable to the viewing audience. To avoid this problem, conventional motion capture systems limit the number of objects that can be captured at one time, e.g., to a single performer. This also tends to make the motion data appear less realistic, since the quality of a performer's performance often depends upon interaction with other actors and objects. Moreover, it is difficult to combine these separate performances together in a manner that appears natural.
Outside of the entertainment industry, there are many other circumstances in which it would be desirable to capture or track facial muscle and/or eye movement without reliance upon optical cameras. For example, automatic speech recognition devices, access control systems, electronic storage and retrieval systems for personal profiles and medical/dental screening systems could make use of such analysis techniques. Speech recognition systems that utilize analyses of facial features may find wide application in noisy environments where it is difficult to utilize acoustic speech recognition alone, e.g., in a military aircraft or in a factory. Each of these potential applications presently lack an effective means for accurately translating an individual's facial features into useful electronic data. This is particularly problematic where the individual is continually changing facial orientation with respect to the detection equipment.
Accordingly, it would be desirable to provide a motion tracking system that overcomes these and other drawbacks of the prior art. More specifically, it would be desirable to provide a motion tracking system that enables faithful capture of subtle facial and eye motion of a performer without the use of cameras.
The present invention overcomes the drawbacks of the prior art by providing a motion tracking system that enables faithful capture of subtle facial and eye motion. The invention uses a surface electromyography (EMG) detection method to detect muscle movements, and an electrooculogram (EOG) detection method to detect eye movements. Signals corresponding to the detected muscle and eye movements are used to control an animated character to exhibit the same movements performed by a performer.
More particularly, an embodiment of the motion tracking animation system comprises a plurality of pairs of electromyography (EMG) electrodes adapted to be affixed to a skin surface of a performer at plural locations corresponding to respective muscles, and a processor operatively coupled to the plurality of pairs of EMG electrodes. The processor includes programming instructions to perform the functions of acquiring EMG data from the plurality of pairs of EMG electrodes. The EMG data comprises electrical signals corresponding to muscle movements of the performer during a performance. The programming instruction further include processing the EMG data to provide a digital model of the muscle movements, and mapping the digital model onto an animated character. As a result, the animated character will exhibit the same muscle movements as the performer.
In an embodiment of the invention, a plurality of pairs of electrooculogram (EOG) electrodes are adapted to be affixed to the skin surface of the performer at locations adjacent to the performer's eyes. The processor is operatively coupled to the plurality of pairs of EOG electrodes and further includes programming instructions to perform the functions of acquiring EOG data from the plurality of pairs of EOG electrodes. The EOG data comprises electrical signals corresponding to eye movements of the performer during a performance. The programming instructions further provide processing of the EOG data and mapping of the processed EOG data onto the animated character. This permits the animated character to exhibit the same eye movements as the performer.
A more complete understanding of the motion tracking system that enables capture of facial and eye motion of a performer for use in producing a computer graphics animation will be afforded to those skilled in the art, as well as a realization of additional advantages and objects thereof, by a consideration of the following detailed description of the preferred embodiment. Reference will be made to the appended sheets of drawings which will first be described briefly.
As will be further described below, the present invention satisfies the need for a motion tracking system that enables faithful capture of subtle facial and eye motion of a performer without the use of cameras. In the detailed description that follows, like element numerals are used to describe like elements illustrated in one or more of the drawings.
Referring first to
The muscular electrode pairs include electrodes 120 1, 122 1 through 120 N, 122 N, which are coupled to the electrode interface 112 through respective electrical conductors 116 1, 118 1 through 116 N, 118 N. Also, a ground electrode 126 is coupled to the electrode interface 112 through electrical conductor 124. In a preferred embodiment of the invention, the muscular electrode pairs comprise surface electromyography (EMG) electrodes that measure a voltage difference caused by a depolarization wave that travels along the surface of a muscle that occurs when the muscle flexes. The signals detected by the surface electrodes are typically in the range of 5 mV. The electrodes should be aligned with expected direction of an electrical impulse (or aligned perpendicular to impulses that should be excluded).
The eye motion electrode pairs include electrodes 136 1, 138 1 and 136 2, 138 2, which are coupled to the electrode interface 112 through respective electrical conductors 132 1, 134 1 and 132 2, 134 2. Also, a ground electrode 144 is coupled to the electrode interface 112 through electrical conductor 142. In a preferred embodiment, eye motion is detected by acquiring and measuring electro-oculogram (EOG) from the eyes.
As known in the medical art, the upper facial muscles are responsible for changing the appearance of the eyebrows, forehead, and upper and lower eyelids. The Frontalis muscles in the upper portion of the face contract isotonically towards static insertion points on the cranium, enabling the surface tissue (i.e., skin) to bunch and wrinkle perpendicularly to the direction of the muscle. The lower facial muscles are made up of several distinct groups, including the Zygomaticus major muscles that contract in an angular direction from the lips toward the cheekbones, the Orbicularis Oculi muscles that are circular or elliptical in nature and extend around the eyes, the Obicularis Oris muscles that extend around the mouth, the Buccinator muscles that contract horizontally toward the ears, and others controlling various miscellaneous actions. The muscles of the mouth have particularly complex muscular interaction. The Obicularis Oris is a sphincter muscle with no attachment to bone. Three primary muscles, i.e., M. Levator, Labii Superioris and Alaeque Nasi, join from above, while the M. Buccinator joins at the major node of the mouth and contracts horizontally. The M. Depressor, Anguli Oris, M. Depressor Labii Inferioris and Mentalis each contract obliquely and vertically. Facial expressions are formed by complex and combined movements of these upper and lower facial muscles.
Referring briefly to
Returning now to
The EMG signal from subsequent performances is fitted to the calibration expressions to generate weighted coefficients for each expression component at step 308. The weighed coefficients (x) are generated by solving an equation F(x)=B for x, where F is the set of calibration expressions generated at step 306, and B is the plurality of EMG channels for each frame processed at step 304. For each facial expression component, an analogous 3D computer graphics model of a face is generated at step 310. The 3D computer graphics model represents the same facial expression using a different set of parameters. For example, geometric coordinates can be used as parameters to represent the facial expression. In another example, muscle strengths can be used as parameters to represent the facial expression. When the weighted components are combined with the 3D computer graphics model at step 312, the result is a facial expression geometry matching that of the original performance. The final geometry of the facial expression is expressed as a sum of the 3D computer graphics model configured with a set of parameters and the weight components, and rendered as an animation frame at step 314.
Referring now to
The bundle of electrode conductors may be bound together to facilitate ease of movement of the performer. The electrode interface 112 may multiplex the various signals onto a single conductor for communication to the motion tracking processor 108. The electrode interface 112 may be tethered to the motion tracking processor 108 by an electrical or fiberoptic conductor, or a wireless connection may be utilized to further enhance freedom of movement of the performer. Alternatively, the electrode interface 112 may include a storage device, such as a hard disk drive or flash memory module, that permits the performance data to be stored locally, and then downloaded to the motion tracking processor 108 at a later time. In an embodiment of the invention, the electrode interface 112 may be carried on the back of the performer, with the bundle of electrode conductors directed to the back of the performer's head and down along the spine to the electrode interface 112. In another alternative, the electrode interface 112 may be carried in a hat worn by the performer, which would promote free movement of the head and neck without interference by the bundle of electrode conductors. It should be appreciated that the electrodes and conductors may be integrated with a flexible mask that is form-fitted to the performer's face.
The motion tracking processor will produce a digital model of an animation character using known techniques. The digital model may be based on a plurality of digital photographs taken of the actor from various angles that are assembled together to produce a three-dimensional (3D) image file or digital model. Software tools to produce a 3D image file are well known in the art, as discussed above. For example, a 3D facial structure may be generated from a plurality of spline surfaces that each define a portion of the surface of the facial structure. The spline surfaces are formed from individual line segments that are collectively referred to as a “wire frame.” The facial structure represents a sub-facie structure that lies below the skin surface. An outer surface tissue having desired texture and color will be applied to this sub-facie structure as part of a subsequent animation process to produce an animated character. It should be understood that the shape of the facial structure will control the facial expressions formed by the animated character.
Once the digital model is created, a virtual facial muscle structure is overlaid onto the digital model of the facial structure. The human facial muscle structure is well understood in the medical literature, and the position and interconnection of the individual muscles can be readily mapped onto the digital model. Muscle actuation is achieved in the digital model by compressing a selected muscle vector or group of muscle vectors to accomplish repositioning of the corresponding plurality of vertices and thereby reshape the surface of the digital model. The facial and eye motion data captured from the performer is then mapped to the respective muscle vectors of the digital model. As a result, the animated digital model will exhibit the same facial and eye movements from the captured performance. It should be appreciated that the performer's actions and expressions can be mapped onto any other kind of animated face, such as a child, animal or different looking adult face.
As noted above, the foregoing motion tracking system will enable the close coordination of facial muscle and eye motion. For example, an eye blink will be detected as a spike of the EMG data for the muscles surrounding the eyes. When the spike is detected, the animation process could insert an eye blink to correspond with the data. This would add significantly to the realism of the resulting animation. The above-described motion tracking system has advantages over traditional facial motion tracking systems because the implementations of the present invention require no camera or motion capture volume. This means that the implementation will still work even if the face to be tracked is occluded from view. Further, the implementations of the present invention automatically parameterize motion capture into muscle activity, independent of final geometry, and thus aid facial motion capture re-targeting.
While the foregoing description addressed only the capture of facial muscle movement, it should be appreciated that the same technology would permit capture of body muscle movement. The electrode pairs could be affixed to various body muscle groups in the same manner as the facial muscles described above. Likewise, a form-fitted body suit could be arranged with the EMG electrodes provided therein to improve the ease and accuracy of affixing the electrodes to the performer's body. This form of body muscle capture could further be an enhancement to conventional motion capture using optical systems, in that muscle flexure could be captured in addition to joint rotation. This would result in more lifelike animation of human movement.
It should be understood that the eye motion process described above could also be used with conventional motion capture systems using optical data to capture motion. The eye motion could then be synchronized with the optically captured body and/or facial motion is the same manner as described above. Moreover, an EMG data capture system used in conjunction with a convention optical motion capture system would provide enhanced data regarding a performance, and could be used at times where optical data is occluded due to optical interference during a performance.
Having thus described a preferred embodiment of a system and method that enables capture of facial and eye motion of a performer without the use of cameras, it should be apparent to those skilled in the art that certain advantages of the invention have been achieved. It should also be appreciated that various modifications, adaptations, and alternative embodiments thereof may be made within the scope and spirit of the present invention. The invention is further defined by the following claims.
|Cited Patent||Filing date||Publication date||Applicant||Title|
|US4320768 *||Jul 17, 1979||Mar 23, 1982||Georgetown University Medical Center||Computerized electro-oculographic (CEOG) system|
|US4690142 *||Dec 10, 1980||Sep 1, 1987||Ross Sidney A||Method and system for utilizing electro-neuro stimulation in a bio-feedback system|
|US4836219 *||Jul 8, 1987||Jun 6, 1989||President & Fellows Of Harvard College||Electronic sleep monitor headgear|
|US5517021 *||Oct 28, 1994||May 14, 1996||The Research Foundation State University Of New York||Apparatus and method for eye tracking interface|
|US5694939 *||Oct 3, 1995||Dec 9, 1997||The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration||Autogenic-feedback training exercise (AFTE) method and system|
|US6026321||Mar 31, 1998||Feb 15, 2000||Suzuki Motor Corporation||Apparatus and system for measuring electrical potential variations in human body|
|US6032072 *||Jan 30, 1998||Feb 29, 2000||Aspect Medical Systems, Inc.||Method for enhancing and separating biopotential signals|
|US6079829 *||Nov 4, 1999||Jun 27, 2000||Bullwinkel; Paul E.||Fiber optic eye-tracking system utilizing out-of-band light source|
|US6249292 *||May 4, 1998||Jun 19, 2001||Compaq Computer Corporation||Technique for controlling a presentation of a computer generated object having a plurality of movable components|
|US6324296||Dec 4, 1997||Nov 27, 2001||Phasespace, Inc.||Distributed-processing motion tracking system for tracking individually modulated light points|
|US6379393 *||Sep 14, 1999||Apr 30, 2002||Rutgers, The State University Of New Jersey||Prosthetic, orthotic, and other rehabilitative robotic assistive devices actuated by smart materials|
|US6673027 *||Jun 6, 2002||Jan 6, 2004||Peter Fischer||Posture measurement and feedback instrument for seated occupations|
|US6720984 *||Jun 13, 2000||Apr 13, 2004||The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration||Characterization of bioelectric potentials|
|US6774885 *||Sep 22, 1999||Aug 10, 2004||Motek B.V.||System for dynamic registration, evaluation, and correction of functional human behavior|
|US6788333||Jul 7, 2000||Sep 7, 2004||Microsoft Corporation||Panoramic video|
|US7012634||Mar 28, 2002||Mar 14, 2006||Eastman Kodak Company||System and method for calibrating an image capture device|
|US7012637||Jul 27, 2001||Mar 14, 2006||Be Here Corporation||Capture structure for alignment of multi-camera capture systems|
|US7068277 *||May 23, 2003||Jun 27, 2006||Sony Corporation||System and method for animating a digital facial model|
|US7106358||Dec 30, 2002||Sep 12, 2006||Motorola, Inc.||Method, system and apparatus for telepresence communications|
|US20020188216 *||May 3, 2002||Dec 12, 2002||Kayyali Hani Akram||Head mounted medical device|
|US20030160791 *||Jul 11, 2001||Aug 28, 2003||Gaspard Breton||Facial animation method|
|US20030215130||Feb 10, 2003||Nov 20, 2003||The University Of Tokyo||Method of processing passive optical motion capture data|
|US20040095352 *||Nov 27, 2001||May 20, 2004||Ding Huang||Modeling object interactions and facial expressions|
|US20040155962||Feb 11, 2003||Aug 12, 2004||Marks Richard L.||Method and apparatus for real time motion capture|
|US20040166954||Mar 3, 2004||Aug 26, 2004||Callaway Golf Company||Static pose fixture|
|US20040179013 *||May 23, 2003||Sep 16, 2004||Sony Corporation||System and method for animating a digital facial model|
|US20050261559 *||May 17, 2005||Nov 24, 2005||Mumford John R||Wireless physiological monitoring system|
|US20060004296 *||Jun 30, 2004||Jan 5, 2006||Huiku Matti V||Monitoring subcortical responsiveness of a patient|
|US20070299484 *||Jul 19, 2007||Dec 27, 2007||Greenberg Robert J||Video processing methods for improving visual acuity and/or perceived image resolution|
|US20080021516 *||Jul 19, 2007||Jan 24, 2008||Greenberg Robert J||Video processing methods for improving visual acuity and/or perceived image resolution|
|1||*||Joyce, C. A. et al.; "Tracking eye fixations with electroocular and electroencephalographic recordings;" 2002, Psychophysiology; Cambridge University Press; vol. 39; pp. 607-618.|
|2||*||Lapatki, B. G. et al.; A surface EMG electrode for the simultaneous observation of multiple facial muscles; Mar. 15, 2003; Journal of Neuroscience Methods; vol. 123, No. 2; pp. 117-128.|
|3||*||Lee, Sooha Park; "Facial Animation System with Realistic Eye Movement Based on a Cognitive Model for Virtual Agents;" 2002; Dissertation in Computer and Information Science; University of Pennsylvania; pp. 1-4, 21 and 51-53.|
|4||*||Lucero, Jorge C., Munhall, Kevin G. "A Model of Facial Biomechanics for Speech Production," Nov. 1999, Acoustical Society of America, vol. 106, No. 5, pp. 2834-2842.|
|5||*||Malmivuo, J. et al.; "Bioelectromagnetism, Principles and Applications of Bioelectric and Biomagnetic Fields;" 1995, Oxford University Press; Chapter 28, pp. 538-549.|
|Citing Patent||Filing date||Publication date||Applicant||Title|
|US7821407||Jan 29, 2010||Oct 26, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US7825815||Jan 29, 2010||Nov 2, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US7931604 *||Oct 15, 2008||Apr 26, 2011||Motek B.V.||Method for real time interactive visualization of muscle forces and joint torques in the human body|
|US7978081||Nov 17, 2006||Jul 12, 2011||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for communicating biometric and biomechanical information|
|US8130225 *||Apr 13, 2007||Mar 6, 2012||Lucasfilm Entertainment Company Ltd.||Using animation libraries for object identification|
|US8144153||Nov 20, 2007||Mar 27, 2012||Lucasfilm Entertainment Company Ltd.||Model production for animation libraries|
|US8170656 *||Mar 13, 2009||May 1, 2012||Microsoft Corporation||Wearable electromyography-based controllers for human-computer interface|
|US8199152||Apr 13, 2007||Jun 12, 2012||Lucasfilm Entertainment Company Ltd.||Combining multiple session content for animation libraries|
|US8452458||May 5, 2008||May 28, 2013||Motek Bv||Method and system for real time interactive dynamic alignment of prosthetics|
|US8542236||Jan 16, 2007||Sep 24, 2013||Lucasfilm Entertainment Company Ltd.||Generating animation libraries|
|US8681158||Mar 5, 2012||Mar 25, 2014||Lucasfilm Entertainment Company Ltd.||Using animation libraries for object identification|
|US8908960||Sep 9, 2011||Dec 9, 2014||Lucasfilm Entertainment Company Ltd.||Three-dimensional motion capture|
|US8928674||Jun 8, 2012||Jan 6, 2015||Lucasfilm Entertainment Company Ltd.||Combining multiple session content for animation libraries|
|US8941665||Mar 21, 2012||Jan 27, 2015||Lucasfilm Entertainment Company Ltd.||Model production for animation libraries|
|US9037530||Mar 29, 2012||May 19, 2015||Microsoft Technology Licensing, Llc||Wearable electromyography-based human-computer interface|
|US9142024||Jul 21, 2009||Sep 22, 2015||Lucasfilm Entertainment Company Ltd.||Visual and physical motion sensing for three-dimensional motion capture|
|US9275487 *||Apr 8, 2009||Mar 1, 2016||Pixar Animation Studios||System and method for performing non-affine deformations|
|US9401025||Sep 21, 2015||Jul 26, 2016||Lucasfilm Entertainment Company Ltd.||Visual and physical motion sensing for three-dimensional motion capture|
|US9424679||Nov 21, 2014||Aug 23, 2016||Lucasfilm Entertainment Company Ltd.||Three-dimensional motion capture|
|US9508176||Nov 16, 2012||Nov 29, 2016||Lucasfilm Entertainment Company Ltd.||Path and speed based character control|
|US20080170077 *||Jan 16, 2007||Jul 17, 2008||Lucasfilm Entertainment Company Ltd.||Generating Animation Libraries|
|US20080170078 *||Apr 13, 2007||Jul 17, 2008||Lucasfilm Entertainment Company Ltd.||Using animation libraries for object identification|
|US20080170777 *||Apr 13, 2007||Jul 17, 2008||Lucasfilm Entertainment Company Ltd.||Combining multiple session content for animation libraries|
|US20090082701 *||Oct 15, 2008||Mar 26, 2009||Motek Bv||Method for real time interactive visualization of muscle forces and joint torques in the human body|
|US20090326406 *||Mar 13, 2009||Dec 31, 2009||Microsoft Corporation||Wearable electromyography-based controllers for human-computer interface|
|US20100117837 *||Jan 29, 2010||May 13, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US20100121227 *||Jan 29, 2010||May 13, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US20100121228 *||Jan 29, 2010||May 13, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US20100131113 *||May 5, 2008||May 27, 2010||Motek Bv||Method and system for real time interactive dynamic alignment of prosthetics|
|US20100164862 *||Jul 21, 2009||Jul 1, 2010||Lucasfilm Entertainment Company Ltd.||Visual and Physical Motion Sensing for Three-Dimensional Motion Capture|
|US20100201500 *||Nov 17, 2006||Aug 12, 2010||Harold Dan Stirling||Apparatus, systems, and methods for communicating biometric and biomechanical information|
|US20100201512 *||Nov 17, 2006||Aug 12, 2010||Harold Dan Stirling||Apparatus, systems, and methods for evaluating body movements|
|US20100204616 *||Jun 19, 2009||Aug 12, 2010||Applied Technology Holdings, Inc.||Apparatus, systems, and methods for gathering and processing biometric and biomechanical data|
|US20150302627 *||Jun 29, 2015||Oct 22, 2015||Rearden Mova, Llc For The Benefit Of Rearden, Llc||Apparatus and method for performing motion capture using a random pattern on capture surfaces|
|U.S. Classification||345/473, 600/546, 345/474|
|International Classification||G06T15/70, A61B5/04, G06T13/00|
|Cooperative Classification||G06F3/015, A61B5/1126, A61B5/6814, A61B5/0488, A61B5/1114, G06F3/013, A61B2562/04, G06T13/40|
|European Classification||G06F3/01B8, A61B5/0488, G06F3/01B4, A61B5/11N2, A61B5/11W, G06T13/40|
|Nov 8, 2004||AS||Assignment|
Owner name: SONY CORPORATION, JAPAN
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAGAR, MARK;SCOTT, REMINGTON;REEL/FRAME:015982/0404
Effective date: 20041104
Owner name: SONY PICTURES ENTERTAINMENT INC., CALIFORNIA
Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:SAGAR, MARK;SCOTT, REMINGTON;REEL/FRAME:015982/0404
Effective date: 20041104
|Dec 31, 2012||FPAY||Fee payment|
Year of fee payment: 4