Search Images Maps Play YouTube News Gmail Drive More »
Sign in
Screen reader users: click this link for accessible mode. Accessible mode has the same essential features but works better with your reader.

Patents

  1. Advanced Patent Search
Publication numberUS20010020200 A1
Publication typeApplication
Application numberUS 09/858,673
Publication dateSep 6, 2001
Filing dateMay 15, 2001
Priority dateApr 16, 1998
Also published asUS6233504, US6385509
Publication number09858673, 858673, US 2001/0020200 A1, US 2001/020200 A1, US 20010020200 A1, US 20010020200A1, US 2001020200 A1, US 2001020200A1, US-A1-20010020200, US-A1-2001020200, US2001/0020200A1, US2001/020200A1, US20010020200 A1, US20010020200A1, US2001020200 A1, US2001020200A1
InventorsHari Das, Tim Ohm, Curtis Boswell, Robert Steele
Original AssigneeCalifornia Institute Of Technology, A California Nonprofit Organization
Export CitationBiBTeX, EndNote, RefMan
External Links: USPTO, USPTO Assignment, Espacenet
Tool actuation and force feedback on robot-assisted microsurgery system
US 20010020200 A1
Abstract
An input control device with force sensors is configured to sense hand movements of a surgeon performing a robot-assisted microsurgery. The sensed hand movements actuate a mechanically decoupled robot manipulator. A microsurgical manipulator, attached to the robot manipulator, is activated to move small objects and perform microsurgical tasks. A force-feedback element coupled to the robot manipulator and the input control device provides the input control device with an amplified sense of touch in the microsurgical manipulator.
Images(8)
Previous page
Next page
Claims(5)
What is claimed is:
1. A microsurgery system, comprising:
a robot manipulator having a plurality of mechanically decoupled joints, said plurality of mechanically decoupled joints allowing actuation of a joint without affecting motion of any other joints;
an effector coupled to said robot manipulator to apply a force to an object; and
a force feedback element adapted to amplify a return force from said effector.
2. The system of
claim 1
, further comprising:
a master control system coupled to said robot manipulator, said master control system allowing an operator to input hand movement, where said hand movement is converted into amount of force applied at the effector.
3. The system of
claim 1
, wherein the return force of said force feedback element provides an exaggerated sense of feel, such that said sense of feel allows an operator to feel smaller force feedback than that without said force feedback element.
4. A method of performing a microsurgery, comprising:
converting operator hand movements into a series of forces to be applied by a manipulator;
determining a series of manipulator movements, said movements performed by a combination of manipulator joint movements;
actuating the manipulator joint movements;
providing an amplified feedback of a return force felt by said manipulator.
5. The method of
claim 4
, wherein said actuating includes providing movement of each joints in a mechanically decoupled orientation such that a joint movement is made without affecting movement of any other joints.
Description
    CROSS-REFERENCE TO RELATED APPLICATIONS
  • [0001]
    This application is a continuation of U.S. patent application Ser. No. 09/292,761, filed Apr. 14, 1999 and issued as U.S. Pat. No. 6,233,504 on May 15, 2001, which claims benefit of the priority of U.S. Provisional Application Serial No. 60/082,013, filed Apr. 16, 1998 and entitled “A Tool Actuation and Force Feedback on Robot Assisted Microsurgery System.”
  • ORIGIN OF INVENTION
  • [0002]
    The invention described herein was made in performance of work under a NASA contract, and is subject to the provisions of Public Law 96-517 (35 U.S.C. 202) in which the Contractor has elected to retain title.
  • BACKGROUND
  • [0003]
    The present specification generally relates to robotic devices and particularly to a mechanically decoupled six-degree-of-freedom tele-operated robot system.
  • [0004]
    Robotic devices are commonly used in factory-based environments to complete tasks such as placing parts, welding, spray painting, etc. These devices are used for a variety of tasks. Many of the robotic devices do not have completely mechanically-decoupled axes with passed actuation for transferring actuation through one joint in order to actuate another joint, without affecting the motion of any other joints. Also, the devices are large and bulky and cannot effectively perform small-scale tasks, such as microsurgical operations. In addition, these devices are not tendon-driven systems, and thus, do not have low backlash, which is desirable for microsurgical operations.
  • [0005]
    A decoupled six-degree-of-freedom robot system is disclosed in U.S. Pat. Nos. 5,710,870 and 5,784,542, issued to Ohm et al. The robot system has an input device functioning as a master to control a slave robot with passed actuation capabilities, high dexterity, six degrees-of-freedom with all six axes being completely mechanically decoupled, low inertia, low frictional aspect, and force-feedback capabilities.
  • [0006]
    The robot system, disclosed in the above-referenced patents, is a tendon-driven system without any backlash, and is therefore capable of precisely positioning surgical instruments for performing microsurgical operations.
  • SUMMARY
  • [0007]
    The inventors noticed, as a result of several simulated microsurgical operations, that the integration of a high precision micromanipulator with a highly sensitive force sensor to the slave robot can enhance the surgeon=s feel of soft tissues. This allows effective performance of microsurgical tasks with resolution of the hand motion less than 10 microns. The force sensor readings are used to amplify forces with high resolution to an input device on the master control. The amplified forces allow the surgeon operating the master control handle to feel the soft tissues with greater sensitivity and to move the handle with exaggeration and precision. In addition, the push button switches mounted on the master control handle provides operator control of system enable and the micromanipulator.
  • [0008]
    In one aspect, the present disclosure involves robot-assisted tasks for use in microsurgery. An input control device with force sensors is configured to sense hand movements of an operator. The sensed hand movements actuate a mechanically decoupled robot manipulator. A microsurgical manipulator, attached to the robot manipulator, is activated to move small objects and perform microsurgical tasks. A force-feedback element coupled to the robot manipulator and the input control device provides the input control device with an amplified sense of touch in the microsurgical manipulator.
  • [0009]
    In some embodiments, the input control device has a handle with activation switches to enable or disable control of the robot manipulator. The activation switches also allow movement of the microsurgical manipulator.
  • [0010]
    In another aspect, a virtual reality system is disclosed. The virtual reality system includes a plurality of input control devices configured to sense operator body movements. Each device has a plurality of joints that are mechanically decoupled for transferring force sensed actuation through one joint in order to actuate another joint, without affecting the motion of any other joints. The operator body movements are translated into corresponding movements in a virtual reality environment. A plurality of force-feedback elements provides the input control devices with feedback of the senses created in the virtual reality environment.
  • [0011]
    In further aspect, a virtual augmentation system to a real-environment configuration is disclosed. The system includes a plurality of input control devices configured to sense operator body movements. Each device has a plurality of joints that are mechanically decoupled, where the operator body movements are translated into corresponding movements in a real environment with certain limitations placed on the movements by a virtual reality environment. A plurality of force-feedback elements provides the input control device with feedback of the senses created in the virtual reality environment to limit movements in the real environment.
  • [0012]
    In further aspect, a microsurgical training system is disclosed. The system includes a master input control device configured to sense operator body movements. The system also includes at least one force-feedback element coupled to the master input control device and at least one slave device coupled to the force-feedback element. The force-feedback element is configured to receive the operator body movements from the master input control device. The operator body movements of the master input control device are replicated in the slave device.
  • [0013]
    In one embodiment, a data collection and storage device is coupled to the master input control device. The data collection and storage device is used to collect and store the operator body movements for subsequent replay.
  • [0014]
    The details of one or more embodiments are set forth in the accompanying drawings and the description below. Other embodiments and advantages will become apparent from the following description and drawings, and from the claims.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • [0015]
    These and other aspects will be described in reference to the accompanying drawings wherein:
  • [0016]
    [0016]FIG. 1 is an overview block diagram of components of the robot-assisted microsurgery (RAMS) system.
  • [0017]
    [0017]FIG. 2 is a perspective view of a slave robot arm.
  • [0018]
    [0018]FIG. 3 is one embodiment of the end effector of the slave robot arm.
  • [0019]
    [0019]FIG. 4 is a perspective view of a master control device.
  • [0020]
    [0020]FIG. 5 is a front view of a master control device handle.
  • [0021]
    [0021]FIG. 6 is a block diagram of a master handle switch interface board.
  • [0022]
    [0022]FIG. 7 is one embodiment of the RAMS system illustrating the advantages of compact size and lightweight.
  • [0023]
    [0023]FIG. 8 illustrates a simulated eye microsurgery procedure using the RAMS system.
  • [0024]
    Like reference numbers and designations in the various drawings indicate like elements.
  • DETAILED DESCRIPTION
  • [0025]
    Micro-surgeons often use a microscope with 20 to 100 times magnification to help them visualize their microscopic work area. The microsurgical operations performed by these surgeons require manipulation of skin and tissue on the order of about 50 microns. A microsurgical manipulator, such as micro-forceps, can often scale down the surgeon's hand motions to less than 10 microns. This allows the average surgeon to perform at the level of the best surgeons with high levels of dexterity. In addition, the best surgeons will be able to perform surgical procedures beyond the capability of human hand dexterity. The integration of the high precision microsurgical manipulator with a highly sensitive force sensor to the slave robot enhances the surgeon's feel of soft tissues and allows effective performance of microsurgical tasks with resolution of the hand motion less than 10 microns.
  • [0026]
    The force sensor readings are used to amplify forces with high resolution to an input device on the master control. The amplified forces allow the surgeon operating the master control handle to feel the soft tissues with greater sensitivity and to move the handle with greater exaggeration and precision.
  • [0027]
    [0027]FIG. 1 shows an overview block diagram of components of the robot-assisted microsurgery (RAMS) system. The components of the RAMS system have been categorized into four subsystems. They are the mechanical subsystem 102, the electronics subsystem 104, the servo-control and high-level software subsystem 106 and the user interface subsystem 108.
  • [0028]
    The mechanical subsystem 102 includes a master control system 110 with an input device 112 and a slave robot arm 114 with associated motors, encoders, gears, cables, pulleys and linkages that cause the tip 116 of the slave robot to move under computer control and to measure the surgeon's hand motions precisely. The subsystem 102 also includes slave and master force sensor interfaces 126, 128, and a master input device handle switch interface 150.
  • [0029]
    The electronics subsystem 104 ensures that a number of error conditions are handled gracefully. Components of the electronics subsystem 104 are a Versa Module EuroCard (VME) chassis 120, an amplifier chassis 122 and safety electronics 124.
  • [0030]
    The VME chassis 120 contains VME processor boards 130 used for high-level system control. The VME chassis 120 also contains two sets of Programmable Multi-Axis Controller (PMAC) servo-control cards 134, power supplies, and two cable interface boards 132.
  • [0031]
    The amplifier chassis 122 contains the six slave robot motor drive amplifiers 140 and three master control device motor drive amplifiers 142. The amplifier chassis 122 also includes a system control electronics board 144 and an amplifier power supply 146.
  • [0032]
    The safety control electronics 124 includes the control electronics board and brake relay board. The purpose of the braking function is to hold the motors in place when they are not under amplifier control. Programmable logic devices (PLDs) in the safety control electronics module 124 monitors amplifier power, operator control buttons and the HALT button, and a watchdog signal from the high-level software and control processor. Any anomaly triggers brakes to be set on the slave robot joint and a fault LED to be lighted. The operator must reset the safety control electronics to re-activate the system.
  • [0033]
    The servo-control and high-level software subsystem 106 is implemented in hardware and software. The subsystem 106 includes servo-control boards 134 and the computational processor boards 130. Servo-control software functions include setting-up the control parameters and running the servo-loop on the servo-control boards 134 to control the six motors, implementing the communication between the computation and servo-control boards 134, initializing the servo-control system and communicating with the electronics subsystem 104 and the user interface subsystem 108.
  • [0034]
    The user interface subsystem 108 interfaces with a user, controls initialization of the system software and hardware, implements a number of demonstration modes of robot control and computes both the forward and inverse kinematics.
  • [0035]
    In one embodiment of the subsystem 108, the user specifies the control modes of the system through a graphic user interface (GUI) implemented on a computer system, such as a personal computer (PC) or a workstation. Commands entered into the GUI are transmitted over an Ethernet connection or by a serial interface and are received on the real-time software side of the system.
  • [0036]
    [0036]FIG. 2 shows the slave robot arm 114. The arm 114 is a six degrees-of-freedom tendon-driven robotic arm designed to be compact yet exhibit very precise relative positioning capability as well as maintain a very high work volume. Physically, the arm measures 2.5 cm in diameter and is 25 cm long from its base 200 to the tip 202. The arm 114 is mounted to a cylindrical base housing 200 that measures 12 cm in diameter by 18 cm long that contains all of the drives that actuate the arm.
  • [0037]
    The joints of the arm 114 include a torso joint 204, a shoulder joint 206, an elbow joint 208, and a wrist joint 210. The torso joint 204 rotates about an axis aligned with the base axis 212 and positioned at the point the arm 114 emerges from its base 200. The shoulder joint 206 rotates about two axes 214 that are in the same plane and perpendicular to the preceding links. The elbow joint 206 also rotates about two axes 216 that are in the same plane and perpendicular to the preceding links. The wrist joint 210 makes three-axes rotations called pitch, yaw and roll rotations.
  • [0038]
    The slave wrist 210 design utilizes a dual universal joint to give a three degrees-of-freedom, singularity free, mechanically decoupled joint that operates in a full hemisphere of motion. The master wrist 210 design uses a universal joint to transmit rotation motion through the joint while allowing pitch and yaw motions about the joint resulting in singularity free motion over a smaller range of motion in three degrees-of-freedom.
  • [0039]
    [0039]FIG. 3 shows one embodiment of the end effector 220 of the slave robot. The end effector 220 is force sensor instrumented micro-forceps 304 actuated by a miniature DC motor 302. Simultaneous sensing of force interactions at the robot tip 306 and manipulation with the forceps 304 is possible with the end effector 220. Force interactions measured with the force sensor 300 are amplified, processed and used to drive the master arm to amplify the sense of touch at the master handle by an amplifier 308.
  • [0040]
    [0040]FIG. 4 shows a master control device 110 similar to the slave robot 114. The device 110 also has six tendon-driven joints. The master control device 110 is 2.5 cm in diameter and 25 cm long. The base 400 of the master control device 110 houses high-resolution optical encoders for position sensing. Since the smallest incremental movement during microsurgery is about 10 microns, 10 encoder counts is the minimum desirable incremental movement. As a result, one encoder count corresponds to one-micron movement at the tip of the end effector 306. High-resolution encoders are necessary for reducing the amount of gearing necessary to achieve the required positional resolution while limiting friction.
  • [0041]
    In addition, the base 400 preferably includes three arm motors and three wrist motors to create the force-feedback capability on the torso 402, shoulder 404, and elbow 406 axes, and the three-axis wrist 408, respectively. The wrist 408 is coupled to a six-axis force sensor 410 that is coupled to a handle 412.
  • [0042]
    [0042]FIG. 5 shows the master control device handle 412. There are three push button switches mounted on the handle 412 which provide operator control of the system and the opening and closing of the micro-forceps 304 on the slave robot arm 220. The enable switch 500 enables operator control of the system. The open switch 502 and the close switch 504 control the microsurgical manipulator 304 at the tip of the end effector 306 by opening and closing the micro-forceps 304, respectively.
  • [0043]
    [0043]FIG. 6 shows a block diagram of a master handle switch interface board 150. One switch 600 is used to inform the system computer that slave motion should be enabled. The output circuit is a relay 606 that turns system enable on or off. The other two switches 602, 604 are used to cause the slave robot manipulator 304 to move in with one switch and out with the other and no motion if both or neither are activated.
  • [0044]
    The switches 600, 602, 604 each have a resistor in series with its contacts. All switch-resistor pairs are connected in parallel providing a two-terminal switch sensor circuit connecting the nodes 610 and 612. The resistors are selected with different weighting values so that each switch has a different effect on the total resistance of the switch sensor. The switch sensor circuit is one element in a two-element voltage divider network. When different switches and combinations of switches are activated the voltage divider output changes.
  • [0045]
    The voltage divider network output changes are measured by a 7-bit analog-to-digital converter (ADC) 614. The numbers generated by the ADC output reflect the condition of the switches that are activated. The ADC numbers are decoded into eight discrete ranges using a lookup table 616. The states are modified in the decode logic to eliminate unwanted conditions. For example, both motor direction activated will cause no motor action.
  • [0046]
    The enable output circuit is a single-pole-double-throw relay 606 whose contacts are wired to an input port on the main computer. The motor driver output has two bipolar drivers 608 that can drive the motor in either direction or not at all.
  • [0047]
    [0047]FIG. 7 shows one embodiment of the RAMS system. The figure illustrates the advantages of compact size and lightweight. The entire electronics and servo-control subsystems containing the VME chassis, the amplifier chassis and the force-control boards are installed on a movable rack 700. A computer, such as a laptop 702, can be placed on top of the rack 700. The slave robot 704 and the master control device 706 can be placed around an operating table with interface cables connecting them to the rack 700.
  • [0048]
    Other advantages of the RAMS system include easy manipulation of the slave robot arm and manipulator, large work envelope, decoupled joints, low backlash, and low stiction.
  • [0049]
    The slave robot arm and manipulator can be easily maneuvered using the master input device handle and the push-button switches. The switch operated indexed motion allows the surgeon to efficiently control the robot arm and manipulator.
  • [0050]
    The RAMS system provides a large work envelope because each joint of the slave robot arm 114 has a large range of motion. The torso has 165 degrees of motion while both the shoulder and elbow have a full 360 degrees of motion. This high range of motion is attained by the double-jointed scheme. The wrist design has 180 degrees of pitch and yaw with 540 degrees of roll. Such large motion ranges increases the work volume and reduces the chance of a joint reaching a limit during operation.
  • [0051]
    The mechanically decoupled slave and master arm joints of the RAMS system simplifies kinematic computations. Furthermore, mechanically decoupled joints provide partial functionality even with one joint failure.
  • [0052]
    The RAMS system provides low backlash by using dual drive trains that are pre-loaded relative to one another. Low backlash is essential for doing fine manipulations. Five of the six degrees-of-freedom have zero backlash and the sixth, which is a result of the wrist design, has low backlash.
  • [0053]
    The RAMS system also provides low stiction with an incorporation of precision ball bearings in every rotating location of the robot. This reduces metal-to-metal sliding and minimizes stiction. Low stiction is effective in providing small incremental movements without overshooting or instability.
  • [0054]
    [0054]FIG. 8 illustrates a simulated eye microsurgery procedure successfully conducted using the RAMS system. The procedure demonstrated was the removal of a microscopic 0.015-inch diameter particle from a simulated eyeball 800.
  • [0055]
    The RAMS system was demonstrated in other procedures, including a dual-arm suturing procedure. Two RAMS systems were configured as left and right arms to successfully perform a nylon suture to close a 1.5 mm long puncture in a thin sheet of latex rubber.
  • [0056]
    The RAMS system can be used in many other applications such as a haptic device in virtual reality (VR) system, synthetic fixtures or virtual augmentation to the real environment, a simulator to train for microsurgical procedures, and a data collection system for measuring the hand motions made by an operator.
  • [0057]
    Although the RAMS system was not developed as a VR system, components of the RAMS system are applicable in the VR system. In one application, the master control arm is a unique haptic device that presents virtual or real force interaction to the user related to touch perception and feedback. The master control arms' ability to measure hand motions to less than 10 microns in translation and to 0.07 degrees in orientation and its pencil grasp make it ideal as an interface for positioning and feeling of a probe in a virtual environment.
  • [0058]
    In another application, the synthetic fixtures or virtual augmentation to the real environment is implemented on the RAMS system to assist the user in performing complex tasks. For example, in the eye surgery procedure, constraints on the motion of the slave robot are implemented to allow the surgical instrument mounted on the slave robot to pivot freely about the entry point in the eyeball. Activation of this mode causes loss of user control in two degrees of freedom of the slave robot. The automated control system prevents motion that moves the instrument against the eyeball wall.
  • [0059]
    In another application, the user interface part of the RAMS system can be used as a simulator to train for microsurgical procedures. Expert guidance to a novice surgeon can be implemented by replicating the expert motions of a master device on a similar device held by the novice.
  • [0060]
    In further application, the RAMS system also can be used as a data collection system for measuring the hand motions made by an operator of the system. This data is useful for characterizing the performance of the user. Much may be learned from analysis and characterization of the collected data including evaluation of the potential microsurgical abilities of surgical residents, prediction of the skill-level of a surgeon at any time or providing some insight into the nature of highly skilled manual dexterity.
  • [0061]
    Although only a few embodiments have been described in detail above, those of ordinary skill in the art certainly understand that modifications are possible. For example, as an alternative to constraining the motion of the slave robot in microsurgery procedure, forces can be simulated on the master handle that would guide the user into making safe motions. All such modifications are intended to be encompassed within the following claims, in which:
Referenced by
Citing PatentFiling datePublication dateApplicantTitle
US6535793 *May 1, 2001Mar 18, 2003Irobot CorporationMethod and system for remote control of mobile robot
US6738691 *May 16, 2002May 18, 2004The Stanley WorksControl handle for intelligent assist devices
US6845297Jan 9, 2003Jan 18, 2005Irobot CorporationMethod and system for remote control of mobile robot
US7391177 *May 17, 2006Jun 24, 2008Hitachi, Ltd.Master-slave manipulator system and this operation input devices
US8000837Aug 16, 2011J&L Group International, LlcProgrammable load forming system, components thereof, and methods of use
US8002767 *Aug 18, 2004Aug 23, 2011Intuitive Surgical Operations, Inc.Multifunctional handle for a medical robotic system
US8005571Jul 3, 2006Aug 23, 2011Neuroarm Surgical Ltd.Microsurgical robot system
US8041459 *Oct 18, 2011Neuroarm Surgical Ltd.Methods relating to microsurgical robot system
US8113304Jun 13, 2008Feb 14, 2012Irobot CorporationRobotic platform
US8170717 *May 1, 2012Neuroarm Surgical Ltd.Microsurgical robot system
US8239992May 9, 2008Aug 14, 2012Irobot CorporationCompact autonomous coverage robot
US8253368Jan 14, 2010Aug 28, 2012Irobot CorporationDebris sensor for cleaning apparatus
US8255092Apr 14, 2008Aug 28, 2012Irobot CorporationAutonomous behaviors for a remote vehicle
US8326469Dec 4, 2012Irobot CorporationAutonomous behaviors for a remote vehicle
US8365848Jun 17, 2008Feb 5, 2013Irobot CorporationRobotic platform
US8368339Aug 13, 2009Feb 5, 2013Irobot CorporationRobot confinement
US8374721Dec 4, 2006Feb 12, 2013Irobot CorporationRobot system
US8378613Oct 21, 2008Feb 19, 2013Irobot CorporationDebris sensor for cleaning apparatus
US8380350Feb 19, 2013Irobot CorporationAutonomous coverage robot navigation system
US8382906Aug 7, 2007Feb 26, 2013Irobot CorporationAutonomous surface cleaning robot for wet cleaning
US8386081Jul 30, 2009Feb 26, 2013Irobot CorporationNavigational control system for a robotic device
US8387193Aug 7, 2007Mar 5, 2013Irobot CorporationAutonomous surface cleaning robot for wet and dry cleaning
US8390251Mar 5, 2013Irobot CorporationAutonomous robot auto-docking and energy management systems and methods
US8392021Mar 5, 2013Irobot CorporationAutonomous surface cleaning robot for wet cleaning
US8396592Mar 12, 2013Irobot CorporationMethod and system for multi-mode coverage for an autonomous robot
US8396611Mar 12, 2013Irobot CorporationAutonomous behaviors for a remote vehicle
US8412377Jun 24, 2005Apr 2, 2013Irobot CorporationObstacle following sensor scheme for a mobile robot
US8417383Apr 9, 2013Irobot CorporationDetecting robot stasis
US8418303Apr 16, 2013Irobot CorporationCleaning robot roller processing
US8428778Apr 23, 2013Irobot CorporationNavigational control system for a robotic device
US8438695May 14, 2013Irobot CorporationAutonomous coverage robot sensing
US8447440May 21, 2013iRobot CoporationAutonomous behaviors for a remote vehicle
US8456125Dec 15, 2011Jun 4, 2013Irobot CorporationDebris sensor for cleaning apparatus
US8461803Jun 11, 2013Irobot CorporationAutonomous robot auto-docking and energy management systems and methods
US8463438Jun 11, 2013Irobot CorporationMethod and system for multi-mode coverage for an autonomous robot
US8474090Aug 29, 2008Jul 2, 2013Irobot CorporationAutonomous floor-cleaning robot
US8478442May 23, 2008Jul 2, 2013Irobot CorporationObstacle following sensor scheme for a mobile robot
US8515578Dec 13, 2010Aug 20, 2013Irobot CorporationNavigational control system for a robotic device
US8516651Dec 17, 2010Aug 27, 2013Irobot CorporationAutonomous floor-cleaning robot
US8528157May 21, 2007Sep 10, 2013Irobot CorporationCoverage robots and associated cleaning bins
US8565920Jun 18, 2009Oct 22, 2013Irobot CorporationObstacle following sensor scheme for a mobile robot
US8572799May 21, 2007Nov 5, 2013Irobot CorporationRemoving debris from cleaning robots
US8584305Dec 4, 2006Nov 19, 2013Irobot CorporationModular robot
US8594840Mar 31, 2009Nov 26, 2013Irobot CorporationCelestial navigation system for an autonomous robot
US8598829Jun 14, 2012Dec 3, 2013Irobot CorporationDebris sensor for cleaning apparatus
US8600553Jun 5, 2007Dec 3, 2013Irobot CorporationCoverage robot mobility
US8606401Jul 1, 2010Dec 10, 2013Irobot CorporationAutonomous coverage robot navigation system
US8634956Mar 31, 2009Jan 21, 2014Irobot CorporationCelestial navigation system for an autonomous robot
US8659255Jun 30, 2010Feb 25, 2014Irobot CorporationRobot confinement
US8659256Jun 30, 2010Feb 25, 2014Irobot CorporationRobot confinement
US8661605Sep 17, 2008Mar 4, 2014Irobot CorporationCoverage robot mobility
US8670866Feb 21, 2006Mar 11, 2014Irobot CorporationAutonomous surface cleaning robot for wet and dry cleaning
US8686679Dec 14, 2012Apr 1, 2014Irobot CorporationRobot confinement
US8726454May 9, 2008May 20, 2014Irobot CorporationAutonomous coverage robot
US8739355Aug 7, 2007Jun 3, 2014Irobot CorporationAutonomous surface cleaning robot for dry cleaning
US8749196Dec 29, 2006Jun 10, 2014Irobot CorporationAutonomous robot auto-docking and energy management systems and methods
US8761931May 14, 2013Jun 24, 2014Irobot CorporationRobot system
US8761935Jun 24, 2008Jun 24, 2014Irobot CorporationObstacle following sensor scheme for a mobile robot
US8763732Mar 9, 2009Jul 1, 2014Irobot CorporationRobotic platform
US8768516Jun 30, 2009Jul 1, 2014Intuitive Surgical Operations, Inc.Control of medical robotic system manipulator about kinematic singularities
US8774966Feb 8, 2011Jul 8, 2014Irobot CorporationAutonomous surface cleaning robot for wet and dry cleaning
US8780342Oct 12, 2012Jul 15, 2014Irobot CorporationMethods and apparatus for position estimation using reflected light sources
US8781626Feb 28, 2013Jul 15, 2014Irobot CorporationNavigational control system for a robotic device
US8782848Mar 26, 2012Jul 22, 2014Irobot CorporationAutonomous surface cleaning robot for dry cleaning
US8788092Aug 6, 2007Jul 22, 2014Irobot CorporationObstacle following sensor scheme for a mobile robot
US8793020Sep 13, 2012Jul 29, 2014Irobot CorporationNavigational control system for a robotic device
US8800107Feb 16, 2011Aug 12, 2014Irobot CorporationVacuum brush
US8836751Nov 8, 2011Sep 16, 2014Intouch Technologies, Inc.Tele-presence system with a user interface that displays different communication links
US8839477Dec 19, 2012Sep 23, 2014Irobot CorporationCompact autonomous coverage robot
US8843244May 14, 2007Sep 23, 2014Irobot CorporationAutonomous behaviors for a remove vehicle
US8849679Nov 25, 2008Sep 30, 2014Intouch Technologies, Inc.Remote controlled robot system that provides medical images
US8849680Jan 29, 2009Sep 30, 2014Intouch Technologies, Inc.Documentation through a remote presence robot
US8854001Nov 8, 2011Oct 7, 2014Irobot CorporationAutonomous robot auto-docking and energy management systems and methods
US8855813Oct 25, 2011Oct 7, 2014Irobot CorporationAutonomous surface cleaning robot for wet and dry cleaning
US8874264Nov 18, 2011Oct 28, 2014Irobot CorporationCelestial navigation system for an autonomous robot
US8897920Apr 17, 2009Nov 25, 2014Intouch Technologies, Inc.Tele-presence robot system with software modularity, projector and laser pointer
US8902278Jul 25, 2012Dec 2, 2014Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US8930023Nov 5, 2010Jan 6, 2015Irobot CorporationLocalization by learning of wave-signal distributions
US8939891Jun 10, 2011Jan 27, 2015Intuitive Surgical Operations, Inc.Multifunctional handle for a medical robotic system
US8950038Sep 25, 2013Feb 10, 2015Irobot CorporationModular robot
US8954192Jun 5, 2007Feb 10, 2015Irobot CorporationNavigating autonomous coverage robots
US8965579Jan 27, 2012Feb 24, 2015Intouch TechnologiesInterfacing with a mobile telepresence robot
US8966707Jul 15, 2010Mar 3, 2015Irobot CorporationAutonomous surface cleaning robot for dry cleaning
US8972052Nov 3, 2009Mar 3, 2015Irobot CorporationCelestial navigation system for an autonomous vehicle
US8978196Dec 20, 2012Mar 17, 2015Irobot CorporationCoverage robot mobility
US8983174Feb 19, 2013Mar 17, 2015Intouch Technologies, Inc.Mobile robot with a head-based movement mapping scheme
US8985127Oct 2, 2013Mar 24, 2015Irobot CorporationAutonomous surface cleaning robot for wet cleaning
US8994643Jul 8, 2008Mar 31, 20153D Systems, Inc.Force reflecting haptic interface
US8996165Oct 21, 2008Mar 31, 2015Intouch Technologies, Inc.Telepresence robot with a camera boom
US9008835Jun 24, 2005Apr 14, 2015Irobot CorporationRemote control scheduler and method for autonomous robotic device
US9038233Dec 14, 2012May 26, 2015Irobot CorporationAutonomous floor-cleaning robot
US9089972Jan 16, 2014Jul 28, 2015Intouch Technologies, Inc.Remote presence system including a cart that supports a robot face and an overhead camera
US9098611Mar 14, 2013Aug 4, 2015Intouch Technologies, Inc.Enhanced video interaction for a user interface of a telepresence network
US9104204May 14, 2013Aug 11, 2015Irobot CorporationMethod and system for multi-mode coverage for an autonomous robot
US9122308 *Aug 16, 2012Sep 1, 2015Samsung Electronics Co., Ltd.Tactile feedback apparatus, system, and method of operating tactile feedback apparatus
US9128486Mar 6, 2007Sep 8, 2015Irobot CorporationNavigational control system for a robotic device
US9138891Nov 25, 2008Sep 22, 2015Intouch Technologies, Inc.Server connectivity control for tele-presence robot
US9144360Dec 4, 2006Sep 29, 2015Irobot CorporationAutonomous coverage robot navigation system
US9144361May 13, 2013Sep 29, 2015Irobot CorporationDebris sensor for cleaning apparatus
US9149170Jul 5, 2007Oct 6, 2015Irobot CorporationNavigating autonomous coverage robots
US9160783May 9, 2007Oct 13, 2015Intouch Technologies, Inc.Robot system that operates through a network firewall
US9167946Aug 6, 2007Oct 27, 2015Irobot CorporationAutonomous floor cleaning robot
US9174342Nov 21, 2014Nov 3, 2015Intouch Technologies, Inc.Social behavior rules for a medical telepresence robot
US9193065Jul 10, 2008Nov 24, 2015Intouch Technologies, Inc.Docking system for a tele-presence robot
US9198728Sep 30, 2005Dec 1, 2015Intouch Technologies, Inc.Multi-camera mobile teleconferencing platform
US9215957Sep 3, 2014Dec 22, 2015Irobot CorporationAutonomous robot auto-docking and energy management systems and methods
US9223749Dec 31, 2012Dec 29, 2015Irobot CorporationCelestial navigation system for an autonomous vehicle
US9229454Oct 2, 2013Jan 5, 2016Irobot CorporationAutonomous mobile robot system
US9248874Dec 31, 2008Feb 2, 2016Irobot CorporationRobotic platform
US9251313Apr 11, 2012Feb 2, 2016Intouch Technologies, Inc.Systems and methods for visualizing and managing telepresence devices in healthcare networks
US9264664Dec 3, 2010Feb 16, 2016Intouch Technologies, Inc.Systems and methods for dynamic bandwidth allocation
US9296107May 10, 2012Mar 29, 2016Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US9317038Feb 26, 2013Apr 19, 2016Irobot CorporationDetecting robot stasis
US9320398Aug 13, 2009Apr 26, 2016Irobot CorporationAutonomous coverage robots
US9323250Aug 2, 2013Apr 26, 2016Intouch Technologies, Inc.Time-dependent navigation of telepresence robots
US9360300Jun 2, 2014Jun 7, 2016Irobot CorporationMethods and apparatus for position estimation using reflected light sources
US9361021Nov 21, 2014Jun 7, 2016Irobot CorporationGraphical user interfaces including touchpad driving interfaces for telemedicine devices
US9375843Jun 18, 2010Jun 28, 2016Intouch Technologies, Inc.Protocol for a remotely controlled videoconferencing robot
US9381654May 14, 2013Jul 5, 2016Intouch Technologies, Inc.Server connectivity control for tele-presence robot
US9392920May 12, 2014Jul 19, 2016Irobot CorporationRobot system
US9429934Oct 15, 2013Aug 30, 2016Intouch Technologies, Inc.Mobile videoconferencing robot system with network adaptive driving
US20030216834 *Jan 9, 2003Nov 20, 2003Allard James R.Method and system for remote control of mobile robot
US20050027397 *Jun 18, 2004Feb 3, 2005Intuitive Surgical, Inc.Aspects of a control system of a minimally invasive surgical apparatus
US20050209734 *Apr 3, 2003Sep 22, 2005Michiharu TanakaRobot system and controller
US20060261770 *May 17, 2006Nov 23, 2006Kosuke KishiMaster-slave manipulator system and this operation input devcies
US20080027590 *Jul 16, 2007Jan 31, 2008Emilie PhillipsAutonomous behaviors for a remote vehicle
US20080086241 *May 14, 2007Apr 10, 2008Irobot CorporationAutonomous Behaviors for a Remove Vehicle
US20080147206 *Aug 10, 2005Jun 19, 2008Abb AbControl system for Real Time Applications for Cooperative Industrial Robots
US20090037033 *Apr 14, 2008Feb 5, 2009Emilie PhillipsAutonomous Behaviors for a Remote Vehicle
US20090065271 *Jun 17, 2008Mar 12, 2009Irobot CorporationRobotic Platform
US20100191375 *Jan 29, 2009Jul 29, 2010Wright Timothy CDocumentation through a remote presence robot
US20120072024 *Oct 20, 2011Mar 22, 2012Yulun WangTelerobotic system with dual application screen presentation
US20130069863 *Aug 16, 2012Mar 21, 2013Samsung Electronics Co. Ltd.Tactile feedback apparatus, system, and method of operating tactile feedback apparatus
US20140154041 *Dec 2, 2013Jun 5, 2014Kabushiki Kaisha Yaskawa DenkiRobot
CN100444085COct 29, 2004Dec 17, 2008森瑟博科技有限公司Force reflecting haptic interface
CN102825603A *Sep 10, 2012Dec 19, 2012江苏科技大学Network teleoperation robot system and time delay overcoming method
WO2005043365A2 *Oct 29, 2004May 12, 2005Sensable Technologies, Inc.Force reflecting haptic interface
WO2005043365A3 *Oct 29, 2004Sep 29, 2005Sensable Technologies IncForce reflecting haptic interface
WO2014088965A2 *Dec 2, 2013Jun 12, 2014Northeastern UniversityApparatuses, systems and methods for force feedback
WO2014088965A3 *Dec 2, 2013Nov 20, 2014Northeastern UniversityApparatuses, systems and methods for force feedback
Classifications
U.S. Classification700/260, 700/245, 700/248
International ClassificationG06F3/01, A61B19/00, G06F3/00, B25J9/18, B25J9/16
Cooperative ClassificationA61B34/71, G06F3/011, G05B2219/40144, A61B34/30, G05B2219/45123, G06F3/016, A61B34/37, A61B34/76, B25J9/1689, G05B2219/40193
European ClassificationA61B19/22B, G06F3/01F, G06F3/01B, B25J9/16T4
Legal Events
DateCodeEventDescription
Mar 5, 2002ASAssignment
Owner name: BAYER, KATHY, DISTRICT OF COLUMBIA
Free format text: CONFIRMATORY LICENSE;ASSIGNOR:CALIFORNIA INSTITUTE OF TECHNOLOGY;REEL/FRAME:012688/0457
Effective date: 20010620
Jul 26, 2002ASAssignment
Owner name: NATIONAL AERONAUTICS AND SPACE ADMINISTRATION, DIS
Free format text: CORRECTION TO CORRECT THE CONFIRMATORY LICENSE ASSIGNEE S INFORMATION PREVIOUSLY RECORDED 012688 FRAME 0457;ASSIGNOR:CALIFORNIA INSTITUTE OF TECHNOLOGY;REEL/FRAME:013140/0386
Effective date: 20010620
Nov 23, 2005REMIMaintenance fee reminder mailed
May 8, 2006REINReinstatement after maintenance fee payment confirmed
Jul 4, 2006FPExpired due to failure to pay maintenance fee
Effective date: 20060507
Nov 15, 2006SULPSurcharge for late payment
Nov 15, 2006FPAYFee payment
Year of fee payment: 4
Feb 19, 2007PRDPPatent reinstated due to the acceptance of a late maintenance fee
Effective date: 20070220
Dec 14, 2009REMIMaintenance fee reminder mailed
Mar 11, 2010FPAYFee payment
Year of fee payment: 8
Mar 11, 2010SULPSurcharge for late payment
Year of fee payment: 7
Oct 9, 2013FPAYFee payment
Year of fee payment: 12