Gaze-Contingent Motor Channelling, haptic constraints and associated cognitive demand for robotic MIS

Med Image Anal. 2012 Apr;16(3):612-31. doi: 10.1016/j.media.2010.07.007. Epub 2010 Aug 1.

Abstract

The success of MIS is coupled with an increasing demand on surgeons' manual dexterity and visuomotor coordination due to the complexity of instrument manipulations. The use of master-slave surgical robots has avoided many of the drawbacks of MIS, but at the same time, has increased the physical separation between the surgeon and the patient. Tissue deformation combined with restricted workspace and visibility of an already cluttered environment can raise critical issues related to surgical precision and safety. Reconnecting the essential visuomotor sensory feedback is important for the safe practice of robot-assisted MIS procedures. This paper introduces a novel gaze-contingent framework for real-time haptic feedback and virtual fixtures by transforming visual sensory information into physical constraints that can interact with the motor sensory channel. We demonstrate how motor tracking of deforming tissue can be made more effective and accurate through the concept of Gaze-Contingent Motor Channelling. The method is also extended to 3D by introducing the concept of Gaze-Contingent Haptic Constraints where eye gaze is used to dynamically prescribe and update safety boundaries during robot-assisted MIS without prior knowledge of the soft-tissue morphology. Initial validation results on both simulated and robot assisted phantom procedures demonstrate the potential clinical value of the technique. In order to assess the associated cognitive demand of the proposed concepts, functional Near-Infrared Spectroscopy is used and preliminary results are discussed.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Cognitive Reserve / physiology
  • Eye Movement Measurements
  • Fixation, Ocular / physiology*
  • Humans
  • Minimally Invasive Surgical Procedures / methods*
  • Robotics / methods*
  • Surgery, Computer-Assisted / methods*
  • Touch / physiology*
  • User-Computer Interface*