Skip to main content

Pint of Robotics: Perceptually-enabled Interactions towards the Operating Theatre of the Future, Dr Alexandros Kogkas, Imperial College London

Date
Date
Wednesday 25 November 2020, 1630-1730
Location
Zoom - e-mail tsfc@leeds.ac.uk for access

Abstract: Patient safety and quality of care remain the focus of the smart operating room of the future. Some of the most influential factors with a detrimental effect are related to suboptimal communication among the staff, poor flow of information, staff workload and fatigue, ergonomics and sterility in the operating room. While technological developments constantly transform the operating room layout and the interaction between surgical staff and machinery, a vast array of opportunities arise for the design of systems and approaches, that can enhance patient safety and improve workflow and efficiency. In the age of information we live in, the surgical domain endeavours to follow the industrial paradigm shift towards data-driven processes for improved products and services. To this end, perceptually enabled data is the foundation stone that will lead to cognition-guided surgery.

The Smart-OR project relies on a real-time gaze-contingent framework, that aims to enhance operator's ergonomics by allowing perceptually-enabled, touchless and natural interaction with the environment. The main feature of the proposed framework is the ability to acquire and utilise the plethora of information provided by the human visual system to allow touchless interaction with medical devices in the operating room. On this basis, a gaze-guided robotic scrub nurse, a gaze-controlled robotised flexible endoscope and a gaze-guided assistive robotic system have been developed. The introduction of a robotic scrub nurse as an integral component of the Smart-OR, may address nursing shortages and empower the team by enabling the surgeon and the human scrub nurse to perform a wider variety of tasks in a more efficient and safe manner. This is achieved by offering a ‘third hand’, but most importantly one that is under the firm control of the operating surgeon. This platform allows hands-free gaze-driven interactions with a screen and a robotic arm, which acts as a robotic nurse assistant by transferring surgical instruments to the surgeon. The surgeon uses natural gaze via wearable eye-tracking glasses to select surgical instruments on a screen, in turn initiating the robot to deliver the desired instrument. The platform was tested with surgical teams in simulating a common operative scenario with similar theatre staff representation and operative field set up. The results of this work provide valuable insights into the feasibility of integrating the developed gaze-contingent framework into clinical practice without significant workflow disruptions. Most importantly, the data provided by the employment of the framework into the surgical workflow, can provide invaluable perceptually enabled information and contribute to the recently emerged field of Surgical Data Science towards Surgery 4.0.

Short Bio: Alexandros Kogkas has received a Diploma in Electrical & Computer Engineering from the University of Patras, Greece and a PhD in Medical Robotics from the Department of Surgery and Cancer, Imperial College London. His PhD research was focused on developing a gaze-contingent framework to enable perceptually enabled interactions in the operating theatre, under the supervision of Dr George Mylonas (engineer) and Prof Ara Darzi (surgeon). Currently, he is a Research Associate at the Hamlyn Centre for Robotic Surgery, Imperial College London. His research interests include human-computer interfaces, human-robot interaction, eye-tracking, computer vision, medical devices.