RoboPatient: Robot-Assisted Learning of Constrained Haptic Information Gain

The soft abdominal phantom with sensorised internal organs that will be extended to provide facial expressions of pain when the organs with physiological conditions are physically examined.

This is an EPSRC funded project led by Imperial College London with Cambridge University and Oxford University as partners.

This project will investigate how human participants innovate haptic exploration behaviours to detect abnormalities in a soft tissue given palpation force constraints. We focus on the application of training medical students to perform physical examinations on patients under constraints imposed by pain expressions conditioned by different gender and culture backgrounds.

Often primary examination of a patient by a General Practitioner (GP) involves physical examination to estimate the condition of internal organs in the abdomen. Facial expressions during palpation are often used as feedback to test a range of medical hypotheses for diagnosis. This leaves room for possible misinterpretations when the GP finds it difficult to establish a stable understanding about the patient’s background. This is a critical medical interaction challenge in UK where there is diverse gender and culture interactions in both GPs and patients.

Given the task of estimating the most likely value of a variable like the position of a hard formation in a soft tissue; humans (including examining doctors) use various internal and external control action such as variation of finger stiffness, shape and orientation of fingers, variation of indentation, and position and velocity control of fingers. We call this behavioural lensing of haptic information. A deeper understanding of how behavioural lensing happens under constraints is particularly important to improve the quality of training of physicians to develop robust physical examination methods on patients. In the case of examining a patient, behavioural constraints are imposed by pain expressions that can render diverse interpretations depending on the culture and gender context of the interaction between the physician and the patient.

Robustness of medical examination can be improved by introducing new technology assisted tools during medical education. Clinical sessions in medical training involve demonstrations from an experienced GP trainer on real patients. However, it is difficult to provide a consistent range of examples across different student groups because the lesson depends on the kind of patients available in the ward. On the other hand, patients resent repeated examination by students.

In this project, we propose to use a robotic patient with sensorised and controllable internal organs together with a detailed finite element based abdominal tissue model to visualise enhanced sensor data so that students can have a deeper insight into the demonstrations from a tutor given a set of symptoms. We will use a modular robotic face design to present pain expressions with different combinations of gender and culture backgrounds during manual abdominal palpation. Palpation force constraints presented by such facial expressions together with physical sensor data visualisation will allow medical students to experience their own variations of examination given different interaction contexts to improve physical examination behaviours.

The outcome of this project will be a new robot assisted taught module for medical students. First trials will be done at the University of Surrey and then it will be introduced to other GP trainers through workshops of the Royal College of General Practitioners. Student and tutor feedback from pilot trials will be used to improve the robo-patient design in a used centred co-design framework.

The team of investigators:

Dr. Thrishantha Nanayakkara (principal investigator), Dyson School of Design Engineering, Imperial College London

Dr. Nejra Van Zalk (Co-investigator), Dyson School of Design Engineering, Imperial College London

Dr. Mazdak Ghajari (Co-investigator), Dyson School of Design Engineering, Imperial College London

Dr. Fumiya Iida (Co-investigator), Department of Engineering, Cambridge University

Professor Simon Lusignan (Co-investigator), Primary Care Health and Sciences, Oxford University

Media and outreach workshops

The Engineering article titled “RoboPatient combines AR and robotics to train medics

Workshop on Human-Robot Medical Interaction at the 2020 IEEE International Conference on Human-Robot Interaction.

Soft Robotic Lung Phantom to show how tissue stiffness differences can lead to ventilator induced lung injury (VILI) during COVID-19 ventilation.

Workshop on Robot-Assisted Training for Primary Care at the 2020 IEEE?RSJ International Conference on Intelligent Robots and Systems.