Suggested project scopes

The following suggested project scopes can be used to discuss further to identify a specific question you want to address for 12 weeks. You can discuss with the module leaders Thrishantha Nanayakkara, Petar KormushevNicolas Rojas, and Weston Baxter to identify a refined research question to investigate deeper or you can propose a new project. You can do a project in a team of up to 3 members. You will work in the robotics research labs in the basement of the Dyson Building to conduct experiments. You will also get the support of the research teams in the robotics labs.

Robo-patient (Supervised by Dr. Thrishantha Nanayakkara): Dr. Thrishantha Nanayakkara is currently leading an EPSRC funded project called the “Robo patient project” with partners from the Radcliffe hospital, Oxford University, and Cambridge University. The objective of an RRP project is to design and test a robotic face that can express the pain level of the patient when a physician examines an affected area of the patient’s body. This pain feedback allows the physician trainee to learn physical examination techniques that are less painful to the patients.

You can test whether a 2D robotic face or a 3D robotic face will give a trainee physical examiner of a robotic patient a better perception of the level of pain the patient feels when an affected area of the body is palpated.

The student(s) can choose to project an animated face onto a soft robotic face and use a clever combination of animation and facial tissue actuation to present pain expressions to render a maximum sense of agency. An animated video projection could potentially allow presenting different gender and culture backgrounds of the patient without having to fabricate many soft robotic faces. Students will also measure palpation forces, movements, and muscle activity of the physical examiner using sensors already available in the laboratory. These data can allow you to quantify whether a 2D or a 3D face leads to the most significant change in palpation behaviors across learning trials.

Robotic assistant (Supervised by Dr Petar Kormushev): For this project, the students will use our Pepper robot ( The goal of this project is twofold: (1) exploratory, to study the built-in capabilities of the Pepper robot, and (2) practical, to implement a real-world human-robot interaction demo using Pepper. The goal is to come up with a meaningful task for Pepper to perform, in order to assist a human in some way. For example, an information assistance task, where a person is asking for information, and the robot is providing it in a multi-modal way (e.g. by talking, showing info on the screen, using hand gestures, or motion of the wheel base). The information could be something relevant to us, e.g. about the Dyson School, the research or location of staff members, or about Imperial’s campus. The research question to address is how to design the HRI interaction in such a way that it maximises the use of available robot capabilities to provide the most useful information in a fast and easy-to-access way. Additionally, it would be interesting to try to adapt the behaviour of the robot to the human user, e.g. according to the gender of the user, or the age, or other differences that are relevant to the task. Students may decide to dress the robot appropriately, or to give it a gender identity, if it makes the interaction task more realistic. Several projects can be made for different interaction tasks.

Human dexterity (Supervised by Dr. Nicolas Rojas): Students will be briefed to study human manual dexterity in activities of daily living (ADLs) such as eating and dressing, taking into account gender dimension to differentiate the needs of women and men according to age. Surprisingly, scientific knowledge about human manipulation is limited; and indeed it seems that we know more about ape manipulation than human manual dexterity. In order to fill this gap, students will select a single task of an ADL of interest to them and will propose experiments to determine gender- and age-based manipulation primitives and behaviour using electromyography and sensor gloves. In addition to the scientific knowledge obtained, the research results will be also useful to inform the design and control of prosthetic hands and hand wearables for different human groups.

Haptic feedback displays (Supervised by Dr. Thrishantha Nanayakkara): Students will be briefed to design a wearable haptic feedback interface to remotely feel a robotic probe that examines a soft tissue to locate a buried hard nodule. The soft tissue represents a clinical scenario of locating a tumor in a soft tissue. The user will be able to control the remote probe already available in the lab to improve haptic perception. This scenario allows the students to understand what design features in the wearable haptic feedback interface will give the best perception of small variations of hardness in the remote soft tissue.

Autonomous soft grasping (Supervised by Dr. Nicolas Rojas): Students will be briefed to develop an autonomous grasping system to pick objects from agglomerates using soft hands. This solution can be useful to automate monotonous tasks in e-commerce order fulfillment. Students will select one problem of warehouse automation (e.g., kitting, parts sorting, unloading a bin of random objects) and then will implement a solution for it based on: 1) a self-adaptive robot hand made of soft materials such as silicone to be installed in a UR5 robot arm, 2) a vision sensor such as a Time-of-Flight or an RGB-D camera, and 3) a data-driven (e.g., machine learning) or model-based control algorithm.

A robotic knee joint for efficient locomotion (Supervised by Dr. Thrishantha Nanayakkara): Recent work suggest that the cam profile of the human knee joint provides a joint angle dependent damping profile that make biped walking very efficient. This project will use a biped passive dynamic walker similar to this walker to design and compare a novel knee joint. Such an innovation can be useful for future prosthetics.

Myoelectric control of modular prosthetic hand (Supervised by Dr. Nicolas Rojas): The OLYMPIC hand is a fully modular design for a prosthetic hand with finger and wrist level modularity, allowing the removal and attachment of tendon-driven fingers without the need for tools, retendoning, and rewiring. Its innovative design enables placement of the motors behind the hand for remote actuation of the tendons, which are contained solely within the fingers. For this project, students will be briefed to implement a control platform for the OLYMPIC hand based on myoelectric signals. The students will select sensors and develop the signal acquisition system, with the objective of implementing a control scheme for the prosthetic hand.

Goat hoof inspired shoe sole (supervised by Dr. Thrishantha Nanayakkara): It will be very useful to have a slip resistant shoe for elderly people to go outdoors without worrying about slipping and falling. Our recent experiments to understand how the mountain goat hoofs reduce slip led to the discovery that the stiffness of certain joints play a key role in slip resistance. The hoof can passively deform and vibrate to reduce slip when the key joints in the hoof are in a certain range of stiffness. In this project, you will use biological inspirations and recent robotic findings to design and test a shoe sole printed using a multimaterial 3D printer or made by depositing soft silicon rubber layers. The experiment will involve directional slip resistance characteristics and comparisons between at least 2 designs. You will use already available laboratory equipment like an XY table and a 6 axis force torque sensor to conduct experiments.

Soft palm for realistic handshakes (Supervised by Dr. Nicolas Rojas): Students will be briefed to study human handshakes, taking into account gender dimension to differentiate results of women and men according to age. The project will focus on proposing experiments to determine the key variables that make a handshake human (e.g., force, compliance) and students will use the results to design and implement a soft robotic palm and robot hand for obtaining human-like handshakes.  The objective of this robotic hand is to improve the human-robot interaction in assistive technologies.