The REINS project is to design and investigate haptic communicational interfaces (reins) between a human agent and a mobile robot guide. The reins will facilitate joint navigation and inspection of a space under conditions of low visibility (occurring frequently in fire fighting). The REINS project aims to map the communicational landscape in which humans (fire fighters, but also the visually impaired) might be working with robots, with the emphasis on tactile and haptic interaction.
Interview on CBS/KCBS Radio News, San Francisco:
In a low-visibility hazardous environment, robotic solutions can come in handy to detect the obstacles in a path, their lightness, and texture with a view to guide a human follower. Besides the no-visibility, fire scenes are often very noisy. This limits the trustworthiness of audio feedback from a guider. In such conditions a haptic (and possibly tactile) interface seems a natural solution to communicate between a guider and a follower. Therefore, characterization of human-human interaction in a haptic communication scenario, where one partner is blindfolded, can provide a viable basis to design optimal human-robot interaction algorithms to serve humans working in many hazardous environments like fire-fighting.
A good solution would be to deploy an intelligent robot that could move in front of the human counterparts to inform them of what lies ahead and the best paths they could take to reach the target location, just like a dog guiding a blind person. Therefore, how human guiders learn to guide blindfolded followers along some arbitrarily complex path would provide an ideal scenario to study how optimal and stable controllers could be learnt to serve such human-robot interaction problems. The study of polynomial parameters of a linear state dependent controller evolve across learning trials of guiding a blindfolded follower along an arbitrarily wiggly path using a hard rein to provide haptic signals
Full video of experiments: