Human-Robot Collaboration intro

ergoCub FOR PHYSICAL HUMAN-ROBOT INTERACTION

 

 

We give humanoid robots the ability to help and collaborate with humans. 

 

 

why_title_IITContent

ergoCub_why text_Features

In modern societies, the demand for physical assistance to humans is increasing. 
In factories, production workers execute repetitive tasks that, in the long run, often cause musculoskeletal diseases. In clinics, orthopedic patients need orthoses and prostheses to overcome their daily deficiencies.  At home, elderly people require a wide range of physical assistance to compensate for their muscles slowly loosing strength. 

We thus need robot collaborators that  perceive humans and correct inefficient collaboration and unergonomic interaction that lead, in the long term, to musculoskeletal diseases.

Divider


what_title_IITContent

ergoCub_what text-Features

Robots can fulfill the human need for physical assistance. Traditional robots, however, are designed to act for humans. But the aforementioned general need of human-robot collaboration requires robots to act with humans in a shared workspace.
To do so, robots that are nowadays proficient in physical interaction should become as proficient in physical collaboration.

We need to develop safe dependable systems that can react, perceive and collaborate with human beings. To understand the biomechanics of human collaborative motion. To track, understand and predict human motion, in real-time, in dynamic environments. To integrate cognition technologies into human-robot collaboration. To develop tools for intuitive collaboration that increase human performance. 

To pursue the above objectives, what we do is to attempt at answering the following two research questions:

Q1: How can a robot help a human?

Q2: How can a human help a robot?

A fundamental concern here is to decode, from a mathematical perspective, what a human (or robot) help is. Some of our theoretical research efforts go along the direction of tackling and answering these open points and questions.

Divider


How

Research on wearable sensors for force sensing

Research on wearable sensors for force sensing Details

Shoes with force/torque sensors

We have developed sandals equipped with homemade IIT force/torque sensors from which the interaction forces between the human feet and floor can be precisely measured.  Each force/torque sensor is also equipped with an IMU and two temperature sensors.

Sandals with tactile-sensor-based insole instead of force/torque sensors

To reduce the cost of a pair of sandals, it would be ideal to substitute the force-torque sensors with other cheaper sensors.  Hence, along with the iCub research line, we developed an insole able to measure the pressure distribution produced by the foot in contact with the sandal.  The insole is an array of capacity based tactile sensors, and the above video shows the activation of the sensor arrays after the human foot exerts pressure on the sandal.

Human-Robot collaboration Video gallery

Untitled ST - Features

Research on the online estimation of human
musculoskeletal stresses

The above wearable sensors are fundamental to retrieve the external forces acting on the human. These wearable sensors are then complemented with other wearable sensors to measure human motion. We use the Xsens wearable sensors to measure the position and orientation of the human limbs, and then we apply homemade IIT online estimation algorithms to retrieve the human posture.

By combining the human motion, the forces measured by the sensorised sandals, and the human model, we can also estimate the human musculoskeletal stresses.

We have developed online Maximum a Posteriori (MAP) based algorithms that estimate the human musculoskeletal stresses in any human configuration.

The above video visualises the outcome of our estimation algorithms for human musculoskeletal stresses. More precisely, the human is performing some random motions and the robot stays still. Then, the whiter the circles around the human avatar on the right-hand side, the higher the estimated human musculoskeletal stresses. The yellow arrows at the avatar human feet represent the estimation of the forces between the human and the floor.

Research on the on-line estimation of human musculoskeletal stresses VIDEO

Research on the control of human-robot and physical interactions

Research on the control of human-robot and physical interactions VIDEO

Divider