Researchers at Georgia Tech have developed a technique that allowed a robot to teach itself to pull a hospital gown onto someone’s arm. By analyzing nearly 11,000 computer simulations of the procedure, the robot learned how to successfully pull the gown over an arm without imparting dangerous forces. The technique could lead to robotic systems that can help to dress incapacitated patients.
Over one million Americans need someone to help them get dressed. In many cases, these people are elderly, injured or suffer from a disease. A robotic system to help people get dressed in the morning would be very useful, but manipulating cloth onto limbs is a complex task for a robot. To help address this issue, the Georgia Tech research team developed a technique to allow a robot, called a PR2, to learn some of these skills through trial and error.
“People learn new skills using trial and error. We gave the PR2 the same opportunity,” said Zackory Erickson, a researcher involved in the study. “Doing thousands of trials on a human would have been dangerous, let alone impossibly tedious. But in just one day, using simulations, the robot learned what a person may physically feel while getting dressed.”
The robot doesn’t use vision, but rather measures the forces it feels as it is applying a garment to the arm. By analyzing nearly 11,000 computer simulations of the procedure, the robot became skilled at estimating the forces it applies to the arm, and identified the best techniques to gently and successfully pull the gown over the arm.
However, the robot is not just responsive, but can also plan ahead during the dressing procedure. “The key is that the robot is always thinking ahead,” said Charlie Kemp, another researcher involved in the study. “It asks itself, ‘if I pull the gown this way, will it cause more or less force on the person’s arm? What would happen if I go that way instead?”
At present, the robot can dress one arm in a hospital gown, but a robotic system that allows for a more complete dressing will require further research.
See the robot in action with this link: