Add to favorites

#Industry News

Sensor Converts Forearm Signals to Control Prosthetic Hands

Researchers at the University of California, Berkeley have developed a wearable sensor that can measure electrical signals in the forearm and use AI to correlate them with hand gestures, such as the movements of individual fingers.

The team demonstrated that the system can control a robotic prosthetic hand and that it may provide a way for amputees to perform delicate movements with such devices.

The flexible sensor can measure electrical signals at 64 discrete areas on the forearm and an electrical chip then uses AI to interpret these signals as specific hand gestures. A user can train the system to recognize unique hand gestures, and so far, the team has successfully trained it to accurately recognize 21 different gestures, including a flat hand, a thumbs up, and a fist.

“When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibers in your arms and hands,” said Ali Moin, a researcher involved in the study. “Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibers were triggered, but with the high density of electrodes, it can still learn to recognize certain patterns.”

The system uses AI to interpret the signals. This occurs on-board, and does not rely on cloud computing, which makes the data interpretation faster and helps to keep patient data secure and private. “In our approach, we implemented a process where the learning is done on the device itself,” said Jan Rabaey, another researcher involved in the project. “And it is extremely quick: You only have to do it one time, and it starts doing the job. But if you do it more times, it can get better. So, it is continuously learning, which is how humans do it.”

The included AI system, which is called a hyperdimensional computing algorithm, means that it can constantly update itself as new information becomes available. For instance, if the electrical signals change because a user’s skin becomes sweaty, the system can incorporate this new information into its data interpretation.

“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model,” said Moin. “We were able to greatly improve the classification accuracy by updating the model on the device.”

The researchers hope that the system will allow for delicate prosthetic control. See a video below about how the sensors are made and how they can control a prosthetic hand.

An armband to control prosthetic hands

Details

  • Berkeley, CA, USA
  • Ali Moin