Researchers develop device that reads hand gestures

LinkedIn +

Engineers at the University of California (UC) Berkeley have developed a device that can recognise hand gestures and could control prosthetics or interact with electronic devices in the future.

According to the researchers, the system, which combines wearable biosensors with AI, could one day enable people to type on a computer without a keyboard or drive a car without a steering wheel.

Ali Moin, doctoral student in UC Berkeley’s Department of Electrical Engineering and Computer Sciences, said: “Prosthetics is one important application of this technology, but besides that, it also offers a very intuitive way of communicating with computers.

“Reading hand gestures is one way of improving human-computer interaction. And, while there are other ways of doing that, by, for instance, using cameras and computer vision, this is a good solution that also maintains an individual’s privacy.”

In the study, which was published in Nature Electronics, the engineers created a hand gesture recognition system by designing a flexible armband that can read electrical signals at 64 different points on the forearm.

These signals are then fed into an electrical chip programmed with an AI algorithm capable of associating these signal patterns in the forearm with specific hand gestures.

The team was successful in teaching the algorithm to recognise 21 individual hand gestures, including a thumbs-up, a fist, a flat hand, holding up individual fingers and counting numbers.

Moin added: “When you want your hand muscles to contract, your brain sends electrical signals through neurons in your neck and shoulders to muscle fibres in your arms and hands.

“Essentially, what the electrodes in the cuff are sensing is this electrical field. It’s not that precise, in the sense that we can’t pinpoint which exact fibres were triggered, but with the high density of electrodes, it can still learn to recognise certain patterns.”

Similar to other AI software the algorithm learns how electrical signals in the arm correspond to hand gestures. The new device uses advanced AI called a hyperdimensional computing algorithm, which is capable of updating itself with new information.

“In gesture recognition, your signals are going to change over time, and that can affect the performance of your model,” Moin said. “We were able to greatly improve the classification accuracy by updating the model on the device.”

The team added that no personal data is transmitted to nearby computers or devices as all the computing occurs locally on the chip. This also speeds computing time.

Further study is required before the device is ready to be a commercial product.

Share this story: