The device uses a combination of biosensors and AI software to identify the hand gestures a person intends to make by analyzing electrical signals from their arm. It’s far from the first gesture recognition system designed for human-computer interaction (HCI), but the new system offers some unique benefits. Most notably, it uses a neuro-inspired hyperdimensional computing algorithm to update itself as it receives new information, such as changes to electrical signals when an arm gets sweaty. “In gesture recognition, your signals are going to change over time, and that can affect the performance of your model,” study coauthor Ali Moin explained in a statement. “We were able to greatly improve the classification accuracy by updating the model on the device.” The team screen-printed the biosensing system onto a thin sheet of PET substrate, a polymer resin that’s typically used to produce synthetic fibers and plastic containers. The researchers picked the material for their armband due to its flexibility, which allows it to conform to a forearm’s muscle movements. The array is comprised of 64 electrodes, each of which detects electric signals from a different point on the arm. This data is fed into an electric chip, which uses the algorithm to associate the signals with specific hand gestures. [Read: MIT’s new wearable lets you control drones with Jedi-like arm gestures] The team trained the algorithm by wrapping the armband around a user’s forearm and instructing them to perform each gesture. In testing, the system accurately classified 21 hand signals, including a fist, thumbs-up, and counting numbers.
A wearable future?
All of the computing is done locally on the chip, which speeds up the system and protects the user’s biological data. Moin believes this combination of security and performance could turn the system into a viable commercial product: You can read the study paper in the journal Nature Electronics.