Smart spray skin uses AI to interpret hand movements
Scientists have devised a new material that can be sprayed on the back of human hands to track their movements, according to a new study. published In the diary Electronic Nature.
Researchers have built a prototype that recognizes simple objects by touching them and can even predictively type with two hands on an invisible keyboard.
A tiny electrical network detects the stretching and flexing of the skin as the hand moves, which is then interpreted by artificial intelligence (AI) to identify the movements.
The algorithm was able to write “No legacy is as rich as honesty” by William Shakespeare and “I am the master of my destiny, I am the captain of my soul” from the poem “Invictus” by William Ernest Henley.
The researchers say this technology could have applications in a variety of fields, from gaming to sports, telemedicine and robotics.
Read more: Language was born in the hands.
Electronic devices already exist that can identify the movement and intended tasks of the human hand. But these are often voluminous and require large amounts of data to be collected for each user and task to train the algorithm. Unsurprisingly, widespread adoption has been limited.
Now, researchers have created a sprayable, electrically sensitive mesh network made up of millions of polyurethane-embedded, gold-coated silver nanowires that conforms to the wrinkles and folds of the human finger and stays put, unless scrub with soap and water.
“As the fingers bend and twist, the nanowires in the mesh are squeezed and pulled apart, changing the electrical conductivity of the mesh. These changes can be analyzed to tell us precisely how a hand, finger, or joint moves,” explains lead author Zhenan Bao, a professor of chemical engineering at Stanford University in the US.
A lightweight Bluetooth module is also attached to wirelessly transfer those signal changes and the machine learning steps to interpret them.
Changing patterns in conductivity are assigned to specific tasks and gestures, and the algorithm learns to recognize them. For example, typing the letter X on a keyboard. And once the algorithm is properly trained, the physical keyboard is no longer needed, just the hand movements by themselves are enough.
Read more: Promise and problems: how do we implement artificial intelligence in clinical settings and ensure patient safety?
The machine learning scheme is also more computationally efficient than existing technologies.
“We brought the rapidly adapting aspects of human learning to tasks with just a handful of tests, known as ‘meta-learning.’ This allows the device to quickly recognize new arbitrary manual tasks and users with some quick tests,” says first author Kyun Kyu “Richard” Kim, a postdoctoral fellow in Bao’s lab.
“In addition, it’s a surprisingly simple approach to this complex challenge which means we can achieve faster computational processing time with less data because our nanogrid captures subtle details in its signals,” adds Kim.
Because it is sprayed, the device can conform to any size or shape of hand and could be used to recognize sign language or even objects by tracing their outer surfaces with your hand. In the future, it could also potentially be adapted to the face to capture subtle emotional cues, which may enable new approaches to computer animation or even virtual meetings.