New smart skins developed at Stanford University could foresee a day when people can type with invisible keyboards, identify objects with touch alone, or allow users to gesture and communicate with apps in an immersive environment.
In a paper just published in a journal nature Electronic products Researchers describe a new type of stretchable biocompatible material that is sprayed onto the back of the hand like a tanning spray. Tiny electrical networks integrated into the mesh detect skin stretching and bending, and using AI, researchers can interpret myriad everyday actions from hand movements and gestures. Researchers say it could have applications and implications in a wide range of fields, such as gaming, sports, telemedicine, and robotics.
So far, several promising methods have been actively studied to enable various hand tasks and gestures, such as measuring the electrical activity of muscles using wristbands or wearable gloves. However, these devices are bulky because they require multiple sensory components to pinpoint movement at every single joint. In addition, training the algorithm requires collecting large amounts of data per user and per task. These issues make it difficult to adopt these devices into everyday use electronics.
This work is the first pragmatic approach in which the format is sufficiently concise and the function sufficiently adaptable so that essentially all users can work with it, even with limited data. Current technology requires multiple sensor components to read each joint in the finger, making it bulky. The new devices also take a more streamlined approach to software to enable faster learning. This precision can be key for virtual reality applications that deliver finely detailed motion for a more realistic experience.
The innovation that makes this possible is a sprayable, electrically sensitive mesh network embedded in polyurethane, the same durable yet stretchy material used to make skateboard wheels and protect hardwood floors from damage. The mesh consists of millions of gold-coated silver nanowires that contact each other to form dynamic electrical pathways. This mesh is electrically active, biocompatible, breathable and will stay in place unless scrubbed with soap and water. It conforms closely to the creases and creases of each human finger that wears it. You can then simply attach a lightweight Bluetooth module to the mesh that can transmit the signal changes wirelessly.
“As the finger bends and twists, the nanowires in the mesh are squeezed and stretched together, changing the electrical conductivity of the mesh. By measuring and analyzing these changes, we can tell you exactly how your hand, fingers, or joints are moving.
The researchers chose to spray directly onto the skin so that the mesh is supported without a substrate. This key engineering decision eliminated unwanted motion artifacts and allowed for the generation of multi-joint information of the finger using a single trace of the conductive mesh.
The device’s spray-on nature allows it to fit any size or shape of hand, but there is the potential to fit the device to your face to capture subtle emotional cues. This could enable new approaches to computer animation or lead to new avatar-driven virtual meetings with more realistic facial expressions and hand gestures.
Then machine learning takes over. A computer monitors the changing patterns of conductivity and maps these changes to specific physical actions and gestures. For example, when you type X on your keyboard, the algorithm learns to recognize that action in the changing pattern of electrical conductivity. When the algorithm is properly trained, the physical keyboard is no longer needed. The same principle can be used to recognize sign language or to recognize objects by tracking external surfaces.
And while existing techniques are computationally intensive and require vast amounts of data that humans have to painstakingly label, the Stanford team has developed a much more computationally efficient learning scheme.
“We’ve taken the aspect of human learning that adapts quickly to a task in just a few trials, known as ‘meta-learning’. This allows the device to quickly recognize random new hand actions and users in a few quick trials.
“Furthermore, with a surprisingly simple approach to this complex problem, faster computational turnaround times can be achieved with less data, as the nanomesh captures the subtle details of the signal,” Kim added. The precision with which the device can map the subtle movements of a finger is one of the key features of this innovation.
Researchers have created a prototype that can recognize simple objects by touch and do the expected two-handed typing on an invisible keyboard. The algorithm was able to input William Shakespeare’s “No legacy is richer than honesty” and “I am the master of my destiny and captain of my soul” from William Ernest Henry’s poem “Invictus”.
Keyboard typing video link: https://drive.google.com/open?id=1TdXiqW3bDDM0RAqJgxZmADLSh1pxBl6Y&authuser=enthusiakk%40gmail.com&usp=drive_fs
The co-first authors are: Kyun Kyu (Richard) Kim is a Postdoctoral Fellow in Chemical Engineering at Bao Group. This is Dr. Min Kim. Student at the Department of Computer Science at Korea Advanced Institute of Science and Technology (KAIST).
Additional Stanford authors include: Samuel Root and Bao-Nguyen Nguyen are Postdoctoral Research Fellows in Chemical Engineering at Bao Group. Yuya Nishio is a PhD. electrical engineering student at Bao Group; Jeffrey B.-H. Tok is the Director of the Chemical Engineering Laboratory at the Uytengsu Education Center. Zhenan Bao is also a member. Stanford Bio-X, Stanford Cardiovascular Institute, Maternal & Child Health Research Institute (MCHRI), Precourt Institute for Energy, Sarafan ChEM-H, Stanford Woods Environmental Research Institute, Wu Tsai Human Performance Alliance, Wu Tsai Neuroscience Institute, Research Associate CZ Bio Hub.
Additional Authors: Chang-Rok Chang, Jin-Ki Min, Jae-Won Kim, Seong-Keun Han, Jun-Hwa Choi are Ph.D. Students of the Department of Mechanical Engineering, Seoul National University; Jin Kim is a researcher at the College of Veterinary Medicine, Seoul National University. This is Dr. Koh Seung-hoon. Student at the Department of Computer Science, Korea Advanced Institute of Science and Technology (KAIST); Siyoon Kim is an assistant professor at the College of Veterinary Medicine, Konkuk University. Cho Seong-ho is a professor at the Department of Computer Science and Technology at the Korea Advanced Institute of Science and Technology (KAIST) and a member of the Soft Robotics Research Center. Koh Seung-hwan is a professor in the Department of Mechanical Engineering at Seoul National University, deputy director of the Engineering Research Institute, and a member of the Soft Robotics Research Center.
that much eWEAR-TCCI Award for Science Writing The project, commissioned by Stanford University’s Wearable Electronics Initiative (eWEAR), was made possible with funding through eWEAR Industry Alliance Program members Shanda Group and the Tianqiao and Chrissy Chen Institute (TCCI®).
disclaimer: AAAS and EurekAlert! We take no responsibility for the accuracy of press releases posted on EurekAlert! Contributing Institutions or Use of Information Through the EurekAlert System.