Low-cost tactile glove learns signatures of the human grasp

May 30, 2019 //By Rich Pell
Low-cost tactile glove learns signatures of the human grasp
Researchers at MIT (Cambridge, MA) have created a low-cost, sensor-packed glove that captures pressure signals as its wearer interacts with a variety of objects, providing insights they say that could aid the future design of prosthetics, robot grasping tools, and human–robot interactions.

The glove, says the researchers, can be used to create high-resolution tactile datasets that could enable an AI system to recognize objects through touch alone. Such information, they say, could be leveraged to help robots identify and manipulate objects, and may aid in prosthetics design.

The low-cost knitted "scalable tactile glove" (STAG) is equipped with about 550 sensors across nearly the entire hand. Each sensor captures pressure signals as the glove interacts with objects in various ways.

A neural network processes the signals to learn a dataset of pressure-signal patterns related to specific objects. That dataset is then used to classify the objects and predict their weights by feel alone, with no visual input needed.

The researchers compiled a dataset using STAG for 26 common objects - including a soda can, scissors, tennis ball, spoon, pen, and mug. Using the dataset, say the researchers, the system predicted the objects' identities with up to 76% accuracy, and predicted the correct weights of most objects within about 60 grams.

Current sensor-based gloves used today can cost thousands of dollars and often contain only around 50 sensors that capture less information. Conversely, STAG produces very high-resolution data and is made from commercially available materials totaling around $10.

Their tactile sensing system, say the researchers, could be used in combination with traditional computer vision and image-based datasets to give robots a more human-like understanding of interacting with objects.

"Humans can identify and handle objects well because we have tactile feedback," says Subramanian Sundaram PhD '18, a former graduate student in the Computer Science and Artificial Intelligence Laboratory (CSAIL). "As we touch objects, we feel around and realize what they are. Robots don't have that rich feedback."

"We've always wanted robots to do what humans can do, like doing the dishes or other chores," says Sundaram. "If you want robots to do these things, they must be able to manipulate objects really well."

The researchers


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.