'Sensorized' skin gives soft robots greater awareness: Page 2 of 2

February 13, 2020 //By Julien Happich
soft robot
Soft robots constructed from highly compliant materials are seen as potentially safer, more adaptable and more resilient than today’s rigid robots. But accurate control feedback loops prove difficult to implement for such deformable bio-inspired robots due to their infinite degrees of freedom.

“We’re sensorizing soft robots to get feedback for control from sensors, not vision systems, using a very easy, rapid method for fabrication,” explains Ryan Truby, a postdoc in the MIT Computer Science and Artificial Laboratory (CSAIL) who is co-first author on the paper along with CSAIL postdoc Cosimo Della Santina.

“We want to use these soft robotic trunks, for instance, to orient and control themselves automatically, to pick things up and interact with the world. This is a first step toward that type of more sophisticated automated control.”

One future aim is to help make artificial limbs that can more dexterously handle and manipulate objects in the environment.

“Think of your own body: You can close your eyes and reconstruct the world based on feedback from your skin,” says co-author Daniela Rus, director of CSAIL and the Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science. “We want to design those same capabilities for soft robots.”


The soft sensors are conductive silicone sheets cut into
kirigami patterns with piezoresistive properties.
Ryan L. Truby, MIT CSAIL

The researchers’ robotic trunk comprises three segments, each with four fluidic actuators used to move the arm. They fused one sensor over each segment, with each sensor covering and gathering data from one embedded actuator in the soft robot.

To estimate the soft robot’s configuration using only the sensors, the researchers built a deep neural network to do most of the heavy lifting. They also developed a new model to kinematically describe the soft robot’s shape that vastly reduces the number of variables needed for their model to process.

In training, the model analyzed data from its sensors to predict a configuration, and compared its predictions to the ground truth data collected simultaneously by the motion-capture system. In doing so, the model “learned” to map signal patterns from its sensors to real-world configurations, matching the robot’s true position.

 

Related articles:

Soft robotics proprioception: Let the machine sort it out

Stretchable skin-like robot crawls and convey objects

Shape-morphing, self-healing elastomer reacts to external stimuli

Optical lace offers soft robots a tactile sensor network


Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.