Perception and Handling enabled by Artificial tactile SEnsing – Phase
Perception and Handling enabled by Artificial tactile SEnsing
Touch sensors inspired by human perception.
Challenges and objectives
Every day, we manipulate dozens, even hundreds of objects, and for each interaction, finger movements are informed by tactile sensations. When we position a pencil in our hand or search for a key in our pocket, the mechanoreceptors buried in our skin inform us about the state of the object touched and guide the fingers towards a stable grip. However, the exact role of the sense of touch remains poorly understood. The objective of the Phase project is therefore to study how tactile perception influences dexterous manipulation. This knowledge allows the development of robotic tactile sensors inspired by human perception. During the project, members discovered which tactile cues encode the perception of friction. Friction between the fingers and the object is a key element of manipulation and its perception is essential to know how much force the fingers should apply. A tactile sensor developed in the Phase project is now capable of measuring the coefficient of friction of arbitrary objects by simply pressing the finger on the object. This makes it possible to finely control robotic grippers to apply the force needed to pick up and manipulate a wide variety of everyday objects, including fruits and vegetables for agricultural robotics and biological tissues in the context of surgical robotics.
Most robotic hands and grippers grasp objects with a pre-programmed force. This force is determined in advance and is strong enough to lift the object's weight through the grip provided by the fingers. If the object is too slippery—its friction coefficient is too low—contact is lost and the object loses its grip. For this reason, it is necessary to know in advance the friction coefficient that the finger applies to the object. The tactile sensor we developed in the PHASE project is capable of measuring the friction coefficient of arbitrary objects by simply pressing the finger on the object. It is sensitive to radial deformation, which is directly related to surface friction. Using artificial intelligence algorithms, the tactile image can be decoded to discover the object's shape and its friction. With this essential information it is possible to finely control the robotic grippers to apply a low force so as not to damage the object, but high enough to prevent the object from slipping and maintain a stable grip.
The first major result of the project is the discovery of the tactile index of friction. With the help of an innovative optical instrument, the Phase project showed that the perception of friction during the first moments of contact was linked to a radial deformation of the skin during the compression of the finger. This radial deformation is created by the interaction of tissues and the surface. When friction is high, the deformation is blocked, giving rise to tiny radial movements of the skin and vice versa. The second major result of the project is that this concept of radial deformation could be distilled into a new, patented tactile sensor, sensitive to deformation and capable of measuring this pattern of radial deformations, and therefore capable of detecting areas of weak adhesion to readjust the grip of robotic grippers. These new tactile grippers can then apply a force sufficient to lift the weight of the object but weak enough not to damage it. If the object is too slippery (coefficient of friction is too low), the contact is compromised and the object escapes the grip. Artificial intelligence algorithms decode the movement of the touch sensor to determine the texture of the object being touched and its grip. This research will improve robotic grippers and enable them to handle delicate objects such as fruits and vegetables.
/
Willemet, L.; Huloux, N.; Wiertlewski, M. Efficient tactile encoding of object slippage. Sci Rep. 2022, 12, 13192.
Willemet, L.; Kanzari, K.; Monnoyer, J.; Birznieks, I.; Wiertlewski, M. Initial contact shapes the perception of friction. Proceedings of the National Academy of Sciences of the United States of America. 2021, 118(49), e2109109118.
Scharff, R.B.N.; Boonstra, D.; Willemet, L.; Lin, X.; Wiertlewski, M. «Rapid manufacturing of color-based hemispherical soft tactile fingertips«. 2022 IEEE 5th International Conference on Soft Robotics (RoboSoft). IEEE. 2022, 896-902.
Lin, X.; Willemet, L.; Bailleul, A.; Wiertlewski, M. Curvature sensing with a spherical tactile sensor using the colorinterference of a marker array. In 2020 IEEE International Conference on Robotics and Automation (ICRA). May 2020, 603-609.
Workshop organized at the Eurohaptics 2022 international conference on the topics of touch mechanics
The sense of touch is the sense that captures the mechanical interaction with our surroundings. When we enter in contact with an object, the mechanical deformation of our skin and the resistance opposed to our limb, inform the central nervous system about the contact condition. From there information such as the weight, center of gravity, texture and slipperiness of the object are extracted. These tactile percepts advise the planification of motor commands in order to achieve the mesmerizing dexterity of the human hand.
Regardless of the obvious benefit of relying on touch system, current commercial robotic systems are rarely equipped with tactile sensors that capture relevant information about the tactile scene. Instead they mostly depend on vision systems to perform tasks. Many reasons motivate this technological preference. First, contrary to cameras, robotic fingers and artificial skins with enough resolution and robustness are not yet broadly available to researchers and industrials. Secondly, even the best artificial sensing systems lack framework for processing and recognizing the tactile scene. The few studies that did tackle these issues are often linked to image processing and often neglect to include frictional and adhesion properties, essential for swift control of robotic hand and surface texture characterization.
This research program aims at bridging the current limitation of soft-sensor design and computational touch to bring the sense of touch to a wide variety of robotic applications. The sensor will leverage the recent advances in soft material construction to build an artificial fingertip that can match the perceptual capability and mechanical strength of their human counterpart. Data provided by sensors will be use to infer the state of contact via a physically motivated computational framework that is based on tribology and contact mechanics.
Research on touch is still in its infancy compared to visual and auditory perception. Building a physically-grounded framework around artificial touch has the same innovative potential as computer vision had 30 years ago.
Project coordination
Michael Wiertlewski (Institut des Sciences du Mouvement)
The author of this summary is the project coordinator, who is responsible for the content of this summary. The ANR declines any responsibility as for its contents.
Partnership
AMU (ISM) Institut des Sciences du Mouvement
Help of the ANR 288,777 euros
Beginning and duration of the scientific project:
September 2016
- 42 Months