Tactile perception combined with vision for successful robotic housekeeping. Part 1

In the field of robotics, the ongoing desire to imitate human sensory capabilities has been driven by the desire to give machines a deeper understanding of the world around them. In recent years, tactile perception and the fusion of tactile and visual senses (called tactile-visual fusion) have become innovative approaches in this direction. The introduction of haptic perception into robotics represents a fundamental shift in the capabilities of machines, giving them the ability to perceive their surroundings through touch in many ways similar to humans.

Traditional robots have relied heavily on visual perception, rarely taking into account the tactile sensations needed to interact in complex and dynamic environments.

Haptic sensors integrate various tactile characteristics such as pressure, temperature, texture and material properties, giving robots rich senses and allowing them to more skillfully interact with their environment.

In manufacturing, robots equipped with tactile sensors will perform fine assembly tasks with greater accuracy and adaptability.

In healthcare, haptic feedback has the potential to assist surgeons in minimally invasive procedures by increasing the precision of surgical instrument manipulation.

Machine learning algorithms play a key role in facilitating the integration of these diverse sensory inputs within tactile-visual fusion. For example, deep learning architectures such as convolutional neural networks (CNNs) and recurrent neural networks (RNNs) demonstrate exceptional capabilities in capturing complex correlations between multimodal perception data and objects to be recognized.

However, there are still many factors that influence the ability of robots to perform complex tasks in real-world scenarios, and hence they are difficult to replace with human labor. First, there is the challenge of creating robots with human-like sensory capabilities. Take for example pouring hot water from a cup – in this process, the robot requires visual perception to localize the cup, and it needs tactile information such as pressure, temperature and slippage to perform the pouring action. Just like humans, multimodal sensory abilities such as vision, pressure, temperature, thermal characteristics, textures, and gliding are required for robots to interact effectively with their environment. In addition, fast and sensitive perceptual abilities are required. For example, lack of high ability for fast and sensitive slip detection may cause the robot to drop an object when trying to grab it, causing loss or putting people in danger.

We propose a thermosensing-based robotic flexible tactile sensor capable of simultaneously multimodal sensing of contact pressure, temperature, thermal conductivity, texture, and slip (Fig. 1). The proposed haptic-visual fusion strategy is used to grasp various objects, where vision helps to detect object position and pose, and sliding-based haptic feedback control enables dexterous robotic grasping. It is also worth noting that the robot is able to stably grasp the cup with minimal force using haptic sensing to ensure that the cup does not crush or slide off when pouring water into it.

We apply the proposed haptic-visual fusion method to the desktop cleaning task, the robot automatically performs a series of actions such as object location, object steady holding, and object accurate recognition to achieve multi-item sorting and desktop cleaning.

This validates the feasibility and superiority of the proposed multimodal tactile sensor as well as the tactile-visual fusion architecture, demonstrating the promising potential of intelligent robots for household applications.

figure 1

Rice. 1: Tactile-visual ingot robot allows complex tasks to be performed. 1

The robot has multimodal sensing capabilities, including vision and touch (perception of contact pressure, temperature, thermal conductivity and texture of objects, and sliding). With these sensing capabilities and the proposed tactile-visual fusion strategy, the robot can perform a range of tasks such as object location, sustained grasping, object recognition and sorting.

Operating principles of multimodal flexible tactile sensor

The sensor consists of an upper sensing layer, a lower sensing layer, a PDMS layer and a porous material in the middle (Fig. 2a). The top and bottom sensing layers have the same sensing structure, which includes two concentric platinum (Pt) thermistors deposited on a flexible polyamide substrate. The internal thermistor has a low resistance (~50 ohms) that acts as a thermocouple (referred to as hot film). The external thermistor has a higher resistance (~500 ohms, referred to as cold film) and acts as a local temperature sensor. The radii of the hot film and cold film are 0.95 mm and 3.2 mm, respectively (Fig. 2a). The sensing principle is based on conductive heat transfer. The hot film on the sensitive layer serves both as a heater and as its own temperature sensor, which is sensitive to the thermal conductivity of the environment.

figure 2

Rice. 2: Structure, working principle and functions of multimodal tactile sensor

A The tactile sensor structure consists of a top sensing layer, a bottom sensing layer, PDMS, and a porous material in the middle.b Operating principle of the lower sensitive layer.c Pressure response Up lower sensitive layer during loading and unloading. The inset shows the lower limit of detection.d The operating principle of the upper sensitive layer. Heat transfer of the upper sensitive layer in the modes of no contact, contact and sliding.e The output signals of the tactile sensor respond to the temperature of the environment and the object, respectively.f The reaction of the upper sensitive layer when touching materials with different thermal conductivities.g Response of the upper sensory layer upon contact and sliding to PA and PEEK, respectively.h Slip and texture are determined by macro- and microelements of signals from the upper sensitive layer. Macro and microelements are extracted from the upper sensory signals by filtering during the sliding process. Error bars shown in c,e And frepresent the standard deviations of five repeated measurements of one tactile sensor.

In the lower sensitive layer, the contact pressure causes elastic deformation of the middle porous material, changing the thermal conductivity of the porous material due to its piezothermal transformation (Fig. 2b). Consequently, the lower sensitive layer responds to contact pressure (the output signal is denoted as Up). Figure 2c shows the response curve of the sensor during loading and unloading under pressure. The sensor has a wide measuring range of 20 N and a low detection limit of 0.01 N, and also exhibits low hysteresis of 2.4%.

As for the upper sensing layer, when the sensor comes into contact with an object, the thermal conductivity of the object will affect the heat transfer of the hot film and thus can be detected. Specifically, the hot film in the upper sensing layer is electrically heated by a constant temperature difference (CTD) circuit to provide a higher temperature than the surrounding environment and create a thermal field in the contact object. When sliding, the hot film moves to a cooler region of the object it comes into contact with, resulting in a change in heat transfer that is immediately detected by the hot film. Essentially, slippage is determined by the change in heat transfer of the hot film. After sliding, heat transfer returns to the pre-slip state as the sensor remains in contact with the heated area. Thus, object sliding is detected using the output of the upper sensor layer (denoted as Ui) (Fig. 2d). In addition, as mentioned earlier, cold film thermistors in the lower and upper sensing layers sense the ambient temperature and object temperature, respectively, as shown in Fig. 2e. Using a CTD circuit, the tactile sensor is not affected by ambient temperature fluctuations. Figure 2f shows the change in the reaction of the upper sensitive layer when touching objects with different thermal conductivities. As the thermal conductivity of the object increases, the output signal of the sensor also increases. Figure 2g shows the dynamic response of the sensor as it transitions from non-contact mode to contact mode, sliding mode, and back to object contact mode. Since the thermal conductivity of polyamide (PA) is higher, it is seen that the contact and sliding response is higher than that of polyetheretherketone (PEEK) material. It is worth noting that changes in contact pressure do not affect slip detection. More importantly, since the sensor is sensitive to the surface topography of the object, the response during the sliding process is filtered to extract its macro characteristics (Umacro) and micro-characteristics (Umicro) (Fig. 2h). These functions can be used to simultaneously determine the slip state and surface texture of an object. In addition, the tactile sensor is tested in reciprocating contact disconnection cycles up to 1000 times, indicating good stability and reliability of the sensor.

Slip detection and texture recognition

Macro functions in the response signals of the upper sensitive layer can be used to characterize the sliding state of an object on the sensor. We select different materials (Bakelite, PE, PA, PEEK, and PPS) for testing and record the sensor responses at sliding speeds of 0 (no sliding), 5, 10, 15, 20, 30, and 50 mm/s, respectively (Fig. 3a ). The results show that as the thermal conductivity of the material increases and the sliding speed accelerates, the corresponding macro-characteristics increase (Umacro). When the thermal conductivity of an object is known from the previous state of stable contact, it can be used to determine the sliding speed of the object when grasped by the robot. Umacro photography. In addition, we use PPS material to determine the minimum detection threshold and response time for slip detection. As shown in Fig. 3b, at a small sliding speed of 0.05 mm/s, the sensor still shows a clear response. Additionally, once slip occurs, the sensor response exceeds the noise level under no-slip conditions in just 4 ms (shown in Fig. 3c). Taken together, these results indicate that the proposed tactile sensor can be used for ultra-sensitive and ultra-fast slip detection on various materials.

figure 3

Rice. 3: Performance of slip detection and texture recognition.

a Macro characteristics for various materials with different sliding speeds. b The tactile sensor has a low slip detection limit of 0.05mm/s. c The tactile sensor has a fast response time of 4ms for slip detection. d microfunctions at different sliding speeds. e FFT analysis of trace elements at different sliding speeds. f FFT analysis of trace elements at different sliding speeds. g Tactile responses and photomicrographs of four types of fabrics including polyester spandex, stretch polyester knit, nylon, and coded rayon (F1–F4). h Tactile response of the F1 and F3 key combination. z Confusion matrix for tissue recognition.

Microparticles in the response signals of the upper sensor layer can be used to determine the microsurface morphology of objects, facilitating texture recognition. In Fig. Figure 3d shows the signals of the sensor microelements received when the PA material slides along the surface of the object at normal pressure 2H at different speeds (0.5, 1, 3 and 5 mm/s). As the sliding speed increases, since the surface topography of the object remains unchanged, the maximum number of signals during the same period of time also increases. Performing a fast Fourier transform (FFT) analysis of these signals in the frequency domain produces the spectrum shown in Fig. 3e. The graph shows that the frequencies corresponding to the spectral peaks also increase with increasing sliding speed. At a sliding speed of 1 mm/s, the corresponding PA frequency is 4.03 Hz, so the grating period on the PA surface is approximately 248 μm. Short-time Fourier transform (STFT) imaging using trace elements provides time-dependent frequency information depicting the magnitude of STFT intensity over time (Fig. 3f). As the sliding speed increases, the frequency distribution gradually shifts to the high-frequency region. Texture detection is also done for various other materials.

Therefore, thermal conductivity, sliding speed, and surface texture of an object can be independently determined using the top signal of the hot film sensor (Ui) from contact to the sliding process. When the sensor comes into contact with the material, the reaction of the hot film in the upper sensing layer changes significantly, and the stabilized voltage (Ui) can be used to determine the thermal conductivity of a material.

When sliding occurs in the robotic gripping process, the sliding speed can be determined based on the macro characteristics (Umacro) extracted from the top hot film sensing signal, combined with a predetermined thermal conductivity.

Additionally, by performing FFT for trace elements (Umicro) extracted from the top signal of the hot film sensor, the lattice period of the material surface, reflecting the texture, can be determined.

The use of tactile sensors to measure the thermal properties of a material and determine texture also makes it possible to accurately differentiate between different fabrics. We select ten different fabrics (polyester spandex, polyester knit stretch fabric, nylon, encrypted rayon, cotton canvas, denim, fleece, wool-polyester fabric, cardboard and linen fabric, and lycra to obtain tactile sensor responses (Fig. 3g) when sliding on the surface of these tissues. In addition, the tactile response of the tissue combination is also shown in Fig. 3h, which indicates that the sensor contacts two tissues in succession during the sliding process, and the corresponding macroscopic and microscopic characteristics of the sensor signal can be used for identification. tissues in sequence. We subsequently extract macro- and micro-features from the tactile sensor data to perform fabric recognition. Using this approach, we achieve accurate recognition of these 10 tissues with an overall accuracy of 94.3%.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *