Next Article in Journal
Recognizing Eating Activities in Free-Living Environment Using Consumer Wearable Devices
Previous Article in Journal
Effect of Milling Time on the Sensing Properties of Fly Ash Zeolite Composite Thin Films
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Proceeding Paper

Tactile Sensor Analysis during Early Stages of Manipulation for Single Grasp Identification of Daily Objects †

by
Vinicius Prado da Fonseca
Department of Computer Science, Memorial University of Newfoundland, St. John’s, NL A1C 5S7, Canada
Presented at the 8th International Symposium on Sensor Science, 17–28 May 2021; Available online: https://i3s2021dresden.sciforum.net/.
Published: 17 May 2021
(This article belongs to the Proceedings of The 8th International Symposium on Sensor Science)

Abstract

:
Dexterous robotic manipulation in unstructured environments is still challenging, despite the increasing number of robots entering human settings each day. Even though robotic manipulation provides complete solutions in factories and industries, it still lacks essential techniques, displaying clumsy or limited operation in unstructured environments. Daily objects typically aim at the human hand, and the human somatosensory system is responsible for solving all the complex calculations required for dexterous manipulations in unstructured settings. Borrowing concepts of the human visuotactile system can improve dexterous manipulation and increase robotics usage in unstructured environments. In humans, required finger and wrist joint adjustments occur after fast identification of the object in the initial stages of manipulation. Fast object identification during those phases may increase robotic dexterous manipulation performance. The present paper explores human-inspired concepts such as haptic glance to develop robotic single-grasp object identification. This concept can assist early phases of robotic manipulation, helping automated decision-making, such as type of grasp and joint position, during manipulation tasks. The main stages developed here are detecting sensor activation and sample collection using signal-to-noise and z-score filtering on tactile data. This procedure automates touch detection and reduces the sensor space for classification. Experiments on a daily objects dataset presented compelling results that will assist in the later stages of the early phases of robotic grasping.

1. Introduction

The growing integration of robots in homes and hospitals has been developing robotic manipulation research in unstructured environments. Still, a big step is required in sensing and data processing to achieve human-level robotic manipulation in such environments. The human somatosensory system uses tactile and visual feedback to create frames of reference for object identification and pose estimation [1]. Such information is essential for later assistance on finger and writs joint updates. One aspect of human manipulation is called haptic glance sensing, which concerns fast object recognition in the absence of visual stimuli. Humans are capable of fast object recognition in occluded environments using only tactile feedback [2] in the first phases of manipulation. A single grasp approach to achieving a robotic haptic glance would promote fast robot classification of objects during the early phases of manipulation. Single grasp manipulation collects data during short periods of time at the beginning of the manipulation to promote an initial object recognition that will help decisions on the after-grasping finger and wrist position.
This paper investigates the initial stages of manipulation where sensor contact is still unknown and prone to wrong selection due to signal noise. We propose a method where the comparison between signal-to-noise of filtered and unfiltered data in addition to z-score peak detection will define manipulation points. The method splits into two main parts, where there is an initial sensor detection with a later application of z-score peak detection in the selected sensors. The classification tasks will use only the data points of the selected sensors. In this way, the single grasp algorithm will have updated data only from activated sensors in peak points.
The present paper develops initial work on a multi-modal dataset of visual-tactile sensing [3]. The selected dataset contains manipulation of daily objects in different hand orientations.

2. Results

This section presents results of feature selection and signal peak detection applied to tactile sensing data. Early phases of robotic manipulation require fast selection and detection of activated sensors that indicate object touch. Signal peak detection of the activated sensors will also enable additional manipulation automation in the early manipulation phases. This paper introduces two separate stages, an initial feature selection followed by peak detection. The techniques evaluated here introduce the use of signal-to-noise ratio and z-peak score applied to tactile sensing data to detect activated sensors and manipulation points of interest. Combining those stages accelerates the initial phases of robotic manipulation. All experiments performed used data described in [3]. The specific experiment used can be found on plot titles.

2.1. Feature Selection

The first stage covers feature selection. A manipulator may carry several sensors, which may not necessarily activate during manipulation. Manipulator design, object shape, or the required grasp type may affect the number of sensors involved in the manipulation. This stage compares signal-to-noise ratio (SNR) from filtered and unfiltered data to automate the detection sensors directly involved in the manipulation.
Savitzky–Golay filtered data [4] will most likely produce a lower SNR when compared to unfiltered data. This reduction is significant in sensors not involved in the manipulation, which may contain only noise. Experimental data suggests that sensors involved in the manipulation do not see a reduction bigger than 10%. This threshold splits the group of sensors involved or not involved in the manipulation. Figure 1 shows an example of 15 sensors where only two of them (0 and 8) were involved in the manipulation.
Due to noise, arbitrary values of SNR can lead to false positives. In some cases where there is a significant SNR in the data (e.g., sensor 5), the comparison with filtered data shows a considerable drop in SNR. A short drop in SNR between filtered and unfiltered data indicates the participation of that sensor in the manipulation. The sensor can be included in the sensor list to perform peak detection. Figure 2 shows one example of such selection where sensors 0 and 8 can be used in further tasks while the other sensors do not participate in the manipulation.

2.2. Peak Detection Results

Single grasp phase during early phases of manipulation will require the detection of initial and final grasping points. The z-peak score was successfully used to detect heart frequency in [5]. Even though the authors used a multi-modal tactile sensor, the same technique was applied here. Figure 3 shows z-score peak detection performed on tactile sensor data.
The peaks detected in red can be used as points themselves or definitions of initial and final points of manipulation that will be used for single grasp classification.

3. Conclusions

This paper provides early results about automating sensor selection and peak detection for early phases of tactile manipulation. A comparison between the filtered and unfiltered signal-to-noise ratios of tactile data provides for initial sensor selection. The selected sensor data are later applied to a z-score to detect time slices with data points for single grasp classification.
The results presented here are still in the early stages, and future work needs to investigate deeper and evaluate classification tasks on the selected sensors and peak locations. The results presented here are compelling and suggest that sensor selection for tactile data can include techniques where little to no knowledge of the dataset is known. A future direction also points to the use of short-term Fourier transformations for feature extraction. Furthermore, future work must cover real-time sensor selection and peak detection in selected data slices.

Funding

This research received funding from the Memorial’s University Faculty of Science.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data used in this paper can be found at https://github.com/tsinghua-rll/Visual-Tactile_Dataset.

Conflicts of Interest

The author declares no conflict of interest.

References

  1. Prado da Fonseca, V.; Alves de Oliveira, T.E.; Petriu, E.M. Estimating the Orientation of Objects from Tactile Sensing Data Using Machine Learning Methods and Visual Frames of Reference. Sensors 2019, 19, 2285. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  2. Lederman, S.J.; Klatzky, R.L. Haptic perception: A tutorial. Atten. Percept. Psychophys. 2009, 71, 1439–1459. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  3. Wang, T.; Yang, C.; Kirchner, F.; Du, P.; Sun, F.; Fang, B. Multimodal grasp data set: A novel visual–tactile data set for robotic manipulation. Int. Adv. Robot. Syst. 2019, 16, 172988141882157. [Google Scholar] [CrossRef] [Green Version]
  4. Savitzky, A.; Golay, M.J.E. Smoothing and Differentiation of Data by Simplified Least Squares Procedures. Anal. Chem. 1964, 36, 1627–1639. [Google Scholar] [CrossRef]
  5. Rocha Lima, B.M.; Eustaquio Alves de Oliveira, T.; da Fonseca, V.P.; Zhu, Q.; Goubran, M.; Groza, V.Z.; Petriu, E.M. Heart Rate Detection Using a Miniaturized Multimodal Tactile Sensor. In Proceedings of the 2019 IEEE International Symposium on Medical Measurements and Applications (MeMeA), Istanbul, Turcky, 26–28 June 2019; pp. 1–6. [Google Scholar] [CrossRef]
Figure 1. Signal-to-noise ratio comparison between filtered (gray) and unfiltered (blue) data for all the sensors during manipulation.
Figure 1. Signal-to-noise ratio comparison between filtered (gray) and unfiltered (blue) data for all the sensors during manipulation.
Engproc 06 00056 g001
Figure 2. Sensor value from selected sensors (left) and only noise sensors (right).
Figure 2. Sensor value from selected sensors (left) and only noise sensors (right).
Engproc 06 00056 g002
Figure 3. Blue line shows raw data while green line shows filtered data (top). Red line shows the Z-score peak detection for the same signal (bottom).
Figure 3. Blue line shows raw data while green line shows filtered data (top). Red line shows the Z-score peak detection for the same signal (bottom).
Engproc 06 00056 g003
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Prado da Fonseca, V. Tactile Sensor Analysis during Early Stages of Manipulation for Single Grasp Identification of Daily Objects. Eng. Proc. 2021, 6, 56. https://0-doi-org.brum.beds.ac.uk/10.3390/I3S2021Dresden-10091

AMA Style

Prado da Fonseca V. Tactile Sensor Analysis during Early Stages of Manipulation for Single Grasp Identification of Daily Objects. Engineering Proceedings. 2021; 6(1):56. https://0-doi-org.brum.beds.ac.uk/10.3390/I3S2021Dresden-10091

Chicago/Turabian Style

Prado da Fonseca, Vinicius. 2021. "Tactile Sensor Analysis during Early Stages of Manipulation for Single Grasp Identification of Daily Objects" Engineering Proceedings 6, no. 1: 56. https://0-doi-org.brum.beds.ac.uk/10.3390/I3S2021Dresden-10091

Article Metrics

Back to TopTop