
Article
Tactile Object Recognition for Humanoid Robots Using New
Designed Piezoresistive Tactile Sensor and DCNN
Somchai Pohtongkam and Jakkree Srinonchat *
Citation: Pohtongkam, S.;
Srinonchat, J. Tactile Object
Recognition for Humanoid Robots
Using New Designed Piezoresistive
Tactile Sensor and DCNN. Sensors
2021, 21, 6024. https://doi.org/
10.3390/s21186024
Academic Editor: Nunzio Cennamo
Received: 4 August 2021
Accepted: 6 September 2021
Published: 8 September 2021
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Department of Electronics and Telecommunication Engineering, Rajamangala University of Technology Thanyaburi,
Khlong Luang 12110, Thailand; somchai_po@mail.rmutt.ac.th
* Correspondence: jakkree.s@en.rmutt.ac.th; Tel.: +66-897-775-038
Abstract:
A tactile sensor array is a crucial component for applying physical sensors to a humanoid
robot. This work focused on developing a palm-size tactile sensor array (56.0 mm
×
56.0 mm) to
apply object recognition for the humanoid robot hand. This sensor was based on a PCB technology
operating with the piezoresistive principle. A conductive polymer composites sheet was used as
a sensing element and the matrix array of this sensor was 16
×
16 pixels. The sensitivity of this
sensor was evaluated and the sensor was installed on the robot hand. The tactile images, with
resolution enhancement using bicubic interpolation obtained from 20 classes, were used to train and
test 19 different DCNNs. InceptionResNetV2 provided superior performance with 91.82% accuracy.
However, using the multimodal learning method that included InceptionResNetV2 and XceptionNet,
the highest recognition rate of 92.73% was achieved. Moreover, this recognition rate improved when
the object exploration was applied to demonstrate.
Keywords: tactile sensor; tactile object recognition; DCNN; humanoid robot; transfer learning
1. Introduction
Unlike humans who can identify objects by touching, humanoid robots do not have
this capability due to the lack of suitable tactile sensors and efficient recognition processing
systems. The critical development of humanoid robot technology can be divided into
two parts: (1) robot anatomy [
1
]; (2) the robot nervous system [
2
]. Developing a physical
structure and human-like learning ability is necessary to enable robots to operate in a home
or office environment. In addition, the development of robots having a human-like hand
structure is desirable [
3
–
5
]. This study examines a humanoid robot’s physical sensory
system that can recognize objects by touch. Its essential function is developed based on the
human physical sensory system [
6
]. In object learning and recognition systems of humanoid
robots employed artificial haptic perception [
7
–
11
], pressure sensors or tactile sensors are
utilized [
7
–
11
], and the obtained information is sent to a computer for analysis [
10
]. Object
learning and recognition systems are similar to the human sensory system where nerve-
ending receptors (e.g., Ruffini endings and Pacinian receptors) obtain information sent to
the brain for interpretation. There have been numerous studies describing the development
of robotic hands. These studies focus on tactile sensor arrays for robot hand artificial skin
application [
7
–
11
]. Human sensory recognition is a complicated action resulting from the
biosensor system in the body, which includes three modes of data perception [
6
]. The first
mode is tactile perception where contact with the skin of the fingers or palm provides
information on the contact geometry or pressure profile.
A tactile sensor array produces this mode of data perception for robots and presents
data in a 2D format or tactile image [
10
]. The second perception mode is kinesthetic
perception, a perception from motion such as rubbing or scrubbing objects. For robots,
this mode of data perception is produced by tactile sensors on the fingertips or palm
from dragging the sensor onto the object and presents data in a 1D format [
12
]. The third
perception mode is global object shape, where perception data is gathered through the
Sensors 2021, 21, 6024. https://doi.org/10.3390/s21186024 https://www.mdpi.com/journal/sensors