
Article
A Sensor Fused Rear Cross Traffic Detection System Using
Transfer Learning
Jungme Park * and Wenchang Yu
Citation: Park, J.; Yu, W. A Sensor
Fused Rear Cross Traffic Detection
System Using Transfer Learning.
Sensors 2021, 21, 6055. https://
doi.org/10.3390/s21186055
Academic Editor:
Subhas Mukhopadhyay
Received: 16 August 2021
Accepted: 6 September 2021
Published: 9 September 2021
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
College of Engineering, Kettering University, Flint, MI 48504-6214, USA; yu4416@kettering.edu
* Correspondence: jpark@kettering.edu
Abstract:
Recent emerging automotive sensors and innovative technologies in Advanced Driver
Assistance Systems (ADAS) increase the safety of driving a vehicle on the road. ADAS enhance
road safety by providing early warning signals for drivers and controlling a vehicle accordingly
to mitigate a collision. A Rear Cross Traffic (RCT) detection system is an important application of
ADAS. Rear-end crashes are a frequently occurring type of collision, and approximately 29.7% of
all crashes are rear-ended collisions. The RCT detection system detects obstacles at the rear while
the car is backing up. In this paper, a robust sensor fused RCT detection system is proposed. By
combining the information from two radars and a wide-angle camera, the locations of the target
objects are identified using the proposed sensor fused algorithm. Then, the transferred Convolution
Neural Network (CNN) model is used to classify the object type. The experiments show that the
proposed sensor fused RCT detection system reduced the processing time 15.34 times faster than the
camera-only system. The proposed system has achieved 96.42% accuracy. The experimental results
demonstrate that the proposed sensor fused system has robust object detection accuracy and fast
processing time, which is vital for deploying the ADAS system.
Keywords:
ADAS; object detection; Convolution Neural Network; sensor fusion; rear cross traffic;
radar; camera
1. Introduction
Most traffic accidents occurred due to human error. Rear-end crashes are a frequently
occurring type of collision, and approximately 29.7% of all crashes are rear-ended colli-
sions [
1
]. Recent emerging automotive sensors and innovative technologies in computer
vision enhance car and road safety. Advanced Driver Assistance Systems (ADAS) are
intelligent systems that help drivers to avoid collisions and increase driving safety, such
as Automated Emergency Braking (AEB), Blind Spot Detection (BSD), Lane Departure
Warning (LDW), etc. ADAS are proven to reduce road fatalities by detecting obstacles in
advance, generating warning signals for drivers, and controlling a vehicle accordingly.
A Rear-Cross Traffic (RCT) detection system is one of the ADAS applications, activated
when a driver drives a vehicle backward. The RCT detection system warns the driver when
obstacles are detected near the backing path. It is a challenging task because obstacles are
approaching fast from the sides, which requires the system to react appropriately in a short
time. The RCT detection system detects objects in blind spots or locations where obstacles
are hard to be viewed through mirrors.
Currently, many commercial RCT detection systems are implemented using radar
sensors. However, in many ADAS applications, using a single sensor is not enough
for system accuracy. A radar sensor can detect object speed and range accurately and
works under adverse weather conditions. However, the radar sensor often has too much
noise and low resolution. Furthermore, a radar sensor is not able to classify the object
types. On the other hand, the camera sensor has the advantages of low cost and high
resolution. However, the camera sensor is susceptible to illumination changes. Therefore,
Sensors 2021, 21, 6055. https://doi.org/10.3390/s21186055 https://www.mdpi.com/journal/sensors