Article
Infer Thermal Information from Visual Information: A Cross
Imaging Modality Edge Learning (CIMEL) Framework
Shuozhi Wang
1
, Jianqiang Mei
2
, Lichao Yang
1
and Yifan Zhao
1,
*
Citation: Wang, S.; Mei, J.; Yang, L.;
Zhao, Y. Infer Thermal Information
from Visual Information: A Cross
Imaging Modality Edge Learning
(CIMEL) Framework. Sensors 2021, 21,
7471. https://doi.org/10.3390/
s21227471
Academic Editors: YangQuan Chen,
Subhas Mukhopadhyay, Nunzio
Cennamo, M. Jamal Deen, Junseop
Lee and Simone Morais
Received: 11 October 2021
Accepted: 6 November 2021
Published: 10 November 2021
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2021 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
1
School of Aerospace, Transport and Manufacturing, Cranfield University, Bedford MK43 0AL, UK;
shuozhi.wang@cranfield.ac.uk (S.W.); lichao.yang@cranfield.ac.uk (L.Y.)
2
School of Electronic Engineering, Tianjin University of Technology and Education, Tianjin 300222, China;
meijianqiang@tute.edu.cn
* Correspondence: yifan.zhao@cranfield.ac.uk; Tel.: +44-(0)-1234729
Abstract:
The measurement accuracy and reliability of thermography is largely limited by a relatively
low spatial-resolution of infrared (IR) cameras in comparison to digital cameras. Using a high-
end IR camera to achieve high spatial-resolution can be costly or sometimes infeasible due to
the high sample rate required. Therefore, there is a strong demand to improve the quality of IR
images, particularly on edges, without upgrading the hardware in the context of surveillance and
industrial inspection systems. This paper proposes a novel Conditional Generative Adversarial
Networks (CGAN)-based framework to enhance IR edges by learning high-frequency features from
corresponding visual images. A dual-discriminator, focusing on edge and content/background, is
introduced to guide the cross imaging modality learning procedure of the U-Net generator in high
and low frequencies respectively. Results demonstrate that the proposed framework can effectively
enhance barely visible edges in IR images without introducing artefacts, meanwhile the content
information is well preserved. Different from most similar studies, this method only requires IR
images for testing, which will increase the applicability of some scenarios where only one imaging
modality is available, such as active thermography.
Keywords: image enhancement; edge detection; deep learning; thermography
1. Introduction
Infrared (IR) is a kind of electromagnetic radiation with a longer wavelength than that
of visible light. Infrared thermography has been widely used in different fields, such as
monitoring [
1
], medicine [
2
], psychophysiology [
3
], nondestructive testing [
4
] (NDT) and
so forth.
Although significant progress has been achieved in IR imaging, the spatial resolu-
tion is still one of the major limiting factors and bottlenecks for industrial thermography
applications, mainly due to the high-cost of sensors. Typically, the pixel dimension of
thermography is 640
×
480, which is relatively low compared with modern RGB photog-
raphy. Although there are some high-end IR cameras with improved spatial-resolution,
these cameras are usually much more expensive. Furthermore, even with the same spatial-
resolution, the boundary of objects in thermal images is not as sharp as that in digital
images. Viewing from the imaging principle: the digital imaging system typically obtains
images by applying CCD or CMOS sensors, based on the difference in the intensity of light
in the range of 0.4–0.7
µ
m reflected by the surface of the observed target, with high contrast
and improved resolution. While infrared thermal imaging technology is based on receiving
radiant energy with longer wavelengths in the range of 3–12
µ
m. Due to the difference
in minimum resolvable temperature difference between the object and the background,
together with the distance that it is measured from, the target is quickly submerged in the
dark background. This phenomenon is likely to lead to the blurring effect of the acquired
IR images. This is particularly problematic in active thermography, where the boundary of
Sensors 2021, 21, 7471. https://doi.org/10.3390/s21227471 https://www.mdpi.com/journal/sensors