Citation: Zhao, D.; Mo, B.; Zhu, X.;
Zhao, J.; Zhang, H.; Tao, Y.; Zhao, C.
Dynamic Multi-Attention Dehazing
Network with Adaptive Feature
Fusion. Electronics 2023, 12, 529.
https://doi.org/10.3390/
electronics12030529
Academic Editor: Silvia Liberata Ullo
Received: 26 December 2022
Revised: 15 January 2023
Accepted: 16 January 2023
Published: 19 January 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Dynamic Multi-Attention Dehazing Network with Adaptive
Feature Fusion
Donghui Zhao
1,
* , Bo Mo
1
, Xiang Zhu
2
, Jie Zhao
1,3
, Heng Zhang
4
, Yimeng Tao
1
and Chunbo Zhao
1
1
Beijing Institute of Technology, Beijing 100081, China
2
Beijing Building Materials Research Institute Co., Ltd., Beijing 100041, China
3
North Navigation Control Technology Co., Ltd., Beijing 100176, China
4
Shanghai Electro-Mechanical Engineering Institute, Shanghai 201109, China
* Correspondence: zhaodonghui99@foxmail.com
Abstract:
This paper proposes a Dynamic Multi-Attention Dehazing Network (DMADN) for single
image dehazing. The proposed network consists of two key components, the Dynamic Feature
Attention (DFA) module, and the Adaptive Feature Fusion (AFF) module. The DFA module provides
pixel-wise weights and channel-wise weights for input features, considering that the haze distribution
is always uneven in a degenerated image and the value in each channel is different. We propose an
AFF module based on the adaptive mixup operation to restore the missing spatial information from
high-resolution layers. Most previous works have concentrated on increasing the scale of the model
to improve dehazing performance, which makes it difficult to apply in edge devices. We introduce
contrastive learning in our training processing, which leverages both positive and negative samples
to optimize our network. The contrastive learning strategy could effectively improve the quality of
output while not increasing the model’s complexity and inference time in the testing phase. Extensive
experimental results on the synthetic and real-world hazy images demonstrate that DMADN achieves
state-of-the-art dehazing performance with a competitive number of parameters.
Keywords: dehazing; CNN; feature attention; feature fusion; contrastive learning
1. Introduction
Haze is a common atmospheric phenomenon caused by floating particles in the
air. Due to the turbid medium, light propagation is hindered, and images taken in the
haze are often subject to some degree of degradation. Input images captured in the hazy
environment will affect the performance of dependable high-level computer vision systems
(such as object detection [
1
,
2
] and scene understanding [
3
,
4
]). However, a dependable
high-level computer vision system must work well with various kinds of interference [
5
,
6
].
It is a significant step for developing dehazing techniques to improve the robustness of
high-level computer vision systems.
Previous works [
7
,
8
] has proposed the atmosphere scattering model to explain the
process of hazy image generation. Specifically, it assumes that:
I(x) = J(x)t(x) + A(1 − t(x)) (1)
where
I(x)
and
J(x)
are the degenerated hazy and clear images,
A
is the atmosphere light
intensity, and
t(x)
is the medium transmission map. We also have
t(x) = e
−βd (x)
, where
β
and d(x) are the atmosphere scattering parameter and the scene depth, respectively.
Early dehazing methods [
9
–
19
] are based on priors in nature scenes; He et al. [
12
]
proposed the dark channel prior (DCP) which is the masterpiece of the prior-based
method. However, prior-based dehazing methods are not efficient in specific scenar-
ios. In recent years, the Convolutional Neural Network (CNN) has been proven effective in
dehazing [20–27]
. DehazeNet [
21
] first reconstructs the haze-free image by estimating
A
Electronics 2023, 12, 529. https://doi.org/10.3390/electronics12030529 https://www.mdpi.com/journal/electronics