Citation: Teague, S.; Chahl, J.
Strapdown Celestial Attitude
Estimation from Long Exposure
Images for UAV Navigation. Drones
2023, 7, 52. https://doi.org/
10.3390/drones7010052
Academic Editors: Andrzej
Łukaszewicz, Wojciech Giernacki,
Zbigniew Kulesza, Jaroslaw Pytka
and Andriy Holovatyy
Received: 13 December 2022
Revised: 9 January 2023
Accepted: 9 January 2023
Published: 12 January 2023
Copyright: © 2023 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Strapdown Celestial Attitude Estimation from Long Exposure
Images for UAV Navigation
Samuel Teague
1,
* and Javaan Chahl
1,2
1
School of Engineering, University of South Australia, Mawson Lakes, SA 5095, Australia
2
Joint and Operations Analysis Division, Defence Science and Technology Group,
Melbourne, VIC 3207, Australia
* Correspondence: samuel.teague@mymail.unisa.edu.au
Abstract:
Strapdown celestial imaging sensors provide a compact, lightweight alternative to their
gimbaled counterparts. Strapdown imaging systems typically require a wider field of view, and
consequently longer exposure intervals, leading to significant motion blur. The motion blur for a
constellation of stars results in a constellation of trails on the image plane. We present a method
that extracts the path of these star trails, and uses a linearized weighted least squares approach to
correct noisy inertial attitude measurements. We demonstrate the validity of this method through its
application to synthetically generated images, and subsequently observe its relative performance
by using real images. The findings of this study indicate that the motion blur present in strapdown
celestial imagery yields an a posteriori mean absolute attitude error of less than 0.13 degrees in the
yaw axis, and 0.06 degrees in the pitch and roll axes (3
σ
) for a calibrated wide-angle camera lens.
These findings demonstrate the viability of low-cost, wide-angle, strapdown celestial attitude sensors
on lightweight UAV hardware.
Keywords: celestial; stellar; navigation; strapdown; attitude
1. Introduction
The use of stabilized celestial navigation sensors for uncrewed aerial vehicle (UAV)
attitude determination is well documented [
1
]. With recent demand for size, weight
and power constrained systems, strapdown celestial sensors have become more common.
A strapdown [
2
] celestial navigation sensor is rigidly mounted to the airframe, causing
imagery to be subjected to motion artefacts from the aircraft, such as actuation, vibration
and turbulence. The length of the exposure window is the primary factor governing the
severity of the resultant motion blur. For wide-angled lenses, it is necessary to use longer
exposure windows so as to increase the total light energy incident on the sensor. Under sta-
ble flight conditions, longer exposure windows enable the detection of higher magnitude
stars, and consequently provide a more accurate attitude estimate. Under motion, however,
the longer exposure window results in “smearing” of star images, leaving a trail as seen in
Figure 1. This image shows a region of interest (ROI) containing a single star trail captured
in-flight from a strapdown celestial imaging system. We can see from this ROI that the
resultant trail tends to be noisy, and the angular velocity tends to change throughout the
exposure interval.
The premise for this research comes from the hypothesis that the observed star trails
contain high-resolution information pertaining to the attitude of the aircraft during the
exposure window. We present a method which estimates high-resolution attitude data from
long-exposure images, provided availability of a low resolution approximation from the au-
topilot (e.g., from an inertial measurement unit). This method makes use of long-exposure
strapdown imagery simulation, presented in [
3
], to provide an initial approximation of the
star trail location and orientation, and corrects for attitude and attitude rate errors from the
Drones 2023, 7, 52. https://doi.org/10.3390/drones7010052 https://www.mdpi.com/journal/drones