Citation: Ugwitz, P.; Kvarda, O.;
Juˇríková, Z.; Šašinka,
ˇ
C.; Tamm, S.
Eye-Tracking in Interactive Virtual
Environments: Implementation and
Evaluation. Appl. Sci. 2022, 12, 1027.
https://doi.org/10.3390/app12031027
Academic Editor: Enrico Vezzetti
Received: 4 December 2021
Accepted: 17 January 2022
Published: 19 January 2022
Publisher’s Note: MDPI stays neutral
with regard to jurisdictional claims in
published maps and institutional affil-
iations.
Copyright: © 2022 by the authors.
Licensee MDPI, Basel, Switzerland.
This article is an open access article
distributed under the terms and
conditions of the Creative Commons
Attribution (CC BY) license (https://
creativecommons.org/licenses/by/
4.0/).
Article
Eye-Tracking in Interactive Virtual Environments:
Implementation and Evaluation
Pavel Ugwitz
1
, Ondˇrej Kvarda
1,
* , Zuzana Juˇríková
2
,
ˇ
Cenˇek Šašinka
3
and Sascha Tamm
4
1
Department of Geography, Faculty of Science, Masaryk University, Kotláˇrská 267/2,
611 37 Brno, Czech Republic; ugwitz.pavel@mail.muni.cz
2
Department of Educational Sciences, Faculty of Arts, Masaryk University, Arna Nováka 1,
602 00 Brno, Czech Republic; jurikova@mail.muni.cz
3
Department of Information and Library Studies, Faculty of Arts, Masaryk University, Arna Nováka 1,
602 00 Brno, Czech Republic; cenek.sasinka@mail.muni.cz
4
Institute of Psychology, Freie Universität Berlin, Kaiserswerther Straße 16-18, 14195 Berlin, Germany;
tamm@zedat.fu-berlin.de
* Correspondence: kvarda.ondrej@mail.muni.cz
Abstract:
Not all eye-tracking methodology and data processing are equal. While the use of eye-
tracking is intricate because of its grounding in visual physiology, traditional 2D eye-tracking methods
are supported by software, tools, and reference studies. This is not so true for eye-tracking methods
applied in virtual reality (imaginary 3D environments). Previous research regarded the domain
of eye-tracking in 3D virtual reality as an untamed realm with unaddressed issues. The present
paper explores these issues, discusses possible solutions at a theoretical level, and offers example
implementations. The paper also proposes a workflow and software architecture that encompasses
an entire experimental scenario, including virtual scene preparation and operationalization of visual
stimuli, experimental data collection and considerations for ambiguous visual stimuli, post-hoc
data correction, data aggregation, and visualization. The paper is accompanied by examples of
eye-tracking data collection and evaluation based on ongoing research of indoor evacuation behavior.
Keywords:
eye-tracking; virtual reality; eye-tracking algorithms; data collection; data visualization;
dynamic environments; interactive environments
1. Introduction
Researchers are generally fond of valid, reliable measurement methods and tools
that allow the capture of variables containing rich information and high interpretative
value. Eye-tracking may be considered such a profitable method because it captures eye
movements and their derived patterns and finds application in the analysis of many types
of spatial and visual compositions.
1.1. The Argument for Eye-Tracking
One could argue that the measurement variables acquired by eye-tracking have broad
applicability, especially in cognitive science and its applications. The argument defines two
types of variables: extrinsic and intrinsic variables [
1
]. Extrinsic variables (e.g., size) can
be measured straightforwardly without the need for expert knowledge in constructing or
operating the measuring instrument, whereas intrinsic variables are more challenging to
access and measure, as their measurement tools make use of other variables or physics-
based conversions to acquire the intrinsic value (e.g., measurement of blood pressure). In
addition, not all phenomena can be measured by a device situated in the physical realm; for
example, stress levels are not only expressed physically by blood pressure, heartbeat, pupil
dilation, etc., but also by high-level cognitive and neurological factors such as stimulus
habituation, nervous system excitability, etc. These are called constructs [
2
] and generally
Appl. Sci. 2022, 12, 1027. https://doi.org/10.3390/app12031027 https://www.mdpi.com/journal/applsci