ABSTRACT
In 2014, 25% of all organizations polled across industry said the lack of infosec skills were a problem. In 2015, an
Enterprise Strategy Group (ESG) survey found that 28% reported a shortage of infosec skills (Trendmicro, 2015).
With the growing threat of cybercrime and national security issues, growing the number of qualified cybersecurity
professionals has become a national imperative. As the cybersecurity universe is shaped by new technologies,
unknown threats, and increasing vulnerability in a dynamic environment, there is an established need to rapidly
establish innovative, effective, efficient and responsive cybersecurity education initiatives (Dark & Mirkovic, 2015).
One such initiative recently piloted by the Department of Defense is the Cyber Operations Academy Course
(COAC). The first pilot began in May 2015 at the Fort McNair campus in Washington D.C. As a six-month
immersive course, participants consisted of 20 mostly military personnel from all four branches of the military
services, various backgrounds and little if any cyber experience. Employing an authentic problem-based course
using cooperative and collaborative learning models, the pilot consisted of instruction in foundations,
defensive/offensive operations, programming, social engineering, and skills integration. Leveraging cyber ranges
and capture the flag (CTF) activities, the course was also supported by four “fireteam” leads as facilitators, coaches,
and subject matter experts. At the end of the course, students developed cyber capabilities and tools, developed and
deployed exploits, detected and responded to incidents, and used social engineering to exploit “targets.” In
comparison with existing cyber protection teams deployed in DoD installations, the students were as capable and in
some cases more capable in comparisons of performance. In pre/post comparisons, students exhibited potentially
large knowledge gains. This paper discusses the nature of the course’s pedagogy; the challenge of developing
representations of learning outcomes and performance; and the challenges in developing performance-based
assessments to authentically and objectively assess students’ knowledge and skills in the context of the course
ABOUT THE AUTHOR
Dr. P. Shane Gallagher is employed by the Institute for Defense Analysis and is supporting the Advanced
Distributed Learning (ADL) Initiative and OUSD Force Training as a learning scientist and education specialist. He
received his Ph.D. in Instructional Technology from George Mason University and MA in Educational Technology
from the University of New Mexico. Currently, Dr. Gallagher provides learning science and methodological
direction for applied research projects and cybersecurity assessment and is the lead researcher for assessing the
development of the ADL Total Learning Architecture. Dr. Gallagher has directed research on video game design for
cognitive adaptability and learning science implications of the design of the xAPI and is also researching methods to
apply the xAPI and its syntax to describe social learning interactions and human performance especially within
cyber-physical contexts. He has led research projects in cognition and game design and R&D projects in learning
object content models, simulations, reusable pedagogical models, organizational readiness, and knowledge
management. He has been recognized by NASA for his work on assessing the Johnson Space Center on knowledge
management readiness by the JSC Chief Knowledge Officer and has authored papers and chapters on neuroscience,
cognition, game design, and innovative learning technology applications and specifications.