11 July 2023
The Newest Weapon in Irregular
Warfare – Artificial Intelligence
By Mohamad Mirghahari
On the morning of 22 May, 2023, an articial intelligence (AI) generated image of an explosion at
the Pentagon surfaced online and spread like wildre throughout social media. Multiple news sources
reported and shared the AI-generated image on their platforms. As a result, markets responded to the
reports and image, and the S&P 500 index fell in just minutes aer its reporting, causing a $500 billion
market cap swing, even though this image was quickly proven as fake.
Articial intelligence provides an ever-expanding set of new tools that can be applied in irregular
warfare, from targeted disinformation campaigns to military deception (MILDEC). In 2012, a
Department of Defense (DoD) Joint Publication dened MILDEC as content “intended to deter
hostile actions, increase the success of friendly defensive actions, or to improve the success of any
potential friendly oensive action.” e Pentagon deep fake (or AI-generated image), which served
to negatively impact the U.S. economy and create a substantial amount of confused and misleading
reporting, demonstrates that this technology can be used for military deception purposes.
By using articial intelligence to create dierent mediums for inuence, one can potentially create
the illusion of an ongoing war, an attack, a resistance movement, and other versions of collateral for
information operations. is use of AI can meet the goals of MILDEC as dened by the DoD.
e image of the explosion at the Pentagon is just the tip of the iceberg of how AI could be used not only
to drive disinformation, but also to conduct economic sabotage. Across multiple domains, AI can be an
essential part of achieving the objectives of any military operation.
AI can also be used to directly support irregular warfare, such as cyber and inuence operations, in a
number of ways, both strategically and tactically.
For example, AI can support military deception by automating the creation and
dissemination of disinformation, thus removing its development from the hands
of planners and teams required to build inuence campaigns, leading to increased
dissemination of disinformation or messages. Conventional wisdom suggests that
increasing the rate of message output or utilizing highly visible entities as ampliers
(such as celebrity “inuencers”) would help to attract a “stickiness” factor to the
message being disseminated, making it more contagious and thus having a more
lasting impact, an idea popularized in Malcolm Gladwell’s booke Tipping Point.
Beyond conventional approaches, articial intelligence, when coupled with
algorithms designed to inuence targeted audiences, can generate realistic fake news,
social media posts, and other content that manipulates public opinion, confuses
adversaries, creates negative or positive sentiment, inuences networks, or diverts a
population’s attention at a pace which will make it dicult for governments, military
forces, and news outlets to verify or conrm an image or recording is fake.