CRS报告 IF11150国防初级读本—致命自主武器系统政策

VIP文档

ID:24669

大小:0.57 MB

页数:3页

时间:2022-11-30

金币:2

上传者:战必胜
https://crsreports.congress.gov
Updated November 14, 2022
Defense Primer: U.S. Policy on Lethal Autonomous
Weapon Systems
Lethal autonomous weapon systems (LAWS) are a special
class of weapon systems that use sensor suites and
computer algorithms to independently identify a target and
employ an onboard weapon system to engage and destroy
the target without manual human control of the system.
Although these systems are not yet in widespread
development, it is believed they would enable military
operations in communications-degraded or -denied
environments in which traditional systems may not be able
to operate.
Contrary to a number of news reports, U.S. policy does not
prohibit the development or employment of LAWS.
Although the United States does not currently have LAWS
in its inventory, some senior military and defense leaders
have stated that the United States may be compelled to
develop LAWS in the future if U.S. competitors choose to
do so. At the same time, a growing number of states and
nongovernmental organizations are appealing to the
international community for regulation of or a ban on
LAWS due to ethical concerns.
Developments in both autonomous weapons technology and
international discussions of LAWS could hold implications
for congressional oversight, defense investments, military
concepts of operations, treaty-making, and the future of
war.
U.S. Policy
Then-Deputy Secretary of Defense Ashton Carter issued
DOD’s policy on autonomy in weapons systems,
Department of Defense Directive (DODD) 3000.09 (the
directive), in November 2012. U.S. defense officials have
stated that they plan to release an updated directive by the
end of 2022.
Definitions. There is no agreed definition of lethal
autonomous weapon systems that is used in international
fora. However, DODD 3000.09 provides definitions for
different categories of autonomous weapon systems for the
purposes of the U.S. military. These definitions are
principally grounded in the role of the human operator with
regard to target selection and engagement decisions, rather
than in the technological sophistication of the weapon
system.
DODD 3000.09 defines LAWS as “weapon system[s] that,
once activated, can select and engage targets without
further intervention by a human operator.” This concept of
autonomy is also known as “human out of the loop” or “full
autonomy.” The directive contrasts LAWS with human-
supervised, or “human on the loop,” autonomous weapon
systems, in which operators have the ability to monitor and
halt a weapon’s target engagement. Another category is
semi-autonomous, or “human in the loop,” weapon systems
that “only engage individual targets or specific target
groups that have been selected by a human operator.” Semi-
autonomous weapons include so-called “fire and forget”
weapons, such as certain types of guided missiles, that
deliver effects to human-identified targets using
autonomous functions.
The directive does not cover “autonomous or semi-
autonomous cyberspace systems for cyberspace operations;
unarmed, unmanned platforms; unguided munitions;
munitions manually guided by the operator (e.g., laser- or
wire-guided munitions); mines; [and] unexploded explosive
ordnance,” nor subject them to its guidelines.
Role of human operator. DODD 3000.09 requires that all
systems, including LAWS, be designed to “allow
commanders and operators to exercise appropriate levels of
human judgment over the use of force. As noted in an
August 2018 U.S. government white paper, “‘appropriate’
is a flexible term that reflects the fact that there is not a
fixed, one-size-fits-all level of human judgment that should
be applied to every context. What is ‘appropriate’ can differ
across weapon systems, domains of warfare, types of
warfare, operational contexts, and even across different
functions in a weapon system.”
Furthermore, “human judgment over the use of force” does
not require manual human “control” of the weapon system,
as is often reported, but rather broader human involvement
in decisions about how, when, where, and why the weapon
will be employed. This includes a human determination that
the weapon will be used “with appropriate care and in
accordance with the law of war, applicable treaties, weapon
system safety rules, and applicable rules of engagement.”
To aid this determination, DODD 3000.09 requires that
“[a]dequate training, [tactics, techniques, and procedures],
and doctrine are available, periodically reviewed, and used
by system operators and commanders to understand the
functioning, capabilities, and limitations of the system’s
autonomy in realistic operational conditions.” The directive
also requires that the weapon’s human-machine interface be
“readily understandable to trained operators” so they can
make informed decisions regarding the weapon’s use.
Weapons review process. DODD 3000.09 requires that the
software and hardware of all systems, including lethal
autonomous weapons, be tested and evaluated to ensure
they
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭