2012 DOTE 作战试验鉴定年度报告

VIP文档

ID:24386

大小:38.16 MB

页数:372页

时间:2022-11-30

金币:10

上传者:战必胜
i
IntroductIon
i
FY 2012
Annual Report
Since my conrmation as Director of Operational Test and Evaluation (DOT&E) in 2009, I have implemented initiatives to
improve the quality of test and evaluation (T&E) within the Department of Defense. I have emphasized early engagement
of testers in the requirements process, improving system suitability by designing reliability into systems from the outset,
and integrating developmental, operational, and live re testing. Implementing these initiatives has revealed the need for an
additional area of focus – the requirement to incorporate statistical rigor in planning, executing, and evaluating the results of
testing.
There are signicant opportunities to improve the efciency and the outcomes of testing by increasing interactions between
the testing and requirements communities. In particular, there should be early focus on the development of operationally
relevant, technically feasible, and testable requirements. In this Introduction, I discuss the crucial role the T&E community
can and should play as requirements are developed. Additionally, I describe DOT&E efforts to institutionalize the use of
statistical rigor as part of determining requirements and in T&E. I also provide an update on the Department’s efforts to
implement reliability growth planning and improve the reliability and overall suitability of our weapon systems. And lastly, I
describe challenges and new developments in the area of software T&E.
Last year, I added a new section to my Annual Report assessing systems under my oversight in 2010 – 2011 with regard
to problem discovery during testing. My assessment fell into two categories: systems with signicant issues observed in
operational testing that should, in my view, have been discovered and resolved prior to the commencement of operational
testing, and systems with signicant issues observed during early testing that, if not corrected, could adversely affect my
evaluation of those systems’ effectiveness, suitability, and survivability during Initial Operational Test and Evaluation
(IOT&E). This year, I am providing an update to the status of those systems identied last year, as well as my assessment of
systems under my oversight in 2012 within those two categories.
THE ROLE OF T&E IN REQUIREMENTS
There is an inherent and necessary link between the requirements and the test communities. The requirements community
must state our ghting force’s needs in the form of concrete, discrete capabilities or requirements. The testing community
must then assess a system that is developed and produced to meet those requirements to determine whether it provides the
military capability being sought; that is, we evaluate the system’s operational effectiveness and suitability when used by our
forces in combat. In my opinion, the collaboration needed between the requirements and the test communities to discharge
these responsibilities needs to be strengthened.
In my report last year, I discussed the Defense Acquisition Executive (DAE) independent assessment of concerns that the
Department’s developmental and operational test communities’ approach to testing drives undue requirements, excessive cost,
and added schedule into programs. The DAE assessment team “found no signicant evidence that the testing community
typically drives unplanned requirements, cost, or schedule into programs.” However, they did note that there were four
specic areas that needed attention:
“The need for closer coordination and cooperation among the requirements, acquisition, and testing communities;
theneedforwell-denedtestablerequirements;thealignmentofacquisitionstrategiesandtestplans;andtheneed
tomanagethetensionbetweenthecommunities.”
The lack of critically needed collaboration among the technical, test, and requirements communities is not new. The 1986
Packard Commission found that success in new programs depends on “an informed trade-off between user requirements, on
one hand, and schedule and cost, on the other.” It therefore recommended creation of a new body representing both military
users and acquisition/technology experts. This ultimately led to the creation of the Joint Requirements Oversight Council
(JROC), which includes the military operators as formal members but includes, as advisors only, the acquisition and test
communities. In 1998, the National Research Council (NRC) identied the need for greater interaction between the test and
the requirements communities; the NRC pointed out that operational test personnel should be included in the requirements
资源描述:

当前文档最多预览五页,下载文档查看全文

此文档下载收益归作者所有

当前文档最多预览五页,下载文档查看全文
温馨提示:
1. 部分包含数学公式或PPT动画的文件,查看预览时可能会显示错乱或异常,文件下载后无此问题,请放心下载。
2. 本文档由用户上传,版权归属用户,天天文库负责整理代发布。如果您对本文档版权有争议请及时联系客服。
3. 下载前请仔细阅读文档内容,确认文档内容符合您的需求后进行下载,若出现内容与标题不符可向本站投诉处理。
4. 下载文档时可能由于网络波动等原因无法下载或下载错误,付费完成后未能成功下载的用户请联系客服处理。
关闭