Lessons learned in the process of conducting the verification and validation of live virtual and constructive distributed environment
暂无分享,去创建一个
To bring reality into models and simulations (M&S), the Department of Defense (DOD) combines constructive M&S with real equipment operated by humans in field environments. When such a live, virtual, and constructive distributed environment (LVC-DE) is assembled, there exist ample opportunities for success or failure depending on many issues. Each M&S tool, along with the means used to connect it to the others, must be examined independently. The combined M&S, the interfaces, and the data they exchange must be tested to confirm that the entire system is interoperable and is achieving its intended goals. Verification and Validation (V&V) is responsible for systematically investigating, creating, and documenting the artifacts needed to assess the credibility of such LVC-DE. The ultimate goal for V&V is to evaluate the capability, the accuracy and the usability of such LVC-DE. The Battlespace Modeling and Simulation V&V Branch has extensive experience performing V&V of LVC-DEs. In a recent project, the task consisted of conducting V&V of the LVC-DE, the supporting infrastructure, and the legacy M&S tools. From a V&V perspective, many things were done correctly; however, several adjustments were necessary to improve the credibility of the LVC-DE. This paper will discuss lessons learned during the implementation and provide recommendations for future LVC-DE applications.