Datasheet

Year, pagecount:2010, 49 page(s)

Language:English

Downloads:4

Uploaded:April 19, 2021

Size:1 MB

Institution:
-

Comments:

Attachment:-

Download in PDF:Please log in!



Comments

No comments yet. You can be the first!

Content extract

Software Logistics Primer (For Training Purposes Only) April 2010 Version 2.0 Table of Contents Acknowledgements . 3 Change Page . 4 1.0 Introduction . 5 1.1 Purpose and Scope .5 1.2 Background . 5 2.0 What you need to [Know] . 6 2.1 Operational Definitions . 6 2.2 Software Acquisition and Management Policies (Source: AIR-4.1) 10 2.3 Processes, Models and Frameworks. 12 2.4 Software Support Considerations .21 3.0 What you need to [Do] .28 3.1 Solicitation Development and Proposal Review Considerations . 28 3.2 Metrics and Cost Estimating. 29 3.3 Other Logistics Planning Considerations .30 4.0 Where to [Go] for more information .34 4.1 Training .34 4.2 Best Practices .34 4.3 Helpful Web Sites and Resources in Preparation for SOW or SOO Development 35 Appendix A – Software-Specific Considerations for the System Engineering Technical Reviews (SETRs) (Source AIR-4.1) 36 Appendix B – Sample Cost Estimating Template .43 Appendix C – Program Related

Engineering . 44 Appendix D–Review of Funding Appropriations/Categories (Source: AIR-4.1) 46 Glossary of Acronyms .47 2 Acknowledgements This Software Logistics Primer is the result of a collaborative team effort consisting of cross sections of Naval Air Systems Command (NAVAIR) and Defense Acquisition University (DAU). The people listed here contributed their expertise, time, and effort in developing a first-of-its-kind software logistics primer for the NAVAIR community: Pall Arnason - NAVAIR 6.0T / NAWCAD 664 Lorenda Batson - NAVAIR 4.1 Cynthia Black - NAVAIR 4.512 Ken Bogdan - NAVAIR 4.14 Dr. William Conroy - DAU Mid-Atlantic Matthew Cosgrove – NAVAIR 6.645 William Devlin - NAWCAD 6.71 John Johnston - NAVAIR 4.22 Stephen King - NAVAIR 4.1103 William Lankford - DAU Mid-Atlantic Christobol Mendez - NAWCWD 6.633 Joe Moschler - DAU Mid-Atlantic Gary Paramski - NAWCAD 6.64 Mark Roth - NAVAIR 4.1 George Shryock - NAVAIR 6.61 Ernest “Tad” Sylivant - NAWCAD 6.7 / Cherry Point

SSA Barbara Williams – NAVAIR 4.141 Jeanette Aley – NAWCAD 6.645 Please direct any questions about this document to Matthew Cosgrove, AIR-6.645 at 301.7576624 or matthewcosgrove@navymil 3 Change Page Date August 2008 April 2010 Nature of Change Original Update to prior Logistics Management for Software Student Pilot; update to DOD 5000.02; added information on the following topics: DO-178B; ARINC 653; Naval Data Distribution System (NDDS); Life Cycle Sustainment Plan (LCSP); update to Logistics Requirements Funding Summary (LRFS) 4 1 2 Revision Number 1.0 1.1 Introduction Purpose and Scope The role of software as the most critical part of weapons systems is growing. As an example, 80% of the functionality in modern-day aircraft, like the F-35 Joint Strike Fighter (JSF), is dependent on software.1 Consequently, software has become a major cost driver in weapons system, development and life-cycle support. Logisticians play a major role in establishing and executing

the weapons systems support environment as well as estimating the support costs. Software growth will continue as technology advances, so it is critical for logisticians to have the tools necessary to make informed decisions relative to the support of software-intensive systems. This document will not make you an expert in software logistics. It is not a how-to cookbook This short primer is intended to be a knowledge and awareness builder with emphasis placed on what you need to [Know], what do you need to [Do], and where you can [Go] for more information. This is a living document, which will be improved upon over time as NAVAIR builds its body of knowledge in this critical support area. It includes some fundamental principles necessary for software acquisition logistics planning and some pointers to sources of information that will enhance your ability to plan and execute software support. 1.2 Background Nearly all DoD systems rely on software for their operation. Software touches

virtually every facet of military systems, from the more common Information Technology (IT) systems to the less obvious "embedded" software-intensive systems. Because the vast majority of system functionality is now being implemented via software (vice hardware), the Assistant Secretary of the Navy for Research, Development and Acquisition (ASN(RD&A)) Software Acquisition Management Focus Team believes that all current systems should be considered “softwareintensive” unless the Program Manager can explain why they are not. Therefore, there is no need to define what a software-intensive system is. Software is embedded in the aircraft, weapons, ground stations, and support equipment that the Naval Air Systems Command (NAVAIR) delivers to the fleet and supports throughout their life cycle. It adds tools and weapons capabilities that would likely not be possible otherwise. With the advent of software-driven Portable Electronic Maintenance Aides (PEMAs), diagnostics and

prognostics, and maintenance data collection systems, software is also an increasingly critical part of the maintenance environment. Like hardware, software support is the set of activities to ensure that software continues to fulfill its intended role in system operation.2 But there are important differences between hardware and software support. Software cannot be seen or touched in the physical sense, so there must be a greater reliance on and faith in highly-disciplined development and sustainment processes. Software logistics has many elements that together form the discipline of developing and writing the software and then providing it to the right place at the right time with the right training and documentation.3 5 2.0 2.1 What you need to [Know] Operational Definitions To adequately plan for software support, the acquisition logistician must be a value-added and integral part of the software acquisition processes described in later portions of this section. This

requires a working knowledge of: Note: This is not intended to be an exhaustive or completely detailed list, but represents key knowledge areas identified by NAVAIR Subject Matter Experts (SME) . The authors of this document recommend gaining a strong working knowledge of these subject areas. The “where to [Go] for more information”, section provides a recommended reading list. A. Capability Maturity Model Integrated (CMMI®) - A management strategy that emphasizes continuous improvement rather than meeting a standard. Two different representations are available for the CMMI®, a continuous representation (previously used by the System Engineering CMM) and a staged representation (previously used by both the Software and Software Acquisition CMMs). The staged representation shows progress as a series of five levels. Each of these levels is described by certain attributes characterizing its level of competency. Each level is associated with process areas, and each process area is

described in terms of common practices that support that level’s goals. The Software Engineering Institute (SEI) recently released CMMI® for Acquisition Version 1.2 It borrows many process areas from CMMI® for Development Version 1.2 and adds several new process areas focused on acquisition These levels, descriptions, and process areas are defined further under processes, models, and frameworks later in this section. B. Software Coding - The design must be translated into a machine-readable form The code generation step performs this task. Programming tools like compilers, interpreters, debuggers, etc are used to generate the code. Different high-level programming languages like C, C++, Pascal, and Java are used for coding. Also, auto-code generation tools are becoming more prevalent, as are languages such as C# (or C Sharp). These tools are intended to provide more error and boundary checking. For example, input buffers are checked to ensure they do not overflow; this mitigates a

large vulnerability in software. C. Software Configuration Management - Configuration Management of software requires the same disciplined approach as hardware. The configuration items are usually referred to as Computer Software Configuration Items (CSCIs). It is important to note that CSCI documentation should include the architectural and detailed designs. Architectural design of each CSCI should include identification of the software units comprising the CSCI, their interfaces, and a concept of execution among them and the traceability between the software units and the CSCI. This should include all applicable items in the architectural design and traceability sections of the Software Design Description (SDD) Data Item Description (DID) (DIDs are identified in more detail in the next section). Depending on Contract Data Requirements List (CDRL) provisions, design pertaining to interfaces may be included in SDDs or in Interface Design Descriptions (IDDs). The CSCI detailed design

provides a description of each software 6 unit. The result should include all applicable items in the detailed design section of the SDD DID. Depending on CDRL provisions, design pertaining to interfaces may be included in SDDs or in IDDs, and design of software units that are databases or that access or manipulate databases may be included in SDDs or in Database Design Descriptions (DBDDs). Further discussion about Software CM will be included in Section 2.51 D. Software Operations and Support (O&S) Costs (software maintenance & modifications) - The labor, material, and overhead costs incurred after deployment in supporting the update, maintenance and modification, integration, and configuration management of software. Depot-level maintenance activities, government software centers, laboratories, or contractors may incur these costs. These costs include operational, maintenance, support and diagnostic software programs for the primary system, support equipment, and

training equipment. The respective costs of operating and maintaining the associated computer and peripheral equipment in the Software Support Activity (SSA) and the cost to conduct all testing of the software should also be included. The costs of major redesigns that change the system’s mission are not included. Correction of Deficiencies includes the costs to develop, test, and deploy software changes that correct defects in defense systems. Software Enhancements include the costs to develop, test, and deploy software that enhances defense systems, and are included as long as those changes do not alter the basic mission of the system.4 O&S costs are comprised of, but not limited to, the following cost elements:  Training and training support  Configuration management activities  Software tracking tools and analysis  Issuance and documentation of updates  Licensing fees and accreditation  Maintenance costs E. Software Data Rights - The Government typically

receives only standard license rights to use that computer software, computer software documentation or technical data in certain limited ways and only if the proper data rights clauses are in your contract. These standard rights may or may not meet your needs. It is the responsibility of the contracting officer to put the proper data rights clauses in your contract, but it is your responsibility to provide the contracting officer with a complete assessment of your work effort. This assessment should include a determination of your present and projected future uses of the software or other deliverables. This assessment is called a “Data Rights Requirements Analysis” (DRRA) and should be conducted prior to contract award, taking into consideration such factors as multiple-site or shared-use requirements, and whether the Government’s software maintenance philosophy will require the rights to modify or have third parties modify the software. If the DRRA determines that the standard

data rights clauses do not provide sufficient rights to meet your needs and the future needs of the federal government, additional rights may be obtained through negotiations with the contractor, sometimes at an additional cost. These negotiations will be conducted for you by the contracting officer.5 7 F. Software Design - The software design phase falls between requirements definition and integration. As shown in the figure below, the inputs to the software design process are the software requirements. The software developed during the design phase is passed on to the integration and test phase. The design process involves an iterative series of progressively more detailed phases that eventually produce coded software modules ready for integration. Each of these phases is further defined below Remember that there may be more activities or additional iterations of the three shown depending on the development life cycle used. For example, the spiral development model will go

through this process two or more times to arrive at the final product. G. Software Documentation - The software development library should contain a controlled collection of software, documentation, other intermediate and final software products, and associated tools and procedures used to facilitate the orderly development and subsequent support of software.6 H. Software Maintenance - Modification of a software product after delivery to correct faults, to improve performance or other attributes, or to adapt the product to a modified environment.7 Software maintenance typically consists of the following activities:  Corrective maintenance: reactive modification to correct discovered problems  Adaptive maintenance: modification to keep software usable in a changed environment • Perfective: modification to improve performance or maintainability • Preventive: modification to detect and correct latent faults I. Software Support Environment - A host computer system and other

related equipment and procedures located in a facility that provides a total support capability for the software of a target computer system (or a set of functionally and physically related target computer systems). The environment enables the performance of a full range of services including: performance evaluation, system and software generation, development and testing of changes, simulation, emulation, training, software integration, configuration management, and operational distribution for the software. There typically two types of software support environments:  Developmental Software Support Environment (DSSE) - Those approved contracting activity resources identified by a software contractor to be used to support the software requirements under the contracted efforts. 8  Life Cycle Software Support Environment (LCSSE) - Those contracting activity resources used by the life cycle software support activity to provide total life cycle software support for assigned

target computer systems.8 J. Software Requirements Definition - Like hardware, software requirements define capabilities or conditions possessed by a system or a component needed to solve a problem or achieve an objective. They can be divided into two categories, functional and nonfunctional. Functional requirements describe what a system should do in response to a specific stimulus. Nonfunctional requirements include performance and system constraints that affect development and design. K. Software Testing (Verification and Validation) - Planning for verification and validation of software begins at onset of development. During verification, the work product (the ready part of the software being developed and accompanying documentation) is reviewed, examined personally by one or more persons in order to discover the defects. This process helps to prevent potential bugs, which may cause failure of the project. Verification and validation processes go hand in hand, but the validation

process starts after the verification process ends (after coding of the product ends). Each verification activity (Requirement Specification Verification, Functional Design Verification, etc.) has its corresponding validation activity (Functional Validation/Testing, Code Validation/Testing, System/Integration Validation, etc.) During the validation process, test plans, test suites, and test cases are developed, which are used during the various phases of validation. The phases involved in validation are: Code Validation/Testing, Integration Validation/Integration Testing, Functional Validation/Functional Testing, and System/User Acceptance Testing/Validation. L. Software Support Activity (SSA) - A Software Support Activity assumes the role of providing post-deployment life cycle support for modifications or upgrades made to a systems software following the systems initial fielding. System modifications and upgrades include multi-system changes, block changes, preplanned product

improvements, repair of deficiencies reported by the user, and other types of system change packages. The SSA organization typically compiles these needed updates into formal software releases to avoid disrupting the fielded system. Software development activities performed by a SSA in providing life cycle support are the same as those carried out during the development effort that led to the first fielding. They are tailored, as appropriate, to reflect the effort required to implement each change package, update pertinent documentation, verify the changes, and distribute the changes to users. M. Software Reusability - Because of the tremendous cost involved in developing new software, there is persistent pressure to reuse software. Software reuse is not a panacea and must be considered carefully using well thought out risk assessment criteria. The following are examples of specific criteria, but should not be considered an exhaustive list. This is also a good list of considerations

for any software product, regardless of whether it is reused or newly coded:  Ability to provide required capabilities and meet required constraints  Ability to provide required safety, security, and privacy 9  Reliability/maturity, as evidenced by established track record  Testability  Accessibility, ensure that the software accommodates the widest range of users including those with disabilities  Transportability, as evidenced by flexibility to run on multiple operating systems such as Linux, Unix, Microsoft Windows  Interoperability with other system and system-external elements  Fielding issues, including: • Restrictions on copying/distributing the software or documentation • License or other fees applicable to each copy  Maintainability, including: • Likelihood the software product will need to be changed • Feasibility of accomplishing that change • Availability and quality of documentation and source files • Likelihood that the current

version will continue to be supported by the supplier • Impact on the system if the current version is not supported • The acquirer’s data rights to the software product • Warranties available • Short and long-term cost impacts of using the software product • Technical, cost, and schedule risks and tradeoffs in using the software product 2.2 Software Acquisition and Management Policies (Source: AIR-4.1) The intent of this section is to make the reader aware of some of the more recognized policies, regulations, and laws that govern the acquisition and sustainment of software, including software architecture. It is by no means exhaustive, but can serve as a quick reference For more specific detail, bookmark the Defense Acquisition University (DAU) website and visit it frequently as you develop your own reference library of relevant policy documents: http://www.daumil/ Information regarding software acquisition may also be found in the NAVAIR Acquisition Guidebook

(Engineering Disciplines, Sections G and J): http://nawctsd.navairnavymil/Resources/Library/Acqguide/Acqguidehtm DoD software acquisition and sustainment policies are derived from a number of laws and regulatory documents that include specific direction and guidance relative to:  Developing software system architectures that support open system concepts – for improved maintenance  Exploiting commercial off-the-shelf (COTS) computer systems products – to reduce the cost for acquiring NAVAIR systems  Allowing for incremental improvements using modular, reusable, extensible software – to improve agility  Identifying and exploiting software reuse opportunities – to reduce costs and errors  Planning for software life cycle support and acquiring needed documentation and tools – to ensure that we can meet the warfighter’s needs during operation and support 10 2.21 Public Law Relative to Software Acquisition A. Clinger-Cohen Act of 1996 (now referred to as

Title 40/CCA)  Governs the acquisition of software-intensive systems within the Federal Government  Seven requirements: • Capital Planning and Investment Control Process Content • Performance and Results Based Management • Appointment of Chief Information Officers (CIO) • Investment Accountability • National Institute of Standards and Technology (NIST) • Modular Contracting • National Security Systems (NSS) B. Government Performance and Results Act (GPRA)  Requires agencies to conduct strategic planning and use performance measurements to assess progress C. Section 804 National Defense Authorization Act of FY 2003  Mandates that the Secretaries of the Army, Navy, and Air Force, and heads of Defense Agencies, establish Software Acquisition Process Improvement Programs (SAPIPs)  Documents processes, criteria to determine performance effectiveness, and metrics for performance measurement and mechanisms to ensure adherence that are consistent with the Defense

Acquisition System D. Federal Information Security Management Act (FISMA)  Places requirements on government agencies and components to improve the security of federal information and information systems 2.22 Key Federal Architecture Requirements and Policies A. Office of Management and Budget (OMB) Memo 97-16  Aligns requirements with processes that support agencies missions and goals; adequate interoperability, redundancy, and security of information systems; and application and maintenance of a collection of standards by which an agency evaluates and acquires new systems B. Federal Acquisition Regulation (FAR) Part 39, Acquisition of Information Technology  Regulates exchange or sale of Information Technology (IT) , security and privacy for computer systems, standards, acquisition of Automated Data Processing (ADP) by contractors, acquisition of telecommunications services, and other appropriations issues C. OMB Memo 05-23 11  For all major IT projects within the

federal government, this OMB Memo requires the use of performance measurement baselines, the assignment of qualified project managers for each project, and the use of an Earned Value Management System (EVMS) D. Chairman of the Joint Chiefs of Staff Instruction (CJCSI) 31701C  Requires the development of integrated architecture products for supporting acquisition documentations: • Joint Capability Integration and Development System (JCIDS) • Information Support Plans (formerly called C4I Support Plans) • Capability gap and redundancy analysis E. CJCSI 621201C  Requires architecture products be used in the J-6 interoperability and supportability certification process  Specifies which architecture products are required for the Initial Capabilities Document (ICD), Capability Development Document (CDD), and Capability Production Document (CPD) F. Secretary of the Navy Instruction (SECNAVINST) 50002C  Establishes Department of the Navy (DoN) acquisition policy  Directs

the Program Executive Officer (PEO)/Program Manager (PM) to develop mission integrated architectures in support of the CDD/CPD process  ASN (RD&A) Chief Engineer (CHENG) to assist the Requirements Officers (RO) and PMs in the development of operational and system architectural views 2.3 Processes, Models and Frameworks When used together, the following key processes, models, and frameworks provide for efficiency and effectiveness of software acquisition, development and support. They are based on best practices throughout industry, government, and academia and are designed to work with each other:  Basic systems engineering  Institute of Electrical and Electronics Engineers, Inc./Electronic Industries Alliance (IEEE/EIA) 12207 software life cycle processes  DO-178B Software Considerations in Airborne Systems and Equipment Certification  ARINC 653 Avionics Application Standard Software Interface  Capability Maturity Model Integration (CMMI®) best practices

framework  Systems Engineering Technical Review (SETR) acquisition framework 2.31 Basic Systems Engineering for Software The process of software engineering follows the elements of the systems engineering process. Systems engineering for software involves the following five functions: 12      Problem definition Solution analysis Process planning Process control Product evaluation These functions are manifested in the following life cycle processes which, other than planning, are defined in greater detail in the operational definitions section:  Software Development Planning  Define customer needs and required functionality  Plan the project  Software Requirements Analysis  Document the requirements  Validate and baseline requirements  Software Design  Develop the design based on the requirements  Validate that design meets all requirements and a baseline is established  Produce the item based on the design  Software

testing/validation  Perform system validation  Ensure all requirements are met and the required functionality performs as expected IEEE 12207 Software Engineering Model 13 2.31 IEEE/EIA 12207 Software Life Cycle Processes IEEE/EIA 12207 process framework is considered the Navy standard for software life cycle management. It is a common framework/structure that supports the life cycle of software from conceptualization through retirement. It was designed to support the functions of software acquisition, development, supply (or deployment), as well as operations and maintenance. These functions are supported through rigorous quality assurance, configuration management, joint review, audit, verification, validation, problem resolution, and documentation processes. The 12207 framework is also designed to:  Manage and improve the organization’s processes and personnel  Establish software management and engineering environments based upon the life cycle processes as

adapted and tailored to serve business needs  Foster improved understanding between customers and vendors and among the parties involved in the life cycle of a software product IEEE/EIA 12207 is packaged in three parts:  IEEE/EIA 12207.0, Standard for Information Technology-Software Life Cycle Processes: Contains International Organization for Standardization/International Electrotechnical Commission (ISO/IEC) 12207 in its original form and six additional annexes (E through J): Basic Concepts, Compliance, Life Cycle Process Objectives, Life Cycle Data Objectives, Relationships, and Errata (errors)  IEEE/EIA 12207.1, Guide for ISO/IEC 12207, Standard for Information TechnologySoftware Life Cycle Processes-Life Cycle Data: Provides additional guidance on recording life cycle data  IEEE/EIA 12207.2, Guide for ISO/IEC 12207, Standard for Information TechnologySoftware Life Cycle Processes-Implementation Considerations: Provides additions, alternatives, and clarifications to the

ISO/IEC 12207 life cycle processes as derived from United States practices A model of the resultant life cycle processes is depicted in the figure below, which is followed by a description of key products resulting from the processes. 14 SUPPORTING LIFE CYCLE PROCESSES PRIMARY LIFE CYCLE PROCESSES Acquisition Documentation Supply Configuration Management Quality Assurance Operation Verification Validation Development Joint Review Maintenance Audit Problem Resolution ORGANIZATIONAL LIFE CYCLE PROCESSES Management Infrastructure Improvement Training IEEE/EIA 12207 Process Framework There are a number of key products resulting from employment of the processes within the 12207 framework:  Development Process Plan  Defines the objectives, standards, and software life cycle models to be used in the software development processes  We (NAVAIR) typically do a Software Development Plan (SDP) instead  Project Management Plan  Defines the technical and managerial

processes necessary to satisfy project requirements  Organization structure, plan for managing project, engineering environment, quality assurance, configuration management, work breakdown structure, etc. (can be in SDP)  Software Requirements Description (SRD)  Specifies the requirements for a software item and the methods to be used to ensure that each requirement has been met  Used as the basis for architecture, design, and qualification testing of a software item 15  Software Architecture Description (SAD)  Describes the software item-wide design decisions and the software item architectural design. It includes, but is not limited to: • Software architecture general description • Software component definition • Identification of software requirements allocated to each software component • Software component concept of execution • Resource limitations and the strategy for managing each resource and its limitation  Software Design Description (SDD)

 Describes the design of a software item and how the software item satisfies the software requirements, including algorithms and data structures. It typically includes, but is not limited to: • Software item input/output description • Static relationships of software units • Concept of execution, including data flow and control flow • Requirements traceability: - Software component-level requirements traceability - Software unit-level requirements traceability • Rationale for software item design • Reuse element identification  Software Interface Design Description (SIDD)  Describes the interface characteristics of one or more systems, subsystems, hardware items, software items, manual operations, or other system components • May describe any number of interfaces • We (NAVAIR) tend to use an Interface Control Document instead • Includes a test plan with the following attributes: - Plans for testing - Environment to be used - Schedule of events - Resources to

include personnel, labs, platforms, and ranges  Test Procedures  Describes the test preparations, test cases, and test procedures to be used to perform testing of a software item, system, or subsystem  Enables the acquirer to assess the adequacy of the testing 16 2.33 Software Considerations in Airborne Systems and Equipment Certification (DO178B) This guidance for software development is accepted by the Federal Aviation Administration (FAA) as a means of certifying software in avionics. DO-178B is primarily concerned with the development processes. As a result, certification to DO-178B requires delivery of multiple supporting documents and records. The quantity of items needed for DO-178B certification, and the amount of information that they must contain, is determined by the level of certification being sought. The certification level is either A, B, C, D, or E. Correspondingly, these DO-178B levels describe the consequences of a potential failure of the software.

The Design Assurance Level (DAL) was determined from the safety assessment process and hazard analysis by examining the effects of software failure in the system. The analysis categories are as follows:  A. Catastrophic – failure may cause a crash  B. Hazardous – failure has a large negative impact on safety or performance, or reduces the ability of the crew to operate the plane due to physical distress or a higher workload, or causes serious or fatal injuries among the passengers  C. Major – failure is significant, but has a lesser impact than a hazardous failure (for example, leads to passenger discomfort rather than injuries)  D. Minor – failure is noticeable, but has a lesser impact than a major failure (for example, causing passenger inconvenience or a routing flight plan change)  E. No effect – failure has no impact on safety, aircraft operation, or crew workload 2.34 Avionics Application Standard Software Interface (ARINC 653) ARINC 653 is a software

specification for space and time partitioning. It defines an Application Programming Interface (API) for software of avionics following the architecture of Integrated Modular Avionics (IMA). It is part of ARINC 600-Series Standards for Digital Aircraft and Flight Simulators. ARINC 653 defines an Application Executive (APEX) for space and time partitioning that may be used wherever multiple applications need to share a single processor and memory, in order to guarantee that one application cannot bring down another in the event of application failure. Logisticians need to realize the importance of the ARINC 653 specification. Applications that use the ARINC 653 API can be more easily ported form one ARINC 653 operating system to another than those that do not. This standard interface promotes software sustainability If software is developed to the ARINC 653 specification it can be easily used on other 653 operating systemspromoting commonality and transportability. 2.35 Capability

Maturity Model Integration® (CMMI®) For the uninitiated, the following section will provide some top-level concepts that must be explored further to gain a working knowledge of CMMI® and its critical importance for the development and support of software. As of this writing, CMMI® Version 12 is the latest 17 offering. In order to gain a working knowledge of CMMI® and its implications for your program, bookmark the System Engineering Institute (SEI) CMMI® website and visit it often: http://www.seicmuedu/cmmi/ Capability Maturity Model® Integration (CMMI®) is a process improvement approach that provides organizations with the essential elements of effective processes. It can be used to guide process improvement across a project, a division, or an entire organization. CMMI® helps to integrate traditionally separate organizational functions, set process improvement goals and priorities, provide guidance for quality processes, and provide a point of reference for appraising

current processes.9 CMMI® is a framework that consists of best engineering practices to address development and maintenance of products and services covering the entire product life cycle from initial concept through delivery and maintenance. CMMI® is internationally recognized and within the United States is utilized by industry, the government, and academia. According to the Software Engineering Institute, CMMI® supports the premise that “the quality of a system or product is highly influenced by the quality of the process used to develop and maintain it.” In terms of benefits, CMMI®:  Embodies software engineering, systems engineering, and acquisition in a single framework  Is essentially a process improvement model  Provides an integrated approach to software and systems engineering as a means for reaching business objectives:  Improve quality  Reduce costs  Optimize schedules  Gauge how well the process improvement program is working  Provides

methodologies and guidelines for the establishment and improvement of processes, supporting efficiency and effectiveness  Serves as a standard by which software development and support is measured for improvement initiatives  Enhances consistency 2.351 CMMI® for Development (Version 1.2) CMMI® for Development (Version 1.2) contains 22 processes divided into 4 categories focuses on process area capability as measured by capability levels (known as Continuous Representation):     Process management (5) Project Management (6) Engineering (6) Support (5) 18 The same 22 processes are broken out into 5 maturity levels (1-lowest level through 5-the highest level) and focus on organizational maturity as measured by maturity levels (known as Staged Representation):      Initial Managed (7) Defined (11) – *minimum required level Quantitatively managed (2) Optimizing (2) The maturity levels and associated key process areas are depicted in the figure

below: * In 2006, Dr. Delores Etter, then ASN (RD&A), issued a memorandum requiring all Requests for Proposal (RFPs) for Software Development to include contract language that provides the Navy with confidence that software integrators and development contractors have well-documented, standardized processes as well as continuous software improvement practices, equivalent to that articulated by CMMI® Level III. 2.352 CMMI® for Acquisition (Version 1.2) The CMMI® for Acquisition works with version 1.2 CMMI® for Development All CMMI® Version 1.2 processes for Acquisition focus on the activities of the acquirer  16 process areas are CMMI® Model Foundation (CMF) process areas that cover:  Process management  Project management  Support process areas 19  6 process areas focus on practices specific to acquisition:  Agreement management  Acquisition requirements development  Acquisition technical management  Acquisition validation  Acquisition

verification  Solicitation and supplier agreement development 2.353 CMMI® Lessons Learned The program logistician should have a working knowledge of the CMMI® construct and application in order to gain insight into the process maturity of developers and acquirers and adequately assess potential supportability risks. A key lesson learned is that while a whole division or company may have a CMMI® staged representation rating, NAVAIR has experienced problems with individual teams not working at that level. Consequently:  Good artifacts for one project may overshadow lack of poor artifacts from another  Your team may have been at early stages and had no artifacts for later stages  The division or company may have a CMMI® Level 3 or higher, but your team may be working at Level 1  May not be using the processes that received the rating  Team may have been developed from other teams that did not participate in rating It is also important to note that CMMI® ratings

are good for three years and then need to be assessed again, preferably by an independent assessor. 2.36 System Engineering Technical Review (SETR) The Systems Engineering Technical Review policy and process is described in NAVAIRINST 4355.19D The SETR was developed by NAVAIR and is being adapted for use within the entire DoD. It includes an iterative program timeline that maps the technical reviews to the acquisition process described in DoD 5000 documentation. The SETR applies to all personnel supporting all NAVAIR and Aviation Program Executive Officer (PEO) programs involved with the design, development, test and evaluation, acquisition, in-service support, and disposal of naval aviation weapon systems and equipment.10 2.361 SETR Timing and Software Specific Reviews The SETR reviews are timed and sequenced in concert with the phases of the program:      Material Solution Analysis – ends with Milestone A Technology Development – ends with Milestone B

Engineering and Manufacturing Development – ends with Milestone C Production & Deployment – ends with Initial Operating Capability (IOC) Operations & Support – ends with System Disposal 20 Software considerations are integral to every SETR (please see Appendix A). Specific details of each SETR can be located at https://nserc.navymil or on the DAU Knowledge Sharing website at https://acc.daumil/CommunityBrowseraspx?id=22513 It is highly recommended that you visit one of these two sources to review the SETR processes and checklists. You will find that each checklist contains a logistics and software radio button near the top of the checklist that will hyperlink to those respective areas of the checklist. The SETR process has been updated in the DoD INST 5000.02 The following chart depicts the latest roadmap: A Materiel Solution Analysis B Technology Development Materiel Development Decision ITR ASR TRA (Ships) • • • • • • • Program Initiation Post

PDRA SRR SFR PDR C Engineering and Manufacturing Development TRR FOC Production & Deployment LRIP/IOT&E PostCDRA CDR IOC SVR (FCA)/ PRR Operations & Support FRP Decision Review ISR PCA TRA TRA Initial Technical Review (ITR) Alternative Systems Review (ASR) Systems Requirements Review (SRR) System Functional Review (SFR) Preliminary Design Review (PDR) Critical Design Review (CDR) Post-PDR Assessment (Post-PDRA) • • • • • • • Post-CDR Assessment (PCDRA) Test Readiness Review (TRR) System Verification Review (SVR) Functional Configuration Audit (FCA) Production Readiness Review (PDR) Operational Test Readiness Review (OTRR) Physical Configuration Audit (PCA) • Technology Readiness Assessment (TRA) • In-Service Review (ISR) Defense Acquisition Management System 2.4 Software Support Considerations Hardware support activities are typically dominated by corrective maintenance (fixing and/or replacing a failed part) and preventive

maintenance (to prolong service life). When hardware fails, the repair person replaces the failed part with an identical but functioning part. When software fails, the software engineer does not replace the offending code with an identical piece of code. The code must be modified to function correctly11 Like hardware, you must plan for fixing software; unlike hardware, it typically does not dominate software support activities. In all, software has eight drivers of change: defect corrections, threats, [policy or] doctrine, safety, interoperability, hardware changes, technology insertion, and functional changes.12 As a result, there are four primary sustainment activities in DoD systems that dominate software support, which must be planned for in your support strategy:  Improving performance and other attributes* 21  Adapting software to new hardware  Adding features and functions to the software to respond to new user requirements and/or threat environments  Improve

efficiency and reliability13 These sustainment activities are manifested in the maintenance activities defined in the beginning of this chapter and repeated here:  Corrective maintenance: reactive modification to correct discovered problems  Adaptive maintenance: modification to keep software usable in a changed environment  Perfective: modification to improve performance or maintainability  Preventive: modification to detect and correct latent faults * Important Note: The Office of the Secretary of Defense (OSD) does not include this in Operations and Support (O&S) costs currently, but this is under review for inclusion in O&S cost estimating. Currently, there must be distinction between significant performance enhancement modifications and those related to reliability and safety improvements. The latter fall into the O&S cost realm14 2.41 Software Replication, Distribution, Installation, and Training (RDIT) Process – Logistics Elements Software

maintenance activities require modification of the software, which then must be distributed to the fleet. The RDIT process – originally developed by the Army – offers key planning considerations for logisticians to consider in developing their support strategy. The planning and implementation of RDIT must begin with the emerging baseline software product and requires a thorough review of the Concept of Operations (CONOPS). It is important to note that while each Integrated Logistics Support (ILS) Element may not serve a specific function in software acquisition, when developing and/or fielding a software program, the logistician is responsible for ensuring that each element is directly or indirectly accounted for. When evaluating RDIT support considerations, one should consider the provisions of the Assistant Program Manager for Logistics (APML) Logistics Handbook and more specifically, the support requirements of the Life Cycle Sustainment Plan (LCSP) (formerly the Acquisition

Logistics Support Plan (ALSP)):  Maintenance Planning  A traditional maintenance plan specifically for software might not apply; however, the discipline of maintenance planning, that is the analysis of what failures could occur, what skill level is needed to fix the projected failures etc, still applies to the software. Software maintenance is, in essence, a change to the software as the code is being re-written. Despite the fact that an O-Level technician will probably not be performing maintenance on software, the logistician is still responsible for planning for the life cycle support of the software.  Supply Support 22  Any hardware requirements for housing, loading, recording, transferring, and storing of software data must be planned for to ensure the end user has all equipment necessary to perform all functions required of the software program.  Technical Data  Technical data packages and manuals – drawings, specs, standards, performance requirements,

packaging, operational and maintenance documents are required when determining types of interfaces required to integrate a software system. Based on the sustainment strategy, it must be considered whether data rights and software source code should be procured, or whether open source software is available, to minimize costs of licensing fees with vendors.  Computer Resources Support  Software Support Activities usually drive the need for Computer Resources for a software product. For example, a Software Integration Labe (SIL) may be required to replicate aircraft avionics and aid in troubleshooting and testing any software changes.  Packaging, Handling, Storage & Transportation (PHS&T)  PHS&T planning specifically for software might not be applicable, but there are some things to consider. Transportation of software might be done via Personal Computer Memory Card International Association (PCMCIA) card. Although the PCMCIA Card would not be PHS&T for the

software, it is beneficial to think about how software will be provided to the end user. If software is field loadable, the Support Equipment element might be more applicable. PHS&T planning is imperative to ensure that all hardware systems that may be loaded with software are not exposed to any environment (electromagnetic interference, vibration, temperatures, humidity, shock, etc.) that would damage the software  Support Equipment (SE)  Software products may not require any SE, but SE can certainly enable supportability of a software system. For example, common SE such as the Memory Loader Verifier Set (MLVS) and the Next Generation Software Loader (NGSL) are critical to minimizing sustainment cost. Most likely, the software within a Weapons Replaceable Assembly (WRA) will change several times in its life cycle. The least expensive, fastest method of updating the software is loading the new software at the Organizational Level of maintenance using Common Support Equipment

(CSE).  Manpower & Personnel  Manpower requirements determination requires that the proper personnel are employed to operate, maintain, and support the software, and train the end users of the software system.  Training & Training Support  Training and Training Support is required for hardware devices, services, media, materials, curriculum, manuals, facilities, systems, instructor, and support personnel. 23 End users of software systems may need training in the form of on-the-job, formal classroom training, computer-based training or other methods. The training methods and systems required for training are as imperative to software systems as to hardware systems.  Facilities  Facilities planning is critical to hardware & software in the areas of ship integration, buildings, power, chambers, test equipment, library, and equipment cooling. Facilities may require special temperature or humidity controls to eliminate the possibility of interference with

or damage to hardware, which could result in software failure. Under Computer Resources Support, a Software Integration Lab was mentioned. Labs or other facilities required to test, operate, and/or maintain the software product need to be considered when considering this ILS element.  Design Interface  Reliability, maintainability, survivability, standardization, interoperability, transportability, human factors engineering, and safety must be addressed for software products. When attempting to influence the design of a software system/product, the logistician should emphasize standardization (such as ARINC 653) that allows for transportability of the software from one environment to another.  Configuration Management (CM)  The same principles and discipline are in effect for software configuration management as for hardware configuration management. A major challenge that a logistician will face with software CM is the Operational Flight Program (OFP). In one scenario, the

OFP, the embedded software within a WRA, changes but the hardware configuration of the WRA remains the same, thus modifying the system’s configuration. This lends to the possibility of supply support issues in which the WRA part number may remain the same, but the end user receives a WRA with an incompatible software load. As was discussed in the Support Equipment element, the logistician must plan for the OFP to change during the life cycle of a WRA. Allowing for software loading at the O-Level or I-Level of maintenance using Common Support Equipment promotes cost-effective software sustainment. Also, CM becomes easier to manage. If the hardware configuration of the WRA remains constant, and just the OFP/software changes, via software loading at the O/I-Level the part number of the WRA can remain the same. The primary method of distributing software to the Fleet is via the Naval Data Distribution System (NDDS). • NDDS provides the war fighter with a worldwide web accessible,

single source – for authorized software, related products, and intelligence user data files up to SECRET classification – regardless of customer location. NDDS will standardize best practices for data delivery within the NAVAIR community and then expand to other SYSCOM communities. https://nddsnavairnavymil  Logistics Requirements Funding Summary (LRFS) 24  Pending NAVAIR-6.0 LRFS policy, the LRFS Guidebook found on the APML essentials page provides the new LRFS format/structure. The table below shows the new LRFS structure for each of the Logistics Elements to be addressed in the LRFS: LRFS LOGISTICS ELEMENTS 1 ILS Program Management 2 Logistics Engineering Analysis, Plans, & Studies 3 Technical Data / Automated Logistics Environment (ALE) 4 Spares & PHS&T 5 Peculiar Support Equipment 6 Common Support Equipment 7 Training & Training Systems 8 Test Support, Site Activation, & Facilities 9 Mod Kits & Government Furnished Equipment

(GFE) 10 Sustainment The ILS Program Management element identifies the funding requirements for Gov/Mil/CSS personnel to manage the logistics program. It also captures the funding required for Configuration Management. The Logistics/Engineering Analysis, Plans, and Studies element identifies funding required for the development of logistics analysis, plans, and studies. Most of these requirements are for the logistics Contract Data Requirements List (CDRL). The Analysis Plans and Studies element lays the foundation for the Logistics Elements that follow. For example, the APML requires Supply Support data, Provisioning Data and Interim Support Item Lists first. Then the APML uses that information to budget for the actual procurement of Spare Parts for Supply Support. Then each Element focuses on the specific products that are required for supporting the weapon system. For the purpose of this document, the focus will be on the impacts software has on the Elements. Most of NAVAIR

software is tied to hardware in some way. If you are supporting a software only product, you would need to identify each element and sub-element for its applicability to your software only program. A key software document developed in the Logistics/Engineering Analysis Plans and Studies is the Computer Resources Life Cycle Management Plan (CRLCMP). The APML should identify the requirements and funding needed to develop, update, and maintain the CRLCMP for the software program. 25 The Technical Data/ALE element in the LRFS is essential for software. Data rights/source code requirements would be defined under this element. Any software product must be able to operate within the ALEthose requirements associated with ALE are documented in this element as well. Each additional LRFS Element is traditionally associated with procurement of equipment. Software requirements must be examined closely in order to ensure they are met. The RDIT process and the methodology utilized to tailor its

principles to develop a viable software support strategy must be considered in the context of the acquisition strategy for the program and logistics planning requirements of the LCSP. In addition to the LCSP, the CRLCMP describes the development, acquisition, test and support plans over the life cycle of computer resources integral to or used in direct support of systems, and may be included as part of the [LCSP].15 With the development of the LCSP and CRLCMP in mind, the following are key planning considerations and associated metrics provided by the Software Engineering Center (SEC) of the Army Communications and Electronics Command (CECOM). It may be helpful to consider these activities in the context of the general software sustainment process model provided at the end of this section:  Replication  Commercial-Off-The-Shelf (COTS) / Government Off The Shelf (GOTS)  Throughput capability of equipment  Cost of equipment  Time to replicate  Security classification

 Existing usable facility or process  Identification of software • Computer Program Identification Number (CPIN) • National Stock Number (NSN)  Labeling • Standard format and location  Total cost to replicate  Media type  Distribution  Time in distribution channels to reach the user  Security classification  Type packaging required (if applicable)  Availability of web-based and/or satellite-based services for distribution  Cost to distribute  Interoperability coordination  Installation  Time to install  Team required  Synchronized coordination necessary 26  Training  Classroom  Documents (self instruction)  Interactive satellite instruction  Computer-aided instruction RDIT Readiness Metrics Considerations:  Time     Software transmission Courier process Classified distribution Worldwide web distribution  Process  Fleet acceptance process  Interoperability checks  Downloading •

Operational • Maintenance  Resources  Equipment/Staffing  Direct Support Staff Sufficient?  Direct Support Downloader Mission Accepted?  Replication Equipment Available in Theater?  Validation of Replicated Media Required?  Test Bed Available for Replicated Media?  Downloading of Firmware Required?  Training on All Downloading Equipment Available? Provided? General Software Sustainment Process 16 27 3.0 3.1 What you need to [Do] Solicitation Development and Proposal Review Considerations The Acquisition Logistician must participate in the design of the product/system to assist/facilitate the “engineering in” of supportability and maintainability of the software product. The APML Essentials web site (https://home.navairnavymil/air66wiki) is a useful tool that provides easily updated, essential knowledge to the APML user community in various topical areas of logistics acquisition and system sustainment. Invaluable information is available ranging

from topics such as development of the solicitation and review of proposals through system disposal.  Solicitations and Proposals: Ensure that the principles, practices, techniques and disciplines of “software logistics and design for supportability” will be met by the supplier/developer. These will include but are not limited to:  Structured programming  Object-oriented programming  Development of software with sustainment in mind  High-order programming languages  Well-defined and routinely followed development processes  Computer-aided Software Engineering (CASE) tools  Robust and consistent documentation  Configuration management for all software builds  Past performance  Extent of CMMI® certification (Level III minimum) 3.11 Development of Statement of Work (SOW) and / or Statement of Objectives (SOO) Ensure the contract language and data requirement(s) specifically addresses state of the art software development and support processes:

Contract Language - As part of the ASN (RDA) Software Process Improvement Initiative of 2006/2007, Dr. Delores Etter issued a memorandum requiring specific contract language for all contracts containing software development, acquisition and life cycle support with RFPs issued after January 1, 2007. The memo is titled “Software Process Improvement Contract Language.” A. CDRLs and DIDs - The ASN (RDA) contract language memorandum includes specific provisions for the inclusion of a SDP as a required CDRL and detailed language for the associated. B. Data Rights - Ensure data rights are thoroughly examined in the context of the anticipated support strategy. The importance of doing this cannot be overemphasized Software deliveries must include source code, documentation, and all that is needed in 28 addition to the software itself to continue maintaining the software. A detailed explanation of the implications of data rights and how to ensure proper consideration for the contract

can be found in section 2.1F of this document C. The RDIT Process - Ensure the RDIT process or some equivalent of it is considered in the solicitation, review of proposals, support strategy and logistics planning documents. A detailed explanation of the RDIT process can be found in section 251 of this document. 3.2 Metrics and Cost Estimating Software metrics are a unique challenge, as NAVAIR Reliability and Maintainability expertise is centered on hardware. However, to ensure with confidence that the software will perform as intended, metrics must be established. First, establish what metrics will be used and for what purpose. Begin with the development, support and cost estimating process: A. Development and Support - Directly engage with your Integrated Product Team (IPT) Software System Engineers from AIR-4.1 to determine the development metrics Pay particular attention to units of measure such as Software Lines of Code (SLOC) or Function Points as you will utilize these metrics

as inputs for the AIR-4.2 cost estimators You should also gain a working knowledge of what productivity metrics will be used as well and to what extent Earned Value Management (EVM) will be applied. Productivity and EVM metrics should be monitored throughout the development phase as they are leading indicators of the software quality and the contractors’ ability to support it once fielded. In addition, a key driver to sustainment costs is the degree of environmental change and the affect this will have on software and data. You should work closely with AIR-4.2 to gain an understanding of these impacts on programs today, enabling improved estimation in the future. Industry also recognizes the degree to which the feedback loop to cost estimators is broken. B. Cost - The AIR-42 Cost Department represents a critical functional area for any software intensive program and must be engaged early to develop life cycle cost estimates for software development through delivery and sustainment.

You must clearly understand what development metrics and sustainment decisions will drive operations and support costs as well as how to estimate the impacts. Directly engage with the AIR-42 cost estimators assigned to the program to determine what type of cost model will be utilized for O&S cost estimating and establish input templates for the cost model. Critical parameters to consider are how much code is involved and type of code (new, reused, or modified), what the maintenance environment will be, and potential requirements. Another key driver to sustainment costs is the degree of environmental change and the affect this will have on software and data. It is important to work closely with AIR-42 to gain an understanding of these impacts on your program, to enable continuous improvement of the cost estimation process. A sample cost estimating template is provided as an attachment to this guide in Appendix B, along with some simple instructions for its use. The cost estimating

process forms the basis for development costs as well as the Operations and Maintenance (O&M) costs. The program logistician must 29 work closely with the IPT and AIR-4.2 to gain consistent cost estimating processes and capture the resulting budgetary items in the LRFS once the budgetary controls are known. Note: A review of types of funding is provided in Appendix D of this document. C. Software Metrics - Ensure that your IPT has Reliability & Maintainability (R&M) engineers embedded within the IPT and that you directly engage with them to establish software R&M metrics. This is a key activity in terms of translating warfighter requirements to supportability goals. Be certain to identify and include fleet stakeholders in these discussions (Commander Naval Air Forces (CNAF) Class Desk is usually a good place to start). Software metrics are a unique challenge It is very difficult to apply traditional hardware centric metrics, such as Mean Time Between Failure

(MTBF), and apply them to software. The purpose behind predictive metrics is to have a high level of confidence that the product will work reliably and can be maintained easily. Future revisions of this document will include more information on software metrics. 3.3 Other Logistics Planning Considerations In addition to the items discussed in the Software Support Considerations portion of what you need to [Know], here are some complimentary actions for support planning: A. Independent Logistics Assessment (ILA) – NAVSO P-3692, Department of the Navy Guide for Conducting Independent Logistics Assessments, Element 14 (Computer Resources and Software Support) should be reviewed early in the program for specific logistics assessment criteria relative to computer and software support. The ILA guide can be found at: https://www.navsopubsdonhqnavymil/authreq/ordersnsf/AllDocsID/42DE5E14BE3E3 EA6852571F1005C13B7/$File/ILA%20Handbook 21Sep06ve10.pdf?OpenElement B. Command Information

Officer (CIO) Community of Practice (COP) – The NAVAIR CIO COP can be found at: https://mynavair.navairnavymil/portal/serverpt/?open=512&objID=1019&PageID=0&p arentname=Login&parentid=1&cached=true&mode=2&userID=16434 This site contains information regarding Information Assurance (IA), Functional Area Managers (FAM), Functional Data Managers (FDM), DoD Information Technology Portfolio Repository-Department of Navy (DITPR-DON), and many other Information Technology (IT) topics. Document any security concerns in the LCSP and/or CRLCMP and what will be done for accreditation. The Clinger-Cohen Act of 1996 (Title 40/CCA), also called the Cohen Act, the CCA, or CIO Act, was enacted to improve acquisition and management practices pertaining to information resources. Ensure that your IPT has a software security expert assigned to it that is well versed in the CCA and other security requirements. Be certain to also consider software disposal requirements for the

software, software discs and other media, firmware, and host hardware. The two primary issues are software classification (level of protection required) and software security (adequacy of 30 protective measures). Be certain to maintain an open dialogue with your software security representative so you have a clear understanding of supportability risks. C. Organic Support - Your program may decide to use organic support in the form of a SSA for some or all of its software support. Again, it is critical to obtain the data rights necessary to perform this support as discussed in Section 2.1F of this document It is equally important to establish teaming and partnering relationships with the SSAs early in the program to include participation in software testing (verification and validation). In addition, the logistician must have a thorough understanding of and become a regular participant in the PRE process. PRE is the Program of Record which provides post-IOC (Initial Operating

Capability) software support for Navy and Marine Corps Tactical Software Systems. A review of the PRE process is provided in Appendix C of this document. D. Software Sustainment and Product Improvement Planning – Consider developing a Software Sustainment and Product Improvement checklist. The following example checklist (taken from the Air Force’s Guidelines for Successful Acquisition and Management (GSAM) of Software-Intensive Systems) is provided to assist you in understanding the sustainment and product improvement issues that should be examined during the acquisition phase and throughout the life cycle. This should not be construed as a required checklist, but rather as an example to help you formulate the right questions within your IPTs in the pre-solicitation phase, review of proposals and prior to the milestone itself. Sustainment  1. Is all your software developed with a goal to facilitate its future sustainment?  2. Do you understand the four types of sustainment

and their purposes?  3. Do you understand the place and purpose of the sustainment phase in the software life cycle?  4. Do you understand your sustainment process?  5. Is there a sustainment plan?  6. Is there a process in place to gather problem reports and upgrade requests for the software?  7. Does the plan provide for reviewing, evaluating, and prioritizing upgrade requests?  8. Are all sustainment activity steps included in the plan?  9. Is there a transition plan to move to the upgraded system?  10. Have all activities been planned and organized to keep interference and operating system downtime to a minimum?  11. Does the plan call for running critical systems redundantly during testing and installation?  12. Do the deliveries include source code, documentation, and all that is needed in addition to the software itself to continue maintaining the software? 31  13. Are all products under configuration control?  14. Do you

understand the processes used by the PRE program to document, prioritize, track, and fund software sustainment efforts? (added to list by NAVAIR) Product Improvement  15. Is your organization following a product improvement strategy?  16. Do you know where your organization is, capability wise, relative to ISO 9001, CMMI® or your chosen improvement standard?  17. Do you have a plan for achieving higher levels of capability?  18. Do your product improvement efforts emphasize improving processes to achieve product improvement?  19. Are your development processes documented?  20. Are your development processes consistent and repeated?  21. Does the leadership of your organization support continuous process improvement?  22. Is your organization committed to achieving a state of continuous improvement versus a certificate of compliance with standards? E. Software Support Framework – Consider laying out a framework for SSAs It is always helpful to have an

example or tool, so the following example is provided for consideration. While the terms may differ, the key activities will be similar and will be particularly useful if overlayed with the SETR process timeline. Software Support Activities and the Acquisition Life Cycle Pre-Deployment Software Support Stage (Initial Software Development) Support Stage (The Redevelopment Phase) • Develop software support strategy • Select support concept • Ensure software supportability • Implement Transition Plan • Investigate alternative support concepts • Propose software quality requirements • Evaluate software quality • Acquire & install SEE/STE • Propose software support acquisition requirements • Certify software documents & technical data • Acquire SSA resources • Influence product • PMO establishes • Identify SSA a Computer resource Resources IPT requirements (CR-IPT) • Prepare initial Computer Resource Plans • Review developer’s

support documentation and transition plans for software (e.g, STrP) • Incorporate developer’s support documentation and transition plans for software • Update Computer Resource Plans • Update Computer Resource Plans 32 • Staff and train personnel • Demonstrate support capability • Update Computer Resource Plans • Manage support • Conduct support • Provide software logistics support • Perform software configuration management • Maintain Computer Resource Plans Acronyms from Software Support Activities and Acquisition Life Cycle Diagram:     PMO- Program Management Office SEE- Software Engineering Environment SSA- Software Support Activity STE- Software Test Environment 33 4.0 4.1 Where to [Go] for more information Training Before embarking on your reading adventure, here are some recommendations. If you find yourself going to or already part of a software-intensive program, there are courses offered by Defense Acquisition

University (DAU) that you should consider. The first is Basic Software Acquisition Management (SAM 101). This course is 100% online and can be accessed through the DAU Register Now Website: https://atrrs.armymil/channels/registernow/ SAM 101 will count toward your 80 hour continuous learning requirement and is consistent with the “Core Plus” DAWIA certification philosophy, which encourages better connection to job needs. For more advanced learning, Intermediate Software Acquisition Management (SAM 201) and Advanced Software Acquisition Management (SAM 301) are also available through Register Now and are classroom based. 4.2 Best Practices As you might imagine, there are many resources available on software best practices. Precious few offer significant detail with regard to software logistics. We have attempted to provide a short listing of sources that is both useful and relevant. A. Guidelines for Successful Acquisition and Management of Software Intensive Systems (GSAM) –

Was developed by the U.S Air Forces Software Technology Support Center, and offers comprehensive guidance and useful checklists. It can be located on the Air Force’s Software Technology Support Center (STSC) website. Another online resource that can be found at the STSC website is CrossTalk – the Journal of Defense Software Engineering. This journal can be accessed via the web or you may request a subscription sent via postal mail. CrossTalk covers up-to-date practical software engineering technologies, practices, and field experiences. B. Contract Language – As part of the ASN (RD&A) Software Process Improvement Initiative of 2006/2007, Dr. Delores Etter issued a memorandum requiring specific contract language, CDRLs and DIDs for all contracts containing software development, acquisition and life cycle support with RFPs issued after January 1, 2007. The memo is titled “Software Process Improvement Contract Language.” C. Cost Estimating and Metrics - Software cost

estimating processes and techniques and software metrics are continuously evolving and improving. D. Software Engineering - In 2004, the Institute of Electrical and Electronics Engineers (IEEE) approved for release their Software Engineering Body of Knowledge (SWEBOK). This is a large reference that attempts to capture generally accepted knowledge relating to software. The SWEBOK can be found at: http://wwwswebokorg/ 34 E. Software Development Processes - The Software Engineering Institute (SEI) released a technical report in March of 2007, Understanding and Leveraging a Supplier’s CMMI® Efforts: A Guidebook for Acquirers. The report number is CMU/SEI-2007-TR-004 This is an excellent resource for anyone trying to understand what a contractor or government team is trying to communicate with respect to their CMMI® capability or maturity. This, along with the numerous sources available at the website, will provide the reference material necessary to understand. The SEI CMMI®

website can be found at: http://www.seicmuedu/CMMI®/ 4.3 Helpful Web Sites and Resources in Preparation for SOW or SOO Development http://www.daumil/ Defense Acquisition University (DAU) https://acc.daumil/CommunityBrowseraspx?id=17608 DAU Systems Engineering Community of Practice https://acc.daumil/CommunityBrowseraspx?id=18008 Acquisition Center for Service Contracting http://www.arnetgov/ Federal Acquisition Central for regulations, systems, resources, opportunities, and training http://www.acqosdmil/dpap/ Defense acquisition and procurement policy http://www.archivesgov/about/indexhtml National Archives http://www.stschillafmil/indexhtml Software Technology Support Center at Hill AFB http://software.gsfcnasagov/indexcfm NASA Goddard Space Flight Center Software Process Improvement https://akss.daumil/defaultaspx Acquisition, Technology & Logistics (AT&L) Knowledge Sharing System http://www.ieeeorg/portal/site IEEE web site with standards and publications

https://nserc.navymil/ Systems Engineering Technical Review (SETR) [use NMCI password with server, either NADSUSWE or NADSUSEAuser name] http://www.psmsccom/ Practical Software and System Measurement (metrics) http://www.seicmuedu Software Engineering Institute (CMMI®) http://www.ndiaorg/ National Defense Industrial Association http://iase.disamil/index2html Information Assurance Support Environnent https://rdit.armymil/rdithomehtm US Army’s Software Replication, Distribution, Installation and Training (RDIT) Resources Website 35 Appendix A – Software-Specific Considerations for the System Engineering Technical Reviews (SETRs) (Source AIR-4.1) System Requirements Review (SRR) Purpose and Software Considerations  Technical assessment that establishes the system specification of the system under review to ensure a reasonable expectation of being judged operationally effective & suitable  Ensures the Capability Development Document (CDD), DoD Directives, statutory and

regulatory guidance, and applicable public law has been correctly and completely represented in the system specification and can be developed within program cost and schedule constraints  Systems requirements are evaluated to determine whether they are fully defined and consistent with the mature system solution, and whether traceability of systems requirements to the CDD is maintained  Assesses the performance requirements as captured in the system specification and ensures the preferred system solution is:  Consistent with the system specification  Correctly captures derived and correlated requirements  Has well understood acceptance criteria and processes  Is achievable through available technologies resulting from the Technology Development phase SRR Software Related Products  Software Development Plan (SDP) – also a CDRL requirement  Schedule  Processes  Software Development Environment  System Specification  Initial Software Requirements

Description (SRD)  Modeling and Simulation Plan  Supporting Products  Systems Engineering Plan  Risk Management Plan  Measurement Plan  Quality Assurance Plan  Measurement Data  Cost  Size  Requirements Volatility System Functional Review (SFR) Purpose and Software Considerations 36  Technical assessment that establishes the system functional baseline of the system under review to ensure a reasonable expectation of being judged operationally effective and suitable  Assesses the decomposition of the system specification to system functional specifications derived from usage case analysis  Functional requirements for operations and maintenance are assigned to subsystems, hardware, software, or support after detailed reviews of the architecture in the environment it will be employed  The system’s lower-level performance requirements are evaluated to determine whether they are fully defined and consistent with the mature system concept, and

whether traceability of lower-level systems requirements to top-level system performance and the CDD is maintained  The development of representative operational usage cases for the system  Determines whether the system’s functional definition is fully decomposed to its lower level, and that the team is prepared to start preliminary design for hardware  Risk management, measurement, quality assurance, configuration management processes are deemed fully functional SFR Software Related Products  Updates to SRR products  SDP  System Specification  Modeling and Simulation Plan  Supporting Products • Systems Engineering Plan • Risk Management Plan • Measurement Plan • Quality Assurance Plan • Cost and Size Estimates  Updated (more detail) SRD  Interface Control Documents  Draft Test Plan  Measurement Data  Cost  Size  Requirements Volatility Software Specification Review (SSR) Purpose  Technical assessment that establishes the

software requirements baseline to ensure the preliminary design and ultimately the software solution has a reasonable expectation of being judged operationally effective and suitable  The software’s lower-level performance requirements are fully defined and consistent with a mature system concept, and traceability of lower-level software requirements to top-level system performance and the CDD is maintained 37  A review of the finalized Computer Software Configuration Item (CSCI) requirements and operational concept  Software Requirements Specification (SwRS) or Software Requirements Description (SRD), Interface Requirements Specification (IRS) or Software Interface Requirements Description (SIRD), Software Integration Plan, and the user’s Concept of Operation Description or User Documentation Description form a satisfactory basis for proceeding into preliminary software design SSR Software Related Products  Updates to SRR and SFR products  Final SwRS or SRD 

Requirements Verification Matrix  CDD to specifications to SwRS or SRD  Interface Control Documents (ICDs) and Interface Requirements Specification (IRS) or Software Interface Requirements Description (SIRD)  Declassification, Anti-Tamper, Open Architecture (OA), and Information Assurance (IA) requirements  Completed Test Plan  Software Integration Plan  Measurement Data  Cost  Size  Requirements Volatility Preliminary Design Review (PDR) Purpose and Software Considerations  Technical assessment that establishes the physically allocated baseline to ensure that the system has a reasonable expectation of being judged operationally effective and suitable  Assesses the allocated design captured in subsystem product specifications for each configuration item in the system and ensures that each function, in the functional baseline, has been allocated to one or more system configuration items  Subsystem specifications for hardware and software, along with

associated ICDs, enable detailed design or procurement of subsystems  A successful review is predicated on the team’s determination that the subsystem requirements, subsystem preliminary design, results of peer reviews, and plans for development and testing form a satisfactory basis for proceeding into detailed design and test procedure development PDR Software-Related Products      Updates to SRR, SFR, SSR products Top-Level Software Design Description and/or Software Architecture Description Completed Test Plan Draft Test Procedures Traceability from design documentation to subsystem test requirements 38  Representative mission profiles  Measurement Data  Size  Defects  Requirements Volatility  Technical assessment that establishes the build baseline to ensure that the system has a reasonable expectation of being judged operationally effective and suitable  Assesses the final design as captured in product specifications for each

configuration item in the system, and ensures that item has been captured in detailed documentation  Product specifications for software enable coding of the CSCI  A successful review is predicated on the team’s determination that the subsystem requirements, subsystem detail design, results of peer reviews, and plans for testing form a satisfactory basis for proceeding into implementation, demonstration, and test Critical Design Review (CDR) Purpose and Software Considerations  Technical assessment that establishes the build baseline to ensure that the system has a reasonable expectation of being judged operationally effective and suitable  Assesses the final design as captured in product specifications for each configuration item in the system, and ensures that item has been captured in detailed documentation  Product specifications for software-enabled coding of the CSCI  A successful review is predicated on the team’s determination that the subsystem

requirements, subsystem detail design, results of peer reviews, and plans for testing form a satisfactory basis for proceeding into implementation, demonstration, and test CDR Software-Related Products  Updates to SRR, SFR, SSR, PDR products  All specifications and requirements documentation are complete/stable  Detailed Software Design Description  Units Test Procedures  Traceability Verification Matrix  CDD to specifications to requirements documentation to design to test procedures  Traceability in both directions  Measurement Data  Size  Defects  Maturity  Requirements Volatility Integration Readiness Review (IRR) Purpose and Software Considerations  Technical assessment that establishes the configuration to be used in integration test to ensure that the system has a reasonable expectation of being judged operationally effective and suitable 39  A product and process assessment to ensure that hardware and software or software

components are ready to begin integrated configuration item (CI) testing  Assesses prior component or unit-level testing adequacy, test planning, test objectives, test methods and procedures, scope of tests to determine if required test resources have been properly identified and coordinated to support planned tests  Verifies the traceability of planned tests to program, engineering data, analysis, and certification requirements  Testing is based upon the Test Plan  Begun in requirements phase and completed during design  Conducted after the test and/or validation procedures and unit-level testing are complete IRR Software Related Products       Updates to SRR, SFR, SSR, PDR, CDR products Approved Integration Test Plan Approved Integration Test Procedures Format for Integration Test Report Completed Integration Test Verification Matrix Measurement Data  Quality/Maturity  Defects Test Readiness Review (TRR) Purpose and Software Considerations 

Technical assessment that establishes the configuration used in test to ensure that the system has a reasonable expectation of being judged operationally effective and suitable  TRR is a multi-disciplined product and process assessment to ensure that the subsystem, system, or systems of systems is stable in configuration and is ready to proceed into formal test  Assesses prior unit-level and system integration testing adequacy, test planning, test objectives, test methods and procedures, scope of tests to determine if required test resources have been properly identified and coordinated to support planned tests  Verifies the traceability of planned tests to program, engineering, analysis, and certification requirements  Determines the completeness of test procedures and their compliance with test plans and descriptions  Assesses the impact of known discrepancies to determine if testing is appropriate prior to implementation of the corrective fix  The TRR process is

equally applicable to all tests in all phases of an acquisition program TRR Software Related Products     Updates to SRR, SFR, SSR, PDR, CDR, IRR products Traceability Analysis Approved Test Plan Approved Test Procedures 40  Format for Test Report  Completed Test Verification Matrix  Measurement Data  Quality/Maturity  Defects Flight Readiness Review (FRR) Purpose and Software Considerations  Technical assessment that establishes the configuration used in flight test to ensure that the system has a reasonable expectation of being judged operationally effective and suitable  Assesses the system and test environment to ensure that the system under review can proceed into flight test with:  NAVAIR airworthiness standards met  Objectives clearly stated  Flight test data requirements clearly identified  Acceptable risk management plan defined and approved  Ensures that:  Proper coordination has occurred between engineering and flight

test  All applicable disciplines understand and concur with: • The scope of effort that has been identified • How this effort will be executed to derive the data necessary (to satisfy airworthiness and test and evaluation requirements) to ensure the weapon system evaluated is ready to proceed to flight test FRR Software-Related Products      Updates to SRR, SFR, SSR, PDR, CDR, IRR, TRR products Approved Flight Test Plan Approved Test Points/Flight Scenarios/Knee Cards Format for Flight Test Report Measurement Data  Quality/Maturity  Defects/Test Hour Operational Test Readiness Review (OTRR) Purpose and Software Considerations  Multi-disciplined product and process assessment to ensure that the system under review can proceed into Operational Test and Evaluation (OT&E) with a high probability the system will successfully complete operational testing  Successful performance during Operational Evaluation (OPEVAL) generally indicates the system being

tested is effective and suitable for fleet introduction  The decision to enter production may be based on this successful determination  Of critical importance to this review is the understanding of available system performance to meet the CPD  Operational requirements defined in the CPD must match the requirements tested to in the Test and Evaluation Master Plan (TEMP) 41 OTRR Software-Related Products      Updates to SRR, SFR, SSR, PDR, CDR, IRR, TRR, FRR products Approved TEMP Approved Test Points/Flight Scenarios Format for Operational Test Report & Quick-look Report Measurement Data  Quality/Maturity  Defects/Test Hour 42 Appendix B – Sample Cost Estimating Template A compilation of the data outlined and described in the embedded checklist/model input sheet provides the necessary data necessary to run AIR-4.2s Constructive Cost Model (COCOMO) cost model to derive software development and support cost estimates. D:Documents and

Settingspall.ARNASO 43 Appendix C – Program Related Engineering Program Related Engineering (PRE)   PRE is the Program of Record which provides post-IOC software support for Navy and USMC Tactical Software Systems NAE Objectives for PRE  Fund capability-based products which include only the part of the infrastructure needed for each deliverable  Eliminated low priority work with little or no benefit to the Fleet  Prioritize capability-based products through a Fleet/TYCOM process ending with Air Boss approval 1 PRE Support  PRE breaks software support into two main categories, Fleet Response Activity (FRA) and Capability Defect Package (CDP)  FRA consists of Issue Triage, Fleet Services, Work Arounds, CDP Development and Basic Lab Maintenance. The resource requirement for these categories is only the effort which can not be applied to a CDP effort.  CDPs are collections of Software Trouble Reports that result in a software product delivery to

the fleet. SSAs perform analysis of STRs and prepare CDPs for input and prioritization within their own community OAG prior to submittal to CNAF  FRA and CDP support levels are submitted by SSAs to their respective PMAs, and a NAVAIR review board reviews all submittals.  CNAF ranks and stacks all CDPs with regard to the overall capability priority to the fleet. CNAF also weighs in on the FRA requirement, esp for aircraft phasing out. 2 44 PRE Execution Cycle Overview TMS Prioritized Issues NAVAIR FRA & CDP Review CNAF CDP Validation & Ranking Emergent Fleet Urgent Issues Allocation Execution Plan Execution Plan Review APPROACH: (1) Change Requests (2) Execution Tracking Re-plan Re-allocate Execution Q1 Rpt Q2 Rpt Q3 Rpt Q4/End Yr Rpt 3 Team Funding Paradigm Shift  Funding SSA teams via FRA and CDPs are a transition from an older model which focused first on required infrastructure (Min Core), then on product FRA Min-Core (All) Management (All) SW

Labs, Data (All) Support (tracking, analysis, OAG, PRE) No Metrics, only $$  Issue Triage Fleet Services Key: Removed CDP development costs from FRA The current model is dramatically more responsive to fleet needs, but can result in an SSA team not receiving enough PRE work to sustain a previous Min Core capability. 4 Source: Ms. Lorenda Batson, AIR-41, NAVAIR PRE Program Manager 45 Appendix D–Review of Funding Appropriations/Categories (Source: AIR-4.1)  Operations & Maintenance, Navy (O&M,N)  Operation and maintenance support of the Navy  Available for obligation for one year and available for obligation adjustments and expenditure for an additional five years  Other Procurement, Navy (OP,N or OPN)  For investment items that do not fall into a specific area for which funds are separately approved, e.g, aircraft, weapons, military construction, etc  Available for obligation for three years and an additional five years for adjustments and

expenditure  Aircraft Procurement, Navy (AP,N or APN)  Appropriated for the procurement and modification of naval aircraft and support equipment, including training devices/systems  Available for obligation for three years plus five more years for adjustments and expenditure  Weapons Procurement, Navy (WP,N or WPN)  Like APN for weapons like missiles, bombs, ammunition, etc.  Research, Development, Test, and Evaluation (RDT&E)  Funds are appropriated for two years with a goal of 100% obligation in the first year of availability  After the two years of availability for obligation, the funds are available for an additional five years for adjustments and expenditure  Broken into 7 budget activities: Budget Activity Research Category 1 6.1 Basic Research 2 6.2 Applied Research 3 6.3a Advanced Technology Development 4 6.3b Demonstration & Validation 5 6.4 Engineering & Manufacturing Development 6 6.5 RDT&E Management Support 7

6.6 Operational Systems Development Title Note: 6.1 through 63a are referred to as Science and Technology (S&T) under the purview of Director Defense Research & Engineering (D,DR&E)  Shipbuilding and Conversion, Navy (SC,N or SCN)  Procurement of ships and the conversion of existing ships  Available for obligation for five years plus five more for adjustments and expenditure  Foreign Military Sales (FMS)  Payment of bills associated with properly executed and approved FMS cases, including services, development, procurement, and maintenance  No expiration 46 Glossary of Acronyms ADP ALSP APEX API APN ASN RD&A DAU EVM CASE CCA CDD CDR CDRL CI CIO CM CMMI® CNAF CONOPS COTS CPIN CPD CRLCMP CSCI CSE CJCSI DAU DBDD DID DoD or DOD DoN or DON DRRA DSSE EVM FAR FISMA FMS FRR GOTS GPRA GSAM ICD Automated Data Processing Acquisition Logistics Support Plan Application Executive Application Programming Interface Aircraft Procurement, Navy Assistant

Secretary of the Navy for Research Development and Acquisition Defense Acquisition University Earned Value Management Computer-Aided Software Engineering Clinger-Cohen Act of 1996 (Title 40/CCA) Capability Development Document Critical Design Review Contract Data Requirements List or Contract Data Deliverable List Configuration Item Chief/Command Information Officer Configuration Management Capability Maturity Model Integrated Commander Naval Air Forces Concept of Operations Commercial Off-The-Shelf Computer Program Identification Number Capability Production Document Computer Resources Life Cycle Management Plan Computer Software Configuration Items Common Support Equipment Chairman of the Joint Chiefs of Staff Instruction Defense Acquisition University Database Design Description Data Item Description Department of Defense Department of the Navy Data Rights Requirements Analysis Developmental Software Support Environment Earned Value Management Federal Acquisition Regulation Federal

Information Security Management Act Foreign Military Sales Flight Readiness Review Government Of-The-Shelf Government Performance and Results Act Guidelines for Successful Acquisition and Management Interface Control Document 47 IEC IEEE IMA IPT IRR IRS IA IT ISO JSF LCSP LCSSE MLVS NAVAIR NGSL NIST NSN NSS OA OFP O&M O&S OMB OPEVAL OPN OSD OT&E OTRR PCMCIA PEMA PDR PHS&T PEO PM PRE RDIT RFP SAM SDD SDP SEI SETR SIDD SLOC SAD International Electrotechnical Commission Institute of Electrical and Electronics Engineers Integrated Modular Avionics Integrated Product Team Integration Readiness Review Interface Requirements Specification Information Assurance Information Technology International Organization for Standardization Joint Strike Fighter Life Cycle Sustainment Plan Life Cycle Software Support Environment Memory Loader Verifier Set Naval Air Systems Command Next Generation Software Loader National Institute of Standards and Technology National Stock Number

National Security Systems Open Architecture Operational Flight Program Operations and Maintenance Operations and Sustainment Office of Management and Budget Operational Evaluation Other Procurement, Navy Office of the Secretary of Defense Operational Test and Evaluation Operational Test Readiness Review Personal Computer Memory Card International Association Portable Electronic Maintenance Aide Preliminary Design Review Packaging, Handling, Storage and Transportation Program Executive Officer Program Manager Program Related Engineering Replication, Distribution, Installation, and Training Request for Proposal Software Acquisition Management Software Design Description Software Development Plan Software Engineering Institute System Engineering Technical Review Software Interface Design Description Software Lines of Code Software Architecture Description 48 SAPIP SCN SE SFR SIRD SOO SOW SRD SRR SSA SSR STSC SWEBOK SwRS TEMP TP TRR WPN WRA Software Acquisition Process Improvement

Program Shipbuilding and Conversion, Navy Support Equipment System Functional Review Software Interface Requirements Description Statement of Objectives Statement of Work Software Requirements Description System Requirements Review Software Support Activity Software Specification Review Software Technology Support Center Software Engineering Body of Knowledge Software Requirements Specification Test and Evaluation Master Plan Test Plan Test Readiness Review Weapons Procurement, Navy Weapons Replaceable Assembly Ms. Barbara Williams, AIR-41 J-STD-016 3 SOFTWARE LOGISTICS - THE EMERGING GIANT 4 Mr. John Johnston, AIR-422 5 NAVAIR Acquisition Guide 2006/2007 6 MIL-STD-498 7 Source: IEEE Standard for Software Maintenance, IEEE Std 1219-1993 8 DOD-STD-1467, Software Support Environment 9 System Engineering Institute (SEI): http://www.seicmuedu/cmmi/general/indexhtml 10 NAVAIRINST 4355.19D, System Engineering Technical Review (SETR) 11 Guidelines for Successful Acquisition and Management

(GSAM) of Software-Intensive Systems: Weapon Systems Command and Control Systems Management Information Systems 12 SOFTWARE LOGISTICS - THE EMERGING GIANT 13 Software Acquisition Management (SAM) 101 14 Mr. John Johnston, AIR-422 1 2 15 NAVSO P-3692, Department of the Navy Guide for Conducting Independent Logistics Assessments, September 2006 16 Department of Energy (DOE) Software Engineering Methodology, Chapter 10 49