Content extract
NAVAIRINST 3905.1 AIR-5.1 NAVAIR INSTRUCTION 3905.1 From: Commander, Naval Air Systems Command Subj: TEST REPORTING POLICY FOR AIR VEHICLES, AIR VEHICLE WEAPONS AND AIR VEHICLE INSTALLED SYSTEMS TESTS Ref: (a) DoD Directive 3200.12, DoD Scientific and Technical Information (STI) Program (STIP) of 11 Feb 98 (b) DoD Instruction 3200.14, Principles and Operational Parameters of the DoD Scientific and Technical Information Program of 13 May 97 (c) NAVAIRINST 3960.4B (d) Integrated Systems Evaluation Experimentation and Test (ISEET) Test Reporting Handbook (e) SECNAVINST 5000.2D (f) SECNAVINST 3900.29E (g) NAVAIRINST 3960.5A (h) DoD Directive 5230.24, Distribution Statements on Technical Documents of 18 Mar 87 Encl: (1) Report Approval Matrix 1. Purpose To establish Naval Air Systems Command (NAVAIR) policies, processes, responsibilities, and requirements for preparation, review and approval for reporting test results of flight, ground, and laboratory tests of air vehicles, air
vehicle weapons, and air vehicle installed systems. This will ensure compliance with references (a) and (b) that mandate formal reporting whenever Test and Evaluation (T&E) funds are expended in the conduct of tests. 2. Cancellation This instruction supersedes and cancels the AIR-4.11/55 Standard Operating Procedure 52131 of 27 August 1996; Integrated Systems Evaluation, Experimentation, and Test (ISEET) Director’s Note of 25 June 2007; NAVAIRWARCENACDIVINST 5214.1 of 9 April 1994, Report Policy; and NAVAIRWARCENACDIVINST 5213.1 of 9 April 1994, Developmental Test/Operational Test Transition Report. NAVAIRINST 3905.1 3. Scope This instruction applies in all cases where NAVAIR aircraft or personnel are involved in the direct execution of T&E activity in support of NAVAIR programs. This includes NAVAIR T&E Group personnel that are supporting the Naval Air Warfare Center Aircraft Division (NAWCAD), Naval Air Warfare Center Weapons Division (NAWCWD), and activities
supported by Naval Air Systems Command Headquarters: the Program Management Group (AIR-1.0) and all Naval Aviation Program Executive Officers (PEOs). When non-NAVAIR sponsored projects (Commercial Services Agreements, other DoD agencies, etc.) require a report, this Instruction applies. Report format can be tailored to specific requirements detailed in tasking documents. Limitations to the scope of this instruction are further discussed under Policy in paragraph 5. 4. Background. a. NAVAIR T&E of aircraft and systems is a collaborative effort between NAVAIR T&E Group and program Integrated Product Teams (IPTs). A key tenet of the Competency Aligned Organization (CAO) is that the IPT or Externally Directed Team (EDT) is responsible and accountable for the product. Under this construct, the NAVAIR T&E Group will provide, to the maximum extent possible, engineers and aircrew to the flight test teams that are empowered to plan, conduct, and report test results. It is vital
that the NAVAIR T&E Group and the IPTs work together to ensure proper test planning, execution, analysis, and reporting. With respect to test reporting, the CAO must ensure that test teams produce test results and decision-making information in a timely manner. That information must be of high quality, mission-oriented, balanced, technically correct, and include qualitative assessments and quantitative engineering data. b. Project Officers and Project Engineers are the primary authors of technical documentation reflecting the results and information gathered from T&E events conducted on test aircraft under the purview of the Commanders of the Naval Test Wings Atlantic and Pacific (Test Wings). The ISEET Competency Leadership (which includes Engineering Divisions and Squadrons) provides review of documentation as delineated in enclosure (1). Responsibility for the approval of accurate and unbiased test results, including program-level, decision-quality data, conclusions, and
recommendations covered by this instruction, rests with senior members of the ISEET Department (AIR-5.1), as shown in the “approved by” column of enclosure (1). 2 NAVAIRINST 3905.1 c. This document provides instruction for the generation, review, roles and responsibilities, and approval of T&E reports. These reports provide results to sponsors, managers, technical personnel, and interested organizations within and outside NAVAIR and DoD organizations. Documenting T&E results ensures that NAVAIR meets the current and future needs of sponsors and warfighters, and is consistent with the best practices of knowledge management. Publication and centralized report archiving also fulfills the DoD Scientific and Technical Information Program (STIP) requirements outlined in references (a) and (b). 5. Policy. a. Test results shall be documented and approved whenever: (1) T&E efforts (including Science and Technology (S&T) and other non-acquisition programs) are
conducted using NAVAIR resources (i.e, aircraft and/or personnel) to perform testing in support of NAVAIR programs; (2) the purpose of an on-aircraft, laboratory, training system, or flight test is to produce decision quality data to support an acquisition decision authority on behalf of NAVAIR; (3) Developmental Test (DT) period is transitioned to an Operational Assessment (OA) or Operational Test (OT) period; (4) T&E results from tests conducted on NAVAIR aircraft are to be submitted as part of a certification process owned by another US Government agency or foreign government; and/or (5) NAVAIR personnel are externally tasked to execute a test, analyze data, and report results. b. In all the scenarios addressed in paragraph 5a above, ISEET Test and Experimentation Coordination Team (TECT) members shall determine the applicability of this instruction to a specific test program. The specific reporting requirements associated with a NAVAIR test plan will be identified in the Test
Reports/ Deliverables Plan (TRDP) and approved during the test plan approval process in accordance with reference (c). c. The following is a list of areas for which NAVAIRINST 3905.1 does not apply: 3 NAVAIRINST 3905.1 (1) air vehicle tests which use NAVAIR resources solely in a supporting role (e.g, when NAVAIR range support aircraft are used to gather telemetry (TM) or other data); (2) T&E of non-NAVAIR air vehicles using NAVAIR resources (e.g, facilities) in support of non-NAVAIR products or customers, when NAVAIR is not responsible for test data generation, analysis, or reporting; and (3) when NAVAIR personnel or air vehicles are used in support of an OT period. 6. Responsibilities Coordinated efforts are required for effective analysis of test data, drafting of technical documentation, timely review, approval, and publication of test reports. The NAVAIR T&E Group leaders and test team members shall be familiar with references (a) through (h), which address
documentation requirements associated with expenditure of T&E funds, as well as guidance to authors, reviewers, and approvers of test reports. The following actions are required by designated personnel with respect to the test reporting process. a. Lead Test Engineer (LTE) This is a person designated by the System T&E Management Division (AIR-5.11) When a team does not have an AIR-5.11 LTE, these duties will be filled by another team member as stated in the Test Project Initiation Document and/or Project Planning Memo. The incumbent may be the 5.1x test team lead, Project Officer, Project Engineer, EDT leader, Platform Coordinator, or other designated individual. The LTE will: (1) review the TRDP and to the maximum extent practicable, facilitate timely reporting of results; (2) coordinate with competency leadership and project sponsors when test results from multiple test plans are to be presented in a consolidated reporting product (e.g, DT/OT Transition Report); (3) ensure
that associated schedule, funding, and Project Officer/Project Engineer team resources are available and adequate for timely reporting; (4) monitor and track the status of all project reports across the team and maintain a current team status roll-up that is available to the competency leadership and TECT; 4 NAVAIRINST 3905.1 (5) notify the TECT and competency leadership when report delivery deadlines will not be met and participate in the negotiation of a revised TRDP with the local program sponsor and competency leadership; and (6) in the absence of a local program sponsor, address the requirements imposed by references (a) and (b) during contract negotiations to ensure that DoD STIP mandates are met and included in contract data requirements lists. The T&E data produced as a result of foreign military sales programs shall be submitted to Defense Technical Information Center (DTIC) within disclosure limitations imposed by the Navy. b. Local Program Sponsor This is a person
designated by the program manager. The incumbent is usually the Assistant Program Manager for Test and Evaluation (APMT&E). When a program does not have an APMT&E, these duties can be performed by an IPT leader, an EDT leader, or other designated individual. The local program sponsor will: (1) ensure that the project sponsor understands the importance of reporting of test results with respect to stewardship of T&E processes (including lessons learned); (2) ensure that reporting requirements reflect project tasking; (3) coordinate with competency leadership and the project LTE to ensure that associated schedule, funding, and the Project Officer/Project Engineer team resources are available and adequate for timely reporting; (4) review the TRDP for appropriate report format and delivery timing; and (5) address the requirements imposed by references (a) and (b) during contract negotiations to ensure that DOD STIP mandates are met and included in contract data requirements
lists. T&E data produced as a result of foreign military sales programs shall be submitted to DTIC within disclosure limitations imposed by the Navy. c. Project Officer/Project Engineer Project Engineer will: 5 The Project Officer/ NAVAIRINST 3905.1 (1) assemble a TRDP that adequately reflects the sponsor’s and ISEET’s reporting requirements per reference (c); (2) capture the progress of their test project via Daily Flight Reports; (3) ensure that mission relevance and supporting data are accurately presented in an unbiased manner; (4) ensure the engineering veracity of the test data. Coordinate with other engineering competencies as appropriate; (5) conduct appropriate engineering analysis and evaluation of test data. Analysis and evaluation of test data is a team function performed by test team members from various T&E and engineering competencies; (6) prepare draft T&E reports; (7) review the Capabilities Development Document (CDD), Capabilities Production
Document (CPD) and Test and Evaluation Master Plan (TEMP), if applicable, to ensure that the test report presents the necessary information to support an acquisition milestone, decision meeting, and/or preparation for Operational Test of Measures of Effectiveness, Measures of Suitability, Critical Technical Parameters, and Key Performance Parameters; and (8) ensure that all documentation created adheres to the appropriate security classification guidance and contains the appropriate distribution statements. d. NAVAIR T&E Group Branch Heads Branch Heads will: The NAVAIR T&E Group (1) work with test team leadership to ensure the team is adequately staffed to facilitate accurate and timely publication; (2) ensure that employees under their supervision receive training on report writing and on their responsibilities in safeguarding intellectual property in accordance with reference (d); 6 NAVAIRINST 3905.1 (3) encourage senior/experienced officers and engineers to actively
coach, mentor, and advise personnel assigned to IPTs, EDTs, and other test teams; and (4) assist test teams in the preparation of reports by acting as technical discipline specialists during the review and approval process. e. NAVAIR T&E Group Division Heads The NAVAIR T&E Group Division Heads will ensure adequate technical review of test reports and will have review and approval authority as defined in enclosure (1) and the project TRDP. f. TECT The TECT will: (1) have review and approval authority for test reports, as defined in enclosure (1), and project TRDPs; (2) maintain the ISEET Test Reporting Handbook, reference (d); and (3) coordinate with the NAVAIR STIP focal point to ensure that a report archiving process exists. g. Test Squadron Commander The Squadron Commander will have approval authority for test reports involving aircraft or resources under squadron purview, as defined in enclosure (1). This authority may be delegated in writing to designated individuals. h.
Test Wing Commanders The Test Wing Commander will have approval authority for DT/OT Transition Reports, as defined in enclosure (1). This authority may be delegated in writing to designated individuals. i. United States Naval Test Pilot School (USNTPS) USNTPS will include applicable report writing training, using reference (d) as part of their training program, to provide guidance to officers and engineers responsible for report generation. j. Operational Security Coordinator When applicable, the coordinator acts as a member of the team and assists in the review of any classified annex or assignment of overall classification, if other than unclassified. 7 NAVAIRINST 3905.1 7. Test Report Documentation Reports shall include results and technical information in sufficient detail to allow verification of accuracy and quantitative rationale for any recommended programmatic decisions. All efforts will meet the minimum Development Test and Evaluation (DT&E) reporting requirements
outlined in reference (e). All customer reporting requirements shall be established during the project-planning phase and shall be determined in concert with competency requirements for full documentation and archiving of the effort. These details will be negotiated between the test team and the sponsor, and will be included in the TRDP portion of the flight test plan, as outlined in reference (c). The assigned ISEET TECT will review the adequacy of each TRDP (i.e, format and timing) as part of the test plan review and approval process. Test results in the form of drawings, photographs, video, software, and/or hardware components should also be considered during preparation of a TRDP. All T&E reports shall be prepared in accordance with the formatting and marking requirements of reference (e), enclosure (5) and reference (f). Reports shall be forwarded to the DTIC for DoD archiving, as stipulated in references (a) and (b). 8. Test Report Types Several test report types, both formal
and informal, are available to support the needs of the test team and their customers. Guidelines for standardizing the formats associated with each formal report type are presented in reference (d). a. Formal Reports In general, a formal report is required to officially document NAVAIR T&E Group test results, conclusions and recommendations. Formal reports require TECT approval unless otherwise delegated in the Project Planning Memorandum and/or TRDP of the project test plan. (1) Deficiency Report (DR). The DR is used to document system-related technical issues identified during developmental testing, concept demonstration testing, experimentation, and/or Concept of Operations (CONOPS) development. The DR is specifically designed to provide program management with sufficient understanding of technical issues to facilitate an appropriate corrective action plan. As such, the DR has a specific, directed format that must be followed at all times. The DR is governed and managed by
NAWCAD/WD. The NAVAIR Technical Assurance Board (NTAB) also governs DRs for programs under NTAB monitoring, in accordance with reference (e), enclosure (5). 8 NAVAIRINST 3905.1 (a) Deficiency Classification. A Part I, Part II, or Part III classification is assigned to each deficiency to rate the severity of its mission impact. The guidelines to determine classification and timing of correction recommendations are provided herein. These guidelines will aid in narrowing the subjectivity of the process. The term “system” as used in the definitions includes: weapon system (e.g, manned and unmanned air vehicles, weapons, and associated mission systems), major aviation training devices (e.g, operational flight trainers, integrated weapon, sensor, system, and tactics trainers, etc.), air traffic control and landing systems, launch and recovery systems, support equipment and software. 1. A Part I classification indicates a severe deficiency, the correction of which is necessary
because it adversely affects one or more of the following: a. the safety of the aircrew, operator, or maintainer, or the airworthiness of the system; b. accomplishment of the primary or alternate mission(s) as well as required interaction with other mission critical systems including classified information processing systems, and/or the capability of the system to perform its intended function. c. the effectiveness or readiness of the aircrew, operator or support personnel. Note: Airworthiness is defined as the property of a particular air system configuration to safely attain, sustain, and terminate flight in accordance with the approved usage and limits. Remote possibilities or unlikely sequences of events shall not be used as a basis for Part I classification. 2. A Part II classification indicates a deficiency of lesser severity than a Part I that does not degrade the system to a state precluding accomplishment of primary or alternate mission(s), or training objectives, but
correction will result in significant improvement in effectiveness of the system, aircrew or support personnel. 3. A Part III classification indicates a deficiency that is annoying and/or requires minimal compensation to attain adequate performance. 9 NAVAIRINST 3905.1 (b) Timing of Correction. Correction timing statements may be tailored, with TECT concurrence, to better serve the needs of a program, in particular those associated with experimentation and concept demonstration projects. Recommended timing for correction of deficiencies is as follows: 1. Part I deficiencies may employ the use of “Stars” (asterisks (*)) to designate a specific recommended time for accomplishment of corrective action. While stars will, by association, be a reflection of the relative seriousness of the deficiency, the focus of stars should remain on the timing recommendation. a. Double Star Part I (Part *I). Safety of flight deficiencies or deficiencies that would preclude or degrade
accomplishment of primary or alternate mission(s) to such a degree as to prohibit further flights or preclude further testing of the system shall be assigned a Part *I with recommended correction prior to a specified test phase (e.g, Sea Trials, OA, Initial Operational T&E (IOT&E), etc.) For concept demonstration or experimentation projects this could include recommended correction prior to a specific program acquisition phase (e.g, Milestone B) or test event within a phase. b. Single Star Part I (Part *I). Deficiencies that should be corrected before operational deployment or other specific phase of mission employment of the system, or excessive operator compensation is required to accomplish the primary or alternate mission(s) shall be assigned a Part *I. For concept demonstration or experimentation projects, excessive operator compensation or extraordinary operating limitations may also highlight a low level of system technical maturity that will adversely affect acquisition
test phases and schedules. c. Part I (no stars) A deficiency that should be corrected when possible but may not interfere with deployment of operational squadrons or mission employment for other squadrons such as training squadrons. 2. when practicable. Part II deficiencies should be corrected 10 NAVAIRINST 3905.1 3. Part III deficiencies should be avoided in future design. (2) Report of Test Results (RTR). The RTR is the most common of all report types by which test results, conclusions, and recommendations are officially documented for the customer. The RTR’s primary purpose is to provide sufficient technical information and recommendations to form an authoritative basis upon which programmatic decisions may be made and milestone reviews can be conducted. (3) Interim Summary Report (ISR). The ISR is a highly abridged hybrid report using RTR and memorandum format elements. It is used to rapidly communicate conclusions and recommendations of a system under test, generally to
provide a basis for an acquisition milestone decision (e.g, Low-Rate Initial Production). The ISR consists of a body portion with pertinent information, particularly conclusions and recommendations, and an appendix consisting of significant DRs and TEMP criteria evaluation status, if applicable. A more extensive follow-up report, such as an RTR, is required to fully document test results. (4) DT/OT Transition Report. The DT/OT Transition Report provides program management with a risk assessment for advancing the system under test into a dedicated OT phase (e.g, OA, IOT&E, etc.) The report specifically addresses the system’s demonstrated performance against TEMP Critical Operational Issues and associated performance requirements. The DT/OT format can be tailored to meet Milestone Decision Reporting obligations/requirements. (5) Test Engineering Data Report (TEDR). The TEDR is used to document and release test engineering data. Technical discussion, conclusions, and recommendations
within the TEDR are limited to aspects of the data itself or the data collection methods. There are no conclusions or recommendations about the system itself under test. System suitability and other systemrelated conclusions and recommendations are covered by other test report types, such as the RTR. (6) Technical Information Memorandum (TIM). The TIM is used to document results from internal scientific, engineering, and technical investigations; publish symposia and other types 11 NAVAIRINST 3905.1 of professional papers; and forward journal articles to DTIC. It is designed to provide test-related technical feedback to science and technology development programs, such as Joint Capability Technology Demonstrations. The TIM does not provide conclusions or recommendations of a programmatic nature, such as suitability for acquisition. (7) Quarterly Status Reports (QSR). The QSR provides a program-level executive summary for all NTAB-monitored programs. For information on preparing
and distributing a QSR, consult reference (g). b. Informal Reports Informal reports document day-to-day test progress and provide a repository of unofficial test results. They are not intended for making programmatic decisions or supporting formal milestone reviews. (1) Daily Flight Report. The “Daily” is an internal test team report used to record quantitative test data and analysis and document qualitative observations and comments from a single test event (flight, ground or laboratory test event). Typically small and focused in scope, the cumulative result of all Daily Reports provides the basis for a subsequent formal report. (2) Periodic Status Brief. A briefing, which provides the customer with periodic assessments of the test program. Unpublished conclusions and recommendations will not be presented. Status Briefs can be communicated in any format agreed to between the test team and the customer. Information copies should be forwarded to the TECT. 9. Test Report Preparation
Drafting of test reports is performed primarily by the Project Officer and Project Engineer, incorporating inputs from applicable test team members and outside technical experts. Detailed guidance for preparing reports is included in the ISEET Test Reporting Handbook, reference (d). 10. Test Report Review and Approval Formal test reports are the mechanisms for official communication of test results, conclusions, and recommendations to the program sponsor. Critical acquisition milestone decisions are made based on these reports. Therefore, a thorough technical review is crucial in ensuring that reports are technically accurate, mission relation is adequately addressed, and conclusions/recommendations are 12 NAVAIRINST 3905.1 supported by data. Review by experts from outside of the test team is important in ensuring all aspects have been considered, including technical accuracy. Prior to submitting a report for formal review, it is advisable to have some level of internal team
review. This review depends on the size and complexity of the project and type of report. The internal review team typically consists of test team members other than the primary drafter(s) and should include others outside the team, such as technical specialists, Branch Heads, Senior T&E Engineers, and others with appropriate technical expertise. Internal review requirements are determined by the test team; larger teams may identify them in a CONOPS document. Formal review requirements for the various types of reports are delineated in enclosure (1). The formal review is intended to ensure the technical accuracy of results and that conclusions and recommendations adequately address program requirements, including accurate mission relation. Approval authority for all reports covered by this instruction is vested with the Director, ISEET (AIR-5.1) and the personnel outlined in enclosure (1). Delegation to nonISEET personnel must be addressed via separate delegation letter from
Director, ISEET. Further delegation within ISEET can be done by Director, ISEET or those personnel listed in enclosure (1). Delegation for both review and approval shall be a function of the scope and complexity of the project and the technical and program risks associated with the information contained in the report. The specific review and approval chain for each report will be negotiated as part of the TRDP preparation. Personnel must ensure that reports reviewed and approved under their purview are objective, balanced, and present technically sound conclusions and recommendations. 11. Security Classification Test reports will by necessity need to be prepared at a classification level appropriate to their content. All test reports shall be archived in accordance with program security regulations. All reports shall at a minimum be marked as “FOR OFFICIAL USE ONLY” with an associated distribution limitation statement. 12. Test Report Distribution Appropriate library facilities,
document control centers, or data distribution systems will be utilized to review, publish, and archive all scientific and technical reports distributed by the command in accordance with reference (h) to ensure policy compliance and quality standards. Whenever possible, all ISEET-generated reports will be archived with the Technical Publication Library. 13 NAVAIRINST 3905.1 13. Instruction Review The process owner, AIR-51, shall review this instruction at least annually and prepare revisions as necessary. Revisions to the enclosure of this instruction may be approved on an “as needed” basis and shall be reported in the annual review of this instruction. DAVID ARCHITZEL Distribution: NAVAIR Directives can now be found on https://homepages.navairnavymil/directives/ or locally on https://mynavair.navairnavymil 14 NAVAIRINST 3905.1 REPORT REVIEW AND APPROVAL MATRIX TYPE OF REPORT Deficiency Report DRAFTED BY Test Team (1) Report of Test Results Test Team(1) Interim
Summary Report Test Team(1) DT/OT Transition Report Test Engineering Data Report Technical Information Memorandum NTAB Quarterly Status Report Test Team(1) Test Team(1) Test Team(1) Test Team REVIEWED BY Formal Reports Division Head(s)(2) NTAB Rep(3) Chief Test Engineer (CTE)(2) Chief Test Pilot (CTP)(2) Squadron Commander for NTAB DRs Technical Specialist(s)(4) Division Head(s)(2) CTP CTE Division Head(s)(2) CTP CTE Division Head(s)(2) CTE CTP Technical Specialist(s)(4) Division Head(s)(2) Technical Specialist(s)(4) CTE CTP APPROVED BY Test Wing Commander for NTAB DR(2) Squadron Commander for non NTAB DR(2) CTE(2); exception is Squadron Commander(2) if previously unreleased Part I Deficiency is reported. Squadron Commander(2) or CTE Test Wing Commander(5) Division Head(2) if one Division, otherwise CTE(2) Division Head(2) Squadron Commander or as Delegated Informal Reports Test Team As determined by team N/A Daily Flight Concept of Operations Report As determined by team N/A
Periodic Status Test Team Concept of Operations Brief NOTES: (1) Internal test team review is part of the drafting process. (2) Can be delegated in writing. (3) As defined in NAVAIRINST 3960.5A, reference (h) Applicable to NTAB programs only. (4) As required by Competency Leadership or TECT. Should be identified in test plan. (5) Usually delegated to Squadron Commander. Enclosure (1)