DAG Chapter 1-PM Technical Mmgt Roles



PM RolesPM ActionsPM’s Necessary Product or OutcomeActivitiesAmplifying Steps and GuidanceTM 1 Engineering Management: Ability to manage a system engineering process; assess the government / contractor’s system engineering approach, activities, and products.1.1 Technical Planning1.1.1 Establish, update and critically evaluate a plan for the technical management of an acquisition activity.Systems Engineering Plan (SEP) update at each major event (milestone)1.1.1.1 Establish and/or review the current SEP IAW current system engineering planning guidance. Steps:Establish a common configuration management approach for SEP and IMS and related documents.Identify any impact to cost, schedule and/or performance as a result of these changes. Evaluate the systems engineering documents with respect to currency, that they are reflective of changing program conditions at each milestone.1.1.2 Ensure engineering processes are coordinated and applied properly throughout a system's life cycle consistent with the Systems Engineering PlanIntegrated Master Plan (IMP)Integrated Master Schedule (IMS) update at each major event (milestone)1.1.2.1 Align and resource program technical plans (TEMP, LCSP. Etc) with the SEP via the IMP and IMS Steps:1. Ensure the IMP is review periodically and updated at least at each milestone.2. Ensure the IMS is reviewed monthly and corrdinated within the government enterprise and all involved contractors.3. Ensure that program planning (and resourcing) is reflective of RAM approach.4. Ensure the RAM efforts across the program are intended to support cost and supportability projections for lifetime support.References:DoDI 5000.02, Enclosure 3DoD RAM-C Report Manual June 20091.1.3 Apply software acquisition management principles (historic or emerging) needed to make sound decisions for planning and executing an acquisition programSoftware Development Plan1.1.3.1 Assess program strategy/plan options that leverage agile or test-based acquisition approaches 1.1.3.2 Manage Software Development and Sustainment, including the evolution of an incrementally fielded capability [may be software-intensive] using a test-driven, prototype-based, iterative build-test-fix-test-deploy capability development process. Steps:Review the Interim DODI 5000.02; four of the six program models in the interim 5000 have significant software activities and the remaining two models likely include software.Plan and resource transition and interfaces of related systems.Consider the development and sustainment of software at every decision point in the acquisition life cycle since it can be a major portion of the total system life-cycle cost. Consider a phased software development approach using testable software builds and/or fieldable software increments to enable developers to deliver capability in a series of manageable, intermediate products to gain user acceptance and feedback for the next build or increment, and reduce the overall level of risk. References:DoDI 5000.02 Defense Acquisition Program ModelsDoDI 5000.02, Enclosure 31.1.4 Ensure Cybersecurity processes are coordinated and applied properly throughout a system's life cycleCybersecurity Strategy1.1.4.1 Develop and document the Risk Management Framework (RMF)1.1.4.2 Align the RMF with key technical planning documents (program requirements documents, system engineering, test and evaluation, and product support plans).Steps:Develop and implement the Risk Management Framework.Identify and document Cybersecurity risks.References:DoDI 5000.02, Enclosure 11 CJCSI 6510.01F, "Information Assurance (IA) and Support to Computer Network Defense (CND)," February 9, 2011DoDI 8500.01 – CybersecurityDoDI 8500.02 (Information Assurance (IA) Implementation)DoDI 8510.01 – Risk Management Framework for DoD IT CJCSI 6212.01F, "Net Ready Key Performance Parameter (NR KPP)," March 21, 2012 Section 811 of P.L. 106-3981.1.5 Manage re-use of legacy hardware and/or software Reuse Plan1.1.5.1 Evaluate the identification, assessment and handling of the risks associated with a reusability plan (H/W and/or S/W).1.1.5.2 Develop S/W and/or H/W reuse plan while handling inherent risk (e.g.: obsolescence and required missionization).Steps:Identify resources for a S/W reuse repositoryEnsure integration with the program IMS1.2 Requirements Decomposition1.2.1 Ensure a requirements management process provides traceability back to user-defined capabilities.Requirements Traceability Matrix (RTM)1.2.1.1 Evaluate a Requirements Traceability Matrix (RTM) in terms of its ability to continually capture all of a program’s approved requirements.Steps:Ensure the RTM features address requirements decomposition, derivation, allocation history, rationale for all entries and changes, and provides traceability from the lowest level component all the way back to the user-defined capabilities (ICD, CDD, CPD).Develop the RTM to manage the design requirement process and is aligned with changes resulting from the JCIDS and CSB, and vice versa.Ensure cost and schedule parameters are adjusted as requirements change.References: Systems Engineering DAG Chapter 41.2.2 Describe the need to convert functional and behavioral expectations to technical requirementsTechnical Requirements1.2.2.1 Evaluate technical requirements for affordability, achievability and traceability to stakeholder’s requirements, expectations, and perceived constraints.1.2.2.2 Validate and base line the technical requirements.1.2.2.3 Define the technical problem scope and the related design and product constraints.Steps:Undertand and reference the Concept of Operations (CONOPS).Ensure connectios are made to user requirements (e.g. JCIDS) and mission operation summary/mission profile.1.2.3 Ensure the design incorporates reliability, availability and maintainability requirements across a system's life cycle Reliabiltity Growth CurvesRAM-C Report1.2.3.1 Develop and document reliability growth expecations.1.2.3.2 Assess reliabiltiy growth progress versus the RAM-C expectations.Steps:Ensure that program planning (and resourcing) is reflective of RAM approach.Use RAM across the program to support cost and supportability projections for lifetime support.References:DoDI 5000.02, Enclosure 3DoD RAM-C Report Manual June 20091.2.4 Ensure the open systems architecture design is compatible with user performance, interoperability, and product support requirements and desired capabilities.Open Systems Architecture CertificationSEPLSCP1.2.4.1 Define system design to achieve interoperability with existing and planned DoD/Service/Agency systems.1.2.4.2 Incorporate a modular open systems approach (MOSA) in order to optimize the design for effective and efficient product support.Steps:1. Survey existing and planned modular and open systems and lessons learned.2. Determine from those systems candidates for the system.3. Consider the life-cycle support costs and benefits of incorporating the existing systems.4. Coordinate design decisions with stakeholders.5. Document decisions in the LCSP in coordination with the SEP.References:DoDI 5000.02, Enclosure 3"DoD Open System Architecture Contract Guidebook for Program Managers," December 15, 20111.2.5 Ensure the information technology design requirement considers Interoperability as well as trusted systems and networksIT Interoperability Plan1.2.5.1 Evaluate (and document) the degree to which information technology (IT) design requirements consider both Interoperability and trusted systems/networks.Steps:Establish working group of key IT and NSS stakeholders.Develop interoperability requirements.Develop requirements for trusted systems and networks.Iterate (with stakeholders) development of the IT interoperability plan.References:DoDI 5000.02, Enclosure 11DoDD 4630.05, "Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS)," May 5, 2004DoD Instruction 4630.8, ''Procedures for Interoperability and Supportability of Information Technology (IT) and National Security Systems (NSS)," June 30,2004DoD Instruction 8320.02, "Sharing Data, Information, and Information Technology (IT) Services in the Department of Defense," August 5, 2013DoD Instruction 8410.03, "Network Management (NM)," August 29,2012Criticality Analysis1.2.5.2 Conduct a criticality analysis to identify mission critical functions and critical components related to trusted systems and networks.Steps:Establish objectives, scope and approach of analysis effort.Assess relevant Trusted Systems and Networks plans.Assess relevant implementation activities in Program Protection Plans ,cybersecurity plans and related documentation.References:DoDI 5000.02, Enclosure 11DoDI 5200.44, "Protection of Mission Critical Functions to Achieve Trusted Systems and Networks (TSN)," November 5, 2012Supplier Threat Analysis1.2.5.3 Conduct a threat analysis of suppliers of critical components (Supplier All Source Threat Analysis)Steps:Determine all available suppliers of critical components.Integrate results of criticality analysis with relevant suppliers as appropriate.Incorporate the above results into the development of the Supplier Threat Analysis.References:DoDI 5000.02, Enclosure 11DoD Instruction 5200.441.3 Technical Assessment1.3.1 Manage the process to document, coordinate, and substantiate the transition of system elements to the next level in the SE processIMS1.3.1.1 Evaluate the implementation of an event-based review process that transitions a product (HW and/or SW) from concept through initial operating capability (IOC).Steps:Establish program roadmap consisting of key program events (PE), especially technical reviews (e.g. SRR, SFR, PDR, CDR, TRR, PRR, etc).Ensure alignment of all technical planning documents (e.g. SEP, TEMP, LCSP) with program planning docuemnts (e.g. PMP, Risk mgt plan).Clarify entry and exit criteria for all program events (PE) in terms of significant accomplishments (SA) and accomplishment criteria (AC), and as a result generate an IMP.Ensure all relevant scope to achieving the PE, SA and AC are traced to the program WBS.Based on the results of 1-4 above, create the program Integrated Masster Plsan (IMP).Transfer IMP information into the Program Integrated Master Schedule (IMS) and align AC with the approapriate tasks. 1.3.2 Ensure a process for monitoring and selecting Design Solution that translates the outputs of the Requirements Development and Logical Analysis processes into alternative design solutions and selects a final design solution.Successful accomplishment of SRR/SFR 1.3.2.1 Chair, or participate substantially in, program technical reviews through and including the System Functional Review (SFR))Steps:When chairing the review, ensure all issues related to the SFR are successfully addressed and recorded with appropriate actions assigned. When “substantially participating” in the review, conduct actions that directly influence maturation of a design from system-level through and including CDR-level clarity.Ensure the final design solution is producible.Guidance:The reviews relevant to this competency include the ASR, SRR, and SFR1.3.3 Ensure the process for monitoring the implementation effort actually yields the lowest level system elements in the system hierarchy. Successful accomplishment of CDR (exit criteria) 1.3.3.1 Chair, or participate substantially in, program technical reviews through and including the Critical Design Review (CDR)1.3.3.2 Evaluate the integration activites of lower level system elements into a higher-level system element in the physical and logical architecture.Steps:When chairing the review, ensure all issues related to the CDR are successfully addressed and recorded with appropriate actions assigned.When “substantially participating” in the review, conduct actions that directly influence maturation of a design from system-level through and including CDR-level clarity. Ensure producabiliity is a critical consideration at the review since it is part of the exit criteria for the CDR that sets the stage for production.Guidance:The reviews relevant to this competency include the SRR, SFR, and in particular the PDR and CDR.1.3.4 Identify, explain and employ measures to assess the technical maturity of a design solution, relative to operational performance requirementsDocumented performance assessment and forecastingEntry and Exit Criteria included in the SEP 1.3.4.1 Evaluate the alignment of technical performance measures to ensure delivery of the desired operational performance requirements. 1.3.4.2 Develop and track entry and exit criteria to be used to determine readiness for the program to proceed Steps:Analyze all KPPs and KSAs in terms of key design drivers/pare KPP/KSA analysis results with relevant information from technical planning documents such as the SEP, TEMP, PPP and LCSP.Develop technical performance measures (TPM) and align them with the WBS.Establish key technical decision points (such as design reviews) and determine associated significant accomplishmsnts and accomplishment criteria (per an IMP).Integrate decision criteria into the SEP, as appropriate, and publish formally as entry and exit criteria for each review.Guidance: “Documented performance assessment and forecasting” can take place in any number of venues, to include PMRs, TIMs and design reviewsThe translation of desired performance/operating capabilities (KPP, COI, KSA) into system development terms is typically accomplished via TPMs1.3.5 Ensure technical measures are continually assessed (tracked, trended and forecasted) to support program decisionsTechnical performance measures (TPMs) and technical measures1.3.5.1 Identify and implement hardware and software metrics appropriate for each level of a program1.3.5.2 Identify and implement technical support measures to be incorporated as part of the QASP1.3.5.3 Identify and track technical measures associated with the handling of risks and opportunitiesSteps:Establish technical performance measures (TPM) and/or functionality points/indicators as appropriate to ensure insight into KPP, KSA or other desired performance.Align TPMs and/or functionality with associated design/functional requirements and WBS elementsEstablish key indicators of technical process maturity/execution.Integrate TPM and technical process indicators into risk handling plans.Input TPM, risk handling tasks/options and technical process indicators into IMS tasks and associated closure criteria.Guidance: “Level of a program” refers to how the system is decomposed or manged at the component up to the system level.“TPMs” refer to technical measures directly linked to attaining the program KPPs and KSAs, e.g. system weight, speed, S/W reliability, etc.Technical measures are linked to the technical tracking of the program, e.g., status of process performance, status of requirementss, status of interfaces, status of technical documentation, etc.1.3.6 Assess whether technical measures are causing the correct (expected) organizational and contractual behavior 1.3.6.1 Evaluate technical measures in the context of program cost and schedule impacts1.3.6.2 Evaluate technical measures in the context of contract incentives and contractor performance assessmentsSteps:Establish technical performance measures (TPM) and/or functionality points/indicators as appropriate to ensure insight into KPP, KSA or other desired performanceAlign TPMs and/or functionality with associated design/functional requirements and WBS elementsEstablish key indicators of technical process maturity/executionAlign TPMs and desired functionality with contractor incentives, as appropriateIntegrate TPM and technical process indicators into risk handling plansInput TPM, risk handling tasks/options and technical process indicators into IMS tasks and associated closure criteria.1.3.7 Plan for and/or evaluate a systems readiness to operate in the intended environment, e.g., information assurance, air worthiness, sea worthiness, net ready.Letter of certification1.3.7.1 Develop and implement plans within appropriate technical planning documents to ensure system certification.1.3.7.2 Assess methods, tools, and procedures, including development of artifacts to ensure the system is certified/approved for use in its intended operating environmentSteps1. Review certification sources and associated requirements together with relevant certification authority to determine unique/tailored expectations.2. Document tailored (unique to program and conditions) certification requirements for the desired capability/characteristic.3. Confirm with certification authority that tailored requirements are adequate for certification.4. Establish working group including all stakeholders that influence, or are influenced by, the certification requirements.5. Integrate certification requirements into program requirements and planning documentation, to include technical requirements documents, technical planning, IMS, IMP, Test (DT/OT) planning and risk/opportunity register.References:DAG chapter 41.3.8 Conduct Post Implementation Review (PIR).PIR1.3.8.1 Plan and conduct a PIR in coordination with the Functional SponsorSteps:Ensure the PIR will reports the degree to which doctrine, organization, training, materiel, leadership and education, personnel, facilities, and policy changes have achieved the established measures of effectiveness for the desired capabilityEvaluate systems to ensure positive return on investment and decide whether continuation, modification, or termination of the systems is necessary to meet mission requirementsDocument lessons learned from the PIR.Guidance:The Functional Sponsor, in coordination with the DoD Component CIO and Program Manager, is responsible for developing a plan and conducting a PIR for all fully deployed IT, including NSS. Reference:DoD I 5000.02, Enclosure 11 para 41.4 Decision Analysis1.4.1 Apply, evaluate and explain multiple approaches to decision analysis concerning technical challenges A documented significant program decision1.4.1.1 Identify decisions requiring technical information to be tracable and defendable1.4.1.2 Apply, evaluate and explain multiple approaches to decision analysis in order to ensure, as a minimum, satisfactory solution to technical problems. Steps:Clearly establish decision goals while framing the technical decision.challenge/problem within the context of program capabilities, cost, schedule and performance constraints/requirementsIdentify the options and alternatives and/or courses of action that are likely to be considered.Characterize risks of information uncertainties, including information that is likely to be missing, unreliable, conflicting, noisy (irrelevant) and confusing.Define evaluation criteria for the decision, to include defining the difference between what is optimal versus “good enough”. Evaluate likely alternative choices and, to the degree practicable, model, simulate or otherwise describe the implementation results from each alternative.Select justify and record the recommended alternative and/or course of action.1.5 Configuration Management1.5.1 Articulate the program technical insights provided by the configuration management processConfiguration Management Plan1.5.1.1 Evaluate a program’s Configuration Management process and its supporting activities for consistency with MIL-HDBK 61A.Steps:1. Ensure Configuration Management planning is complete and documented in the program’s Systems Engineering Plan (SEP) and supporting Configuration Management Plan (CMP).2. Ensure the configuration management program enterprise should be resourced.3. Continually assess the relationship between the program’s configuration management process, program requirements, and the pace of growth of those requirements.References:ANSI/EIA-649-B-2011 “Configuration Management Standard,”MIL-HDBK-61ADAG Section 4.3.71.5.2 Employ Configuration Management methods and best practices to establish and maintain consistency of a product's attributes with its requirements and product configuration informationApproved Engineering Change Proposal, Request for Deviation, and/or Request forWaiver1.5.2.1 Approve or prepare, analyze, and/or coordinate an Engineering Change Proposal, Request for Deviation, and/or Request for Waiver.1.5.2.2 Support or Chair a Configuration Control board (CCB).Steps:1. Ensure the impact to the functional, allocated, and/or product baseline is clarified. 2. Assess the cost and schedule implications and ensure incorporation into performance measurement and forecasting.3. Ensure relevant DIDs and CDRL items are reviewed, tailored as appropriate and incorporated into the contract.References:MIL-HDBK-61A, APPENDIX D, ECP Management Guide1.6 Technical Data Management1.6.1 Apply the principles, procedures, and tools of data management and associated data rightsData Management Strategy (as integrated in the Acquisition Strategy)1.6.1.1 Develop and implement a data management strategy for the long-term needs for product data necessary to develop, acquire, manufacture, operate, support, maintain and dispose of their acquisition program.Steps:Assess and determine the data required to design, manufacture, and sustain the system.Assess and determine the data required to support re-competition for production, sustainment or upgrade.References: DoD 5010.12M, Procedures for the Acquisition and Management of Technical Data1.6.2 Oversee and appraise a Program’s Technical Data Management PlanTechnical Data Management Plan1.6.2.1 Determine the appropriate data rights required (quality and quality) to ensure the Government can use the data across the program life cycle to the appropriate legal extent. 1.6.2.2 Envoke and document a disciplined process to plan, acquire, access, manage, protect, and use data of a technical nature to support the total life cycle of the systemSteps:Assess the risks to the program of not acquiring the desired product data and/or data rights due to cost or other considerations. Determine the contractual actions needed to acquire the data and data rights from contractors.Determine, plan and budget for the Information Technology (IT) repositories/environment, access controls, configuration management actions, maintenance and disposal that must be funded over the acquisition life cycle to manage the data.1.6.3 Appraise the importance and legal complexity of data and the associated rights for both hardware and software including documentation and source code.Data Rights Requirements Analysis (DRRA)1.6.3.1 Analyze planned uses of software in an acquisition program as well as possible future uses of the software products or tools to determining the Government’s need for data and associated data rights.Steps:Continually (updated at each key decision point such as design review, program milestones, etc.) assess the contractor’s stated needs relative to intellectual property.Identify the need for intellectual property. Identify the data rights you have for your program.Determine the risk for the program by acquiring or not acquiring the data and associated rights. Consider the possible need for data rights other than the Federal Government (i.e. FMS customers, third-party maintainers and or users).1.6.4 Oversee and assess a program life-cycle data management method for an item, system, or facilities (including COTS)Data management assessment1.6.4.1 Evaluate the program’s life-cycle data management method for items, system, and facilities including COTS.Steps:1. Review existing and planned Enterprise-level data management systems (DoD, Service, Agency) capabilities.2. Leverage existing requirements to maximize interoperability and minimize cost, burden and footprint.3. Review system-unique data management requirements and perform alternatives analysis.4. Establish appropriate agreements for enterprise-level common requirements.5. Prepare appropriate specifications for system-unique requirements.1.7 Interface Management1.7.1. Oversee a process to ensure all interfaces are defined and in compliance with the system elements and other systemsProduct Interface Management Plan1.7.1.1 Approve, or substantively review and provide feedback to ensure the SEP and the SEMP incorporate an Interface Management Plan.Steps:Ensure internal and external interfaces and their requirement specifications are documented.Identify preferred and discretionary interface standards and their profilesEvaluate justification for selection and procedure for upgrading interface standards.Ensure the certifications and tests applicable to each interface or standard.Align with the program’s configuration management plan.Deveoplment contract SOW and CDRL list1.7.1.2 Evaluate the contract SOW and CDRLs ability to adequately reflect all system interface requirements.Ensure documented interface requirements serve critical functions at all levels of the system.Develop functional and physical architectures.Facilitate competitive bids.Enable integration of systems and lower-level system elements.Support system maintenance, future enhancements, and upgrades.Provide input data for continuous risk management efforts.References:DAG section 4.3.5. Requirements Management ProcessDesign review artifact approval1.7.1.3 Evaluate the artifacts produced at System Requirement Review (SRR), Preliminary Design Review (PDR), Critical Design Review (CDR), Physical Configuration Audit (PCA, and Post Implementation Review (PIR)Steps:Identify appropriate artifacts associated with each design reviewAssess, based on unique program conditions, the desired content and maturity of each document at each review.IIntegrate desired artifact characteristics with entry and exit review criteria for each review.Using the IMS, adjust or ensure timing of artifact deliveries allows adequate time for Government review and contracgtor responseIIntegrate desired characteristics and timing of artifacts directly into SOW and CDRL requirements.References:DAG chapter 4DoDI 5000.02, enclosure 11, para 4Documented CCB decisionsCurrent and complete interface control documents (ICDs)and specifications1.7.1.4 Conduct, or substantively participate in an Interface Control Working Group (ICWG) to execute the Interface Management Control Process Flow. Steps:1.Establish systems and subsystem interface requirements.2.Align stakeholders with each interface or set of interfaces.3. Establish ICWG membership based on identified stakeholders.4. Evolve/Modify ICWG membership as program matures/changes and additional interfaces are revealed or refined.References:Configuration Management-Reference MIL-HDBK-61ADATA ITEM DESCRIPTION (DID) Title: Systems Engineering Management Plan (SEMP) Number: DI-SESS-81785 OSD’s SEP Preparation Guide.Open Systems Architecture Guidebook for Program Managers, DAG section 4.3.5. Requirements Management ProcessTM 2 Defense Business Systems: Ability to evaluate the complexity of a software effort; synthesize the development, integration, and test management activities; and apply appropriate measurement techniques to support embedded system software, computer systems, or IT related programs2.1 DBS Certification2.1.1 Interpret and comply with Defense Business Systems certification/ accreditation guidanceDBSMC certification approval2.1.1.1 Develop and implement plans within appropriate technical planning documents to ensure DBSMC certification approval.Steps:Ensure DBSMC certification approval is received prior to any obligation of funds for acquisition. Ensure programs are re-certified at least annually. Guidance:Defense Business Systems Management Committee (DBSMC), chaired by the Deputy Secretary of Defense, is the approval authority for all statutorily required DBS certifications and will document such decisions. The Milestone Decision Authority (MDA) [when at the OSD or Military Department level] will serve as a member of the DBSMC.References:DoD I 5000.02, November 25, 2013 Enclosure 12Office of the DoD Deputy Chief Management Officer, "Defense Business Systems Investment Management Process Guidance," June 20122.2 DBS Acquisition Approach Preparation2.2.1 Develop and evaluate an acquisition approach for a Defense Business SystemProblem StatementBusiness Case2.2.1.1 Review and analyze the Problem Statement developed by the functional user.2.2.1.2 Co-develop and update the Business Case with the sponsor.Steps:For DBS, use a "Problem Statement" in lieu of an "Initial Capabilities Document (ICD). For DBS, employ an “Acquisition Approach” as part of the Business Case instead of an “Acquisition Strategy.”Ensure the Business Case also includes information from an LCSPI.References:DoD I 5000.02, Enclosure 12, Para 4 and 5TM 3 Test and Evaluation Management: Ability to develop, document, staff, and assess a test and evaluation program within the lifecycle of an acquisition program.3.1 Test Planning3.1.1 Describe how test and evaluation (T&E) activities verify the system will ultimately meet user requirements.Evaluation Framework Matrix3.1.1.1 Identify evalution criteria to support test and evaluation planning efforts via the requirements documents.3.1.1.2 Given all key program documents define top-level metrics/measures requied for decision support.3.1.1.3 Identify, assess and as necessary negotiate the types of tests or evaluation activites available and/or appropriate to confirm system performance.Steps:Review the following documents when identifying evaluation criteria:CDD/CPDProgram Protection Plan (PPP)Cyber Security Plan (IA)Information Support Plan (ISP)System Threat Assessment (STA)Environmental, Safety and Occupational Health (ESOH)Concept of employment/operation (CONOPS)Operational Mode Summary and Mission Profile (OMS/MP).2. Understand unique program conditions (e.g., specific service, organization, or mission area) when conducting Task 3.1.1.3. 3.1.2 Oversee comprehensive T&E planning from component development through realistic or operational T&E into production and deploymentTEMP3.1.2.1 Develop and assess Part I of the TEMP which includes the Mission, System Description, Threats involved, Program history to include any previous testing as well as any Key Capabilities or interfaces. 3.1.2.2 Ensure the development test plans are sufficiently robust to demonstrate readiness for operational test.3.1.2.3 Provide input for any special tests or certification requirements.3.1.2.4 Evaluate the efficiency and effectiveness of the programs modeling and simulation plans to support program risk and opportunity handling and successful execution of the development and test activities.3.1.2.5 Explain the roles and responsibilities of the WIPT, Integrated Test Team ITT, Combined Test Team CTT, Operational Test Agency OTA as necessary to address all the T&E issues and document support for test activities.3.1.2.6 Ensure the test and evaluation plan aligns to the authorized budget for the program.3.1.2.7 Incorporate T&E requirements into the contract (statement of work, CDRL, and associated request for proposals, etc).3.1.2.8 Ensure the overall test support/facilities resources will be available and capable to execute the test program.Steps:Coordinate with all stakeholders for key documents to identify the content. Ensure the TEMP is consistent with key program documents: Acquisition Strategy (AS), SEP, LCSP, PPP, ICD, CDD, CPD, Environmental, Safety and Occupational Health and concept of employment/ operation; including the operational mode summary and mission profile (OMS/MP).Facilitate a common T&E DatabaseAssess the adequacy of the T&E management forum to address all the T&E issues and documentation.Oversee the development of the evaluation strategy to ensure technology maturity growth through modeling and simulation, DT and OT. The evaluations must provide indicators that the system will perform in its intended environment.References: Defense Acquisition Guidebook (DAG) section 9; Memorandum for users of the DOT&E TEMP Guidebook and the Interim DoD5000.02 DAG section 9.5 DoD Integrated Product and Process Development Handbook, DAG section 9.2 and “Rules of the Road – A Guide for Leading a Successful Integrated Product Team DAG section 9.5.5.3MAIS IOT&E Plan3.1.2.9 Plan for Limited Deployment For a Major Automated Information System (MAIS) Program.Steps:Analyze requirements for limited deployment of a MAIS.Evaluate MDA decision (and associated DOT&E input) relative to MAIS program deployment. References:DoD I 5000.02, Enclosure 11 para 83.1.3 Ensure the comprehensive test and evaluation activities address stated and emerging (hardware, software, environmental and services) risks and opportunities.Test and Evaluation Master Plan3.1.3.1 Ensure the development test and evaluation activites address the stated handling of technical (hardware and software) risk and opportunities.3.1.3.2 Ensure the operational test and evaluation activities address the stated handling of the users risk and opportunities.Steps:Ensure the SEP addresses the following: software unique risks; inclusion of software in technical reviews; identification, tracking, and reporting of metrics for software technical performance, process, progress, and quality; software safety and security considerations; and software development resources. Ensure software assurance vulnerabilities and risk based remediation strategies are assessed, planned for, and included in the ProgramProtection Plan (PPP).Integrate SEP and PPP information into the TEMP.References:DoD I 5000.02, Enclosure 3, "System Engineering" Para 11 "Software"3.1.3.3 Integrate Environment, Safety and Occupational Health (ESOH) risk management into the overall systems engineering process for all T&E activities throughout the system’s life cycle. 3.1.3.4 Prepare and maintain a Compliance Schedule that includes the test schedules and locations identified in the TEMP to enable consideration of potential impacts to the environment and completion of appropriate documentation.Steps:1. Identify and assess potential ESOH hazards.2. Ensure the safe conduct of test activities and compliance with applicable environmental requirements by eliminating ESOH hazards.3. Identify hazardous materials, wastes, and pollutants (discharges/emissions/ noise) associated with the T&E program and develop plans for minimization and/or safe disposal.References:MIL-STD-882NEPA/E.O.121143.1.4 Recognize decision factors in T&E (requirements, resources, product maturity including hardware/software, and developmental reviews) needed to confirm readiness to start the test.Test Readiness Review3.1.4.1 Chair or substantially participate in / contribute to a Test Readiness Review to ensure and document that the subsystem or system under development is ready to proceed into formal test.Steps: When chairing the review, ensure all issues related to the TRR are addressed and recorded with appropriate actions assigned. When “Substantially participating in or contributing to” a review, actions include, but are not limited to the following activities:Determine if all planned preliminary, informal, functional, unit level, subsystem, system, and qualification tests been conducted, and results are satisfactory for entering the next phase of testing.Review how the test risks are being assessed and mitigated within the Test Plan and determine what actions or adjustments should be made to the overall program.Identify implications for overall program cost, schedule and product performance.Reference: DAG Section 9Operational Test Readiness Review3.1.4.2 Chair or substantially participate in / contribute to an OTRR to ensure and document that the system under review can proceed into Operational Test and Evaluation (OT&E) with a high probability the system will successfully complete operational testing.Steps:When chairing the review, ensure all issues related to the OTRR are addressed and recorded with appropriate actions assigned. When “substantially participating in or contributing to” a review, actions include, but are not limited to the following activities:Review Developmental Test and Evaluation (DT&E) results to assess the system’s progress against the key performance parameters, key system attributes, and critical technical parameters in the Test and Evaluation Master Plan (TEMP).Review how the test risks are being assessed and mitigated within the Test Plan and determine what actionos or adjustments should be made to the overall program.Identify implications for overall program cost, schedule and product performance.Reference: DAG Section 93.1.5 Evaluate realistic tests or the OT&E program that will determine the operational effectiveness and suitability of a system under realistic operational conditions in an operational environment. OT&E test plan3.1.5.1 Ensure OT&E test planning considers appropriate use of accredited M&S to support DT&E, OT&E, and LFT&E and be coordinated through the T&E WIPT.3.1.5.2 Assess the reliability growth required for the system to achieve its reliability threshold during OT&E.Steps:1. Maximize the use of an integrated testing approach in the OT&E test planning to reduce resource requirements without compromising either DT&E or OT&E objectives. 2. Maximize training and exercise activities in the OT&E to increase the realism and scope of both the OT&E and training, and to reduce testing costs.3. With OTA’s participation in early DT&E and M&S, identify technology risks and provide operational insights to the PM, the JCIDS process participants, and acquisition decision-makers. 4. Ensure OTA involvement in the monitoring of, or participating in, all relevant DT&E activities and wherever possible draw upon test results with the actual system, or subsystem, or key components, or with operationally meaningful surrogates.5. Ensure OT&E uses threat or threat representative forces, targets, and threat countermeasures, validated by the DIA or the DoD Component intelligence agency, as appropriate, and approved by DOT&E.6. Ensure OT&E evaluates cyber security on any system collecting, storing, transmitting, or processing unclassified or classified information.3.1.6 Recognize security and safety compliance (such as people and item/system under test) and environmental requirements constraints to protect resources and comply with established policiesIntegrated Cybersecurity Risk Management Plan3.1.6.1 Integrate the Cybersecurity Risk Management Framework steps and activities into the test and evaluation (T&E) program.3.1.6.2 Ensure alignment of security related testing construct in the program protection plan and the TEMP Steps:1. Working with the Program Manager, T&E subject matter experts, and applicable certification Stakeholders, assist the user in writing testable measures for cybersecurity; and include these measures in the Test and Evaluation Master Plan (TEMP).2. Ensure that the TEMP documents the threats to be used, based on the best current information available from the intelligence community.3. Ensure that the TEMP documents a strategy and resources for cybersecurity T&E. 4. Ensure the test program will include, as much as possible, activities to test and evaluate a system in a mission environment with a representative cyber-threat capability.5. Ensure that software in all systems will be assessed for vulnerabilities, including penetration testing from an emulated threat in an operationally realistic environment for higher criticality systems.3.1.7. Recognize the role of data alignment in supporting specific test objectives required to successfully conduct an overall evaluationTest Report3.1.7.1 Assess the degree to which a test & evaluation program ensures alignment of test data with test objectives.3.1.7.2 Evaluate the use and application of collected test data in light of impact to operational requirements.Steps:1. Identify the test and evaluation framework required to shape desired data collection. 2. Determine the cost, schedule and performance ramifications relative to accomplishments, gaps and/or new testing relative to the existing framework.3.2 Test Execution3.2.1 (DT): Oversee a comprehensive T&E program to validate system specifications and requirements including use of Modeling & Simulation.TEMP Test and evaluation reports3.2.1.1 Apply outputs of the T&E program (including M&S) to assess whether evolving design meets system design-to or build-to specifications and user requriements.3.2.1.2 Track actual schedule and test resources being used to execute the T&E program to enable early identification of any projected shortfalls or schedule impacts. Elevate to program government body as potential need for management reserve. 3.2.1.3 Maintain insight into contractor T&E activities to track progress in accomplishing test objectives and meeting subsystem or system performance requirements.3.2.1.4 Evaluate the T&E efforts of other participating government activities to ensure efficient and timely execution as well as seamless integration with other test efforts.3.2.1.5 Ensure test infrastructure and tools to be used in operational tests undergo verification, validation, and accreditation (VV&A) by the intended user or appropriate agency.Steps:Track the execution of T&E to elevate any key performance issues early for consideration of design trade-offs.Ensure the ITT consists of empowered representatives of test data producers and consumers (to include all applicable stakeholders) to execute and/or adapt the test strategy to efficiently throughout the acquisition life cycle.Ensure all test infrastructure and/or tools (e.g., models, simulations, automated tools, synthetic environments) to support acquisition decisions are verified, validated, and accredited (VV&A) by the intended user or appropriate agency. Obtain workaround options from TPWG to mitigate cost/schedule impacts of any VV&A issues. Consider appropriate integration of operational test considerations in development tests.3.2.2 (OT): Manage Department/Agency process to ensure the system does not enter IOT&E before it is sufficiently mature to successfully pass suitability and effectiveness measures.System maturity metrics for IOT&E entrance criteria3.2.2.1 Ensure the Operational Test Readiness Review system maturity metrics support the decision to proceed into IOT&E.3.2.2.2 Assess system requirements against latest threat information.3.2.2.3 Ensure user involvement in early system assessments/tests.Steps:Establish critical technical parameters and track progress throughout DT&E. Coordinate with the operational test director to articulate the scope of planned operational testing, the critical operational issues, and the measures of effectiveness and suitability.TM 4 Manufacturing Management: Ability to manage and apply the principles of manufacturing management in order to influence the design process, transition to production, and execution of the manufacturing plan. Able to analyze program office and contractor status/plans for transition to production.4.1 Manufacturing Planning and Transition4.1.1 Oversee management actions leading to an adequate and efficient manufacturing capability and production (if applicable) of the minimum quantity necessary to provide production or production-representative articles for IOT&E. System Engineering Plan (SEP),Statement of Work (SOW) and associated CDRLs CDR conduct and Post-CDR Assessment4.1.1.1 Formulate SEP and contract requirements that drive manufacturing considerations into the system design and engineering process.4.1.1.2 Ensure PQM considerations are addressed in technical and design reviews.4.1.1.3 Evaluate whether results of early producability assessments warrant changes to system level requirements to drive more efficient manufacturing capability and production.4.1.1.4 Ensure the manufacturing strategy/plan aligns to the authorized budget, resources and schedule for the program.4.1.1.5 With customer and industry partner(s), evaluate trade space between design choices, producability, and cost/schedule.Steps:1. Ensure the SEP addresses how manufacturing trades should be factored into the design process.2. Ensure SOW addresses Manufacturing Readiness Levels and process proofing as part of considerations for entrance/exit criteria for PDR, CDR, and Production Readiness Review(s).3. Assess manufacturing readiness early in the program lifecycle, as early as the AoA and Milestone A.Reference: Defense Manufacturing Management Guide (available on the DAU ACC.4.1.2 Establish an initial production base (along with initial spares production) for the system expandable to an effective full rate productionLow-rate initial production decisionProduction Readiness Review ReportMilestone C Briefing4.1.2.1 Discern LRIP readiness based on demonstrated manufacturing capability and program maturity. 4.1.2.2 Incorporate production requirements into the contract (statement of work, CDRL, and associated request for proposals, etc).4.1.2.3 Ensure the overall production support/facilities resources will be available and capable to execute the manufactuing program.Steps:Establish conditions that will determine “stability” of the design.Once the design is determined to be stabile, update program cost estimate based on, at a minimum, revised production process yields and labor costs.4.1.3 Establish and optimize a full rate production base for the system, based on LRIP and successful operational testingPCAPRR reportfull-rate production decision4.1.3.1 Discern FRP readiness based on demonstrated manufacturing capability and program maturity. Steps:Update program cost estimate.Consider post production shut down (and re-start) issues.4.1.4 Understand material management functions, manufacturing facilities design and associated issues/implications of new product and process technologiesManufacturing program review assessment/ report4.1.4.1 Conduct, or substantially participate in, periodic/ongoing reviews of manufacturing performance.4.1.4.2 Apply, in coordination with DCMA quality control, quality assurance, and validation and verification techniques associated with PQM environment in order to ensure quality products.4.1.4.3 Evaluate supply chain management roles and responsibilities (including DMSMS) in the design and execution of a production line.4.1.4.4 Evaluate the application of appropriate recognized standards for product performance and production processes (e.g., six sigma tools, NIST, ISO, ANS).Steps:Perform the manufacturing readiness assessment at all milestones. Resource planning and scheduling for the manufacturing readiness assessment as required.Tooling and special test equipment are planned for and resourced.References: SD20-22 DMSMS Guidebook and the Defense Manufacturing Management Guide (available on the DAU ACC)4.2 Manufacturing Shutdown4.2.1 Plan for production line shut down and post production support.Manufacturing Shutdown Plan4.2.1.1 Develop, or substantively participate in developing a plan and budget to manage production line shutdown. 4.2.1.2 Alternatively, take an existing shutdown plan to review, assess and create a written analysis report including highlighting significant lessons learned. 4.2.1.3 Plan for and budget for what is to be done with tooling and facilities.Steps:Assess all manpower and personnel activity associated with a manufacturing shutdown.Appraise, disposition and document all special machinery relevant to manufacturing shutdown.Analyze, evaluate , and document all manufacturing related methods and processes associated with program shutdown and determine a plan of action to preserve.Assess, analyze and document all materials including government owned tooling associated with programmatic shutdown.Appraise all manufacturing measuring devices and programmatic measurement data necessary for transition.Evaluate, analyze, and assess all detailed design drawings that have been completed for key technologies and components for transition.Assess and develop a plan to archive the Initial Quality Assurance Plan and the Quality Management System.Develop and Implement a budget plan for government funded facilities used for storage of associated manufacturing items and artifacts.In conjunction with the PCO, develop an approach to close relevant contracts.4.2.2 Execute a system level production line shutdown and transition to post production supply chain support.System Verification Review approvalProduction shutdown decision4.2.2.1 Analyze and plan for post-production support activities and address during the System Verification Review.4.2.2.2 Ensure special tooling and production facilities are properly placed in long term storage or disposed.4.2.2.3 Ensure all production line documentation is properly archieved.4.2.2.4 Ensure post product supply chain processes are established to address failure defect investigation support.Steps:1. Assess the extent of post-production support required based on individual program and item.2. Allocate failure / defect investigation support as required from design agent production engineering resources. 3. Plan and budget for the personnel resources in the event resources are needed for both of these items.Reference: DoD SVR checklist available on the DAU ACC siteTM 5 Product Support Management: Ability to manage and apply the principles of DoD life cycle logistics management in order to influence the design process and the development and execution of the product support plan. This includes an understanding and implementation of appropriate actions on supply chain management, environmental matters relating to the development, manufacture, and disposal of facilities and end items.5.1 Product Support Planning5.1.1 Oversee management actions leading to an adequate and efficient hardware and software product support capability.Requirements AnalysisIntegrated Logistics AssessmentSupportability analysisProduct support strategyLCSP5.1.1.1 Ensure all user requirements documents are reviewed and program planning documents are appropriately aligned to provide an affordable life cycle solution.5.1.1.2 Analyze overall life cyle risks and opportunities related to cost, schedule and product support impacts.5.1.1.3 Evaluate legacy system support requirements and corresponding documentation.5.1.1.4 Determine the optimal use of either a traditional or performance based logistics support plan.5.1.1.5 Create a traditional and/or Performance Based Logistics strategy.Steps:1. Assess KPPs and KSAs for life-cycle support implications.2. Identify program and system design trade-offs to comply with functional, quality, and constraint requirements. 3. Identify life-cycle support features from the legacy system which we want to sustain in the new system (minimize redundancies).4. Identify life-cycle support features in the legacy system that we do not want to continue in the new system.5. Review legacy system life-cycle support lessons learned documentation.6. Interview and observe end-users and sustainment personnel operating and sustaining the legacy system.7. Identify organizations and activities who have roles and responsibilities supporting the test and evaluation strategy throughout the program acquisition life cycle.References:DAG Section 9.5.45.1.2 Evaluate the definition, importance, application and oversight of DoD life cycle sustainment metrics.KSAsProduct support related TPMs5.1.2.1 Identify measures and metrics that contribute to Operational Availability for the system.5.1.2.2 Perfrom analysis to determine the reasonableness of the measures and metrics (within program cost, schedule, and performance constraints).Steps:1. Review key metric parameters and estimates over the system life-cycle.2. Ensure the metrics have been conveyed to the appropriate contract and organic support documentation.3. Compare sustainment measures and metrics with legacy or analogous system.4. Assess the costs of collecting and analyzing sustaininment measures.5.2 Product Support Management5.2.1 Evaluate and optimize the logistical burden (footprint) that an item/system will place on the user.Life-Cycle Support Plan (Maintanence Concept Section)5.2.1.1 Assess overall maintenance concept.Steps:1. Review mainentance scenarios with stakeholders.2. Refine maintance support requirements to minimize burden and footprint.3. Document findings in the Maintenance Concept section of the LCSP.5.2.2 Develop and implement a software support plan, including development, modification, upgrades, and retirement or replacement of software and/or information technology products.Life-Cycle Support Plan (Information Support Plan Section) Maintanence Data Management System Requirements5.2.2.1 Prepare and/or evaluate a program Information Support Plan.5.2.2.2 Derive maintenance data management system requirements from sustainment concepts and plans.Steps:1. Review mainentance scenarios with stakeholders.2. Identify retiring, upgrading, existing software intensive components.3. Define alternatives to maintain the system software through the life-cycle.4. Document the most advantageous alternatives given the program objectives in the Information Support Plan section of the LCSP.5. Ensure continued compliance with the Clinger-Cohen Act.5.2.3 Develop and implement a performance based agreement for a hardware, hardware/software, or information technology based program.Service level agreement Performance based Inputs to the IMP and IMS agreements5.2.3.1 Define the information systems failure criteria and expected performance level in coordination with the end users.5.2.3.2 Assess methods and tools used to determine the quality, efficiency, effectiveness and practicality of information technology systems. 5.2.3.3 Integrate sustainment tasks into the Integrated Master Plan (IMP) and Integrated Master Schedule (IMS).Steps:1. Establish requirements for development of an Service Level Agreement.2. Evaluate SLA stakeholder needs as appropriate.3. Develop performance measures to opitimze the effectiveness of the SLA based on SLA requirements and stakeholder needs.5.2.4 Manage ongoing sustaining engineering assessments of the fielded item/system and facilities.Sustainment Contract and Contract Modifications Data-Driven Decisions Regarding Sustainment EngineeringPerformance and RAM design improvements5.2.4.1 Define (for a system or component) the sustaining engineering requirements.5.2.4.2 Evaluate and plan sustaining engineering activities for a system. 5.2.4.3 Translate the results of field data reports into actions for sustaining engineering (Reliability-Based Logisitics).Steps:1. Define the requirement for engineering support following system fielding.2. Justify and budget funding for sustaining engineering.3. Review field data reports to find trends and early warning signs.4. Identify potential actions for further investigations by sustaining engineering.5. Provide direction, as appropriate, to field units regarding sustaining engineering discoveries.5.2.5 Implement and oversee program activities to identify, track, fund, and correct obsolescence and DMSMS risks, opportunities and issues.Program Obsolescence and DMSMS Strategy5.2.5.1 Perform Risk and Opportunity assessment of system components with regards to obsolescence.5.2.5.2 Evaluate, execute and track obsolescence corrective actions based on risk and opportunity analysis.5.2.5.3 Assess issues identified by users, system analysts, or part notification agencies and facilitate appropriate corrective actions.Steps:1. With representatives from the supply chain, review the Bill of Materials for risk and opportunities related to obsolescence.2. Conduct additional market research.3. Prepare the business case for proactive changes in manufacturing sources.4. Develop a process for tracking obsolescence issues.5.2.6 Manage the engineering and test of technical or procedural solutions to neutralize any harmful aspects of an item/system and its disposal.Updates to the LCSP Disposal Section5.2.6.1 Evaluate, avoid and/or minimize the use and/or impact of environmentally hazardous materials in the system.5.2.6.2 Review and comply with National Environmental Policy Act and other safety and ocupational health laws, regulations and polcies.Steps:1. During design, avoid or reduce environmentally hazardous materials in the system.2. Perform trade-offs to avoid environmentally hazardous materials.3. Follow environmental laws, regulations, and policies regarding the disposal of hazardous components.4. Review sustainment concepts and operations for environmental and safety issues.5. Adjust sustainment tasks (or make changes to the system) to avoid environmental or safety issues.6. Monitor sustainment operations to ensure compliance or make adjustments to procedures as necessary.References:National Environmental Policy Act (NEPA)5.3 Supply Chain Management5.3.1 Align a supply chain management concept within the context of an agency or the joint supply chain architecture.Program Supply Chain Management Concept5.3.1.1 Assess program-specific SCM needs within the Joint Supply Chain Architecture (JSCA) guidance/elements, as appropriate. 5.3.1.2 Assess and balance the SCM needs of the program against the potential needs of other programs and Services.DoD Integrated Product Support Element GuidebookSteps:1. Peform cost-benefit analysis for system-unique SCM requirements.2. Document decisions in the LCSP in coordination with the SEP.5.3.2 Coordinate management actions involving production, inventory, location, and transportation of program materiel items (and associated information and financial transactions).Life-Cycle Sustainment Plan Updates and Operating ProceduresDSOR5.3.2.1 Evaluate and establish procedures for the transport, storage, and issuance of materials at the point of need.5.3.2.2 Engage data systems to track and analyze supply chain realated compionents.5.3.2.3 Ensure budgets are aligned to meet acquisition lead times and inventory demands.Steps:1. Review material support concepts.2. Conduct logistics demonstrations.3. Establish standard operating procedures, contracts, and agreements with Services/Agencies (DLA)/ Commodity Commands/Arsenals /Depots/Ship Yards as necessary.References:10 USC 2460, 2464, and 24665.3.3 Interpret supply chain related risks with components and commodities acquired in program development through operations.Program Protection PlanLCSP5.3.3.1 Ensure the PPP clear addressed supply chain related risks.5.3.3.2 Ensure the contracts in all phases of a program incorporate actions related to identifying, reporting, and addressing supply chain risks.Steps:Determine all available suppliers of critical components.Integrate results of criticality analysis with relevant suppliers as appropriate.Incorporate the above results into the development of the Supplier Threat Analysis.Integrate supply chain threat findings into the PPP.References:DoDI 5200.39DoDi 5200.44 ................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download