Quality Management Plan CD



| | |

| | |

| |Quality Management Plan |

|Prepared By: |Reviewed By: |Approved By: |Deliverable Information: |

B1, Godrej Industries Complex

Eastern Express Highway

Vikhroli (E), Mumbai 400079.

| |

|Classification: SECRET/HIGHLY CONFIDENTIAL/CONFIDENTIAL/UNRESTRICTED (Tick the Relevant Classification) |

| |

|Do not forward or copy data in part or full without explicit permission of ------ |

|Data access is limited to ------------------------(URL of list) |

|Use Strong authentication / EFS Encryption / Lock in a Drawer |

|Log access in a register |

|Retention period is -------- ( default as per Policy 123) |

| |

|Check the relevant Handling guidelines above. |

Revision history

|Version |Date |Author |Reviewed by |Comments |

| | | | | |

| | | | | |

| | | | | |

| | | | | |

Table of Contents

1 Introduction 4

1.1 Definition and Acronyms 4

1.2 References used while preparing this document 4

1.3 Assumptions and Dependencies 4

2 Relationship between stakeholder Requirements and Metrics 4

3 Compose Project Defined Process 5

3.1 Compose Process for AD Projects 6

4 Goals for Metrics 7

4.1 (a) Goals for High Priority Metrics (Critical Objectives) for AD/Solution Factory Projects 7

4.1 (b) Goals for Other Priority (medium and low) and all other mandatory CD Metrics for AD/ Solution Factory Projects 8

4.2 (a) Goals for High Priority Metrics (Critical Objectives) for Agile Projects 8

4.2 (b) Goals for other Priority Metrics (medium and low) and all other mandatory metrics for Agile Projects 8

4.3 (a) Goals for High Priority Metrics (Critical Objectives) for Testing projects 9

4.3 (b) Goals for other Priority Metrics (medium and low) and all other mandatory metrics for Testing projects 9

4.4 Statistical Monitoring of Sub-Processes 10

5. Measures and their Granularity and the Mapping with the tool 11

5.1 Common measure across process types (AD, Testing and Support) 11

5.2 Application development projects 12

5.3 Testing Projects 13

5.4 Agile Projects 13

5.5 Measures and their Mapping with the tool 13

6. Analysis Strategy 14

6.1 Method of Data Validation 14

6.2 Method of Analysis and reporting 14

7. Customer Satisfaction Survey 15

8. Verification and Validation 15

9. Delivery Excellence (DEx) Activities 16

10. Causal Analysis and Resolution Plan 16

10.1 Techniques for Defect Analysis 16

10.2 Identifying Actions from CAR 16

10.3 Driving Improvements using CAR 17

Introduction

This document discusses the development, quality assurance processes, quality objectives and goals of the team. This document is intended for use by members of the team and will lay down the procedures to be followed for all Quality Management activities. The Plan lays down the quality standards that the Project team will conform to and the metrics to be collected.

1 Definition and Acronyms

Definition and Acronyms used in this document.

2 References used while preparing this document

Work Package

Metrics Goals

Process Performance baseline

Measurement Guidelines

3 Assumptions and Dependencies

Assumptions followed during preparation of this document and dependencies (e.g. Software Configuration Management and Project Management Plan).

Relationship between stakeholder Requirements and Metrics

Here is a Broad classification of Stakeholders and Stakeholder requirements, Metric Focus and the Metrics. Analyze the same in your project perspective and if there are any conflicts in the requirements resolve the conflicts

The Project Manager will set the priority of the Metric depending on the Business Objectives and Client requirements (contract, SOW, DCSO). But collection of the Metric is not optional when the priority is Low, only would be reflected in the Analysis accordingly. The Priority can be set as High, Medium or Low but all cannot be of the same priority at one point of time.

Use the myMetrics tool from the Quality Management System to identify the applicable measures and metrics based on your project type. The myMetrics tool provides details of the metrics in terms of formula, processes/sub processes, description, and direct /indirect metrics and the leading /lagging indicators for each metric. A leading indicator provides early warnings for future events and a lagging indicator depict the parameters that might get impacted based on your current process performance. Refer Appendix A for further details.

For multiple critical metrics for a business goal define the priority of the metric and order of priority for each metric with same priority. Each Critical metric is marked with a Unique identifier (M1, M2, M3... etc).

Business Goals are arranged in the Order of Priority for the project and arranged sequentially in the table below.

|Custom Development |

|# |Stake holder |Stakeholder requirement |Business Goal |Metric |Priority |Priority # |

| | |defined in | | | | |

|M1 |End Client / Client |Contract |High Quality Delivered |Delivered Defect Rate |High |1 |

| |Team | | | | | |

|M2 | | | |Delivered Defect |High |2 |

| | | | |Density | | |

|M3 |ATCI Leadership / |Organizational |Cost to Serve |Cost to Serve |Medium |1 |

| |Management |Priorities / Performance| | | | |

| | |Objectives | | | | |

|M4 | | | |Cost Variance |High |1 |

|M5 | | | |Schedule/End Date |High |2 |

| | | | |Variance | | |

|M6 | | |Delivery Excellence |CTSS |Medium |1 |

|M7 | | |Industrialization |Productivity |Medium |1 |

|Testing |

|M1 |End Client/Client Team|Contract |High Quality Delivered |Defect Leakage Ratio |High |1 |

|M2 | | | |Rejection Ratio |Medium |1 |

|M3 |ATCI Leadership / |Organizational |Cost to Serve |Cost Variance |High |1 |

| |Management |Priorities / Performance| | | | |

| | |Objectives | | | | |

|M4 | | | |Schedule/End Date |High |2 |

| | | | |Variance | | |

|M5 | | |Delivery Excellence |Delivered Defect Rate |High |1 |

|M6 | | | |Delivered Defect |High |2 |

| | | | |Density | | |

|M7 | | |Industrialization |Test Creation & |High |1 |

| | | | |Execution Productivity | | |

|M8 | | | |Test Automation |Medium |1 |

| | | | |Penetration | | |

|Agile |

|M1 |End Client/Client Team|Contract |High Quality Delivered |Delivered Defect Rate |High |1 |

|M2 |ATCI Leadership / |Organizational |Cost to serve |Cost Variance |High |1 |

| |Management |Priorities / Performance| | | | |

| | |Objectives | | | | |

|M3 | | | |Schedule/End Date |High |2 |

| | | | |Variance | | |

|M4 | | |Delivery Excellence |Code Refactoring |Low |1 |

| | | | |Percentage | | |

|M5 | | |Industrialization |Velocity |High |1 |

Compose Project Defined Process

This section shall include details on the project’s defined process i.e. the best process combination which will help the project meet its objectives.

1 Compose Process for AD Projects

Steps followed for composing the project’s defined process:

1. Refer scope of the project

2. List major phase/activities of the project and brainstorm on the various alternatives (tools, methods etc) available to execute the subprocesses

3. Case 3a: No Project baseline (past releases data) available

• Use Project predictor tool that considers PPB data, and derive the predictions with the set targets

Case 3b: Project baseline available

In case Project baseline is available, the prediction model/tool to be used to be selected based on the methodology followed in the project.

|Methodology Followed |Prediction Model |

|Waterfall methodology |Monte carlo Simulation using Crystal Ball |

|Parallel Processing methodology |Monte carlo Simulation using Crystal Ball |

|Agile Methodology |Agile Prediction Model |

|Testing Only Projects |Testing Prediction Model |

Case 1: Process combination output can meet the project objective:

4. Compose the defined process

5. Based on methodology followed in the project, Run Crystal Ball/Project predictor (for Sensitivity Analysis) to identify the critical sub processes

6. Automation and Process Maturity Scores derived from AOF (Automation Opportunity Finder) Assessments can be used as an additional aid to identify the critical subprocess

7. Feed the actual values from completed phases as and when it is available. Revisits the composed process at the end of milestone or sub process or phase. This is to ensure the current project capability can meet the project objectives

Case2: Process combination output cannot meet the project objective:

4. Identify improvement initiatives using CAR, DAR, OID or Risk Analysis to address the gap in meeting the project objectives. Document the results of the analysis

5. Use Optquest/ PPM to analyze the impact of various options/improvement initiatives in each phase. Analyze the forecast provided by the tool

6. If the project objective is still not met/ or if there are risks for the alternate methods/improvement areas, negotiate the targets with the client.

7. Identify the critical subprocesses using sensitivity analysis. The risks identified needs to be tracked in Risk log

8. Automation and Process Maturity Scores derived from AOF (Automation Opportunity Finder) Assessments can be used as an additional aid to identify the critical subprocess

9. Feed the actual values from completed phases as and when it is available. Revisits the composed process at the end of milestone or sub process or phase. This is to ensure the current project capability can meet the project objectives

|Phases/Sub processes |Options Considered |Best Option Chosen |

| | | |

| | | |

| | | |

| | | |

Goals for Metrics

The goals can be arrived using 3 different approaches. Project Goals have to be segregated into

• Goals for High Priority Metrics (Critical Objectives) – These are goals against all the High Priority Stakeholder metrics

• Goals for Priority metrics (medium and low) and all other mandatory metrics. – These are goals against all the Medium/Low Priority stakeholder metrics and all the leading/lagging indicators for the High Priority metrics.

Steps to arrive at Goals:

1. Arrive at the goal using based on the data in the following order and compute the goals

• Projects previous phase

• Similar project

• Process Performance Baseline

• Benchmarks

• KT Phase data

• Contractual commitment

2. The project can use the prediction model outputs (Crystal Ball/Project predictor as the basis for setting the metrics or sub process level goals

If a process type is not applicable you can remove the table and mention as not applicable

Within a process type if a metric is not applicable mention as NA and don’t delete the row

(e.g. – Testing project X has only execution responsibility hence Test creation productivity is NA)

The Organization Goals are available in QMS Metrics page ()

The PM should have a strategy for achieving goal – this could be based on the Predictions, industrialization principles etc.

The tables given below shall be appropriately selected based on the type of project. The high priority metrics goals and other priority metrics goals for the project need to be segregated and monitored appropriately.

The table below lists all the high priority metrics identified for the critical business goals for the project. All these metrics would be statistically monitored and controlled by the project.

1 (a) Goals for High Priority Metrics (Critical Objectives) for AD/Solution Factory Projects

|# |Metric |Strategy for Achieving Goal |

|M1 |Delivered Defect Rate |Monitoring using Control Charts |

|M2 |Delivered Defect Density |Monitoring using Control Charts |

|M4 |Cost Variance |Monitoring using Scatter Plots |

|M5 |Schedule/End Date Variance |Monitoring using Scatter Plots |

The table below lists all the other priority metrics. This table outlines all the leading/lagging metrics to the critical metrics. The mapping between the critical metrics and the other priority metrics need to built in the table. These are additional mandatory metrics that the project would track and report on a periodic basis.

2 (b) Goals for Other Priority (medium and low) and all other mandatory CD Metrics for AD/ Solution Factory Projects

|Metric |Strategy for Achieving Goal |Helps in Achieving |

|Cost To Serve | |M3 |

|CTSS | |M6 |

|Productivity | |M7 |

|Overall Defect Rate | |M1, M2, M6 |

|Build Quality Rate | |M1,M2 |

|Peer Review Effectiveness | |M1,M2 |

|Testing Effectiveness | |M1,M2 |

|Stage wise Defect Rate | |M1,M2 |

|Overall Defect Density | |M1,M2 |

|Build Quality Density | |M1,M2 |

|Cost of Rework | |M1,M2, M3 |

|Cost of Quality | |M1,M2, M3 |

|Change Request Impact | |M4, M5 |

|% Non Productive Effort | |M4 |

|Defect Removal Efficiency | |M1,M2 |

3 (a) Goals for High Priority Metrics (Critical Objectives) for Agile Projects

|# |Metric |Strategy for Achieving Goal |

|M1 |Delivered Defect Rate | |

|M2 |Cost Variance | |

|M3 |Schedule/End Date Variance | |

|M5 |Velocity | |

4 (b) Goals for other Priority Metrics (medium and low) and all other mandatory metrics for Agile Projects

|Metric |Strategy for Achieving Goal |Helps in Achieving |

|Code Refactoring Percentage | |M4 |

|Percentage Code Refactoring Effort | |M4 |

|Change Request Impact | |M2,M3 |

|Regression Testing Efficiency | |M1,M2,M3 |

|Overall Defect Rate | |M1 |

|Cost of Quality | |M1 |

|Cost of Poor Quality | |M2,M3,M5 |

|Test Cases Pass Percentage | |M1 |

|Test Case Execution Trend | |M5 |

Note: Please refer the AD goals for setting goals in Agile Projects. Agile projects will have specific goals after few months of data collection on Agile Projects.

5 (a) Goals for High Priority Metrics (Critical Objectives) for Testing projects

|# |Metric |Strategy for Achieving Goal |

|M1 |Defect Leakage Ratio | |

|M3 |Cost Variance | |

|M4 |Schedule/End Date Variance | |

|M5 |Delivered Defect Rate | |

|M6 |Delivered Defect Density | |

|M7 |Test Creation & Execution Productivity | |

6 (b) Goals for other Priority Metrics (medium and low) and all other mandatory metrics for Testing projects

|Metric |Strategy for Achieving Goal |Helps in Achieving |

|Rejection Ratio | |M2 |

|Test Automation | |M8 |

|Penetration | | |

|Testing Defect removal | |M1, M6,M7 |

|efficiency | | |

|Test Support Efficiency | |M1,M6,M7, M2 |

|Test creation productivity| |M7, M3,M4 |

|Test execution | |M7, M3,M4 |

|productivity | | |

|Defect Rejection ratio | |M2, M1,M5,M6 |

|Overall Defect Rate | |M5,M6, M1 |

|Defects per Test Case | |M2 |

|Defects per Test Case | |M7 |

|Executed | | |

|Test Case Pass Percentage | |M5,M6,M1 |

|Change Request Impact | |M3,M4 |

|Cost of Quality | |M3 |

|Cost of Rework | |M3, |

*** Only mandatory metrics need to be mentioned as part of the table as optional metrics do not

Require goals/targets to be defined against each of them

7 Statistical Monitoring of Sub-Processes

Identification of Sub processes

AD Projects

Project’s process constitutes all the activities that are executed to deliver the client requirements. Out of these, there are some activities which need a stringent monitoring to ensure that we do not deviate from the project targets and client priorities. These are called as critical sub processes. Processes or Sub processes which are critical to meet the ATCI, Client and Project objectives are chosen for statistical monitoring to understand the performance and variations in the process.

Sub processes listed in the below table are the ones that are critical to meet the project objectives. Hence these are chosen for statistical monitoring and are tracked using control charts for the identified parameters.

Sub process Identification guidelines are used to identify the critical sub processes. Other statistical techniques used to identify the sub process for statistical monitoring is:

• Sensitivity Analysis

• Test of Hypothesis

|# |Business Objective/Goal |Sub process |Parameter |Chart used |Granularity and |Sub process goals |

| | | | | |Frequency of plot | |

|M1 |Delivered Defect Rate | | | | | |

|M2 |Delivered Defect Density | | | | | |

|M4 |Cost Variance | | | | | |

|M5 |Schedule/End Date | | | | | |

| |Variance | | | | | |

Monitoring the Selected Sub processes

Project will monitor the performance of the selected critical sub processes to determine their capability to satisfy the project’s objectives. Corrective actions will be identified as necessary. Regular analysis will be done on the sub process performance with a focus on the following points:

1. Comparing the project objectives to the control limits of the sub process parameter

2. Monitor changes in the selected sub process’ process capability

3. Identify and document the areas of improvement to increase the process stability and capability

4. Corrective action as necessary to address the sub process capability deficiencies.

Analysis result will be documented in the Control Charts

Measures and their Granularity and the Mapping with the tool

The following section gives the Granularity requirements of measures for various process types. If you have the process types applicable select the same. Else you can mention it as not applicable.

E.g. Project A has activities involves Build, Assembly and Product testing. But still Project a need to spend time in Reviewing the Design as a Transition point activity which will help in reduced risk and improved Quality and productivity. Project A has Design Review cell Applicable and Task and Rework activities not applicable

The tables given below shall be appropriately selected based on the type of project

1 Common measure across process types (AD, Testing and Support)

Defects

The attributes of a defect are given below. This needs to be captured for all lifecycle stages

• Injected stage

• Detected stage

• Severity

• Cause

• Priority

• Type

The attributes of the defect can be identified by different persons. Here the preferable role for identifying the attributes for defects are given below so that the time spent is minimized. Kindly mention which role will be identifying in the project

|Attribute |Best identified by |

|Injected stage |Author for testing defects, Reviewer for |

| |Review defects |

|Detected stage |Reviewer, tester |

|Severity |Reviewer, tester |

|Type |Reviewer, tester |

|Priority |Team lead, Project Manager |

|Cause |Author |

Reviews

The attributes of reviews are given below

• Review Type – Desktop Review or Peer Review or SME Review or Group Review or Inspection

• Preparation effort

• Review effort

• Size of the work product

• Number of defects found

Risks

The attributes of risk are given below

• Probability

• Impact

2 Application development projects

The attributes of Effort are given in the following table with a Stage Vs Activity table. Kindly mention which are the cells applicable to your project

|Stage |Task |Review |Rework |Requirement changes |

|Analyze* | | | | |

|Design* | | | | |

|Detailed Technical Design/Build Application | | | | |

|Components* | | | | |

|Build –Component / Unit testing** | |NA | | |

|Test – Assembly / Integration testing** | |NA | | |

|Test –Product / System Testing** | |NA | | |

|Deploy – Acceptance and Warranty |NA |NA | | |

| | | | | |

|Plan – Project Mgmt | | | | |

|Project Training | | | | |

|Configuration Mgmt | | | | |

|Defect prevention | | | | |

Both planned and Actual effort is required for all the applicable activities.

NA – Activity type not applicable

* - Analyze, Design and Build includes the effort spent for Test scoping and test planning.

**- Testing includes the effort for Test setup, test execution and test reporting

Defects and reviews as given in common measures

Size

• FP / Number of logical Source lines of code / Normalized components

• Requirements count (initial and Final)

Schedule

• Start Date (planned, actual) for all the tasks

• End date (planned, actual) for all the tasks

3 Testing Projects

• Test Case

– Manual / Automation

– Created / Executed

– Added to the Repository / Reused from the repository

– Test cases given by the client

– Unique test cases executed

Definition of Test case - Test case will contain 10 steps

• Effort

• Defects

– Defects detected during each phase of life cycle which are applicable. Details like stage injected, stage detected, severity, defect type, cause etc are captured for review defects.

– For defects found in test execution Severity is the only mandatory metrics.

4 Agile Projects

• Effort and schedule at sprint/Story points. If phases/Subprocess is applicable, planned and actual effort and schedule need to be tracked at phase level.

• Defects with details of severity and phase injected/detected

• Number of components – Total/refactored

• Number of story points in the sprint

• Count of CRs, and effort added due to CRs

5 Measures and their Mapping with the tool

This information is required for all the process types

|Sr. No |Measure |Unit |Typical Attributes of |Tool used |Field in the |Frequency of back up|Remarks |

| | | |Measures | |tool used for | | |

| | | | | |capturing the | | |

| | | | | |data | | |

| | | | | | | | |

Analysis Strategy

1 Method of Data Validation

The data validation can be done using the gage R&R. DART has inbuilt data validation rules. But tools by themselves will not ensure the integrity of data. If the project does formal validation it will ensure accuracy of data and avoid mistakes.

|Sr. No |Measure |Frequency at which data validation |Responsible person |

| | |will be done | |

|1 |Effort |Weekly |PM |

|2 |Defects |Weekly |PM |

|3 |Size |Weekly |PM |

|4 |Schedule |Weekly |PM |

2 Method of Analysis and reporting

Metrics shall be collected and analyzed at the completion/release of a module in case of Application Development Projects.

Project Manager shall ensure that the derived metrics analyzed at the release of module/monthly basis as the case may be, is submitted in DART. This will be used by the Metrics team for generating the organization-wide Process Performance Baseline. For projects, which do not use DART, PM shall document the collected and analyzed metrics in the Metrics Analysis Report available in ATCI-QMS.

For projects following Agile methods, measures are listed in the Agile MCT

Trend analysis should be performed for evaluating and monitoring the project’s performance against planned over the project life cycle period. In addition, outlier analysis and pattern analysis should be performed monthly, for monitoring and controlling the sub-processes identified.

Reporting:

|Sr. No |Report |Frequency at which analysis and |Responsible person |

| | |reporting will be done | |

|1 |SSR reporting |Weekly |PM |

|2 |DART reporting for Governance through |Weekly |PM |

| |mDART | | |

|3 |Control Charts |Weekly/Monthly |PM |

Customer Satisfaction Survey

This section defines the frequency of conducting the customer feedback analysis for the project. The feedback received is analyzed and reported as part of the Organization Process Performance Baseline. Refer to Customer Feedback process

Verification and Validation

This section defines the verification and validation strategies that will be adopted by the project team.

For all reviews, the Review form or some project specific tool will be used to document the review findings. All defects will be categorized in the form. The errors will be stated and these will be tracked to completion. Any documented reviewer defect that is rejected by the reviewer will be sent to the Modifier and the process may iterate to and fro until the Reviewer accepts the change.

The closure of these defects will require verification/re test from the reviewer. The following parameters will be considered to decide type and extent of re-review in case of closure of defects.

• The complexity of the application/module

• Business impact of the application/module

• Defect severity

• Number of components/objects/modules impacted by the defect as per Requirement Traceability Matrix

• Amount of rework effort spent to fix the defect

• Stage at which the defect was identified

This assessment with details on the above parameters and the decision on the type and extent of re-review need to be captured as a separate row item in the below table.

For testing the Test Conditions Report template or project specific tool can be used. From the process database and past project data and criticality of the document or deliverable the strategy for conducting the type of review will be identified.

Type of Reviews: Peer Review, Group Review, SME review, Desktop review, Inspection, Structured Walkthrough, Round-robin Review

|Sl. |Deliverables |Verification/Validat|Planned Date |Method |Responsibility |Remarks |

|No |/Documents |ion | | | | |

|1 |Requirement Definition |Review |Will be planned |Peer review |PM/TL/ Tech Lead |The detailed |

| |Document /Detailed | |using MSPS/ | | |planning & tracking |

| |Design Document /Code | |Project specific | | |will be done using |

| | | |tool | | |MSPS/Project |

| | | | | | |specific tool |

|2 |Tested Code |Testing |Will be planned |UT/ST/AT/IT |Developer/TL/Tester |The detailed |

| | | |using MSPS/ | | |planning & tracking |

| | | |Project specific | | |will be done using |

| | | |tool | | |MSPS/ Project |

| | | | | | |specific tool |

|3 |Tested Code |Re-review |Will be planned |Defect fix |Reviewer/Developer/Te|Detailed assessment |

| | | |using MSPS/ |validation |ster |based on defined |

| | | |Project specific | | |parameters |

| | | |tool | | | |

Delivery Excellence (DEx) Activities

................
................

In order to avoid copyright disputes, this page is only a partial summary.

Google Online Preview   Download