Test Plan (Unit and Integration) Template
[pic]
TEST PLAN
(UNIT AND
INTEGRATION)
Project or System Name
U.S. Department of Housing and Urban Development
Month, Year
Revision Sheet
|Release No. |Date |Revision Description |
|Rev. 0 |5/30/00 |Test Plan (Unit and Integration) Template and Checklist |
|Rev. 1 |6/8/00 |Correction to text in Section 2.0 |
|Rev. 2 |4/10/02 |Conversion to WORD 2000 format |
| | | |
| | | |
| | | |
| | | |
| |Test Plan Authorization |
| |Memorandum |
I have carefully assessed the Test Plan for the (System Name). This document has been completed in accordance with the requirements of the HUD System Development Methodology.
MANAGEMENT CERTIFICATION - Please check the appropriate statement.
______ The document is accepted.
______ The document is accepted pending the changes noted.
______ The document is not accepted.
We fully accept the changes as needed improvements and authorize initiation of work to proceed. Based on our authority and judgment, the continued operation of this system is authorized.
_______________________________ _____________________
NAME DATE
Project Leader
_______________________________ _____________________
NAME DATE
Operations Division Director
_______________________________ _____________________
NAME DATE
Program Area/Sponsor Representative
_______________________________ _____________________
NAME DATE
Program Area/Sponsor Director
TEST PLAN
TABLE OF CONTENTS
Page #
1.0 GENERAL INFORMATION 1-1
1.1 Purpose 1-1
1.2 Scope 1-1
1.3 System Overview 1-1
1.4 Project References 1-2
1.5 Acronyms and Abbreviations 1-2
1.6 Points of Contact 1-2
1.6.1 Information 1-2
1.6.2 Coordination 1-2
2.0 TEST DEFINITION 2-1
2.x [Test Identifier and Type] 2-1
2.x.1 Requirements to be Tested 2-1
2.x.2 Expected Results 2-1
2.x.3 Test Hierarchy 2-1
2.x.4 Extent of Test 2-1
2.x.5 Test Data 2-2
2.x.5.1 Test Data Reduction 2-2
2.x.5.2 Input Test Data Control 2-2
2.x.5.3 Output Test Data Control 2-2
2.x.5.4 Data Recovery 2-2
2.x.5.5 Test Data Handling 2-2
2.x.6 Input Commands 2-3
2.x.7 Output Notification 2-3
2.x.8 Support Software 2-3
2.x.9 Error Handling 2-3
2.x.10 Test Conditions 2-3
2.x.11 Extent of Test 2-3
2.x.12 Test Constraints 2-3
3.0 TEST EXECUTION 3-1
3.1 Test Schedule 3-1
3.2 Test Progression 3-1
3.3 Test Criteria 3-1
3.3.1 Tolerance 3-1
3.3.2 Samples 3-1
3.3.3 System Breaks 3-1
3.4 Test Control 3-1
3.5 Test Procedures 3-1
3.5.1 Setup 3-1
3.5.2 Initialization 3-2
3.5.3 Preparation 3-2
3.5.4 Termination 3-2
3.5.5 Test Cycle Performance Activities 3-2
1.0 GENERAL INFORMATION
NOTE TO AUTHOR: Highlighted, italicized text throughout this template is provided solely as background information to assist you in creating this document. Please delete all such text, as well as the instructions in each section, prior to submitting this document. ONLY YOUR PROJECT-SPECIFIC INFORMATION SHOULD APPEAR IN THE FINAL VERSION OF THIS DOCUMENT.
The Test Plan (Unit and Integration) establishes the tests which will be performed, establishes testing schedules, and identifies responsibilities for testing the system during development activities.
GENERAL INFORMATION
1.1 1.1 Purpose
Describe the purpose of the Test Plan.
1.2 1.2 Scope
Describe the scope of the Test Plan as it relates to the project.
1.3 1.3 System Overview
Provide a brief system overview description as a point of reference for the remainder of the document. In addition, include the following:
1. Responsible organization
2. System name or title
3. System code
4. System category
1. Major application: performs clearly defined functions for which there is a readily identifiable security consideration and need
2. General support system: provides general ADP or network support for a variety of users and applications
5. Operational status
3. Operational
4. Under development
5. Undergoing a major modification
1. System environment and special conditions
1.4 1.4 Project References
Provide a list of the references that were used in preparation of this document. Examples of references are:
6. Previously developed documents relating to the project
7. Documentation concerning related projects
8. HUD standard procedures documents
1.5 1.5 Acronyms and Abbreviations
Provide a list of the acronyms and abbreviations used in this document and the meaning of each.
1.6 1.6 Points of Contact
1.6.1 1.6.1 Information
Provide a list of the points of organizational contact (POCs) that may be needed by the document user for informational and troubleshooting purposes. Include type of contact, contact name, department, telephone number, and e-mail address (if applicable). Points of contact may include, but are not limited to, helpdesk POC, development/maintenance POC, and operations POC.
1.6.2 1.6.2 Coordination
Provide a list of organizations that require coordination between the project and its specific support function (e.g., installation coordination, security, etc.). Include a schedule for coordination activities.
2.0 TEST DEFINITION
Test Definition
1 2.x [Test Identifier and Type]
This section provides a description of the unique name or descriptor of the test and describes the type of test and objective. Each test should be under a separate section header, 2.0 - 2.x.
Test types may include the following:
9. Requirements Validation
10. Functional Testing
11. Performance/Volume/Stress Testing
12. Security Testing
13. Ease of Use
14. Operational Testing
15. Documentation Testing
16. Procedure Testing
17. Interface Testing
Refer to the Section 5.1 of the System Development Methodology (SDM) for a complete description of each of these test types.
b.1.1 2.x.1 Requirements to be Tested
Describe the system and program requirements that will be tested with this particular test.
b.1.2 2.x.2 Expected Results
Describe the expected results of the testing performed.
b.1.3 2.x.3 Test Hierarchy
Describe the location of the test in the hierarchy of the testing to be performed.
b.1.4 2.x.4 Extent of Test
Identify the software units and programs that are to be included in the test. Identify the program functions and interfaces that will be tested.
b.1.5 2.x.5 Test Data
Identify the test data required for the particular test, and describe to the data field level.
b.1.5.1 2.x.5.1 Test Data Reduction
The available techniques may include:
18. Manual collection and collation of system test output into test sequence order, followed by verification of the results
19. Semiautomatic inspection of test results as obtained by data recording means using a test data reduction program, followed by manual inspection of selected test results that do not lend themselves to complete reduction by automatic means
20. Automatic inspection of test results specifically recorded for manipulation by the test data reduction program
Test results as recorded include all items of test significance. The test data reduction program contains an image of correct data output for an item-by-item comparison of data, and provides a summary of an evaluated test as output. Describe the technique to be used for manipulation of the raw test data into a form suitable for evaluation, if applicable.
b.1.5.2 2.x.5.2 Input Test Data Control
Describe the manner in which input data are controlled to test the system with a minimum number of data types and values; exercise the system with a range of bona fide data types and values that test for overload, saturation, and other “worst case” effects; and test the system with bogus data and values that test for rejection of irregular input.
b.1.5.3 2.x.5.3 Output Test Data Control
Identify the media and location of data produced by the test. Describe the manner in which output data are analyzed to detect whether an output is produced; evaluate output as a basis for continuation of the test sequence; and evaluate the test output against the required output to assess system performance.
b.1.5.4 2.x.5.4 Data Recovery
Describe how original data will be recovered before and after test execution.
b.1.5.5 2.x.5.5 Test Data Handling
Describe how test data will be identified, maintained, and version-controlled.
b.1.6 2.x.6 Input Commands
Describe the manner in which input commands are used to control initialization of the test, to halt or interrupt the test, to repeat unsuccessful or incomplete tests, to alternate modes of operation as required by the test, and to terminate the test. Describe job control language to be executed, if applicable.
b.1.7 2.x.7 Output Notification
Describe the manner in which output notifications (messages output by the system concerning status or limitations on internal performance) are controlled in order to:
21. Indicate readiness for the test
22. Provide indications of irregularities in input test data or test database because of normal or erroneous test procedures
23. Present indications of irregularities in internal processing of test data because of normal or erroneous test procedures
24. Provide indications on the control, status, and results of the test as available from any auxiliary test software
b.1.8 2.x.8 Support Software
Identify external or existing programs needed to support the test. Include version and release numbers if appropriate. Identify utilities or other data manipulation tools that will be used to create or modify test data, to create erroneous data, as well as “staged” data to test all system interfaces.
b.1.9 2.x.9 Error Handling
Describe procedures for reporting errors, test results, and reworking/retesting programs.
b.1.10 2.x.10 Test Conditions
Indicate whether the test is to be performed using the normal input and database, or whether some special test input is to be used.
b.1.11 2.x.11 Extent of Test
Describe the extent of testing employed by this test. Where limited testing is to be used, the test requirements will be presented either as a percentage of a well-defined, total quantity or as a number of samples of discrete operating conditions or values. Identify all test drivers and stubs. Discuss the rationale for adopting limited testing.
b.1.12 2.x.12 Test Constraints
Describe any anticipated limitations imposed on the test because of system or test conditions (timing, interfaces, equipment, personnel).
3.0 TEST EXECUTION
Test Execution
1 3.1 Test Schedule
Provide a listing or graphic depicting the locations at which the test will be scheduled and the timeframes during which the test will be conducted. Identify the duration of the test at each site.
2 3.2 Test Progression
Include an explanation concerning the manner in which progression is made from one test to another so the cycle or activity for each test is completely performed.
3 3.3 Test Criteria
Describe the rules by which test results will be evaluated.
1 3.3.1 Tolerance
Discuss the range over which a data output value or system performance parameter can vary and still be considered acceptable.
2 3.3.2 Samples
Establish the minimum number of combinations or alternatives of input conditions and output conditions that can be exercised to constitute an acceptable test.
3 3.3.3 System Breaks
Discuss the maximum number of interrupts, halts, or other system breaks which may occur because of non-test conditions.
4 3.4 Test Control
Indicate whether the test is to be controlled by manual, semiautomatic, or automatic means.
5 3.5 Test Procedures
1 3.5.1 Setup
Describe or refer to standard operating procedures that describe the activities associated with setup of the computer facilities to conduct the test, including all routine machine activities.
2 3.5.2 Initialization
Itemize, in test sequence, the activities associated with establishing the testing conditions, starting with the equipment in the set-up condition. Initialization may include such functions as:
25. Readout of control function locations and critical data from indicators and storage locations for reference purposes
26. Queuing of data input values for the test
27. Queuing of test support software
28. Coordination of personnel actions associated with the test
3 3.5.3 Preparation
Describe, in sequence, special operations such as:
29. Inspection of test conditions
30. Data dumps
31. Instructions for data recording
32. Modifications of the database
33. Interim evaluation of results
4 3.5.4 Termination
Itemize, in sequence, the activities associated with termination of the test, such as:
34. Readout and location of critical data from indicators for reference purposes
35. Termination of operation of time-sensitive test support software and tests apparatus
36. Collection of system and operator records and logs of test performance and results
5 3.5.5 Test Cycle Performance Activities
Describe the step-by-step procedures to perform the activities in the test (run test, verify results and correct errors, rerun test). Assign each step a number, and this number, along with critical test data and test procedures information, shall be recorded on a test procedure form for test control and recording of results.
................
................
In order to avoid copyright disputes, this page is only a partial summary.
To fulfill the demand for quickly locating and searching documents.
It is intelligent file search solution for home and business.
Related download
- department of behavioral and developmental
- validation verification and testing plan template
- wioa case note standards michigan
- application for accreditation
- determining system certification accreditation levels of
- bureau of state case management
- test plan template
- test plan unit and integration template
- provider checklist mihms
Related searches
- roles and responsibilities template powerpoint
- username and password template word
- uat test plan template
- differentiation and integration formulas
- differentiation and integration pdf
- differentiation and integration formula pdf
- differentiation and integration in business
- derivative and integration formulas
- test plan vs test cases
- differentiation and integration ppt
- differentiation and integration questions
- differentiation and integration psychology