What is a test plan?
A test plan is a document that details the objectives, strategy, standards, schedule, and resources for testing the whole or part of a software product. It can be a master plan or a plan for a specific kind of testing like unit testing, integration testing, usability testing, performance testing, etc.
To put it simply, a test plan answers questions like:
- What is the scope of testing?
- What will be tested?
- How will it be tested?
- Which tools will be used to automate testing?
- Which tool will be used to manage the testing process?
- When will test cases be written?
- Who will write them?
- Who will execute them?
- Under what conditions will testing be deemed a success?
- How much time will all this take?
- How much will it cost?
What does a test plan contain?
Different organizations use different formats. However, let's look at the popular IEEE 829 format, which is relatively simple to understand. We shall now look at its contents along with some real-world examples.
1. Test Plan Identifier
A unique string to identify this test plan. E.g., PR-01-M-v1.2
If you have multiple test plans, use a naming convention to identify which represents what easily. For example, suppose you have different plans for different testing levels. In that case, you can use the characters U, I, S, and M in the identifier to represent unit, integration, system, and master, respectively.
It is also beneficial to include the version number in the identifier to avoid confusion between different versions.
List all documents along with their correct version numbers that form the basis of this test plan. For example:
- Requirements specifications
- System specifications
- Design documents
- Test process standards
- Development standards
- Compliance requirements
3. Executive Summary
Write a summary of this plan's objectives and scope. Include other details that an executive may find noteworthy.
4. Test Items
List the technical details of items to be tested. These items should not be from an end-user perspective. For example, you can write that the User-Authentication-SSOv1.2 module will be unit tested here. You can also refer to the software specification documents here for clarity.
5. Software Quality Risks
Identify all risks related to the quality of the software under test.
- Dependency on a third-party product.
- Ability to use a new automation tool
- Quality of test cases to test complex algorithms
- Inheriting components with a history of poor quality
- Unsatisfactory results from unit testing
- Poor documentation that can lead to misinterpretation of customer requirements
- Government regulations
6. Features to be Tested
List all the end-user features that you will test in this release. You can include their impact or risk using a simple rating scale like High, Medium, and Low. Unlike section 4, which is technical, you should write this section from the perspective of end-users omitting technical jargon, acronyms, and version numbers.
7. Features not to be Tested
In this section, you will list all the end-user features that you will not test in this release, along with their reasons like:
- The component is already being used in a production system as it is stable.
- The feature will be released but not documented nor available to end-users.
- The functionality will be marked as pre-alpha and only released to get end-user feedback.
8. Test Strategy
If this plan is for a specific level (e.g., unit testing), it should be consistent with the master plan. Your test strategy should answer questions like:
- Which metrics will be collected? (e.g., Defect Density)
- Which automation and coverage tools will be used and where?
- What will be the hardware and software environments?
- How will each component be tested?
- On what factors will test cases be selected for regression testing?
- What constitutes smoke testing?
9. Pass and Fail Criteria
This section should explain the completion criteria of this plan. It will depend on the testing level. For example, if the plan is for the unit testing level, the pass criteria could include the following conditions:
- All test cases have been peer-reviewed.
- The code coverage tool has indicated comprehensive coverage by the unit test cases.
- All test cases have been executed.
- There are no high severity bugs.
- The defect density is below the specified number.
The master level criteria could include:
- All low-level plans' have passed
- No open high-risk bugs
- All open bugs have been reviewed and classified as low-risk
10. Suspension and Resumption Criteria
Sometimes you reach a point where continuing with testing adds no value; you are just wasting resources. If a primary test case fails, there is no point in continuing testing because the failure indicates a fundamental problem with the code, and fixing that would require re-running all the test cases anyway. The prudent way will be to stop testing, get the developer to fix the code, and retest the entire test suite.
This section lists the conditions under which you should pause testing. For example:
- When a high-risk test case fails
- After three medium severity defects in a module
- After any bug in the security authorization code
11. Test Deliverables
While developers will provide the software product, what will the QA team deliver? This section answers this question. Some items could be:
- Test plan documents for lower levels
- Test cases.
- Tools and their outputs.
- Data generators.
12. Remaining Tasks
If this plan only addresses a phase or step of the master plan, this section will list the remaining items in the master plan so that the authors of the following one can quickly start from where this plan ends without wasting time.
13. Environmental Requirements
This section will list the hardware and software requirements required to perform testing described in this plan.
14. Staffing and Training
This section lists the roles required for testing. It also states the training requirements - for the software itself and any other automation tools needed for testing.
15. Roles and Responsibilities
- Deciding risk for test items
- Designing test cases
- Creating test data
- Training and configuration of automation tools and other management software needed for testing
- Executing test cases
- Taking qualitative go/no-go decisions
This section will list the roles and responsibilities required various activities like:
Create a schedule for this plan based on validated estimates of effort and availability of personnel. It will be dependent on the development schedule as you can only perform testing on completed code. Therefore, the plan should schedule tasks relative to development dates.
A good schedule will have tasks organized by subtasks, dependencies, constraints, and milestones.
This section should also bring out critical aspects of the schedule and include contingency plans.
17. Project Risks and Contingencies
This section should list the risks and mitigation plans for risks associated with this plan. This section deals with project risks instead of section 5, which deals with risks related to software quality.
A few examples of project risks are:
- Lack of resources
- Unavailability or delay of required hardware, software, or tools.
- Delays in training
- Changes in customer requirements.
Like any other risk management process, you should start with risks obtained from brainstorming and interviewing people involved in this test plan.
In this section, you describe the approval process for this test plan. You can also include the approval processes of other milestones. A general guideline is to have business people sign off on business-related items, and technical staff approves technical things.
Significant problems often start with minor miscommunications. Use this section to define uncommon terms or acronyms used in this test plan.