QA Madness Blog   What Is a Test Plan and How to Write One?

What Is a Test Plan and How to Write One?

Reading Time: 7 minutes

Well-written documentation lies at the core of an efficient QA process. Documentation brings structure and logic to the testing activities. In a way, it unites project team members around the same goal, providing a clear understanding of the hierarchy, tasks, and expected outcomes.

One of such documents is a Test Plan. In this article, you will learn:

  • what a software Test Plan iswhat a software Test Plan is;
  • how it is different from a Test Strategy;
  • what role it plays in the project;
  • types of plans;
  • why a team needs a Test Plan;
  • and, most importantly, how to draw it up.

Before proceeding to the definitions and explanation, we’d like to explain one more important term from the QA field – a test artifact. Test artifacts are by-products generated during the software testing process and shared with the project team. Simply put, these are documents that bridge the communication gap between all the members. Now, let’s move to a Test Plan and its specifics.

What Is a Test Plan?

A Test Plan is a test artifact describing actions that will occur during the testing process – from the development strategy to criteria for finding errors. It also outlines the logic of completing the tasks, features risk assessment, and explains scenarios of effective resolution of those risks.

Test Plan Structure

A Test Plan has a clearly defined structure established by the IEEE 829 – the industry standard for software and system documentation. Thus, you can prepare a template and use it for every project, filling it with specific data.

Building a Test Plan document based on the IEEE 829 standard has many benefits. First and foremost, familiarity with the document structure makes it easier to write and use a Test Plan. The IEEE 829 standard eliminates any futile discussions about what to include and in what order. Instead, a Team Lead and QA Engineers focus on other activities.

So, according to the IEEE 829 standard, a Test Plan should consist of 19 items:

  1. Test Plan Identifier
  2. References
  3. Introduction
  4. Test Items
  5. Software Risk Issues
  6. Features to Be Tested
  7. Features Not to Be Tested
  8. Approach
  9. Item Pass/Fail Criteria
  10. Suspension Criteria and Resumption Requirements
  11. Test Deliverables
  12. Remaining Test Tasks
  13. Environmental Needs
  14. Staffing and Training Needs
  15. Responsibilities
  16. Schedule
  17. Planning Risks and Contingencies
  18. Approvals
  19. Glossary

1) Test Plan Identifier

In this section, we add the QA provider’s name and company logo, the name of the document, its version, and the year of creation. In other words, it is the title page of your Test Plan.

2) References

The next step is to include the history of the document. For this, add the Table of Changes. It should contain the following columns:

  • date;
  • version;
  • description;
  • and author.

With this table, a team can record and track changes to manage the document and the process it describes efficiently.

3) Introduction

Here, we briefly state what we are going to do during the project. The introduction is a note to a client. In a couple of sentences, describe what services the QA team will provide and why. For example:

Our company provides Functional and UI testing to detect bugs in the software product before the release. We perform the detailed testing of the stated functionality to help achieve the set business goals for your software product.

Include all the types of testing you agreed to cover, but don’t go into detail. It is enough to generalize at this stage.

4) Test Items

Test Items are general functionality to be tested – for example, installation, registration, checkout, etc. In a way, it is a short description of the content of the Test Plan. Later on, each of the items will be explained in detail. The list can be expanded or shortened depending on a task or type of testing.

5) Software Risk Issues

In this section, we describe the risks a team may face during testing. For example, if a deadline is set for the summer period, it is reasonable to assume that people may take vacations, and the date may be delayed. You should consider both human resources and tech aspects, mentioning everything significant in the document.

6) Features to be Tested

This part covers what the majority imagines a Test Plan should present – a more precise list of features to inspect during a specified time. For example, we mentioned the checkout functionality in Test Items. In Features to Be Tested, we should list separate components of the flow: entering delivery information, choosing a payment method, order confirmation, etc. As for the timing, a client and a QA company discuss it before documentation writing.

7) Features not to be Tested

Here, you list the features that a QA team is not going to test for a particular reason. It doesn’t matter why you don’t cover this scope. Just don’t forget to state what features remain out of your tasks and are a client’s responsibility.

8) Approach

Then, we describe the techniques and types of testing we are going to use. We also include test cases in this section. Therefore, a client can get a complete picture of the testing activities.

9) Item Pass/Fail Criteria

Each Test Case will be assigned ‘Pass’ or ‘Fail’ state depending on two criteria:

  • the existence and severity of bugs;
  • the level of successfully executed requirements.

And don’t forget to identify entry and exit criteria for the testing. They mark the beginning and the end of the testing process.

An entry criterion describes what needs to be done before testing begins. For example, you may need to have:

  • finalized reference documents;
  • a software product packaged the way customers will get it;
  • specific software utilities, configuration files, or data;
  • product requirements and other documentation, etc.

The entry criteria reveal test readiness or unreadiness. It will be helpful to make a list of items to use as inputs and request the materials necessary for running tests.

An exit criterion describes what you consider necessary to complete a test. QA teams often try to make exit criteria a condition for software delivery, but it is not realistic. This decision is for a Product Owner (or another person in charge) to make.

Here’s an example of test exit criteria:

All scheduled tests have been completed, all fixed defects have been checked, notifications of all new defects found have been issued. All failed points, such as failure of a certain set of tests due to hardware malfunction, have been documented.

10) Suspension Criteria and Resumption Requirements

The suspension/resumption criteria describe what happens if it is impossible to continue testing due to the defects. In other words, if things go so bad that the planned testing activities cannot go on, they have to stop until after the defect elimination.

11) Test Deliverables

As the name suggests, we inform a client about the materials they will receive to witness the results of the work. Test deliverables usually show testing outcomes in a form of metrics: the number of tests completed, bugs found, etc. In a way, metrics are numerical indicators of quality, though it shouldn’t be the only criterion for estimating the work done.

12) Remaining Test Tasks

A SDLC may be unpredictable. Sometimes it takes more time than initially expected to test a product and deal with risks. If the deadlines are tight, some parts of the functionality may remain untested. In this case, a team still includes left-out tasks in a Test Plan. Also, this section can describe the work scope to cover in case all the tasks are closed before the deadline.

13) Environmental Needs

In general, this section summarizes the needs for testing hardware. Here we mention the tools used for testing. If needed, you can describe the equipment and its features. It is necessary if, for example, a project will require exploiting a VR kit, some specific devices that need to be purchased, etc.

14) Staffing and Training Needs

If we get a task to test software for nuclear reactors, it is likely that the team won’t fully understand the specifics. Exaggerations aside, when the team is to test a project from an industry they are not familiar with, it makes sense to have a lecture or short course from experts. It will help to understand the particularities of a project and make the work more efficient.

15) Responsibilities

This clause describes the areas of responsibility of each member of the QA team. A convenient way of providing this information is a table with three columns – name, position, and responsibility.

16) Schedule

A Test Plan should also feature project deadlines. The team needs to evaluate the speed – that is, the amount of time the testing will need to finish testing. If there are several testing phases, clarify their order and timing.

17) Planning Risks and Contingencies

This section overlaps with the Software Risk Issues mentioned above. In addition to the list of risks, we provide explanations on how to handle those risks and what to do in force majeure circumstances.

18) Approvals

Every test document features the names or positions of the specialist who prepared it, those who should approve it and give the green light to the team to use it.

19) Glossary

Finally, there is a section where you can explain the core concepts used during the Test Plan writing. A Glossary helps to prevent misinterpretation of the used terminology.

Types of Test Plans

Despite the standard structure, there are several types of test plans. The differentiation is based on the particularities of the described tasks and the scope of work. To be more specific, QA teams tend to use the following classification:

  • Level-specific test plans – unit, integration, system, and acceptance test plans.
  • Type-specific test plans – functional test plan, performance test plan, usability test plan, automation test plan, etc.
  • Master Test Plan – a comprehensive QA Test Plan. It features high-level information, structures the testing process in detail, and rarely undergoes changes or revisions.

Compared to a Test Plan, a Master Test Plan is more static. That’s the key difference. As a rule, a project team uses one Master Test Plan and several shorter Test Plans for different levels or types of testing that describe individual modules of the same application.

Regardless of the type, you can create a Test Plan without using any specific tools. You can come across the phrase “Test Plan management tools,” but the wording is a bit off here. A Test Plan is a document, and the only tool you need to manage it is a text editor. Usually, people actually refer to test management tools, like TestRail, Testpad, QMetry, Kualitee, etc. Maybe, it’s another popular tool, Azure DevOps Test Plans, that causes this confusion.

Test Plan and Test Strategy – What’s the Difference?

Many find it complicated to tell the difference between a Test Strategy and a Test Plan in software testing. Nevertheless, these are two different documents. In short, a Test Plan is more detailed and covers more aspects than a Test Strategy. The latter is often used at the organizational level and rarely changes. Meanwhile, a Test Plan is more dynamic and used at the project level. A Strategy is usually a part of a Plan.

There is also a QA strategy that stretches beyond testing and encompasses other quality assurance activities and methodologies. You can learn more about what’s inside a QA strategy in one of our previous posts.

To Sum Up

As you can see, a Test Plan is voluminous, often difficult to write, but a crucial testing artifact. It guides the team through a well-structured testing process, preventing a lot of stressful situations and misunderstandings. Moreover, a Test Plan helps team members stay on the same page since stakeholders and developers can have access to it.

Test Plan writing requires strong analytical skills, attention to detail, and the ability to think several moves ahead. Though complicated to develop, the result always pays out. So, even if documentation writing seems less interesting than testing, only together they make the QA process efficient.

Ready to speed up the testing process?