How the Solar webProxy team applies DoR and DoD criteria in product testing

Hello! I am Ekaterina Vasilyeva, a testing engineer at Solar Group. In our work, there is an eternal question of how to make testing fast, high-quality and effective. And do you know what helps? Proper organization of the process. At Solar, for example, we actively use the concepts of DoR (Definition of Ready) and DoD (Definition of Done) when testing products. Although these criteria are more common in development, they have proven to be incredibly useful for us, testers. They help us clearly understand when a task is ready for testing, and when we can breathe a sigh of relief and say: “Done!” As a result, there are no missed deadlines and releases are made on the same day. In this article, using Solar webProxy as an example, I will tell you how DoD and DoR help us improve the quality of testing and what difficulties we encountered when implementing these criteria.

What is DoR and DoD

Definition of Ready (DoR) is a set of conditions that tasks must meet before they are submitted for testing. That is, DoR answers the question:What is needed to start testing? “. Here are some examples:

  • Sufficient time has been allocated to complete the task.

  • Resources are available to conduct testing: an appropriate test environment or the availability of the required hardware capacity for testing.

  • The employees’ competencies correspond to the assigned task, and the roles of each employee involved in testing are distributed.

  • There are no defects blocking testing.

  • The acceptance criteria that will be used to evaluate the success of testing are defined. Yes, Definition of Ready already uses Definition of Done a little bit, but more on that later.

  • And, of course, the requirements are formulated! Clear, concise, clearly written, agreed upon with all stakeholders, containing enough information, but at the same time not overloaded with it, with well-developed user stories and a transparent understanding of the functionality.

The more of the above characteristics are in the document, the better and more optimistic the life of the tester will be, and the test design will be more complete, and, as a result, the quality control of the product will be more thorough. Defining DoR ensures that the team has everything necessary to start working on tasks and can achieve the set goals.

The second term, Definition of Done, or “definition of completion,” establishes the criteria by which it will be possible to determine that the testing task completedExamples of this include:

  • The test design was developed based on the requirements presented and agreed upon.

  • Test cases are successfully closed and functionality requirements are tested against the described use cases.

  • Testing was performed for all declared test environments.

  • All defects and critical errors that blocked testing have been fixed.

  • There are no unresolved non-critical bugs, or their number is reduced to a minimum, and they themselves are documented and included in the sprint backlog.

One might think that since Definition of Ready is responsible for the beginning of the process, and Definition of Done is responsible for completion, then these criteria are applied only at the beginning and end of testing. However, both of these concepts are used at the start to form, agree on, and familiarize the entire team with them.

What are the advantages of DoR and DoD in testing?

For effective testing, we need clear and unambiguous criteria by which the team immediately understands that it can start testing and can confidently declare the process complete. DoR and DoD allow us to more accurately estimate the scope of work and allocate time for completing tasks, understand what preparatory activities are required, and clearly plan the sprint.

So, DoR and DoD increase the transparency and quality of the testing process. However, in addition to the positive effect, they have their drawbacks.

Limitations of DoR and DoD in testing

It would seem that the more detailed and precise everything is defined before testing begins, the faster and more successful it will be. Yes, but to a certain extent.

DoR and DoD criteria that are too broad and difficult to implement, as well as strict adherence to them, can slow down the testing process. Instead of communicating sufficient requirements and starting testing, the team will continue to carefully specify unimportant aspects. This will delay both the start and completion of testing.

The implementation and maintenance of Definition of Ready and Definition of Done alone requires time and effort from the team – this is an additional burden that can shift the product release date. The negative impact is especially felt at the beginning of the work, when the team is just getting used to them and introducing them into the system. There is a risk that the team may underestimate the amount of work or not take into account all the necessary aspects. In addition, the criteria may change depending on the project or situation, so the team must periodically update the criteria, which also costs resources.

Flexibility can also suffer when using DoR and DoD: too rigid conditions can greatly limit the team's adaptability to changes. Any change in the user story or product requirements will mean double the work of defining the DoR and DoD criteria.

Implementation of DoR and DoD in the testing process

Given the above negative aspects of Definition of Ready and Definition of Done, they need to be used wisely, skillfully balancing between being a frustrating bureaucracy and a productive tool. Gradually, as the team applies these concepts, they will gain a better understanding of how to use them effectively to optimize the testing process.

Based on our experience, I recommend following these rules:

  • The criteria must be formulated clearly and unambiguously. All team members must understand and agree with the criteria.

  • The criteria should be reviewed and updated regularly. As the project and operating conditions change, the DoR and DoD may also require adjustment.

  • Concepts are part of the daily process. Before starting work on a task, you should make sure that it complies with the DoR, and before closing it, check that all DoD items are met.

  • Transparency and accessibility: the document with the terms of the DoR and DoD must be available to all team members.

Example of using DoR and DoD when testing the Solar webProxy product

Solar webProxy is an innovative software service for secure access to web resources. Its operating principle is similar to a firewall: it is installed as an intermediate element in the network infrastructure and exercises control over all data transmitted between personnel, internal company systems and external Internet resources. However, unlike a firewall, Solar webProxy provides higher efficiency and has rich flexible functionality for solving various tasks in the field of network security. Solar webProxy is equipped with a built-in antivirus, firewall, web resource categorization system and content filtering, in which you can flexibly configure access policies for certain users to various resources and files, for example, prohibit employees from uploading files with the .doc extension to the external network.

I'll tell you how we use DoR and DoD in practice on the Solar webProxy project using the example of developing and testing one of the features.

One of the product updates included the ability to flexibly configure content filtering for unrecognized file types and monitor interactions with them, which is considered a potential threat to network security. The development of a new feature in our team always begins with a general meeting, which includes analysts, stakeholders, developers, and testers. In such meetings, known as PBR (Product Backlog Refinement) in the Scrum methodology, teams analyze the need for implementing a task. In this case, the prerequisite was the customer's need.

Next, based on the user story and their own experience, each team offers their own additions or optimizations to improve user experience, including the testing area.

The next step is to create a test design. The testing team, in accordance with all the information received at PBR, forms test cases. In addition to test cases for the user interface, practical functionality checks are carried out, including checking the possibility of setting up a user interaction policy with files of unknown extensions: prohibiting downloads/uploads and displaying warnings.

Next, after the test design has been approved, the process of passing test cases begins, during which various defects are fixed. As a result of the work performed, a control check is carried out to ensure that the testing has been passed – the feature functions in accordance with the requirements, the test cases have been successfully passed, the bugs have been closed or recorded and transferred to the backlog, and, if necessary, the features of the feature's operation have been included in the documentation.

The entire above-described algorithm and stages of formation and use of DoR and DoD criteria are reflected in the diagram below. This is just an example of how we use these tools in the Solar webProxy team, but they can be taken as a basis and adapted to your business processes, using them as a checklist to follow the concept of “readiness” and “completion”.

Results of applying DoR and DoD criteria in testing

The most important question is: how to do testing in the maximum volume in the minimum time? The DoR and DoD criteria do not provide a ready answer to this question. These concepts are only tools that in capable hands will become a powerful aid and help optimize the testing process. Try to implement and adapt them to your business processes, and perhaps this will help your product pass the testing stage more successfully and efficiently.

In conclusion, I will share a table that will help you compile a list of criteria for applying the Definition of Ready and Definition of Done concepts in software testing.

Before taking a new task into a sprint, answer the following questions:

Questions for DoR formation

What does the implementation of the task help to solve?

Makes the functionality better/clearer/more convenient/more efficient

The quality of functionality will not change much

What is the need for implementation?

The presence of this function is critical for the customer.

Maybe someday in the future it will be useful to one of the customers

Has the task been agreed upon with all parties involved?

Yes, all persons involved in the task confirmed the need to implement the task

The decision to accept the task for work was made by a narrow circle of people

What is the user story?

The history is transparent and reflects user behavior close to real behavior.

User story is unclear or missing

Are there any additional ways to accomplish this task?

No, only working through the task directly will help solve the problem

There is a possibility to solve the problem in a less labor-intensive way

What is the efficiency of solving the problem?

The effort expended will pay off with an impressive contribution to the functionality of the product.

The result obtained will be small in comparison with the efforts expended.

What are the technical requirements for the task?

Technical requirements The result obtained will be small in comparison with the efforts expended are clear to all persons involved

Technical requirements are vague and lack specificity

Is there time to complete the task?

The task is entered into the sprint, time is allocated for it

Not enough time has been allocated for the task

Are there any defects that prevent the task from being completed?

There are no defects, or they are non-critical

There are blocking defects

Is the test environment defined?

Yes

No

Has the test design been agreed upon and approved?

Yes

No

Questions for DoD Formation

All test cases passed successfully?

Test cases are completed completely and successfully

There are unsuccessfully passed test cases

Does the completed task meet the stated requirements?

Yes, the presented requirements are fully implemented during the execution of the task

The obtained result differs from the stated requirements

Is the user story reproduced in full?

The story described was reproduced in full.

Unable to fully reproduce user story

Ekaterina Vasilyeva

1st category test engineer at Solar Group of Companies

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *