📜 ⬆️ ⬇️

"Calendar of the tester" for July. Analytics testing

Meet the authors of the July article for the Calendar of the Tester, Andrey Marchenko and Marina Tretyakova, Testers-analysts of the Contour. This month, the guys will talk about analytics testing workflow models, and how they started testing analytics before the development phase. The experience of the guys will be useful for managers, testers and analysts of medium-sized grocery teams who do not live in a startup and for whom quality is more important than speed .





Analytics Testing Workflow Models


Model 1


The tester works with analytics after the finished task has been transferred to him. He checks the task, reading the analytics, like the documentation of what the developer did. Everything is mistaken, regardless of the level of professionalism. Defects can be in analytics or in the developer code.


Minuses:



Analytics errors detected during testing are expensive or very expensive.

Pros:



Model 2


The tester connects to the task before it is transferred to development. He looks at the prototypes for the task or just reads the documentation. All questions on the task tester asks analytics. The analyst promptly corrects the comments. The tester compiles acceptance tests.


Minuses:



Pros:



If the development team doesn’t have reviews of analysts, testing analytics improves product quality and reduces the risk of passing the task to development with errors in the TOR.


When we recommended the second model to testers working on the first model, we often heard:



The restructuring of the development process is a serious managerial task.

Implementation of testing analytics to the development stage


Almost a year, as in the project Kontur. The standard in the development process is embedded mandatory step "testing analytics." This team did not come immediately. The growth in the number of task returns from testing to the analytics stage and further refinement was the impetus.


It was especially painful for large tasks with new functionality. The tasks transferred to the testing stage on the front-end were raw, often broke down on the simplest scenarios, were implemented differently in view of the “ambiguity” of definitions and terms in analytics.


The analytics testing process does not appear on the click of a finger or any magic. This is hard work, but it can be divided into stages.


Stage 0. Sell the team test analytics idea


You can easily imagine a situation when you suddenly get feedback on your work with comments, suggestions and corrections. The first thought of any normal person will be: “Why did you decide to check me out? I do not trust? Are you monitoring the quality of my work? ” At this stage it is very important that the analyst does not have the feeling that he is being checked for quality and, in case of unsuccessful testing, will be dismissed.


A number of questions can be removed if the information is presented in the key: “this will give us the opportunity to learn about new functionality earlier, speed up the testing phase, prevent even minor defects in the code”.


When stage 0 is passed, you can move on.


Stage 1. The introduction of analytics testing in the development process


Having convinced the team, we proceed to the implementation of testing analytics in the daily workflow. If initially the workflow looked like this:



That after introduction:



If there are several analysts in your team who review each other before submitting the task to development, then we start testing a better text. We mean that analytics review is not testing it, but only a part of it.


Stage 2. Analytics testing


There are tasks when prototypes replace the text version of analytics.


In this case, the prototype is also checked as text. If prototypes complement the analytics, then it is useful to look at the design layouts of future functionality before reading the documentation. This is your only chance to look at the task as a user who did not read the TK and does not know how everything works and should work.


What can be checked in analytics:


1. The proposed solution satisfies the objectives of the task.


For example, if the goal of the task is to collect feedback from users, then the solution should involve recording and storing user responses.


2. Uniqueness of interpretation.


For example, the wording “show information for the current day” can be interpreted in different ways. You can understand how to “show information for the selected day in the settings,” or how to “show information for the day equal today.”


3. The feasibility of the decision.


Feasibility is the ability to realize the requirements written in analytics with known limitations of the development environment, programming language, algorithmic complexity. Good analysts can keep in mind an algorithm by which they can solve the problem they have written. It’s not a fact that the developers will do according to this algorithm (they are more aware, they will find ways to make the algorithm optimal, etc.), but its very presence indicates the feasibility of the task.


4. Testability.


How to check that the condition “to improve search results” is met is not clear. But if you rewrite the condition to “search results should be shown to the user within 1 second after pressing the“ Search ”control, it is understandable.


5. The availability of alternative scenarios.


In the wording “If the number and date are indicated, then we print the number and date. If the date is not specified, we print only the number. There are not enough scenarios:



6. Handling exceptions.


In the formulation “It is possible to load a document only in Excel format” it is not clear what should happen if we load files of other formats and what error we will see when loading them.


Artifacts when testing analytics


What artifacts can remain after analytics testing:



Developer Checklist - Required, capacious, basic checks of the main scenarios that must work before it can be tested. It is also a tool for improving code quality. Before you submit the task to testing, the developer passes the checklist, independently quickly identifying bugs.


The developer must be persuaded to go through the tester's checklist, removing the developer’s internal agitations “they’re checking me out”, emphasizing “speeding up the process, speeding up testing, improving quality”. As a result, our developer passes through these checklists and is glad that it was not the tester who found the errors, but himself (“no one will know that I messed up in such a nonsense scenario”).


What is the result


At first glance, it seems that the introduction of a new stage in the development process will only increase TimeToMarket, but this is an illusion. First, of course, the testing process of analytics will be new and untested, and the tester will spend more time on it. In the future, gaining experience, he will be able to quickly conduct it. And the results obtained at the stage of testing analysts will reduce the time at the stage of direct testing and reduce the number of returns to a minimum.


This analytics testing process is implemented in several Contour teams. Development teams received a number of undeniable advantages:



We believe that you will be able to get these advantages by implementing the stage of testing analytics in your project.


List of calendar articles:
Try a different approach
Reasonable pair testing
Feedback: as it happens
Optimize tests
Read the book
Analytics testing
The tester must catch the bug, read Kaner and organize a move.
Load service
Metrics in QA service
Test security
Know your customer
Disassemble backlog


')

Source: https://habr.com/ru/post/417287/


All Articles