Metrics of the 1C agile team in JIRA

It was inconvenient; stakeholders and the team lacked understanding of what progress we were making, whether we were on time, whether we were moving at the right speed. We realized that there was not enough JIRA analytics, visualization of metrics, and dashboards to understand the team’s growth areas, where it is necessary to reduce the release time of features and see the dynamics: from month to month, from sprint to sprint.

What productivity analytics did we use?

The purpose of collecting metrics is to provide real information about the progress of development from the idea to the readiness of business applications.
The company did not have a ready-made tool, so we had to find and develop it ourselves.
We had an issue tracker in JIRA, we decided to use its capabilities, and studied reports on tasks and sprints.

Creation of working steel and dashboards

Creation of working steel and dashboards

One sprint report was not enough, switching between them was inconvenient, so we created a separate Desktop.

Sprint status reporting

Sprint status reporting

We added various graphs and dashboards; at the top you can see convenient filters for sprints, our project, and statuses.

At any time you could open the board and see the task statuses.

Pie chart by status

Pie chart by status

There were a lot of different graphs and it was necessary to choose what would be useful to us to analyze our work.

In terms of products - plan/actual

In terms of products – plan/actual

Added a visual – the percentage of tasks completed in the sprint, shows how many tasks were planned, how many were implemented and the ratio of the two metrics

By type of task

By type of task

Dashboards can be formed in the context of the necessary analytics, for example, separately calculated for “Products”: 1C: Billing, 1C: Accounting, 1C: ZUP, separately for project and non-project tasks

Analytics on the number of tasks in the context of Sprint/Resolution

Analytics on the number of tasks in the context of Sprint/Resolution

We added a dashboard to analyze the performance metric – the number of closed tasks (resolution – Resolved, Canceled) in each sprint to understand how fast we are moving;

We added a dashboard for the remaining time in terms of tags – in them we kept our analysts that we wanted to track (for example, separate tasks for personnel records, accounting, C&B, etc.)

Tasks can be assessed in hours or in story points. For more accurate planning of sprints, quarters, and project management, we needed to know how much work the team can complete in a certain period of time – a sprint.

Velocity is a metric in Scrum that allows a team to evaluate its performance and predict future tasks. This metric is measured in the number of story points completed in one sprint. It allows the team to determine how many tasks can be included in the next sprint, estimates when the project will be completed, and allows the team to analyze its performance and look for ways to improve.

We added a dashboard for analyzing all Epics, tracking progress for each and every one individually, and how many tasks within the epic were solved.

We added dashboards that showed us the number of problems created and solved.

Used task burndown charts Burndown. They are needed to understand that the team, when planning a sprint, does not take on enough tasks if it always finishes its work ahead of schedule. Or, on the contrary, he plans too much and does not have time to complete everything during the sprint; there are postponements.

What problems did you encounter?

  1. Management by numbers only. It is important not only to rely on measured metrics in isolation, but also to make informal observations that reinforce each other. Otherwise, you can get distortions – unimportant results, despite the beautiful numbers on the dashboards, and reduce employee morale.

  2. There are too many metrics, but they do not help in further decision making. We added all possible charts, tried every instrument that was on the board, for fear of forgetting to measure something. In fact, most of it turned out to be redundant. We spent time analyzing and collecting data, but this analytics does not help us make better decisions, so we made it a rule to ask ourselves questions:

    • What decisions will this metric help you make?

    • Which metrics are sufficient for adoption?

    • Do we need to measure this right now?

  3. We use the same metrics at all stages of the project and development. We were faced with the problem that the indicators that we measured at the Design stage turned out to be no longer relevant at the Testing stage. We already had other types of issues in JIRA, other tags, statuses, so we urgently had to rebuild so that our dashboards and project progress were up-to-date. It is important to remember that if everything changes, you need to adapt flexibly.

  4. Metrics are not used to analyze and improve the product. It is important not to distort data for your own purposes or use it for punishment. The metric should not serve only the purposes of career growth; there should be no manipulation of data. Metrics should primarily be aimed at team efficiency, achieving results, and product development.

Rules for using metrics

There is an opinion that the harm from metrics is greater than the benefit. This is possible if these cool tools are not implemented correctly. Therefore, you need to use the rules to avoid this:

  1. Use metrics in conjunction with other tools to analyze the big picture.

  2. Use them only when they help improve decision making.

  3. Pay attention to manipulating people’s metrics and changing their behavior.

  4. Interpret metric values ​​carefully.

Rules like these have helped guide our team toward better metrics and, more importantly, better business results.

Bottom line

  • We understand the capacity, speed, predictability of the sprint and the project as a whole;

  • We increased the percentage of closed tasks and minimized those transferred from sprint to sprint thanks to forecasts;

  • Began to introduce a culture of metrics in the company, share your practice with other development teams;

  • The development process has become more transparent for the team;

  • We increased the loyalty of business users, received positive feedback and the mood in the team improved!

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *