We build a system for monitoring the quality of frontend projects

Hello everyone! Let’s continue the cycle of stories about what the Web Core team at DomClick is doing. In the previous article, we described how we create a design system. And this time I would like to share the history of the development of a quality monitoring system for frontend projects – Front Radar. We have many projects in our company that are created by a large number of teams, and in this regard, it was necessary to check the relevance of projects and identify problem areas that negatively affect the client experience.

The goals that we pursued when implementing this project:

  • evaluate the relevance of the project (timely update of dependencies);

  • checking for vulnerabilities;

  • checking the legality of using dependencies (checking licenses);

  • collecting statistics on the size of the project;

  • collection of statistics on such parameters as Performance, SEO, Accessibility.

Front Radar is a REST service that contains many microservices and starts them depending on the called API and the set of parameters. The set of these microservices today is as follows:

  • Dependencies license checker

  • Dependencies update checker

  • Dependencies safety checker

  • Bundle sizes checker

  • NPM sizes checker

  • Lighthouse

  • W3C Validator

Our service will be launched for two stages of checks: as a blocking Quality Gate in the pipeline at the build stage, and the second stage after a successful deployment.

Embedding Front Radar in Pipeline as a quality gate
Embedding Front Radar in Pipeline as a quality gate

The first stage of checks is launched via a standard web-hook with the following set of parameters: project ID, release version, repository URL, and a data object with information about the size of the files required for the web client to function (to collect analytics in the Bundle sizes checker). This is enough for us to pull the project from the repository, install the dependencies and analyze the project with each of the microservices. We save the analysis results in the database, where the project ID will be the key. Thus, we save all the statistics, and from release to release we can assess the state of our code base.

Dependencies license checker

Verifies that we are using libraries with licenses for free commercial use in our projects. Under the hood of the service, we use the library license-checkerby pre-installing the dependencies for the project that was pulled from the repository. After checking, the utility gives us an answer in the format:

{
  'resolve-url@0.2.1': {
    licenses: 'MIT',
    repository: 'https://github.com/lydell/resolve-url',
    publisher: 'Simon Lydell',
    path: '/Users/royroev/WebstormProjects/package-stats/temp/project-5/node_modules/resolve-url',
    licenseFile: '/Users/royroev/WebstormProjects/package-stats/temp/project-5/node_modules/resolve-url/LICENSE'
  },
  'resolve@1.12.0': {
    licenses: 'MIT',
    repository: 'https://github.com/browserify/resolve',
    publisher: 'James Halliday',
    email: 'mail@substack.net',
    url: 'http://substack.net',
    path: '/Users/royroev/WebstormProjects/package-stats/temp/project-5/node_modules/resolve',
    licenseFile: '/Users/royroev/WebstormProjects/package-stats/temp/project-5/node_modules/resolve/LICENSE'
  },
  ...
}

I would like to note that the utility collects information not only on the first-level dependencies specified in package.json, but also traverses the entire dependency tree.

After we receive the license information object, we filter the libraries that come with a limited license and mark them to look out for. We also save this data in the database to track emerging problems from version to version.

Dependencies update checker

Collects statistics on dependency updates. Timely updating of the used ones allows you to keep the project up to date. With each release, the communities improve their libraries, reduce their weight, eliminate vulnerabilities and bugs that can harm your project. This module also provides insight into updating the UI Kit components, as the Web Core team is improving their products every day and wants other teams to update them. If someone does not do this, we must identify the reason that prevents the renewal and eliminate it.

Also in this module we check if the versions of the libraries are fixed, namely, we check for the presence of lock files. This is necessary so that when building the project, the team does not encounter such problems in dependencies as the presence of vulnerabilities, changes in the license agreement, breaking changes and commonplace bugs. You can learn more about why this is needed in the article of our employee.

Dependencies safety checker

Checks dependencies for vulnerabilities. We use multiple npm audits and access the Node Security Project Vulnerability Database. You can see how npm audit works in this article. For us, this service is blocking: if it finds a library with a vulnerability in its dependencies, then we are forced to block the publication of the project.

Bundle sizes checker

Gets statistics on the size of collected bundles for the web client. To do this, we receive a data object in the incoming web-hook that contains information about the size of all files in the project that are necessary for its functioning. This data is collected in CI / CD using a recursive traversal of the folder with the collected project. Object example:

{
	sumSize: 40355,
  sumSizeZip: 10285,
  files: [
  	{
    	name: 'main.js',
      size: 11022,
      sizeZip: 4033,
    },
    {
    	name: 'vendors.js',
      size: 22142,
      sizeZip: 8012,
    },
    ...
  ]
}

We also save this data in the database to generate statistics for the project. Now we can analyze from release to release how the project adds or loses weight, and if the project weighs significantly more with the current release, then we mark such an assembly with the flag “needs attention”.


The second stage of checks is launched immediately after the project is deployed (for web clients, this is the publication of the project in the production environment, for npm packages, the publication in the npm registry). As in the first stage, the web-hook is triggered, but with a different set of parameters: the project identifier, the array of project page addresses (for the web client), the name of the package in the registry (for the npm package).

NPM sizes checker

Gathers statistics on the size of built npm packages. Our company has its own private bundle registry, in which we publish our libraries for further reuse in other teams. And, of course, I would like to collect statistics on the size of such packages. In this service, we generate an index.js (entryFile) file with the import of our library, install it with the command npm i $package-name and use webpack to collect our entryFile. After a successful build, we get a JSON object stats, which contains information about the size of JS and CSS files. Based on historical data, we can form statistics and identify positive or negative dynamics.

Lighthouse

There are several ways to monitor your web client using Lighthouse:

  1. Through PageSpeed ​​API (setting is described in detail in this article). Runs on schedule, has nothing to do with your CI / CD, so there is no release binding.

  2. Through lighthouse-ci… A very handy tool with a lot of settings. You can run a script on Puppeteer if your page, for example, is closed for authorization. The method will suit you if you are building a project with one of the listed tools for CI / CD: GitHub Actions, Travis CI, Circle CI, GitLab CI, Jenkins (based on Ubuntu), Google Cloudbuild. If you have something different, you have to dig into someone else’s code or go to step 3.

  3. Write your own service based on the node.js server and the Lighthouse npm library, with your own rules, settings and custom scripts.

We chose item 3 because we wanted to fine-tune the service and the ability to launch both when a new version of the project is published, and on schedule. After publishing the web client in a production environment, we start checking the array of URLs received in the web-hook using Lighthouse. The service is a node.js server on which we first run chrome-launcher; if you need to analyze a web client that is closed for authorization, then we run authorization scripts (our autotests), and then we start checking pages using the library Lighthouse

const chromeFlags = [
    '--disable-gpu',
    '--headless',
    '--no-zygote',
    '--no-sandbox',
  	'--disable-dev-shm-usage',
];

const config = {
  "extends": "lighthouse:default",
  "settings": {
    "emulatedFormFactor": "mobile",
    "useThrottling": true
  }
};

const chrome = await chromeLauncher.launch({ chromeFlags });

const flags = {
  port: chrome.port,
  output: 'json',
  'max-wait-for-load': 500000,
};

const result = await lighthouse(url, flags, config);

/*
result = 

{
  "performance-score": 83,
  "accessibility-score": 98,
  "best-practices-score": 100,
  "seo-score": 100,
  "pwa-score": 100,
  "firstContentfulPaint": 2341,
  "firstMeaningfulPaint": 2341,
  "largestContentfulPaint": 3736,
  "firstCPUIdle": 4951,
  "interactive": 5101,
  "speedIndex": 2341,
  "estimatedInputLatency": 13,
  "totalBlockingTime": 221,
  "maxPotentialFID": 210,
  "cumulativeLayoutShift": 0.009377760145399306,
  "cumulativeLayoutShiftAllFrames": 0
}

*/

We know that Lighthouse checks can vary based on different check conditions, so we run 5 checks on each page and average each metric. Then we aggregate and save the data to the database and try to create infographics that our teams can understand.

Also, with each release, we save a Lighthouse report, which displays the results and recommendations for improving the site.

W3C checker

It is a NodeJS application that uses the library html-validator… We receive data on errors and warnings in JSON format, which we summarize to track the dynamics from release to release, as well as in HTML format to display recommendations to developers.

Conclusion

All indicators affect the customer experience of using our products, therefore, based on all the collected metrics, we make recommendations for projects to improve the code base and identify all deviations from the best practices in the world of web development. Teams try to correct deviations within the technical quota.

We are not going to dwell on the listed checks; we are planning to connect additional systems for analyzing our web projects – Favicon Checker, SEO Checker, Link Checker, CSS Validator, CSS Stats. I hope that the article was useful to you and that you are organizing a similar monitoring system. Of course, I’m waiting for you in the comments, I will always be glad to answer your questions and comments.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *