Multiple duplication of titles and descriptions (SEO)

What is the article about?

One of the problems that sites with a very large number of pages (order: several tens of millions of pages) can face is the same title and description meta tags on all these pages. In this article, we will look at ways to partially improve the SEO situation in this case.

What are meta tags in general and why are they needed?

In simple terms, meta tags are special markup on a page that is not visible to the user on the page itself, but is needed for search engines to understand how to title the site in search results and what brief description to give it. The more informative and attractive the meta-title and meta-description, the more attractive the site is for the user in search results. This increases the number of clicks on it and, accordingly, organic traffic on the site – one of the main goals of SEO. An example is in the screenshots below.

Snippet in search results

Tags in code

Table of contents

Method 1: Unique title and description (captain obvious)

Method 2: 301 redirect (if there are irrelevant pages)

Method 3: Clean-param directive in robots.txt (only for Yandex search robots)

Method 4: Close the page from indexing with the robots meta tag (does not work if the directive is disabled in robots.txt)

Method 5: Add a canonical meta tag (for search robots it is a recommendation)

Unique title and description

The very first thing that comes to mind when a webmaster complains about a large number of identical meta titles and meta descriptions on pages is to make them unique. This is a great idea, but what to do when there are not 5 or 10 such pages, but 5 or 10 million?

The main trick is that you don’t have to edit all the title and description meta tags manually. As a rule, any web page has one or another key unique information (otherwise what’s the point of a web page?) that can be “dragged” into the meta-title and meta-description. For example, each article has a title and a short description (these are ideal for use in meta-title and meta-description, unless, of course, the articles on your site have different titles and different content). Each product has its own unique name and description, and so on. You just need to configure the “pulling” of these information fields into the appropriate meta tags. By the way, if the article description contains >250 characters, then there is nothing wrong with tightening the text to the last point that fits within the given limit. Typically, the first 200 – 250 characters contain enough key information and keywords.

Can you give me an example “on your fingers”?

Certainly!

Let’s say You have an online store “KupiVsyo” selling household appliances. Now on the page of each product, the meta title and meta description are simply duplicated from the main page, that is:

meta-title as is: Buy goods from “KupiVsyo” – a large online store selling household appliances.
meta-description as is: The large online store “KupiVsyo” has a lot of different household appliances! Kettles, irons, multicookers and other household appliances – buy online.

That is on the page of the kettle “Super Kettle XXX” the title and description described above are now the same as on the main page of the online store. Now, on the page of each product, add the product name to these meta tags and get:

meta-title to be: Buy Superkettle XXX in “KupiVsyo” – a large online store selling household appliances.
meta-description to be: The large online store “BuyEverything” has Superkettle XXX and a lot of different household appliances! Kettles, irons, multicookers and other household appliances – buy online.

Bingo! By setting up automatic filling of title and description in this way on all product cards, you will get rid of a very large number of duplicates. Just don’t forget that in the great and mighty Russian language there are many inflections and conjugations, therefore, taking into account the indeclinability and inconjugation of the text being pulled up, you need to correctly build a meta-title and meta-description template for it.

301 redirect

If you have pages that for some reason are no longer relevant (for example, you made a new design for some part of the site and it turned out that on two different URLs there are pages with identical content, but different design and layout), then From them you can set up a 301 redirect to relevant pages. You can read more about what types of redirects there are, why they are needed and how to set them up in this article or in this. In short, a 301 redirect is a forced redirection from one URL to another, which helps hide unnecessary URLs from indexing.

Using teapots for teapots as an example

Let’s consider the same online store “Buy Everything” and the kettle card “Super Kettle XXX”. Let’s say you decide to redesign the product card, that is, you have two pages on different URLs with the same content (and, accordingly, with the same meta tags). And you have a million such products, that is, a million duplicates. By setting up a 301 redirect from the URL leading to the old card to the URL leading to the new one, you will block all pages of the old cards from indexing.

Clean-param directive in robots.txt

If for one reason or another you have GET parameters when going to a particular page (for example, you are tracking the search engine from which the transition was made, or other parameters), then you need to close them from indexing so that search robots do not consider these pages are different. To do this, in the robots.txt file (you can read more about it Here) you need to add the clean-param directive (more about it Here) and enter the appropriate parameters into it.

It is important to note, that this directive is “implemented” only by Yandex robots, while Google robots are not.

Back to the Dummies

Let’s say your default teapot card “Super Teapot XXX” is located at the following URL: www.kupivse.ru/bitovayatehnika/chayniki/123123. However, it is important for you to track from which search engine people are coming to you on this card, why do you add UTM tag.

Total when going from Google, the URL looks like www.kupivse.ru/bitovaya-tehnika/chayniki/123123?utm_source=google, and from Yandex www.kupivse.ru/bitovaya-tehnika/chayniki/123123?utm_source=yandex. To avoid duplication in this case, you need to add the line to the robots.txt file

Clean-param: utm_source

Bingo! You have overcome duplication caused by get parameters (only for Yandex robots).

Block a page from indexing with the robots meta tag

To hide a page from indexing, you can add the following meta tag to the Header:

<meta name="robots" content="noindex, nofollow">

It is important to note that if a page is prohibited in the robots.txt file, then the meta tag or header directive has no effect

No dummies, but more about this meta tag

Tyk

Add canonical meta tag

To almost all the methods described above, it wouldn’t hurt to add a laconic rel=”canonical” attribute. It is advisory in nature and helps search robots understand which page is considered canonical and, therefore, reduces the number of duplicates.

No dummies, but more about this meta tag

Tyk

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *