Is it okay if 30-40% of URLs give 404 errors? Google – on the attitude of search bots to broken links

Worried webmasters with a lot of broken links in Google Search Console can relax. It turns out that when almost half of the site’s pages respond with 404 errors, this is the norm for the Google search bot.

Corporation representative John Mueller saidthat this is justified by the algorithm of the crawler, analyzing the site and every page on it. The search bot continues to crawl pages that have been deleted, even those that were deleted several years ago. It is impossible to prevent a search engine from analyzing “dead” pages.

And what to do with it? Nothing. If you are sure that the page no longer exists, then you can exhale. The Google bot will analyze the broken link and ignore it. The presence of 404 errors will not affect the assessment of Core Web Vitals and the position of the site in the search results.

By the way, it is planned that in May of this year Google will update ranking factors and add a new algorithm Page Experience… The new ranking algorithm will consist of 7 parameters. Also Google announced a new tool Core Web Vitals, which will replace and combine the services of Lighthouse, Chrome DevTools, PageSpeed ​​Insights, Search Console’s Speed ​​Report.

Big changes in rankings are expected in Google in 2021. So it is worth paying attention to the new algorithm and services for verification.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *