Sitemap audit, orphan pages and zombies



Following the success of my zombie method detailed in late 2018, I found a new way to spot zombie pages that can degrade your SEO. Find out how to do this with an advanced sitemap audit and a tip to speed up zombie page processing by Google.




Reminder: if you have too many zombie pages on your site, it can deteriorate your SEO. But above all, improving a zombie page turns it into a good page that generates revenue.



In this new folder, I detail you how to find zombie pages not found by a crawl ... As well as a tip for Google to take into account very quickly your corrections.



I said that with this method, I have spotted on many sites and it allowed me to make beautiful optimizations!



The common point of this technique and the associated hint is that they are based on an "advanced" sitemap audit. You can do it "by hand" (but it can be long) or via the tool of your choice, which is important to understand the principle.



For my part I use RM Sitemaps coupled with RM Tech, the 2 SEO audit tools of my platform My Ranking Metrics. The RM Sitemaps audit includes all the features and tips described in this folder.




The zombie orphan pages


Orphan page definition



Let's start at the beginning: in SEO, what is an orphan page?




An orphan page is a page which exists on the site but which is not connected to others by a link, in any case visible from the search engines.



There are many reasons that can explain the presence of orphan pages.



First, there is the case of good pages that have become orphaned .



For example, for more or less obscure technical reasons, a page can no longer be found by Google-compatible links:






  • the pages that make links to it are now blocked on the crawl (robots.txt file)

  • the pages that link to it (or the links themselves) are managed or accessed by Javascript, in a manner incompatible with Google

  • etc.




That's not all: the mistake is often human. For example, you manually deleted old pages from your site, but they were the only ones to link to it.



Conversely, there are the bad orphan pages (candidates to become zombies ...):






  • a whole lot of URLs that produce what I call the black mass: URLs that should never have existed but that Google has crawled and sometimes indexed. Often you will say "it's the CMS's fault  " ...

  • pages with extremely low content

  • products that are not available for sale but whose product sheet is still displayed

  • etc.






OK, but if they are indexed, Google will return the crawlers, so is it so serious to have this kind of pages?



The problem of orphan pages

Yes !




That the orphan page is of quality or mediocre, it is an SEO problem that must be corrected!



A page of quality but which is orphan is handicapped, its referencing is less effective. If it is a product listing currently on sale, or any other page that could generate revenue, then its performance is significantly lower. Indeed :






  • it does not get any popularity (PageRank) by your internal links

  • in the absence of internal mesh, its semantic optimization falls

  • internet users find it hard to find on the site

  • she is less often crawlée

  • etc.




It would be enough to know these good orphan pages to reintegrate them in the internal mesh and that they find a good referencing ...



A poor orphan page degrades the average quality of the site perceived by Google without you realizing it. It is therefore important to improve the quality of these pages , or to separate them depending on the case.




How to find orphan pages?


The classic method based on sitemaps is to follow these steps:






  • make a crawl of your site (with a crawler that follows all the links allowed on the crawl) and filter to get only the indexable pages

  • make an exhaustive list of all the pages to be indexed: if all is well done, you have it in a sitemap (or several)

  • by difference, find your orphan pages "present in the sitemaps but not found by a crawl that follows the links"




It's already good, I hope you've already tested it (for a long time).




Questions about sitemap files?



I propose to go further with this advanced method of discovering orphan pages via Google Analytics :






  • crawl your site (with a crawler that follows all allowed crawl links) and filter to keep only searchable pages

  • through the Google Analytics API , retrieve URLs from pages that have been visited for 1 year (organic support) and filter to keep only searchable pages

  • by difference, find your orphan pages "views in Google Analytics but not found by a crawl that follows the links"




Do the same to find orphaned pages through Google Search Console :






  • crawl your site (with a crawler that follows all allowed crawl links) and filter to keep only searchable pages

  • through the Google Search Console API , retrieve the URLs of pages that have generated impressions for 1 year and filter to keep only searchable pages

  • by difference, find your orphaned pages "displayed in Google SERPs but not found by a crawl that follows the links"




 Reminder: RM Sitemaps includes searching for orphan pages, everything is automated ??




What to do with orphan pages?


In the end, you get a complete list of orphan pages. Here's what to do:






  • if it's a good page, hang it up at the rest of the site! Make links from the most relevant pages (according to their semantic context). I had great results with the product sheets of a client (9% were orphaned ...)

  • if it's a bad quality page that can be improved , at work!

  • otherwise, it's a horrible zombie that must be destroyed. This is the purpose of the rest of this file.





Accelerated cleaning of zombie pages


I prefer to call it back because it's major: my method of zombie pages consists of:






  • check if there are zombie pages that can plumb the SEO

  • have a pre-calculated index for each page to prioritize actions

  • have the most data to understand the cause (why is the page zombie)

  • fix the problem so that the page is of high quality and generates traffic

  • as a last resort, de-index or delete unrecoverable pages







what to do zombie pages?


Whether you have improved the quality of a zombie page or, on the contrary, made it deindexed, it's better if you can make Google understand that you have corrected the problem.




To speed things up, use sitemaps!




Tip: Make separate sitemaps, it allows for more precise and effective analysis in the Search Console.






1. 1 For the good pages


You have worked hard to improve the quality of some pages with a zombie index too high. It's good ! Now the goal is for Google to be aware of it as soon as possible.






  • Group all relevant pages into a sitemap. Make it simple: a text file with one URL per line is more than enough.

  • Name it to remind you what it is and declare it in Search Console.

  • In the following days, observe the indexing rate of its pages by studying the "coverage of the index". Instead of leaving on "All known pages", filter to limit the pages in this file.




You should reach 100% indexing and see the traffic increase. If this is not the case, explain the situation in my forum .




2 For zombie pages


If some pages are irrecoverable or should never have been indexed anyway, it is even more urgent to remove them from the index of Google.



Of course, if all the pages are grouped in a directory of your site, and that all this directory is to be removed, it is enough to ask for its désindexation in Search Console. But if the pages are mixed with others to keep?



The trick for Google to de-index pages quickly is to put them in a sitemap (simple text file) and declare it in the Search Console. I remind you that a sitemap is not used to index pages, but to crawl them. By crawling the pages, Google will find that you ask for their deindexation ( tag meta robots or code 404/410 ). Of course, do not make the mistake of blocking them in the robots.txt file .



You should see these pages disappear from Google in the following days (or weeks if there is a lot). If it takes too much time, you have made a mistake, in this case:






  • do a new SEO audit to check that all your actions have been done well

  • explain the situation in the forum





And you ?


Have you ever searched for orphan pages on your site? Including this advanced method including Analytics and Search Console?



What techniques do you use to fix your zombie pages and accelerate their inclusion by Google?

I am waiting for your comments !
Bazonggier

Bazonggier is a site where you find unique and professional blogger templates, Improve your blog now for free. Kapan Nikah?

Posting Komentar

Lebih baru Lebih lama