Was 301ing an expired domain to your Sitemap.xml nerfed?

Joined
Jan 13, 2024
Messages
248
Likes
212
Degree
1
Hey. This was an old school way to boost your sites indexation rate. You buy an expired domain that has a lot of links pointing to it and therefore is crawled by Google bot often. Then you point it to your Sitemap.xml file. All your sites webpages will be crawled and indexed now. You’ll get a page rank boost too. Super good hack 10 years ago.

Issue is that I see a Google update stating they fixed expired domain abuse. Does anyone know if this stopped working due to what they said in the update? I’ll go test it with 50k in expired domains, or I can just ask someone here :smile:

Thanks in advance!
 
There's really no need to work in boosting your indexation rate if you're publishing quality content/pages (even eCom). Most CMS's are going to report new content to ping lists which Google crawls, and most importantly you should be signed up for Search Console where you attach your sitemap_index and all your sub-sitemaps in there. And Google will crawl it any time it's updated.

Google's entire advantage over others is that they crawl and index everything in order to create a complete link graph in order to generate page rank flows. I understand that may change with AI people crapping out 100,000 pages a day now, but even then they may just "distrust" a domain instead.
 
Back