Indexing Site WITHOUT Search Console?

Some options to consider:
  • Submit your site manually to ping lists such as Ping-o-Matic (Wordpress does this automatically using XML-RPC).
  • Build a handful of medium to high quality links or many low quality ones and wait. You could prioritize highly crawled, easy-to-drop-a-link-on forums like Reddit.
  • Create social media profiles and link out to your site from those, either in the profile (usually no-follow but I bet Google makes the exception there in terms of crawling) or in posts.
Realistically, you shouldn't need to be concerned about this at all, if Google hasn't crawled the site yet. By the time you start doing other things you need to be doing, whether that be related to SEO or Marketing, it should happen without your manual intervention.

If Google has crawled and chosen not to index your site, especially your home page, it's probably lacking in text (browser side Javascript with lots of images instead of text) or image splash pages, etc.
 
Some options to consider:
  • Submit your site manually to ping lists such as Ping-o-Matic (Wordpress does this automatically using XML-RPC).
  • Build a handful of medium to high quality links or many low quality ones and wait. You could prioritize highly crawled, easy-to-drop-a-link-on forums like Reddit.
  • Create social media profiles and link out to your site from those, either in the profile (usually no-follow but I bet Google makes the exception there in terms of crawling) or in posts.
Realistically, you shouldn't need to be concerned about this at all, if Google hasn't crawled the site yet. By the time you start doing other things you need to be doing, whether that be related to SEO or Marketing, it should happen without your manual intervention.

If Google has crawled and chosen not to index your site, especially your home page, it's probably lacking in text (browser side Javascript with lots of images instead of text) or image splash pages, etc.
Yes sure I shouldn't worry about google indexing my stuff if I made the websites manually. This is a little BH experiment, thus I can't add any of the domains to GSC, and also have a hard time spending money for HQ links sent to all of them (it would be 5 figures just to get one link to each...)

I'll try ping-o-matic and some easy-links to see what works best. I have a sheet recording all the indexing experiments to see which ones does best :smile: I'll try social media profiles as well, but the same thing goes there; it takes a fuck ton of time (or money) to register it for all the sites lol
 
@Ryuzaki
what should I do if google removed AI content from the index? On 13 August, google rejected the AI content. I went and deleted it. It doesn't exist anymore.

I've published new articles but Google doesn't even crawl new pages on the site. It only crawls and index homepage. I've built Web 2.0 links w/ tier2 links and have a few high quality guest posts to the site. Google Bot is crawling my site more often now, but still not indexing it.

Yes its on Wordpress with Ping o magic.
Yes it used to receive like 250 hits/day from reddit to inner pages.

Seems like nothing it working.

My current next tasks are to buy an expired domain and 301 it to this as well as adding IndexNow to the site. Anything else?

M6Lq63Z.png


Xh398zv.png
 
@BakerStreet, how much text is on these pages? Are they boilerplate pages with ad-lib'd content, where it's mainly a repeat of each other, like location-based pages often are, as an example?

If you dig into the "Not Indexed" set of pages and locate the ones that still exist and you do want indexed, what reason does it give? Does it give one other than "crawled but not indexed?"
 
@BakerStreet, how much text is on these pages? Are they boilerplate pages with ad-lib'd content, where it's mainly a repeat of each other, like location-based pages often are, as an example?

If you dig into the "Not Indexed" set of pages and locate the ones that still exist and you do want indexed, what reason does it give? Does it give one other than "crawled but not indexed?"
1,500 words and they're not boilerplate content. It has subheadings and paragraphs and each article is about a different topic (ie "$Brand review").

In the "not indexed" section, the pages that exist that I want indexed are not there. Seems like GoogleBot is not even crawling my site past the homepage.

Besides the homepage, date of last crawl for all other pages was in August.
 
@BakerStreet, that's strange. Are you certain you're not blocking crawling in your robots.txt? That's the only thing that would stop Googlebot from reaching a page. The only other thing I can think of is that somehow you've added nofollow either to your links out from your homepage or in the meta tags in your header.

Have you submitted a sitemap into Search Console? If so, Google would certainly know those new inner pages exist and would crawl them at least once. If you have added one, I'd click into it through Search Console and see if the pages appear there. If not, I'd visit it in your browser and make sure it's not malformed or anything weird going on there.

Beyond that it's possibly a case of Google losing indexation trust for a low PR site that added a relatively lot of content and then deleted it. They're moving in the direction that Bing has always gone which is to not index everything they find like they used to. They want it validated by links. I don't mean every single page, but for the site as a whole in general.

That tends to be the solution to most indexation problems on newer or smaller sites. They simply need more links. Those web 2.0 links might create pathways for spiders to crawl into your site but it doesn't mean they'll index it. It's going to be page rank zero pages on new sub-domains hit with spam, you know. The guest posts are definitely the way to go. You could even score some niche edits for cheaper if you don't really care too much about the profile. You can get them niche relevant, but they may not be on the cleanest sites, but you can ensure they pass page rank at least.
 
Yes sure I shouldn't worry about google indexing my stuff if I made the websites manually. This is a little BH experiment, thus I can't add any of the domains to GSC, and also have a hard time spending money for HQ links sent to all of them (it would be 5 figures just to get one link to each...)

I'll try ping-o-matic and some easy-links to see what works best. I have a sheet recording all the indexing experiments to see which ones does best :smile: I'll try social media profiles as well, but the same thing goes there; it takes a fuck ton of time (or money) to register it for all the sites lol

a 'sheet' as in google sheet? make one set to public with your links in it and see if that does it...put some relevant content in there. if its really bh though, and/or depending on the qty your talking about...might not index, but at least theyll get crawled?
 
@Ryuzaki
what should I do if google removed AI content from the index? On 13 August, google rejected the AI content. I went and deleted it. It doesn't exist anymore.

I've published new articles but Google doesn't even crawl new pages on the site. It only crawls and index homepage. I've built Web 2.0 links w/ tier2 links and have a few high quality guest posts to the site. Google Bot is crawling my site more often now, but still not indexing it.

Yes its on Wordpress with Ping o magic.
Yes it used to receive like 250 hits/day from reddit to inner pages.

Seems like nothing it working.

My current next tasks are to buy an expired domain and 301 it to this as well as adding IndexNow to the site. Anything else?

M6Lq63Z.png


Xh398zv.png
I fucking got it! The crawl budget is like 40 requests two or three times a week! That's way too low for Googlebot to even crawl all the pages on my site. I need to increase the crawl budget so that GoogleBot can crawl more pages of my site, so that it can index my site.

Here's what I'm going to do:
1.) Have a VA update 2 or 3 paragraphs to every article so that it is refreshed. Hopefully this sense freshness signals to googlebot.
2.) Build six digits worth of tier2 links to my tier1 links to boost the PageRank of my site, so that Googlebot will increase the crawl budget.

Hopefully in 2 weeks, it will solve this issue!
 
I fucking got it! The crawl budget is like 40 requests two or three times a week! That's way too low for Googlebot to even crawl all the pages on my site. I need to increase the crawl budget so that GoogleBot can crawl more pages of my site, so that it can index my site.

Here's what I'm going to do:
1.) Have a VA update 2 or 3 paragraphs to every article so that it is refreshed. Hopefully this sense freshness signals to googlebot.
2.) Build six digits worth of tier2 links to my tier1 links to boost the PageRank of my site, so that Googlebot will increase the crawl budget.

Hopefully in 2 weeks, it will solve this issue!
Internal link better is the answer. If the bot gets stuck it seems to not index anymore unless its a trusted site.

Oh and expired domains or age on the domain.
 
Back