How To Get Google To Index My Website

How To Get Google To Index My Website

HELLO, experts in the house, please tell me how to effectively make Google to index my site faster!

Asked on October 7, 2019 in Education.
Add Comment
2 Answer(s)

Visit us to learn more!

Answered on October 7, 2019.
Add Comment


How to get indexed by Google
Found that your website or web page isn’t indexed in Google? Try this:

Go to Google Search Console
Navigate to the URL inspection tool
Paste the URL you’d like Google to index into the search bar.
Wait for Google to check the URL
Click the “Request indexing” button
This process is good practice when you publish a new post or page. You’re effectively telling Google that you’ve added something new to your site and that they should take a look at it.

However, requesting indexing is unlikely to solve underlying problems preventing Google from indexing old pages. If that’s the case, follow the checklist below to diagnose and fix the problem.

Here are some quick links to each tactic—in case you’ve already tried some:

Remove crawl blocks in your robots.txt file
Remove rogue noindex tags
Include the page in your sitemap
Remove rogue canonical tags
Check that the page isn’t orphaned
Fix nofollow internal links
Add “powerful” internal links
Make sure the page is valuable and unique
Remove low-quality pages (to optimize “crawl budget”)
Build high-quality backlinks

1) Remove crawl blocks in your robots.txt file
Is Google not indexing your entire website? It could be due to a crawl block in something called a robots.txt file.

To check for this issue, go to

Look for either of these two snippets of code:

User-agent: Googlebot
Disallow: /
User-agent: *
Disallow: /
Both of these tell Googlebot that they’re not allowed to crawl any pages on your site. To fix the issue, remove them. It’s that simple.

A crawl block in robots.txt could also be the culprit if Google isn’t indexing a single web page. To check if this is the case, paste the URL into the URL inspection tool in Google Search Console. Click on the Coverage block to reveal more details, then look for the “Crawl allowed? No: blocked by robots.txt” error.

This indicates that the page is blocked in robots.txt.

If that’s the case, recheck your robots.txt file for any “disallow” rules relating to the page or related subsection.

robots txt
Important page blocked from indexing in robots.txt.

Remove where necessary.

2) Remove rogue noindex tags
Google won’t index pages if you tell them not to. This is useful for keeping some web pages private. There are two ways to do it:


Answered on October 7, 2019.
Add Comment

Your Answer

By posting your answer, you agree to the privacy policy and terms of service.