Get More Killer Tips
Subscribe To Our Mailing List And Get Interesting Stuff And Updates To Your Email Inbox
Thank you for subscribing.
Something went wrong.
There are cases when website owners can’t find their sites in Google’s search results.
It happens pretty rarely on more advanced stages of your website’s life cycle. But it could be a common thing when your website is quite young.
Nevertheless, this issue might occur anyway.
In this post, you will find out 7 reasons why your website can’t be found on Google and get solutions on how to fix them.
1. Why is Your Website Not Showing Up in Google: Your Website Is New for Google
When you launch a new website, Google needs some time to notice it. Nothing is going to happen in a blink of an eye. Hence, be patient!
If you want to check whether your website has been noticed by Google, you can use the following command site:yourwebsitedomain.com:
If Google knows about your website, you will see at least one result explored. But it does not always mean that a search engine will know about the exact pages you’re trying to rank.
To find it out, type in Google the query site:websitedomain.com/page-you-are-looking-for:
However, if you don’t see any result, you should create a sitemap and submit it via Google Search Console:
Your sitemap will let Google know which pages it should crawl and where to find them.
2. Blocking Search Engines From Crawling Your Website
Sometimes a website owner might want not to show certain pages of the site in the search results.
To do this, you should implement a “noindex” meta tag that is a piece of HTML code and has the following structure:
<meta name=”robots” content=”noindex”/>
Consequently, those pages that you tagged as “noindex” won’t be indexed even if you have created a sitemap and submitted it in Google Search Console.
There are cases when web developers use this meta tag to prevent Google from indexing a website at the developing stage. Afterwards, they forget to remove it before releasing a website.
Fortunately, Google Search Console can provide you with any “noindexed” page via the “Coverage” report.
Make sure that you didn’t forget to remove this “noindex” meta tag and your website can be crawled properly.
3. Your Pages Are Blocked From Crawling by Search Engines
If you want to instruct search engines what landing-pages they can and can not visit, you should have a robot.txt file on your website. The URLs blocked in your robot.txt file won’t be crawled by Google for sure. Simply, these pages won’t be shown up in search results.
To identify the pages that are blocked from crawling, pay attention to “submitted URL blocked by robots.txt” issues in the “Coverage” report of Google Search Console.
Keep in mind that you will be able to get this information only if Google has attempted to crawl the URLs in your sitemap. However, if you have recently submitted it, you won’t get the data you need.
Likely, you can check these pages manually. Type the following yourdomain.com/robots.txt and you will get a file like this:
Pay attention to if there is a piece of code “Disallow: /” under any of these user-agents:
This snippet of code blocks Google from crawling all the pages across your website.
4. Absence of Quality Backlinks
There are different ranking factors that Google takes into account to rank your website. One of these factors is the number of high-quality backlinks that your website aims to have.
Backlinks work as a kind of “votes” for certain pages on your website. The more backlinks your website has, the higher chances to rank it will deserve.
Therefore, if your pages don’t have enough quality backlinks, it could be the reason why your site is not showing up in Google.
To find out how many referring domains link back to your website, you can use the Site Explorer tool. Just paste your website’s URL and draw your attention to these two metrics:
Now you have a clear idea of how many referring domains your website has right now.
But what about exploring the information on how many referring domains you need to acquire to rank in the top 10 SERP for your target keywords?
Go to the Keywords Explorer tool and review the “SERP overview” report:
A “Domains” column will let you know how many referring domains certain pages have in the top 10 SERP.
By having this information you will know the exact number of referring domains your page needs to acquire and rank on Google’s first page.
5. Your Page Is Not as Authoritative
You probably know that Google’s ranking algorithm takes into account a so-called PageRank that works by counting the number and quality of links to a page to determine a rough estimate of how important the website is.
However, this PageRank score was canceled a couple of years ago. As a result, you can’t compare a PageRank of your pages with those that are in Google top 10.
Likely, you can get this comparison data by exploring the URL Rating score that takes into account backlinks and internal links as well.
You can find this metric with the help of Site Explorer tool:
To compare your UR with the top-pages, review “UR” column in the “SERP overview” report in Keywords Explorer:
If you see that your page for a target keyword has a lower UR score, you should boost its link authority by adding more internal links and building backlinks.
6. Your Pages Don’t Align With a Search Intent
Google tends to provide users with results that would fit their queries the most. Hence, you must work on producing content that would align with what people want to find in search.
Your purpose is to create content based on your target audience search intent.
There are four types of search intent:
- Navigational (when a searcher is looking for a certain website)
- Informational (when a searcher is looking for some information)
- Transactional (when a searcher is looking to make a purchase)
- Commercial investigation (when a searcher is looking for a specific product/service but hasn’t made a final decision if he wants to purchase it)
If you optimize your page with a target keyword that doesn’t align with search intent, Google won’t rank it on the first page. Even if your page has a high URL rating.
7. Google Penalty as a Possible Reason
There are fewer chances that your website won’t be shown on Google because of the Google penalty. But you must remember about this possibility as well.
We can distinguish the penalties as manual and algorithmic. You can get a manual penalty when Google takes action against your website and removes it from the search results. An algorithmic penalty is possible to get when Google’s algorithm surpasses your website (or a page) in the search results because of some quality issues.
Frankly speaking, it is a pretty rare case when your website gets a manual penalty. It is possible if you have done something wrong purposely. You will get a notification about the penalty via “Manual penalties” tab in Google Search Console:
Nevertheless, you will never get a notification about having an algorithmic penalty. And it is quite challenging to figure out by yourself.
If you noticed a sudden drop in organic traffic, it might be a signal that you have been penalized algorithmically. Your first step should be checking whether this drop coincides with the suspected Google algorithm update.
The best advice for you is to have a consultation with an expert in this field before you start disavowing links.
Nobody’s insured against different issues related to a website. And you can never be sure for a hundred percent that your website can always be found on Google.
Likely, by knowing these 7 reasons that might prevent your website from appearing on Google, you will have a chance to resolve these issues and keep on ranking well.
If you think that this piece of content lacks some other additional information, feel free to share your thoughts in the comment section.