0% Complete
0/0 Steps
  1. SEO Basics
    12 Topics
    |
    1 Quiz
  2. Semantic Core
    12 Topics
    |
    1 Quiz
  3. Keywords Clustering
    14 Topics
    |
    1 Quiz
  4. Website Structure
    11 Topics
    |
    1 Quiz
  5. On-Page SEO
    55 Topics
    |
    1 Quiz
  6. Technical SEO
    9 Topics
    |
    1 Quiz
  7. SEO Reporting
    38 Topics
    |
    1 Quiz
  8. External SEO
    8 Topics
    |
    1 Quiz
  9. SEO Strategy
    2 Topics
    |
    1 Quiz
Lesson 7, Topic 22
In Progress

Why Is The Page Or Site Missing From Google

11.02.2022
Lesson Progress
0% Complete

It’s important to note that when you type something into Google hoping to see your website in the search results, you’re not actually looking for your website. You’re looking for a page on your website. That’s an important distinction.

If Google doesn’t know about the existence of the page you’re trying to rank or thinks it doesn’t deserve to rank, then it won’t show up anywhere that matters in the search results.

For that reason, to show up in Google, three things need to be true:

  • Google knows that your website exists and can find and access all your important pages.
  • You have a page that’s a relevant result for the keyword you want to show up for.
  • You’ve demonstrated to Google that your page is worthy of ranking for your target search query—more so than any other page from another website.

Possible reasons:

1. Your website is too new

It takes time for Google to discover new websites and web pages. If you only launched your site this morning, then the most straightforward explanation is that Google just hasn’t found it yet.

To check whether Google knows your website exists, run a search for site:yourwebsite.com

google site search results 1

If there is at least one result, then Google knows about your website. If there are no results, then they don’t. But even if they know about your website, they might not know about the page you’re trying to rank. Check that they know about this by searching for site:yourwebsite.com/a-page-you-want-to-show-up-in-google/

site search page 1

There should be one result.

If you see no results for either of these searches, create a sitemap, and submit it via Google Search Console. (It’s good practice to do this regardless.)

Search Console > Sitemaps > Enter sitemap URL > Submit

submit sitemap 1

SIDENOTE. You’ll need to create a free Search Console account and add your website before doing this. Read this guide for instructions.  A sitemap tells Google which pages are important on your site and where to find them. It can also speed up the discovery process.

2. You’re blocking search engines from indexing your pages

If you tell Google not to show certain pages in the search results, then it won’t.

You do that with a “noindex” meta tag, which is a piece of HTML code that looks like this:

<meta name=”robots” content=”noindex”/>

Pages with that code won’t be indexed, even if you created a sitemap and submitted it in Google Search Console.

You probably don’t recall ever adding that code to any of your pages, but that doesn’t mean it isn’t there. For example, WordPress adds it to every page if you check the wrong box when setting up your site.

wordpress search engine settings 1

It’s also something that a lot of web developers use to prevent Google from indexing a site during the development process and forget to remove it before publishing.

If Google has already crawled the pages in your sitemap, it’ll tell you about any “noindexed” ones in the “Coverage” report in Google Search Console.

Just look for this error:

noindex search console 1

If you recently submitted your sitemap to Google and they haven’t crawled the pages yet, run a crawl in Ahrefs Site Audit. This checks every page on your site for 100+ potential SEO issues, including the presence of “noindex” tags.

noindex site audit 1

Remove “noindex” tags from any pages that shouldn’t have them.

3. You’re blocking search engines from crawling your pages

Most websites have something called a robots.txt file. This instructs search engines where they can and can’t go on your website. Google can’t crawl URLs blocked in your robots.txt file, which usually results in them not showing up in search results.

If you’ve submitted your sitemap via Google Search Console, it should alert you about issues related to this. Go to the “Coverage” report and look for “Submitted URL blocked by robots.txt” errors.

blocked by robots search console 1

Once again, that only works if Google has already attempted to crawl the URLs in your sitemap. If you only recently submitted this, then that may not yet be the case.

If you prefer not to wait, you can check manually. Just head to yourdomain.com/robots.txt.

You should see a file like this:

ahrefs robots 1

What you don’t want to see here is this piece of code…

Disallow: /

.… under any of these user-agents:

User-agent: *

User-agent: Googlebot

Why? Because it blocks Google from crawling all the pages on your site.

You also don’t want to see a “Disallow” directive for any important content.

For example, this Disallow rule would prevent Google from crawling all the posts on our blog.

Disallow: /blog/

Remove any directives blocking content that you want to show up on Google.

WARNING

Robots.txt files can be complicated, and they’re easy to mess up. If you feel that yours may be preventing pages from showing up on Google, and you don’t know much about this file, hire an expert to fix it.

While there are hundreds of factors at play in Google’s algorithm, the number of backlinks from unique websites to a page seems to be a strong one. 

If the web pages ranking above you have way more backlinks, then this could be part of the reason you’re not showing up in Google.

To see the number of unique websites (referring domains) linking to your page, paste your URL into Site Explorer or our free backlink checker.

ahrefs backlink checker 1

Consider building more backlinks if your page falls short.

5. Your page is lacking “authority”

Google’s ranking algorithm is based on something called PageRank, which essentially counts backlinks and internal links as votes.

To check the URL Rating of any page on your site, paste the URL into Site Explorer or our free backlink checker.

url rating site explorer 1

Compare that to the UR of the top-ranking pages for your target keyword using the “SERP overview” in Keywords Explorer.

url rating serp 1

If the top-ranking pages have a much higher UR score than yours, it might be a sign that your lack of “link authority” is holding you back.

There are two ways to boost the authority of a web page:

  • Build more backlinks;
  • Add more internal links.

 

6. You have duplicate content issues

Duplicate content is when the same or similar web page is accessible at different URLs.

Google tends not to index duplicate content because it takes up unnecessary space in their index—a bit like having two copies of the same book on your bookshelf.

Instead, it usually only indexes the version that you set as the canonical.

If no canonical is set, Google attempts to identify the best version of the page to index itself.

Unfortunately, Google’s ability to identify duplicate pages without non-self-referencing canonicals isn’t perfect.

7. You have a Google penalty

Having a Google penalty is the least likely reason for not showing up on Google. But it is a possibility.

There are two types of Google penalties.

  • Manual: This is when Google takes action to remove or demote your site in the search results. It happens when a Google employee manually reviews your website and finds that it doesn’t comply with their Webmaster Guidelines.
  • Algorithmic: This is when Google’s algorithm suppresses your website or a web page in the search results due to quality issues. It’s more a case of computer says no than human says no.

Luckily, manual penalties are extremely rare. You’re unlikely to get one unless you’ve done something drastically wrong. Google also usually alerts you about them via the “Manual penalties” tab in Search Console.

manual penalty 1

If there’s no warning in there, then you probably don’t have a manual penalty.

Unfortunately, Google doesn’t tell you if your site is being filtered algorithmically—and this can be quite challenging to identify.

If you suspect an algorithmic penalty due to a recent significant drop in organic traffic, your first course of action should be to check whether that drop coincided with a known or suspected Google algorithm update.

Panguin is a useful tool for this. It shows known algorithm change dates over your Google Analytics traffic to make it easy to spot issues.

If you still suspect your site has been filtered or penalized at this point, talk to an expert before taking any potentially catastrophic actions like disavowing links.