SIX ways to inadvertently keep Google from indexing your website

August 24, 2018

It might be that you’re blocking Google without knowing it. That means that Google won’t index all the pages of your website. In this article, you’ll learn how to block Google and how to make sure that you do not block Google inadvertently.

1. Errors in the robots.txt file of your website will keep Google away

robots-block-300x228 Theme Builder LayoutThe disallow directive of the robots.txt file is an easy way to exclude single files or whole directories from indexing.

To exclude individual files, add this to your robots.txt file:

User-agent: *
Disallow: /directory/name-of-file.html

To exclude whole directories, use this:

User-agent: *
Disallow: /first-directory/
Disallow: /second-directory/

  • If your website has a robots.txt file, double check your robots.txt file to make sure that you do not exclude directories that you want to see in Google’s search results.
  • Note that your website visitors can still see the pages that you exclude in the robots.txt file. Check your website with the website audit tool in SEOprofiler to find out if there are any issues with the robots.txt file.

2. Use the meta robots noindex tag and Google will go away

The meta robots noindex tag enables you to tell search engine robots that a particular page should not be indexed. To exclude a web page from the search results, add the following code in the <head> section of a web page:

<meta name=”robots” content=”noindex, nofollow”>

In this case, search engines won’t index the page and they also won’t follow the links on the page. If you want search engines to follow the links on the page, use this tag:

<meta name=”robots” content=”noindex, follow”>

The page won’t appear on Google’s result page then but the links will be followed. If you want to make sure that Google indexes all pages, remove this tag.

The meta robots noindex tag only influences search engine robots. Regular visitors of your website still can see the pages.

3. The wrong HTTP status code will send Google away

The server header status code enables you to send real website visitors and search engine robots to different places on your website. A web page usually has a “200 OK” status code. For example, you can use these server status codes:

    • 301 moved permanently: this request and all future requests should be sent to a new URL.
    • 403 forbidden: the server refuses to respond to the request.

For search engine optimization purposes, a 301 redirect should be used if you want to make sure that visitors of old pages get redirected to the new pages on your website.

4. Google won’t index password protected pages

  • If you password protect your pages, only visitors who know the password will be able to view the content.
  • Search engine robots won’t be able to access the pages. Password protected pages can have a negative influence on the user experience so you should thoroughly test this.

5. If your pages require cookies or JavaScript, Google might not be able to index your pages

  • Cookies and JavaScript can also keep search engine robots away from your door. For example, you can hide content by making it only accessible to user agents that accept cookies.
  • You can also use very complex JavaScripts to execute your content. Most search engine robots do not execute complex JavaScript code so they won’t be able to read your pages.
  • In general, you want Google to index page pages.

6. Bad WordPress or Plugin Configurations

  • Under Settings/Reading in WordPress there is a setting to block Search Engines from crawling your site. We’ve this set while developing a site and it inadvertently NOT being unchecked.
  • As well, most SEO Plugins like Yoast and Ultimate all have settings to globally block a site, or to easily set no-index tags. Be sure that these settings are monitored as to not disrupt proper crawling.

Share this article:

eHost-square-ad Theme Builder Layout

We’re listening.

Have something to say about this article? Share it with us on Facebook, Twitter or LinkedIn:


Related Posts

smx-overtime-managing-your-online-business-reviews Theme Builder Layout

SMX Overtime: Managing your online business reviews

Last month I spoke at SMX East about scaling online reputation management – specifically reviews. At the end of my presentation, “14 Tips to Scale Reviews Across Multiple Locations,” I fielded a number of fantastic questions from session attendees and wanted to...

Get ALL Your SEO, WordPress & Divi News

Join Our Daily Roundup

SEO News and More

SEO News and More

Subscribe ToThe Weekly SEO Trade News Updates

Get the latest SEO, SEM and SMM marketing intel, tips and tricks from one of the best SEO Gurus online. 

Every Tuesday morning we send out an aggregated email listing all new posts on SEO Trade News.

Excellent! Now check your email to confirm your subscription.