May 22, 2016

Create Custom Robots.txt File in Blogger Blog

Custom Robots.txt File
In the previous post, I was discussing Basic Search Engines For Blogger. If you found it on this blog I do hope that you will enjoy and take action on it to get ranking on Google search engine.

In this tutorial within a term of Blogger search engine, I will talk in deep about Custom Robots.txt file that is most important file for every blog and website, but to avoid of confusing from others I focus on Blogger term only which has an option called Search Preferences where you can configure your Custom Robots.txt file in it.

Custom Robots.txt File

What is Robots.txt?

Robots.txt has known as kind of text file that contain few lines of simple code. It has been saving on blog or website's server which interacts with all kind of crawlers or spiders like Googlebot, that Webmasters called search engine's spider, it index and crawl your blog in the search results.

New web pages will crawl and index after the Google send crawlers to crawl those pages. The crawler will index the only web page that is allowing in your Robots.txt file. The crawler will work on your robots.txt file and advanced crawl on pages that allowed by you. In case you disallowed any page in your robots.txt file then spider or crawler will follow a rule and it will not index those page into their search engine.

Robots.txt file's Location

Robots.txt file will be added to a right location in web root directory of blog or website. If you do not add a robots.txt file into your web directory, the spider or crawler will not work well on your website, it will crawl as default.

Now you can add your robots.txt file in Search Preferences function that where you are easy to edit your meta head tag, robot tag, dofollow and nofollow tags for different pages. We can use this function to create and add a robot.txt file.

To create a robots.txt file you need to understand about the crawler condition because if you are do not understand about how to set up a text file in your custom robots.txt, it will be affected by your search engine results.

How to Create Robots.txt file?

To create a robots.txt file is very easy because we can use a robots.txt generator tool from to generate a complete code for blogger. Then you just copy a simple code as below.

User-agent: *
Disallow: /search
Allow: /

To Create Robots.txt File You must Know These Below Rules

You see the first two line in the file as below, it means that you disable a keyword "SEARCH" or block a directory from all web crawlers in your URL Link.

Here is an example:

User-agent: *
Disallow: /search

In his case, it means that you block a directory from all web crawlers. If you create and add this rule to your robots.txt file, all your static pages will disable crawl by any spiders.

Here is an example:

User-agent: *
Disallow: /P

In this case, it means that you disable particular posts that will not be indexing by the date you assigned. YYY and MM refer to public year and month and if you would like to exclude indexing you can use this explanation. /2016/01/psot-url.html

User-agent: *
Disallow: /yyyy/mm/post-url.html

SEO Friendly Recognized Robots.txt in Blogger

To tell Google about your work with your website or blog, you would create a robots.txt file and submit to your web directory. If you are using Blogger you can add your robots.txt file by using Search Preferences function which located in your blogger dashboard.

Add Custom Robots.txt File in Blogger

  1. Log in Your Blogger Blog
  2. On Dashboard ›› Settings ›› Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
  3. Let's Paste Robots.txt File Code for Blogger in the box
  4. Then Click on Save Changes Button. You are Done!

Custom Robots.txt File For Blogger

How to Live Your Robots.text File?

After you added your Robots.txt file to your blog or website, you can see it live on the web. type your domain main in the web browser then slash and add robots.txt after it.

Custom Robots.text File Live


This is what I am discussing today about what the helpful of Robots.txt file that you should create and add it to your blog or website because it will help your website work great on Google search engine.

The result is ruled by you to tell spider or crawler work as limited by setting up yourself. If you tell Google to crawl your page, it will do it but if you do not allow it to crawl the google will ignore.

So, this what we are sharing about by today tutorial and I do hope you will be enjoyed with it and share with your friends.

Was this article helpful?

Thanks! Your feedback helps us to improve
Keo Sopherth Professional Blogger

Offer a blogger blog, WordPress website development tutorials, and simple search engine optimisation tip with love.

- Advertisement -
Advertise Here (330 x 280)
- Advertisement -
Advertise Here (330 x 280)