Robots.txt is a content record which contains few lines of basic code. It is saved money on the site or blog’s server which train the web crawlers to how to file and creep your blog in the list items. That implies you can confine any website page on your blog from web crawlers with the goal that it can’t get listed in web search tools like your blog names page, your demo page or whatever other pages that are not as critical to get filed. Keep in mind forget that pursuit crawlers check the robots.txt document before creeping any page.
Each blog hosted on blogger has its default robots.txt file which is something look like this:
This code is divided into three sections. Let’s first study each of them after that we will learn how to add custom robots.txt file in blogspot blogs.
- User-agent: Mediapartners-Google
This code is for Google Adsense robots which help them to serve better ads on your blog. Either you are using Google Adsense on your blog or not simply leave it as it is.
- User-agent: *
This is for all robots marked with asterisk (*). In default settings our blog’s labels links are restricted to indexed by search crawlers that means the web crawlers will not index our labels page links because of below code.
That means the links having keyword search just after the domain name will be ignored. See below example which is a link of label page named SEO.
And if we remove Disallow: /search from the above code then crawlers will access our entire blog to index and crawl all of its content and web pages.
Must Read: How To Add Twiiter Widgets In Blogger Blog
Here Allow: / refers to the Homepage that means web crawlers can crawl and index our blog’s homepage.
Disallow Particular Post
Now suppose if we want to exclude a particular post from indexing then we can add below lines in the code.
Here yyyy and mm refers to the publishing year and month of the post respectively. For example if we have published a post in year 2013 in month of March then we have to use below format.
To make this task easy, you can simply copy the post URL and remove the blog name from the beginning.
Disallow Particular Page
If we need to disallow a particular page then we can use the same method as above. Simply copy the page URL and remove blog address from it which will something look like this:
- Sitemap: http://example.blogspot.com/feeds/posts/default?orderby=UPDATED
This code refers to the sitemap of our blog. By adding sitemap link here we are simply optimizing our blog’s crawling rate. Means whenever the web crawlers scan our robots.txt file they will find a path to our sitemap where all the links of our published posts present. Web crawlers will find it easy to crawl all of our posts. Hence, there are better chances that web crawlers crawl all of our blog posts without ignoring a single one.
Must Check: Complete SEO + HTML5 Meta Tags Pack 2017 For Blogger Template
Note: This sitemap will only tell the web crawlers about the recent 25 posts. If you want to increase the number of link in your sitemap then replace default sitemap with below one. It will work for first 500 recent posts.
If you have more than 500 published posts in your blog then you can use two sitemaps like below:
How To Add Custom Robots.Txt to Blogger
Now the main part of this tutorial is how to add custom robots.txt in blogger. So below are steps to add it.
- Go to your blogger blog.
- Navigate to Settings >> Search Preferences ›› Crawlers and indexing ›› Custom robots.txt ›› Edit ›› Yes
- Now paste your robots.txt file code in the box.
- Click on Save Changes button.
- You are done!
How to Check Your Robots.txt File?
You can check this file on your blog by adding /robots.txt at last to your blog URL in the browser. Take a look at the below example for demo.
Once you visit the robots.txt file URL you will see the entire code which you are using in your custom robots.txt file. See below image.
This was the present finish instructional exercise on the best way to include custom robots.txt record in blogger. I truly attempt with my heart to make this instructional exercise as basic and useful as would be prudent. Yet at the same time in the event that you have and uncertainty or inquiry at that point don’t hesitate to ask me. Try not to put any code in your custom robots.txt settings without thinking about it. Essentially ask to me to determine your questions. I’ll let you know everything in detail. Much appreciated folks to peruse this instructional exercise. On the off chance that you like it at that point please underpins me to spread my words by sharing this post on your online networking profiles. Glad Blogging!