Ads

How to Create the Custom Robots.txt file in the Blogger in 2021

How to add the Custom Robots.txt file in the Blogger in 2021




How to create the custom robots.txt file for the blogger

What is Robots.txt?:

Robots.txt is a file that controls the crawling of the search engines for your website. It means that the robots.txt plays a vital role in the SEO of the blog. In this article, we understand how to create the robots.txt file and implement the robots.txt file into your blogger.

Robot.txt file tells the search engine which pages should crawl and which shouldn't crawl. 
In the robots.txt we use user-agent, allow, disallow, and sitemap functions for the different search engines like Google, Bing, etc.

Robots.txt file for the Blogger:

To create the custom robots.txt file for the blogger, first of all, we need to understand the basic structure of robots.txt for the blogger.

Default robots.txt file looks like this:

User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search
Allow: /

Sitemap: https://www.example.com/sitemap.xml

  • In the first line, the robots.txt file tells about the bot type. It means the website provides permission to crawl for the google bots.

  • In the next line user agent is * which means that the bots of all search engines are disallowed to search pages. The allow tag line means other pages other than search pages are allowed to crawl.

  • The last line of the robots.txt file contains the sitemap for the blog.
The robots.txt file controls the bots of the search engines and provides the instruction for the pages to crawl and not crawl.

How to Create a Customized Perfect Robots.txt file for the Blogger:

In the above, we discuss the basic robots.txt file for the blogger. But now we discuss how to create the perfect and customized robobts.txt file for the blogger.

The above default robots.txt file allows the archive pages to index which causes duplicate content problems in the google search console. To prevent the archive section from crawling we use Disallow rule /20* in the robots.txt file of your blogger. But this rule stops the crawling of the pages. To avoid this we use another rule /html* section that allows the bots of the search engine to crawl the posts and pages of your blog or website.


User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /search*
Disallow: /20*
Allow: /*.html

Sitemap: https://www.example.com/sitemap.xml
Sitemap: https://www.example.com/sitemap-pages.xml

The above file is the best robots.txt file for the SEO of the blog.

Implementation of Robots.txt file in the Blogger:


How to create the custom robots.txt file for the blogger



  • First of all the login to your blogger site.
  • Go to the settings of the blog.
  • Scroll down to search the crawling and indexing section.
  • Enable the custom robots.
  • Click Custom robots.txt to add the above file.
  • Don't forget to change the example URL to your blog URL address.
Tags

Post a Comment

0 Comments
* Please Don't Spam Here. All the Comments are Reviewed by Admin.