Custom Robots.txt: Robots.txt is a file which contains few lines of simple code. In Blogger, it’s called as Custom Robots.txt. By it’s name you can understand that you can easily customize those codes according to your choice. In my previous article, I wrote how to add custom robot header tags in blogger. Which is very important for improving search engine optimization. Custom Robots.txt file will help you in the same extent to improve Search engine presence with more advanced customization. Let’s dive in!
What is Custom Robots.txt in Blogger?
In Blogger, Custom Robots.txt is a text file containing few lines of simple code which webmaster creates to instruct search bots (google bots) how to crawl web pages on their blog.
Robots.txt file is saved on every blogs server to restrict google bots which page should crawl and index and which page should not. If a page is restricted in robots.txt file by “disallow” tag then that page won’t crawl by google resulting that page wont index in google.
Robots.txt file is the piece of code google scans before scanning the entire web page. So, it’s very much important to implement proper robots.txt code in blogger for better search ranking. A blog having proper robots.txt code, helps to improve crawl rate. So, it’s very much important to implement proper robots.txt code and understand all the technical terminologies relating Robots.txt.
In Blogger, robots.txt is something looks like:
- How To Write Perfect SEO-Optimized Articles In Blogger
- LSI Keyword Tool: Best LSI Keyword Generator for improving SEO
- How To Add ADS.txt File in Blogger?
- Custom Robots Header Tags Settings For Blogger
Technical terminologies relating custom robots.txt
In blogger robots.txt file is divided into four sections. User-agent: Mediapartners-Google, User-agent:*, Disallow: /search, Sitemap. Let’s understand every single terms in details.
What is User-agent: Mediapartners-Google?
User-agent: Mediapartners-Google is for Google Adsense robots. User-agent: Mediapartners-Google helps to serve better relevant ads by scanning the entire page.
What is User-agent:*?
User-agent or UA is for all robots marked with asterisk (*).
What is Disallow: /search?
“Disallow: /search” means the links having /search keyword will not be indexed.
Note: If you remove Disallow: /search from Robots.txt then google bot will be able to access and index your search label. Which is not recommended for SEO. Since, /search tag will cause duplicate URL.
What is sitemap?
In blogger, sitemap is either an XML or HTML file that contains all the URLs of the pages inside your blog. XML sitemap is submitted to all webmasters whereas HTML sitemap is a page in your blog. XML sitemap helps search engine bots (crawlers) to access your contents whereas HTML sitemap helps your visitors to navigate through your blog.
Custom Robots.txt Code
How to Add Custom Robots.txt to Blogger
To add custom robots.txt file in blogger. Follow the above steps:
- Sign in to your blogger blog and select your appropriate blog.
- Go to Settings > Search Preferences > Custom Robots.txt > Edit > Yes.
- Now, paste robots.txt code in the code field.
- Click “Save“. You’re done!
How to check Robots.txt?
To check robots.txt for your blog or any blog just put /robots.txt at the end of the url. Example, for checking robots.txt of mathzag site
◆ Last Word: Hope you are benefited by this article and learned something new. Let me know how this tutorial helped you to custom robots.txt on search preference setting for improving your search engine visibility. For any issues relating restricting any url other than described in the article, drop a comment below we are ready to help you. Also, if benefited please share with your friends in social media.