How To Add Custom Robots.txt File in Blogger


Robots.txt is a content document which contains few lines of basic code. It is saved on the site or blog's server which educate the web crawlers to how to file and creep your blog in the indexed lists. That implies you can limit any website page on your blog from web crawlers with the goal that it can't get ordered in web crawlers like your blog names page, your demo page or some other pages that are not as critical to get recorded. Keep in mind forget that hunt crawlers examine the robots.txt document before slithering any website page.

Read : How To Index Blogger Blog Posts for Search Rankings and Listings

How To SEO Optimize Your Blogger Blog Titles For Higher Search Rankings and Results


All blog hosted on blogger platform has its default robots.txt file which is something look like this:

User-agent: *
Disallow:
Sitemap: http://www.blogname.com/sitemap.xml
Above is the new excerpt of robot.txt which looks different from the old where you will have to add more Sitemap if your blog have more than 1000 posts. This current one contains all blog posts with no limit.

Robot.txt will help Google Search engine crawl and index your blog labels, posts and pages. It is a very easy and fast tool for blogger seo that should be applied to blogger blogs if you want your blog to get on google quickly. I'm this tutorial, i will teach you how to add robot.txt file in blogger and make it effective using Google Webmaster Tools.

How To Add  Custom Robots.Txt to Blogger


Log in to your blogger blog.

Navigate to Settings >> Search Preferences ›› Crawlers and indexing ›› Custom robots.txt


Click the Edit and choose Yes

Now paste below robots.txt file code in the box.

User-agent: *
Disallow:
Sitemap: http://www.blogname.com/sitemap.xml
Replace blogname with your blog name

Click on Save Changes button.



How To Test Your Blogger Custom Robots.Txt in Google Webmaster Tools 


After adding custom robot.txt file in your blogger blog, the next thing to do is that you have to test the file in Google webmaster tools. When you test the file in Google webmaster tools, webmaster will connect with blogger and index your blog posts, pages and label effectively.

To Add this, make sure you have already added and verify your blog in Google Webmaster Tools.


Log in to your Google Webmaster Dashw.

Click the Site link you Activated Robot.txt for

On the site dashboard, click "Crawl" and from the dropped down options, click "robots.txt Tester"

Here, you might find the robot.txt field empty if you just added and verify your blog on Google webmaster tools.

Therefore, to test your blog robot.txt file, copy the robot.txt given above and paste it in the field as shown in snapshot below.

Click "submit"


A page will pop up. You will be asked to download your robot.txt file, don't bother downloading because it still the same robot.txt above that will be given in the txt file.

Skip the second request and move to the last where it says :

Ask Google to update  
Submit a request to let Google know your robots.txt file has been updated.
Click "Submit" to let submit your request. Then let Google do the rest.


You will get a success message :
Success! Reload the Tester page in a minute to confirm the timestamp


Now if you want to kill it all, just add your blog Sitemap and Sitemap Pages to Google Webmaster Tools. That's all

Google will start indexing your post like machine. Here your post might not be listed I'm search engines but it will be crawled by search listings and as time goes on, your post will be on search engines.

I hope this helps.

If you have encountered any issue why trying this post, drop it via the comment box and I will reply you ASAP.

Cheers!






Subscribe To Our Newsletter
Follow Us On Twitter and Instagram

Comments