Good robots vs bad robots

Web Robots also can be called “Spiders”, “Crawlers”, “Web Bots”, “Search Bots”, or “Bots” are software applications that run automated tasks on the web. Search engines use “robots” to crawl, index and cache your website.
There are “good” robots, and “bad” robots. Good robots like Googlebot, msnbot, googlebot-image, teoma, and many more spider the website and find pages for indexing.
Bad bots are usually Web Crawlers which will create site problems. They do spamming, stealing website content, finding certain file extensions ( like downloading your Mp3 or video files ) and collecting email addresses.

This is a HTML website robots.txt file sample:

User-agent: *
Disallow: /cgi-bin/
Disallow: /secretfolder/
Allow: /index.htm

You may use robots.txt File Generator:

WordPress does not create a robots.txt file unless you use a plug-in such as WP Robots Txt.

WordPress has a virtual Robots.txt file and it is only enabled if you select “I would like to block search engines, but allow normal visitors” in Settings/Privacy or if a plugin enables it via a setting like the one in the sitemap plugin.
Regardless of whether or not it’s active, most robots will take an actual robots.txt over WordPress’ virtual robots.txt.

If you like to create a robots.txt file for a blog site, you may create the file like this:

User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /wp-content/plugins/
Disallow: /wp-content/cache/
Disallow: /wp-content/themes/
Disallow: /wp-login.php
Disallow: /wp-register.php

Please note that when you create your own robots.txt file at, it will override WordPress’ virtual robots.txt file. Your sitemap file can be this also: sitemap.gz

robots.txt checker is a “validator” that analyzes the syntax of a robots.txt file.
You may look at this website:
Read more info about robots.txt files here:

WordPress Plug-ins:

WP Robots Txt
Edit your robots.txt file from the WordPress admin