Bad bots is a big challenge for any webmaster. Bad bots increase your website virtual traffic and kill all your bandwidth even sometime crash your server due to overload of traffic. For that reason your actual visitors will not be able to see your website properly. Now a days when big challenge to keep your visitors on your web page minimum 10 second to impress with your products/ services.

In most cases webmaster block bad bots using robots.txt but as you know internet is getting too smart even bad bots bypass your robots.txt file and crawl your web pages. So I came to a solution lets stop bad bots using .htaccess. So when bad bots are coming to your website your .htaccess will output “403 Forbidden error”

Below I am pasting some useful code to block bad bots.

## Bad Spider Block Code
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} Baidu [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Firefox\/6\.0\.2 [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Semrush [NC,OR]
RewriteCond %{HTTP_USER_AGENT} MegaIndex [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Ahrefs [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Yandex [NC,OR]
RewriteCond %{HTTP_USER_AGENT} MJ12bot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} Exabot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} botify [NC,OR]
RewriteCond %{HTTP_USER_AGENT} spbot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} 8legs [NC,OR]
RewriteCond %{HTTP_USER_AGENT} DotBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} BLEXBot [NC,OR]
RewriteCond %{HTTP_USER_AGENT} PhantomJS [NC]
RewriteRule (.*) – [F,L]
If any one know any more bad bots share it here so our community can get help from this.

 

 

Leave a Reply

Your email address will not be published. Required fields are marked *