Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Ban robots from website [closed]

my website is often down because a spider is accessying to many resources. This is what the hosting told me. They told me to ban these IP address: 46.229.164.98 46.229.164.100 46.229.164.101

But I've no idea about how to do this.

I've googled a bit and I've now added these lines to .htaccess in the root:

# allow all except those indicated here
<Files *>
order allow,deny
allow from all
deny from 46.229.164.98
deny from 46.229.164.100
deny from 46.229.164.101
</Files>

Is this 100% correct? What could I do? Please help me. Really I don't have any idea about what I should do.

like image 352
testermaster Avatar asked May 13 '14 12:05

testermaster


People also ask

Can you stop a bot from crawling a website?

They can do this by utilizing robots. txt to block common bots that SEO professionals use to assess their competition. For example Semrush and Ahrefs. This will block AhrefsBot from crawling your entire site.

Can bots be blocked?

The most basic means of blocking bad bots from your site involves blacklisting individual IP address or entire IP ranges. This approach is not only time consuming and labor intensive, but it is also a very small band-aid on a very large issue.


1 Answers

based on these

https://www.projecthoneypot.org/ip_46.229.164.98 https://www.projecthoneypot.org/ip_46.229.164.100 https://www.projecthoneypot.org/ip_46.229.164.101

it looks like the bot is http://www.semrush.com/bot.html

if thats actually the robot, in their page they say

To remove our bot from crawling your site simply insert the following lines to your
"robots.txt" file:

User-agent: SemrushBot
Disallow: /

Of course that does not guarantee that the bot will obey the rules. You can block him in several ways. .htaccess is one. Just like you did it.

Also you can do this little trick, deny ANY ip address that has "SemrushBot" in user agent string

Options +FollowSymlinks  
RewriteEngine On  
RewriteBase /  
SetEnvIfNoCase User-Agent "^SemrushBot" bad_user
SetEnvIfNoCase User-Agent "^WhateverElseBadUserAgentHere" bad_user
Deny from env=bad_user

This way will block other IP's that the bot may use.

see more on blocking by user agent string : https://stackoverflow.com/a/7372572/953684

Should i add, that if your site is down by a spider, usually it means you have a bad-written script or a very weak server.

edit:

this line

SetEnvIfNoCase User-Agent "^SemrushBot" bad_user

tries to match if User-Agent begins with the string SemrushBot (the caret ^ means "beginning with"). if you want to search for let's say SemrushBot ANYWHERE in the User-Agent string, simply remove the caret so it becomes:

SetEnvIfNoCase User-Agent "SemrushBot" bad_user

the above means if User-Agent contains the string SemrushBot anywhere (yes, no need for .*).

like image 91
Sharky Avatar answered Sep 20 '22 20:09

Sharky