I read one possible measure to filter out spambots and want to get opinions (advice).
Measure is like this:
1) Login form is not directly accessible. I mean if someone enters www.domain.com/login.php he would be redirected to index or something like this.
2) On index.php is button Login. When visitor clicks on Login, popup login form appears. Upon login I check $_SERVER["HTTP_REFERRER"]. If it is not login.php, then error (fail)
Is such measure useful? May be instead of $_SERVER["HTTP_REFERRER"] may use something else? Is such method reasonable at all?
I would say: NOT REALLY. But there's a better solution, see below.
The problem is, good bots can do everything that humans can do too. Most bots are even "better" then humans, because they know how login systems work etc.! Using JavaScript/Iframe/AJAX-constructs to prevent bots will help against mainstream crawlers that simply search for text-input and password-input forms. And even if your site uses superawesome indirect-accessable login forms, even then someone might build a bot for exactly that use case.
A good solution to this problem:
Using a time-delayed login blocker! There's an tutorial on how to delay - and finally even block - login attemts after an critical amount of failed logins here: http://www.codedevelopr.com/articles/throttle-user-login-attempts-in-php/
Additionally use a high-end captcha after the first failed login.
For the really bad guys: If you get masses of login fails from a special IP, then block this IP. This is quite advanced, but a common practice.
(please note: the author of this tutorial still uses mysql_query, which is outdated for years now. you should do it with mysqli or PDO.)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With