I am getting hit numerous times by crawlers on a page which triggers an API call. I would like to limit access to that page for bots who do not respect my robots.txt.
Note: This question is not a duplicate.. I want rate limiting not IP blacklisting.
Rate limiting runs within an application, rather than running on the web server itself. Typically, rate limiting is based on tracking the IP addresses that requests are coming from, and tracking how much time elapses between each request.
A rate-based rule tracks the rate of requests for each originating IP address, and triggers the rule action on IPs with rates that go over a limit. You set the limit as the number of requests per 5-minute time span.
Check out the gem: Rack::Attack
!
Battle-tested in production environments.
If you are using redis
in your project you can very simply implement requests counter for API request. This approach allows you not to just limit robots access, but limit different API request using different policies based on your preferences. Take a loook on this gem or follow this guide if you want to implement limit by yourself.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With