Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to limit the number of user requests made within a minute

Tags:

url

php

limit

ip

A user will request a file by number via a URL like script.php?userid=222. This example would show the record of file #222.

Now I want to limit the number of files per (remote IP) user to a maximum of 5 different records in a minute. However, the user should be able to access the same id record any number of time.

So the user could access file #222 any number of times, but if the (remote IP) user accesses more than 5 other different records in a minute, then it should show an error.

For example, suppose within a minute the following requests are made:

script.php?userid=222
script.php?userid=523
script.php?userid=665
script.php?userid=852
script.php?userid=132
script.php?userid=002

then at the last request it should show the error message.

Here is the basic code:

$id = $_GET['userid'];
if (!isset($_GET['userid']) || empty($_GET['userid'])) {
    echo "Please enter the userid";
    die();
}

if (file_exists($userid.".txt") &&
        (filemtime($userid.".txt") > (time() - 3600 * $ttime ))) {
    $ffile = file_get_contents($userid.".txt");} else {
    $dcurl = curl_init();
    $ffile = fopen($userid.".txt", "w+");
    curl_setopt($dcurl, CURLOPT_URL,"http://remoteserver.com/data/$userid");
    curl_setopt($dcurl, CURLOPT_RETURNTRANSFER, TRUE);
    curl_setopt($dcurl, CURLOPT_HTTP_VERSION, CURL_HTTP_VERSION_1_0);
    curl_setopt($dcurl, CURLOPT_TIMEOUT, 50);
    curl_setopt($dcurl, CURLOPT_FILE, $ffile);
    $ffile = curl_exec($dcurl); 
    if(curl_errno($dcurl)) // check for execution errors
    {
        echo 'Script error: ' . curl_error($dcurl);
        exit;
    }
    curl_close($dcurl);
    $ffile = file_get_contents($userid.".txt");
}
like image 621
smallbee Avatar asked Dec 26 '15 21:12

smallbee


1 Answers

Instead of relying on the IP address, you could use the session mechanism. You can create a session scope via session_start(), and then store information that sticks with the same user session.

I would then suggest to keep in this session scope the list of unique IDs used in previous requests that user made, together with the time of the request, ignoring any repeated requests, which are always allowed. As soon as this list contains 5 elements with a time stamp within the last minute and a new ID is requested, you show the error and refuse the lookup.

Here is the code that does this. You should place it right after you have checked the presence of the userid argument, and before the retrieval of the file contents:

// set the variables that define the limits:
$min_time = 60; // seconds
$max_requests = 5;

// Make sure we have a session scope
session_start();

// Create our requests array in session scope if it does not yet exist
if (!isset($_SESSION['requests'])) {
    $_SESSION['requests'] = [];
}

// Create a shortcut variable for this array (just for shorter & faster code)
$requests = &$_SESSION['requests'];

$countRecent = 0;
$repeat = false;
foreach($requests as $request) {
    // See if the current request was made before
    if ($request["userid"] == $id) {
        $repeat = true;
    }
    // Count (only) new requests made in last minute
    if ($request["time"] >= time() - $min_time) {
        $countRecent++;
    }
}

// Only if this is a new request...
if (!$repeat) {
    // Check if limit is crossed.
    // NB: Refused requests are not added to the log.
    if ($countRecent >= $max_requests) {
        die("Too many new ID requests in a short time");
    }   
    // Add current request to the log.
    $countRecent++;
    $requests[] = ["time" => time(), "userid" => $id];
}

// Debugging code, can be removed later:
echo  count($requests) . " unique ID requests, of which $countRecent in last minute.<br>"; 

// if execution gets here, then proceed with file content lookup as you have it.

Deleted session cookies...

Sessions are maintained by cookies on the client. The user may delete such cookies, an so get a new session, which would allow the user to make new requests without regard of what was previously requested.

One way to get around this is to introduce a cool-down period for each new session. For instance, you could have them wait for 10 seconds before a first request can be made. To do that, replace in above code:

if (!isset($_SESSION['requests'])) {
    $_SESSION['requests'] = [];
}

By:

$initial_delay = 10; // 10 seconds delay for new sessions
if (!isset($_SESSION['requests'])) {
    $_SESSION['requests'] = array_fill(0, $max_requests,
        ["userid" => 0, "time" => time()-$min_time+$initial_delay] 
    );
}

This is of course less user friendly as it affects any new session, also of users who are not trying to cheat by deleting cookies.

Registration

The better way is to only allow the lookup services to registered users. For this you must provide a user database and authentication system (for example password based). The requests should be logged in the database, keyed by the user's ID. If then a new session starts, the user must first authenticate again, and once authenticated the request history is retrieved from the database. This way the user cannot cheat around it by changing something on their client configuration (IP address, cookies, employing multiple devices in parallel, ...)

like image 124
trincot Avatar answered Oct 19 '22 23:10

trincot