I created a Email-Queue database table. I will insert all Emails my PHP application needs to send into this table.
Another PHP script will then look for all unsent Emails and sends them.
I run this script using cronjobs. Unfortunately cronjobs can run only at a maximum of once per minute. So in the worst-case a user has to wait one minute until his Email is really going to be sent.
My current idea for a workaround is calling the script with an addtional sleep parameter and duplicating the cronjobs.
Example:
* * * * * curl emails.php?sleep=0 >/dev/null 2>&1
* * * * * curl emails.php?sleep=10 >/dev/null 2>&1
* * * * * curl emails.php?sleep=20 >/dev/null 2>&1
* * * * * curl emails.php?sleep=30 >/dev/null 2>&1
* * * * * curl emails.php?sleep=40 >/dev/null 2>&1
* * * * * curl emails.php?sleep=50 >/dev/null 2>&1
In the above example the script would run every 10 seconds. The first line of the emails.php Script would be:
sleep($_REQUEST['sleep']);
This has to be done at script level.
// cron.php running every 10 seconds
<?php
$expireTime = time() + 60;
while (time() < $expireTime) {
// my php logic here
sleep(10);
// sleep for 10 seconds
// you may change the sleep time to change frequency
}
Here's a simple bash script I've written which can be used with crontab to run more frequently than 1 minute.
you can save it as ~/bin/runEvery.sh and then in crontab write something like this to run otherScript.sh every 5 seconds:
*/1 * * * * ~/bin/runEvery.sh 5 otherScript.sh
This is the script:
#!/bin/bash
inputPeriod=$1
runCommand=$2
RUN_TIME=60
error="no"
if [ 'x'"$runCommand" != 'x' ]
then
if [ 'x'$inputPeriod != 'x' ]
then
loops=$(( $RUN_TIME / $inputPeriod ))
if [ $loops -eq 0 ]
then
loops=1
fi
for i in $(eval echo {1..$loops})
do
$runCommand
sleep $inputPeriod
done
else
error="yes"
fi
else
error="yes"
fi
if [ $error = "yes" ]
then
echo "runEvery - runs a command every X seconds for a minute"
echo "Usage: runEvery.sh <# in seconds < 60> <command to run>"
fi
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With