cron jobs bring down the server? - php

I am using the paid host Hosting24 to run my website. I got a cron job which execute the following code every 1 minute.
<?php
require_once('connect.php');
for($c = 0; $c < 60; $c=$c+5)
{
// php to mysql queries SELECT/ UPDATE/ INSERT etc...
sleep(5);
}
mysql_close($my_connection);}
?>
I tried to use the for loop to allow the script to run for 1 minute. Eventually my script should run for as long as I want it to be because the server will execute it every 1 min.
However, I opened my website for a short while and then I cannot connect to it. I cannot even access my cpanel.
Is my cron job script overheating the system, so the system is down?
How should I set up my cron job script to let it run every 1 min and lasts for 1 min?
Thanks.

It's been my experience that cron jobs that need to include files should contain the full path to that file (the CLI environment can differ greatly from the environment inside the web server). Try that and see if it helps.
If not, the next thing you need to do is turn the cron job off and run it from the CLI yourself, using top to look at the process usage. See how long it takes for your cron to run.

Related

cron job stopped working after 5 minutes

Trying to insert several data from one database server to another database server using php script.
Everytime cron job stops execution just after 5 minutes, i already given this
set_time_limit(0);
ini_set('mysql.connect_timeout', '0');
ini_set('max_execution_time', '0');
settings in php script.
Also where i can find cron job log in nginx server
cron jobs typically don't have access to environment variables, such as $PATH. Possibly it cannot find the wget binary, and thus the cron job fails.
Try replacing 'wget' in your cronjob with the fully qualified path, such as /usr/bin/wget
To find the location of wget on your system, use the command 'which':
which wget

PHP - Start a cron job manually / let the server works instead of the user

I have a quite slow script that need to be executed frequently (about 30 times in a minute) so the user can't execute it and the cron job runs at most every minute.
So, is there a way (using PHP) to let the server works instead of the user?
This is quite easy: Use a flag file
Script running without user interaction (may be started by cron or shell, including PHP shell execute):
<?php
while (true) {
while (file_exists('/path/to/flagfile')) sleep(1); //Can even use microsleep
include ('/path/to/worker/script');
touch('/path/to/flagfile');
}
?>
Script to trigger it (started from webserver via user interaction)
<?php
#unlink('/path/to/flagfile');
echo "Processing triggered!";
?>

bash script timeout

I have an issue which stops my (slow) process. I start my background slow process using a php page with a button as follow:
<form id="trial" method="post" action=""><input name="trial" value="Start!" type="submit">
<?php
set_time_limit(0);
if (isset($_POST['trial'])) {
system("/srv/www/cgi-bin/myscript.sh");
}
?>
At some point after 1.5 days the process stops, I have modified the php.ini and the apache config file inserting a very high number in the timeout directive, but it seems it does not work, or there is some other process that is stopping myscript.sh.. do you have any suggestions?
thanks!
I'm assuming you have access to the server via SSH based on your post.
If the real goal is to get your script to run continuously, why not log in and
nohup myscript.sh
As long as your script behaves, it will continue to run as long as it needs to after you close the terminal.
Check the Logs
To determine why your script is failing, you'll definitely want to check /var/log/kern.log and /var/log/syslog. Look for any entries containing your script or any of it's children. Your script may be getting killed off by the kernel ( exceeding limits ) or erroring out at runtime.
Execute the script continuously will take some problem so set Cron for every 30 mins in your system.
set_time_limit(30);
system("/srv/www/cgi-bin/myscript.sh");
Cron setup :
30 * * * * php /path/to/your/php/file.php

php script that run continously and concurrently with cronjob

I have 3 scripts that do some stuff.
I want to run them continously and concurrently.
Let's say for example:
First script took 30 minutes to finish.
Second - 20 mins.
Third - 5 mins.
So I need everyone of them to run immediately after it's finished.
The 3 scripts make UPDATE in a same DB, but they need to work separately.
They can run together at once, but not couple of times(my english sucks, sorry about that).
Let's explain what I mean with example:
firstScript.php is running
secondScript.php is running
thirdScript.php is running
firstScript.php trying to start but it still running. Wait.(till finish)
May be some shell script will do the job, but how?
Make a bash script that takes one argument, and have it do something like this:
if [ -f /tmp/$1 ]
then
echo "Script already running"
else
touch /tmp/$1
php $1
rm /tmp/$1
fi
Set up a cron to run this script and pass it the name of the php script you want to run.
You could execute a shell command just before the php script dies. Example :
while($i < 1000)
{
$i++;
}
shell_exec("bin/php.exe some_script.php");
If you are working on a shared hosting account this might not work do to security issues.
note the "bin/php.exe" need to be edited for your server, point to where ever your php is installed.

How to call drupal cron multipe times until done

I'm working on a wamp development environment and testing how long it takes to get the site indexed. I'm doing this by running cron manually.
The problem is that if there's 700 jobs in the job_queue, each run of cron does only some of them, so I need to run cron several times. How could I keep calling cron in a loop until there are no more jobs left in the job_queue?
Also, I'm open to any drush alternatives. I'm aware of drush cron, but this also does only some of the jobs each run, so needs to be run again manually.
If you want to run something all at once until it's done, cron is the wrong tool for the job; batch API is the tool for that. Unfortunately the search module is written to update only on cron, but there's not a lot of code in search_cron to copy into a batch function. So I'd suggest going the batch route rather wrapping some sort of pseudo-batch around cron, as your end goal doesn't seem to involve cron at all.
step 1. - Run cron every minute
step 2. - Just check if the script is already running. If so - close the "new script"
$lid='';
if(is_array($argv) && isset($argv[1])) {
$lid=$argv[1];
}
if($lid!=='') {
$xx=array();
exec('ps x | grep "thefile.php '.$lid.'" | grep -v "grep"', &$xx);
if(count($xx)>1) {
die('Script '.$lid.' running already... exiting... '."\n");
}
}
Put it in cron with every minute php thefile.php 1
php thefile.php 2
php thefile.php 3 to run 3 scripts at the same time.
drush search-index will generate the remaining search index for you.

Categories