I am running a crawler programed in PHP every hour with a cron job. When everythings goes as expected, the script quits automatically. However, for some reasons, sometimes it gets stuck in an infinite loop. It gets worse because I use a lock file to avoid a duplicate run, when the crawler gets stuck it never runs again until it kill it manually ( ps aux -> kill )
How can I make sure that the script ends after a couple of hours whatever happens?
Should I add a line in the php code? Wouldn't it be more robust to do that directly in Linux?
The best idea that I have so far is to make a small batch file with all the necessary commands and then invoke that batch with cron instead of the php script directly.
Am I right, and what should the commands be?
Thanks
Edit: the best I found so far is: http://www.linuxquestions.org/questions/linux-general-1/how-to-kill-the-process-after-specific-time-624453
The bash is way too long, I was hoping for a smarter, shorter solution.
Cheers
This would kill php process which were started more then an hour ago:
$(ps -eo comm,pid,etimes | awk '/^php/ {if ($3 > 3600) { print "kill "$2}}')
3600 - timestamp in second
P.S. You can run command
> ps -eo comm,pid,etimes
before and after to ensure that everything worked out.
P.P.S. I know it is old question but someone might find it helpful
Related
I am not very good with bash or shell script.
I would like to know if what I want to do is possible.
I have a big request to do with PHP. Only when I launch it my server is in Timout (and I can't extend it).
I had the idea to split the PHP requests in iterations. Every time I reload my page, my script iterates over a JSON file.
I want to know if I can use a script with a CRON to run my PHP file as long as it has iterations to perform. And if I can use a response from my PHP in my bash to stop the script when everything is finished?
I apologize for my English, thank you for your time.
I have searched the web several times, but I can't find an answer to use a php file with a bash script with iterator.
Thanks in advance,
Thank to Andrea Olivato :
If it's just a timeout problem and your script would work, just run it
from command line. php script.php Assuming you have a non-infinite
loop in your script, it will automatically stop once it finishes
everything
I am trying to start a linux shell script from PHP5 that will run for 24hours, but I want the webpage to return within seconds. I though this could be solved by making a script spawning of the task, but it does not seem to work.
I have been searching around for a solution or a "one shot / fire and forget" option for a couple of days without any luck.
The following example shows the problem.
In PHP 5 I make one of the following call (tried a lot it this point)
passthru("dummy_script.sh");
or
system("dummy_script.sh");
or
shell_exec("dummy_script.sh");
The dummy script look the following:
#!/bin/sh
{
while true
do
sleep 1
done
} &
I can see the that process gets started, but the webpage does not return before I make a 'killall dummy_script.sh'. If I run the script manually in a terminal it return immediately and spawns of the loop.
Does anyone know a way here I can spawn of the task without making the webpage wait it ?
Hope you guys can help me out, it would be most appreciated.
To answer your question:
You may start looking at pcntl_fork. Or you may check this. Basically, you are using the native fork to fork the long running process so your php frontend does not have to wait.
If you're feeling adventurous, you may put your "job" (your request to this long running process) in a DB. A cron job then checks the DB for incoming requests and it is the one that executes that process.
Another method is to use resque, but don't bother at this point.
I'm creating a webservice for an Android app in PHP with MySQL. I want to continuously check whether any data is available. I haven't got any idea how to get data as a background process. How can I execute a query without any request or without calling file?
I searched and got some code like
$command = "php -d max_execution_time=50 -f myfile.php '".$param."' >/dev/null &";
exec($command);
But where should I put this code so this query will run continuously?
Yes, the ampersand trick will work. You can use something like supervisord to restart it every few hours, so that any memory leaks are dealt with. This also makes it less fragile if it were to crash or hang.
Also, you can use something like cron to run a task for 10 minutes, and then die off and wait for cron to start it again - bear in mind that with most background tasks, it doesn't matter if there's a short period the task is not running, since it will catch up. It's worth checking in each run whether the previous one is still running, and exit early if it is: that way you don't have two background tasks causing race-conditions when retrieving work from your database.
Finally you can use a job server, such as Gearman. This will allow you to send tasks to it in an asynchronous fashion, and they will be run by worker tasks (in either time or priority order). This is probably the most reliable approach, but it takes a bit more work to set up. There's a PHP module for this, but in my experience it's more of a hassle to use than Net_Gearman, which is available in PEAR.
I need to write a server-side program that lives on the server, and is checking a database consistently for new entries.
When a new entry shows up in the database, the program should process the data and put the results somewhere else.
It is important to hi-light that the process isn't instigated by new entries showing up, but by the program checking for new entries on its own.
Some people I've spoken to brought up cron jobs, I was curious what if this is the solution for me? I see that it has limitations, it won't run less than every minute. I was hoping for the program to run every 5 seconds, would I be better off writing a shell script or is that a bootleg fix?
I'm not sure if this is conventional (?) but...
Use a database trigger on INSERT that runs an external program (PHP, Python, .. whatever). Which database are you using? I think this post is old but might be of help: http://crazytechthoughts.blogspot.co.uk/2011/12/call-external-program-from-mysql.html
There is a technique I've frequently used when dealing with queues that I've been processing.
#!/bin/sh
php -f checkDBAndAct.php
sleep 5
exec $0
The exec $0 part starts the script running again, replacing itself in memory, so it will run forever without issues. Any memory the PHP script uses is cleaned up whenever it exits, so that's not a problem either.
A simple line will start it, and put it into the background:
cd /x/y/z ; nohup ./loopToProcessDB.sh &
or it can be similarly started when the machine starts with various means (such as Cron's '#reboot ....')
-- from https://stackoverflow.com/a/2686100/6216
An extended version is on http://PHPscaling.com and https://gist.github.com/alister/1386212
Though I'd use an actual queue system, rather than a DB, as there are a number of downsides to bending a database to this task.
Is there a way to make php work forever without cron.
What I want it for is to unban users after a few hours by running a mysql query, thanks
If you don't have access to cron jobs on your server (I guess you are running on a shared hosting?), the best alternative is to run an "external cron". Have a look at www.setcronjob.com. I have been using this for a couple of months now and it is pretty stable.
You can set it up such that it calls a script on your website every whenever you want. (Example: http://www.yoursite.com/script.xxx)
In the script, you can run a MySQL query to check which users have been banned for a couple of hours and then unban them.
You can start your script from the command line and let it run in the background. You will have to design this script in such a way that it never exits and just loops forever using the sleep() function to avoid unnecessary processor load. Since php scripts invoked from the command line have no max execution time the script will run until you manually kill it off with the kill command.
Once you've written the script you can start it with:
nohup php myscript.php &
nohup makes the script still run once you log out of the console session that you started it from, otherwise it would kill off then. The & symbol at the end starts the script as a new process in the background so that you can continue using the console.