Always running process - php

Is there any way how to ensure some process will be always running?
Let's say i need this to be running:
php -f myscript.php -param1=value1
Now i do it this way:
The proccess is launched and right after that the PID is written to the file (myProcess.pid).
Then i schedule a cronjob, which is periodically trying to launch the process again and again each 5 minutes.
The process is actually a bash script file, which firstly checks, whether myProcess.pid exists and whether is, if the PID in that file is really running, if it's not, it launches it and rewrite the "myProcess.pid" with new PID.
There are several problems with that solution:
What about the "blackout" time period between "cron checks"?
What about the period? Isn't 5 minutes too much to spend unncessary CPU time?
What if the file myProcess.pid has been compromised, or just deleted by someone/something? Then the bash script launches it again, even if it's already running.
What if the process dies and the exact PID takes another process?
Does any better approach exist?

You'll need some kind of monitoring application like
http://supervisord.org/
http://blogs.nopcode.org/brainstorm/2011/04/21/supervisord-one-process-to-rule-them-all/
Supervisor is a client/server system that allows its users to monitor and control a number of processes on UNIX-like operating systems.
Alternatively, you can create some daemonized php code:
http://simas.posterous.com/writing-a-php-daemon-application
http://kevin.vanzonneveld.net/techblog/article/create_daemons_in_php/

Related

using exec for long running scripts

I want to get some data from and API and save for that user in database, this actions takes random times depending on the time and sometimes it takes even 4 hours,
I am executing the script using exec and & in the background in php,
My question is that is exec safe for long running jobs, I dont know much about fork,and linux processes etc so I dont know what happened internally on CPU cores,
Here is something I found that confused me,
http://symcbean.blogspot.com/2010/02/php-and-long-running-processes.html
Can somebody tell me if I am going in right direction with exec?
will the process be killed itself after script completion?
Thanks
Well, that article is talking about process "trees" and how a child process depends of it spawning parent.
The PHP instance starts a child process (through exec or similar). If it doesn't wait for the process output, the PHP script ends (and the response is sent to the browser, for instance), but the process will sit idling, waiting for it's child process to finish.
The problem with this is that the child process (the long running 4 hours process) is not guaranteed to finish its job, before apache decides to kill its parent process (because you have too many idle processes) and, effectively, killing its children.
The article's author then gives the suggestion of using a daemon and separate the child process from the parent process.
Edit:
Answering the question you left in the comments, here's a quick explanation of the command he uses in the article
echo /usr/bin/php -q longThing.php | at now
Starting from left to right.
echo prints to Standard Output (STDOUT) the stuff you put in front of it so...
echo /usr/bin/php -q longThing.php will print to the shell /usr/bin/php -q longThing.php
| (pipeline) feeds directly the STDOUT of a previous command to the standard input (STDIN) of the next command.
at reads commands from STDIN and executes them at a specified time. at now means the command will be executed immediately.
So basically this is the same thing as running the following sequence in the shell:
at now - Opens the at prompt
/usr/bin/php -q longThing.php - The command we want to run
^D (by pressing Control+D) - To save the job
So, regarding your questions:
Will the child process be immediately killed after the PARENT PHP script ends?
No.
Will the child process be killed at all, in some future moment?
Yes. Apache takes care of that for you.
Will the child process finish its job before being killed?
Maybe. Maybe not. Apache might kill it before its done. Odds of that happening increase with the number of idle processes and with the time the process takes to finish.
Sidenote:
I think this article does point in the right direction but I dislike the idea of spawning processes directly from PHP. In fact, PHP does not have the appropriate tools for running (long and/or intensive) bg work. With PHP alone, you have little to no control over it.
I can, however, give you the solution we found for a similar problem I faced a while ago. We created a small program that would accept and queue data processing requests (about 5 mins long) and report back when the request was finished. That way we could control how many processes could be running at the same time, memory usage, number of requests by the same user, etc...
The program was actually hosted in another LAN server, which prevented memory usage spikes slowing down the webserver.
At the front-end, the user would be informed when the request was completed through long polling,

How can I create, monitor and run/stop a process in linux

I have a cron which runs every minute but the thing is it queues every request from the past minute and executes some tasks one after another. I want to run a background process which will run infinite time. I'll check if there is any new request came in & will process that immediately.
do {
//do my stuff
} while(true)
I need to know the command to check if the process is running or not, if not then start this, else do nothing
FYI - I'm not a linux guy and dont know anything about bash or shell. I need PHP code which I can add in the every minute cron which will just monitor this process is running or not.
What you are looking for is a service control and/or a watchdog. You can use D. J. Bernstein Daemontools or similar software.
Also, if you want to do it in PHP you could, inside the start part of your daemon (that is what you are building) raise a flag (a file), then withing a cron job, run another program to check if the flag is raised (the file exist) every N minutes.

Running a PHP script completely on server side

I'm having a problem where putty gets regularly disconnected. So, when I run a PHP script from the terminal, it always gets interrupted. The script is supposed to run several hours, so I'm not having any luck with it.
How can I completely run this from the server side? I'm reading about cron jobs, but I'm having a hard time understanding at this time. Is there any alternative to cron for what I need?
I have several script PHP files that need to be run, one by one, or perhaps two at a time. Any ideas?
You don't need to leave it run in a cron job - you can just run the php script inside a screen.
Simply type;
screen php /path/to/myphpscript.php
A screen will continue running even after you disconnect from PuTTY. If you need to check up on it, you can use;
screen -r
To re-attach yourself to this process, and view any output.
You need to prevent the process from terminating when the session disconnects.
Something like this would work:
nohup php myscript.php
You can create a cron job to start the php script periodically based on a list of time tasks. More info. You could also start the task in the background from the console. i.e. php-cgi script.php& this would make the script a background task
Take a look at GNU Screen; it allows you to detach and reattach a session later, which is perfect for long-running scripts. Cron is a good option if you want it to happen in a recurring fashion; one-off batch jobs can be scheduled with something like at. For more intense computing needs, you might want to look into a more full-fledged job scheduling system like TORQUE.
You can run your program in background
php ./yourscript.php &

Monitoring php Scripts through Gearman

I am trying to run my php scripts in Gearman worker code but also want to monitor
besides that if they are taking more than the expected run time ,I want to kill those scripts.Each script has to run in a timely fashion(say running every 10 minutes) and the Gearman client picks ,the script which are ready to run and send s them to Gearman worker.
I tried using the following options :
1) Tried using an independent script,a normal php script which monitors the running process.
But this normal scripts will not inform Gearman that job got killed and Gearman thinks that the job that got killed is still running.
So that made me think I have to synchronize the process of monitoring and process of running php scripts in the same worker.
Also these jobs need to be restarted and the client takes care of them.
2) I am running my php scripts using the following command :
cd /home/amehrotra/include/core/background;php $workload;(this is blocking does not go to the next line until the script finishes execution).
I tried using exec , but exec does not execute the scripts
exec ("/usr/bin/php /home/amehrotra/include/core/background/$workload >/dev/null &");
3) Tried running 2 workers ,one for running php script another for monitoring but Geraman client does not connect to two workers.
Not the coolest plan, but try to use database as central place where everything is controlled.
It will take some resources and time for your workers but it is the cost to make it manageable.
Worker will need to check for commands (stop/restart) that are assigned to him via db. and he can also save some data into db so you can see what is happening.

long time cron job via wget/curl?

I am working on cron jobs for my php app and planning to use cron via wget/curl.
Some of my php cron jobs can take 2-3 hours. How to let php work 2-3 hours from cron tab ? Is it good practice to run such long time jobs via cron wget/curl ? Anyone got experience on this ? I also have email queue and i needs to be run every 10 seconds, but cron tab is minute level. Any suggest on this case ?
Thanks for reading.
When you use wget/curl, you are requesting for a page from a webserver. Every webserver will have a time out period, so this might time out.
Also some of the hosting providers may stop the process running beyond certain minutes ( basically done to control the rogue threads).
So it is not advisable to schedule using wget/curl if the job takes more than few minutes.
Try scheduling it using actual scheduler. You can run php from command line
php [options] [-f] [--]
[args...]
php command should be on the path.
You can use the following at the start of your script to tell PHP to effectively never time out:
set_time_limit(0);
What may be wiser is, if crontab is going to run the script every 24 hours, set the timeout to 24 hours so that you don't get two copies running at the same time.
set_time_limit(24*60*60);
Crontab only allows minute-level execution because, that's the most often you should really be launching a script - there are startup/shutdown costs that make more rapid scheduling inefficient.
If your application requires a queue to be checked every 10 seconds a better strategy might be to have a single long-running script that does that checking and uses sleep() occasionally to stop itself from hogging system resources.
On a UNIX system such a script should really run as a daemon - have a look at the PEAR System_Daemon package to see how that can be accomplished simply.
I've just run into the problem with a long running PHP script executed via cron and wget. The script terminates after 15 minutes due to the default timeout in wget.
However I believe this method does in fact work if you set up the correct timeouts. The PHP script you run needs to be set to have an unlimited run time. Best to set this at the start of your long running script.
set_time_limit(0);
When using wget you also need to remove its timeout by passing -T0. e.g.
wget -q0 -T0 http://yourhost/job.php
Be very careful not to overload your server with long running scripts.
If using wget to call the long running cron job, consider some of it's defaults that aren't desirable in this situation. By default it'll --tries 20 times if no data is read within --read-timeout of 900 seconds. So at a maximum set tries to 1 to ensure the cron job is only called once, since you of course wouldn't want a long running script called multiple times while it's still running.
wget -t 1 https://url/
If on cPanel and you don't care for any output emails after each cron job run, also use the following flags -O and -q to hide output, and redirect errors with 2>&1.
wget -O /dev/null -q -t 1 https://url/ 2>&1

Categories