PHP5, Shell_exec waiting for spawned linux shell tasks to finish - php

I am trying to start a linux shell script from PHP5 that will run for 24hours, but I want the webpage to return within seconds. I though this could be solved by making a script spawning of the task, but it does not seem to work.
I have been searching around for a solution or a "one shot / fire and forget" option for a couple of days without any luck.
The following example shows the problem.
In PHP 5 I make one of the following call (tried a lot it this point)
passthru("dummy_script.sh");
or
system("dummy_script.sh");
or
shell_exec("dummy_script.sh");
The dummy script look the following:
#!/bin/sh
{
while true
do
sleep 1
done
} &
I can see the that process gets started, but the webpage does not return before I make a 'killall dummy_script.sh'. If I run the script manually in a terminal it return immediately and spawns of the loop.
Does anyone know a way here I can spawn of the task without making the webpage wait it ?
Hope you guys can help me out, it would be most appreciated.

To answer your question:
You may start looking at pcntl_fork. Or you may check this. Basically, you are using the native fork to fork the long running process so your php frontend does not have to wait.
If you're feeling adventurous, you may put your "job" (your request to this long running process) in a DB. A cron job then checks the DB for incoming requests and it is the one that executes that process.
Another method is to use resque, but don't bother at this point.

Related

Hot reloading running PHP script?

My architecture is composed of the following :
A supervisor controller that makes sure n instances of the same PHP scripts are running (and restart them when there are not).
The script is a while loop that waits for a job from Beanstalk and process it when it arrive. After a certain number of loops (defined), the script exit, in order to be reloaded by supervisorctl (I do that because PHP tends to be instable in the long run).
When I push some changes, I have to wait that the "workers" have finished the number of expected loop to be restarted and then take into consideration the new changes.
Is there a way to fasten this ?
Thank you for your help.
Maybe simply decoupling the loop from the code that ingest the jobs + using process control extension can help, check this out: http://www.hackingwithphp.com/16/1/8/running-programs-in-the-current-process-space
(Sorry if I'm making wrong assumptions here, I don't have enough Karma to ask questions through comments)

start /B doesn't start the task

I'm currently launching an asynchronous job with PHP to perform some tests.
To make it work, I found on SO some tips, like the use of popen and start:
$commande = "testu.bat";
$pid = popen('start /B ' . $commande, 'r');
$status = pclose($pid);
The testu.bat's folder is in my user PATH.
This script performs some task, and to control it's execution, it should generates a log file, but I never get it.
Whereas if I just remove the /B option, it works fine and I get my log file.
Did I miss something about background execution? How can I catch the error informations when it is running in the background?
It appears you are operating under the assumption that the /B switch to the start command means "background". It does not. From the start usage:
B Start application without creating a new window. The
application has ^C handling ignored. Unless the application
enables ^C processing, ^Break is the only way to interrupt
the application.
Processes launched by start are asynchronous by default. Since that appears to be what you want, just run the command without the /B switch.
Interesting one... Ok, here's what I think is going on:
Because you run the task in the background, the PHP script will just carry on, it is not waiting for testu.bat to return anything...
Or put another way, popen does what it was instructed to do, which is starting the task in the background (which it does) but then control is handed immediately back to PHP, whilst the log file is still being created in the background and the php script carries on at the same time...
What I would do in this case is let testu.bat call the php script (or another PHP script) in a callback type fashion once it has done its processing, in a similar way as in Javascript you would use callbacks for asynchromous Ajax calls...
Maybe provide the callback script command as a parameter to testu.bat..?
Hope this is of any help...
I'm not quite sure about your goal here, but here are some info you might use:
for figuring out background errors, you may find these functions useful:
set_exception_handler();
set_error_handler();
register_shutdown_function();
Of course write out the errors they catch into some file.
If you do not need any data back from your requests, you can simply use:
fsockopen()
curl
and give them a short timeout (10 milisec). The scripts will run in the backround.
Alternatively if you do need the data back, you can either put it into a database and set up a loop that checks if the data has already been inserted, or simply output it into a file and check for its existence.
In my opinion start launches the specified command by creating a new prcoess in the "background". Therefore the execution of start itself "just" starts the second process and exists immediately.
However, using the /B switch, the command to be executed will be excuted in the context of the start process. Therefore the execution of the start process takes longer. Now what I suspect is that executing pclose terminates the start process and as a result of this you don't get your log file.
Maybe one solution (not testet though) could be executing something like
start /B cmd "/C testu.bat" where start just tries to execute cmd and cmd gets /C testu.bat as parameter which is the "command" it shall execute.
Another thought:
What happens if you don't call $status = pclose($pid);?
Just for some people seeking this trick to works, in my case it just needs to activate the PHP directive ignore_user_abort in php.ini or by the PHP platform function.
Without this activated, the process is killed by pclose() without finishing the job.
Your problem is most likely properly solved by using a queue system. You insert a job into a queue that a background process picks up and works on. In this way the background task is completely independent of the HTTP request that initiated the task - but you can still monitor its progress.
The two most popular software packages that can help you in your scenario:
Gearman
Check out this gist and this totorial for installation on Windows.
RabbitMQ
Check out this tutorial for installation on Windows.
Note that implementing a queuing solution is not a 5 minute prospect, but it is technically the right approach for this sort of situation. For a company product this is the only viable approach; for a personal project where you just want something to work, there is a level of commitment required to see it through for the first time. If you're looking to expand your development horizons, I strongly suggest you give the queue a shot.

Running background process in PHP permanently

I'm creating a webservice for an Android app in PHP with MySQL. I want to continuously check whether any data is available. I haven't got any idea how to get data as a background process. How can I execute a query without any request or without calling file?
I searched and got some code like
$command = "php -d max_execution_time=50 -f myfile.php '".$param."' >/dev/null &";
exec($command);
But where should I put this code so this query will run continuously?
Yes, the ampersand trick will work. You can use something like supervisord to restart it every few hours, so that any memory leaks are dealt with. This also makes it less fragile if it were to crash or hang.
Also, you can use something like cron to run a task for 10 minutes, and then die off and wait for cron to start it again - bear in mind that with most background tasks, it doesn't matter if there's a short period the task is not running, since it will catch up. It's worth checking in each run whether the previous one is still running, and exit early if it is: that way you don't have two background tasks causing race-conditions when retrieving work from your database.
Finally you can use a job server, such as Gearman. This will allow you to send tasks to it in an asynchronous fashion, and they will be run by worker tasks (in either time or priority order). This is probably the most reliable approach, but it takes a bit more work to set up. There's a PHP module for this, but in my experience it's more of a hassle to use than Net_Gearman, which is available in PEAR.

PHP, re-run the code when finished

I'm developing a PHP-service which does numerous operations per customer, and I want this to run continuously. I've already taken a look at cron, but as far as I understood cron made it possible to run the code on set times. This can be a bit dangerous since we are dependant that the code has finished running before it starts over, and the time for each run may vary as the customer base increases. So refresh, cron or other timed intervals cant be done, as far as I'm aware.
So I'm wondering if you know any solutions where I can restart my service when it is finished, and under no circumstances make the re-run before all the code have been executed?
I'm sorry if this is answered before or is easily found on Google, I have tried to find something, but to no avail.
Edit: I could set timed intervals to be 1 hour, to be absolutely sure, but I want as little time as possible between each run.
Look at this:
http://www.godlikemouse.com/2011/03/31/php-daemons-tutorial/
What you need is a daemon that keeps running. There are more solutions than this while loop.
The following I once used in a project: http://kvz.io/blog/2009/01/09/create-daemons-in-php/ , it's also a package for PEAR: http://pear.php.net/package/System_Daemon
For more information, see the following SO links:
What is a daemon: What is daemon? Their practical use? Usage with php?
How to use: PHP script that works forever :)
Have you tried runnning the PHP script as a process. This here has more details http://nsaunders.wordpress.com/2007/01/12/running-a-background-process-in-php/
If you do not want to learn how to code a daemon, I recommand using a software that manages processes in userland: Supervisor (http://supervisord.org/)
You just need to write a configuration file to specify which processes you want to run, and how.
It is extremely simple to configure and it is very adaptable (you can force having only one instance of your process, or instead have a fixed number of instances... etc).
It will also handle automatic restart in case your script crashes, and logging.
On the PHP side, just create a script that never quits, using a while(true) { ... } loop, and add an entry like this in supervisord's conf:
[program:your-script]
command=/usr/bin/php /path/to/your_script.php
I'm using that software in production for a few projects (to run ruby and php gearman asynchronous workers for websites).
Try to have a custom logic , where you can set the flag ON and OFF and in your CRON , you can check before running the code inside it. I wanted to suggested something like Queue based solution , once you get the entry , then run the logic of your processing . Which can be either daemon or cron. It will give more control if your task is OK to execute now . Edited it

Running Cron jobs in parallel (PHP)

In the past, I ran a bunch of scripts each as a separate cron job. Now I'd like to run a controller script with one cron job, then have that call the scripts separately (and in parallel, all at the same time), so I don't have to create a new cron job every time I add another script.
I looked up pcntl_fork() but we don't have that installed. Can fsockopen() do this as well?
A few questions:
I saw this example, http://phplens.com/phpeverywhere/?q=node/view/254, that uses fsockopen(). Will this allow me to run PHP scripts in parallel? Note, the scripts don't interact, but I would still like to know if any of them exited prematurely with an error.
Secondly the scripts I'm running aren't externally accessible, they are internal only. The script was previously run like so: php -f /path/to/my/script1.php. It's not a web-accessible path. Would the example in #1 work with this, or only web-accessible paths?.
Thanks for any advice you can offer.
You can use proc_open to run multiple processes without waiting for each process to finish.
You will have a process handle, you can terminate each process at any time and you can read the standard output of each process.
You can also communicate via pipes, which is optional.
Passing 1st param php /your/path/to/script.php param1 "param2 x" means starting a separate PHP process.
proc_open (see Example #1)
Ultimately you will want to use an infinite while loop + usleep (or sleep) to avoid maxing out on the CPU. Break when all processes finish, or after you killed them.
Edit: you can know if a process has exited prematurely.
Edit2: a simpler way of doing the above is popen
Please correct me if I'm wrong, but if I understand things correctly, the solution Tiberiu-Ionut Stan proposed implies that starting the processes with proc_open and waiting for them to finish will not be run as a cron script, but is part of a running program/service, right?
As far as I understand the cron jobs, the controller script user920050 was thinking of using would be started by cron on a schedule and each new instance would launch the processes all over again, do the waiting for them to finish and probably run in parallel with other cron-launched instances of the controller script.

Categories