Run a php script for so long in background in heroku - php

I have a PHP script which sends messages to a list of users, which I hosted in Heroku. Now I wanted to add a delay in between those messages. Say like, 6 mins gap between each message. So if there are 90 users, that script should run for 9 hours, in the background.
I tried calling this script using ajax, so it runs in the background and adding sleep(360); inside the for loop, to get 6 mins delay. But it only works for approx 10 to 20 users, after that it stops.
foreach ($users_list["users"] as $key => $value) {
try{
....
code for sending message
.....
}catch(Exception $e){
continue;
}
sleep(360);
}
So I would like to know, what is the optimal way to achieve this, in Heroku.

Calling a script using AJAX doesn't call it in the "background" it just runs it Asynchronously from the page you are on. In other words it's still running in Apache, has any session data, and still bound by the timeout settings of PHP and Apache.
To run it truly in the background you can use something like CRON
Or if you are allowed to on your server you can call it by command line like with exec or shell_exec, there are a few other similar functions too, such as popen, system etc. They all do things in a slightly different way.
Some environmental stuff will be different and this may have a big impact on your code. For example a lot of stuff in $_SERVER is not set or has different information. Such as the servers IP address may not be in there, you won't have any session stuff. You won't be able to use $_GET or $_POST but can get the input data (form the command line call) from the $argv array, the first item being the files path... etc...
Basically you need to call it like this:
exec('php -f "path/to/php/file.php" "arg1" "arg2"');
Calling it this way it will still be blocking, meaning it waits for execution of the called script.
To go one step further and make it non-blockin you can add (on Linux)
exec('php -f "path/to/php/file.php" "arg1" "arg2" > /dev/null &');
The & at the end is the most important bit.
Now on Windows it's a bit of a different ballgame. I've had success using this
$WshShell = new \COM('WScript.Shell');
$cmd = 'cmd /C php "path/to/php/file.php" "arg1" "arg2"';
$WshShell->Run($cmd, 0, false);
Also on windows to run PHP with just php you have to add the path to the php.exe yhou want to use to the path environmental variable. Otherwise you have to use the full path to the exe instead of just php
In either case you should be very careful about putting end user data in any command line call. There are 2 functions to sanitize it, but I try to just not put it in.
escapeshellarg
escapeshellcmd
I wrote a wrapper class for this you can find on my GitHub
Hope it helps.

Related

Run script on other Server

I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.
Till now I am using following code on website 1.
$file = file_get_contents('Website-2/update.php');
But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.
Is there a way where I can do this in a better way or tell PHP to move to next line of code.
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.
Common queues apps to look at:
[https://aws.amazon.com/sqs/?nc2=h_m1][1]
[https://beanstalkd.github.io/][2]
[https://www.iron.io/mq][3]
[1]: https://aws.amazon.com/sqs/?nc2=h_m1
[2]: https://beanstalkd.github.io/
[3]: https://www.iron.io/mq
What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.
On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.
You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.
(Async curl request in PHP)
(https://segment.com/blog/how-to-make-async-requests-in-php/)
Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)
exec('php -f "{path to new script}.php" $args > /dev/null &');
The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.
On windows you can use COM and WScript.Shell to do the same thing
$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);
You may want to use escapeshellarg on the filename and any arguments supplied.
So it will look like this
Server1 calls Server2
Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
Server1 continues as normal
Server2 continues the background process
So using your example instead of calling:
file_get_contents('Website-2/update.php');
You will call
file_get_contents('Website-2/update_kickstart.php');
In update_kickstart.php put this code
<?php
exec('php -f "{path}update.php" > /dev/null &');
Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly
Simple...
The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.

How do I know when the task from command line is finished?

This is really important as I could not find anything I am looking for in Google.
How do I know when the application (or is it more appropriate to call it a task?) executed by a command line is done? How does the PHP know if the task of copying several files are done if I do like this:
exec("cp -R /test/ /var/test/test");
Does the PHP script continue to go to next code even while the command is still running in background to make copies? Or does PHP script wait until the copy is finished? And how does a command line application notify the script when it's done (if it does)? There must be some kind of interaction going on.
php's exec returns a string so yes. Your webpage will freeze until the command is done.
For example this simple code
<?PHP
echo exec("sleep 5; echo HI;");
?>
When executed it will appear as the page is loading for 5 seconds, then it will display:
HI;
How does the PHP know if the task of copying several files are done if I do like this?
Php does not know, it simply just run the command and does not care if it worked or not but returns the string produced from this command. Thats why it better to use PHP's copy command because it returns TRUE/FALSE upon statistics. Or create a bash/sh script that will return 0/FALSE or 1/TRUE to determine if command was successful if you are going this route. Then you can PHP as such:
<?PHP
$answer = exec("yourScript folder folder2");
if ($answer=="1") {
//Plan A Worked
} else {
//Plan A FAILED try PlanB
}
?>
It waits until the exec call returns, whatever it returns.
However it might be that the exit call returns although the command it has started has not yet finished. That might be the case if you detach from the control, for example by explicitly specifying a "&" at the end of the command.

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

PHP and shell_exec

I have a PHP website and I would like to execute a very long Python script in background (300 MB memory and 100 seconds). The process communication is done via database: when the Python script finishes its job, it updates a field in database and then the website renders some graphics, based on the results of the Python script.
I can execute "manually" the Python script from bash (any current directory) and it works. I would like to integrate it in PHP and I tried the function shell_exec:
shell_exec("python /full/path/to/my/script") but it's not working (I don't see any output)
Do you have any ideas or suggestions? It worths to mention that the python script is a wrapper over other polyglot tools (Java mixed with C++).
Thanks!
shell_exec returns a string, if you run it alone it won't produce any output, so you can write:
$output = shell_exec(...);
print $output;
First off set_time_limit(0); will make your script run for ever so timeout shouldn't be an issue. Second any *exec call in PHP does NOT use the PATH by default (might depend on configuration), so your script will exit without giving any info on the problem, and it quite often ends up being that it can't find the program, in this case python. So change it to:
shell_exec("/full/path/to/python /full/path/to/my/script");
If your python script is running on it's own without problems, then it's very likely this is the problem. As for the memory, I'm pretty sure PHP won't use the same memory python is using. So if it's using 300MB PHP should stay at default (say 1MB) and just wait for the end of shell_exec.
A proplem could be that your script takes longer than the server waiting time definied for a request (can be set in the php.ini or httpd.conf).
Another issue could be that the servers account does not have the right to execute or access code or files needed for your script to run.
Found this before and helped me solve my background execution problem:
function background_exec($command)
{
if(substr(php_uname(), 0, 7) == 'Windows')
{
pclose(popen('start "background_exec" ' . $command, 'r'));
}
else
{
exec($command . ' > /dev/null &');
}
}
Source:
http://www.warpturn.com/execute-a-background-process-on-windows-and-linux-with-php/
Thanks for your answers, but none of them worked :(. I decided to implement in a dirty way, using busy waiting, instead of triggering an event when a record is inserted.
I wrote a backup process that runs forever and at each iteration checks if there is something new in database. When it finds a record, it executes the script and everything is fine. The idea is that I launch the backup process from the shell.
I found that the issue when I tried this was the simple fact that I did not compile the source on the server I was running it on. By compiling on your local machine and then uploading to your server, it will be corrupted in some way. shell_exec() should work by compiling the source you are trying to run on the same server your are running the script.

Run a PHP-script from a PHP-script without blocking

I'm building a spider which will traverse various sites and data mining them.
Since I need to get each page separately this could take a VERY long time (maybe 100 pages).
I've already set the set_time_limit to be 2 minutes per page but it seems like apache will kill the script after 5 minutes no matter.
This isn't usually a problem since this will run from cron or something similar which does not have this time limit. However I would also like the admins to be able to start a fetch manually via a HTTP-interface.
It is not important that apache is kept alive for the full duration, I'm, going to use AJAX to trigger a fetch and check back once in a while with AJAX.
My problem is how to start the fetch from within a PHP-script without the fetch being terminated when the script calling it dies.
Maybe I could use system('script.php &') but I'm not sure it will do the trick.
Any other ideas?
$cmd = "php myscript.php $params > /dev/null 2>/dev/null &";
# when we call this particular command, the rest of the script
# will keep executing, not waiting for a response
shell_exec($cmd);
What this does is sends all the STDOUT and STDERR to /dev/null, and your script keeps executing. Even if the 'parent' script finishes before myscript.php, myscript.php will finish executing.
if you don't want to use exec you can use a php built in function !
ignore_user_abort(true);
this will tell the script to resume even if the connection between the browser and the server is dropped ;)

Categories