Running PHP after request - php

I would like to be able to start a second script (either PHP or Python) when a page is loaded and have it continue to run after the user cancels/navigates away is this possible?

You can send Connection:Close headers, which finishes the page for your user, but enables you to execute things "after page loads".
There is a simple way to ignore user abort (see php manual too):
ignore_user_abort(true);

Use process forking with pcntl.
It only works under Unix operating systems, however.
You can also do something like this:
exec("/usr/bin/php ./child_script.php > /dev/null 2>&1 &");
You can read more about the above example here.

for keeping the current script:
ignore_user_abort(true);
set_time_limit(0);
for running another script:
see $sock=fsockopen('http://localhost/path_to_script',80,$errorStr,3600) + stream_set_timeout($sock,3600);
see exec('php path_to_script'); - this will cause your script to run from the CLI, so you'd have to install php-cli on that server.

Another approach if you can't use the others is to include an img tag at the bottom of your output that requests a php page that does whatever it is you are wanting to do.
It will still show the loading animation though, so I think Karsten's suggestion is probably better (I'll try that next time I need to do this type of thing I think).

Related

How to call file_get_contents() in PHP without waiting for result?

In some of my PHP scripts I am using this code to POST data to a URL:
$file = #file_get_contents();
This will only POST data, content returned by server is empty. The executed script is really unimportant and isn't needed for the main script that gets executed. It's like a log file.
Normally PHP will wait until this is ready executed.
Is there a way to call $file=#file_get_contents(); without waiting for the result? Just call it and execute the next command without taking care of $file=#file_get_contents();?
I have already searched for this problem but only found solutions for the console.
file_get_contents is a sync method so you can't just skip it but if the method is not important a solution using PHP is create a thread and put it that log logic method, in that way you can run the process where actually the request is being attended (process / thread) and at "same time" the thread logging whatever you are doing.
http://php.net/manual/es/class.thread.php
You can move the asynchronous code inside a separate PHP file, then execute it using one of the program execution functions. You need to spawn the program in such a way that PHP does not wait for it to finish. For example on Unix you can use the & operator:
<?php
shell_exec("php post.php arg1 arg2 arg3 >/dev/null 2>/dev/null &");
This is tricky on Windows but not impossible.
This feature is with file_get_contents not really possible. But you can use fsockopen to achieve this.

Running heavy php file in background without cronjobs

How can I run a php file IN BACKGROUND after submitting a form. The loading has to happen in background since it usually takes a very long time.
Basically it's just like running a cronjob, except I want to trigger it manually and with my browser.
There are several possible ways to do this.
Try setting ignore_user_abort to TRUE in your script.
If changed to TRUE scripts will not be terminated after a client has aborted their connection.
Take a look at popen() and pclose(). You can do something like this:
pclose(popen("start php /path/to/myscript.php", "r"));
You can kick off a separate PHP process with a system() or exec() call. Something like this:
system('php /path/to/myscript.php >/dev/null 2>&1 &');
Start the request using AJAX. The browser will continue running while waiting for a response. You can even show a popup or some information when the request is finished, although you don't have to.

Detect when a script delays and allow it to skip or continue in PHP

I have a file, lets say file1.php, that within the script executes a file using: exec("php-cli -f _DAEMON.php") after executing the exec() command, it needs to run more code, the problem is that _DAEMON.php as its name says, is a Daemon and it will never stop running, so it freezes file1.php without allowing the rest of the code to run.
Is there a way to allow the code to continue executing even if exec("php-cli -f _DAEMON.php") has not finished. Or to detect if the code delays for more than x seconds/milliseconds, to continue?
Thanks.
Maybe try using a socket (curl might work with a low timeout, not sure if it'll kill the script though offhand). Not ideal, will add some overhead.
http://phplens.com/phpeverywhere/?q=node/view/254
Also, doriana_gd was probably referring to something like node.js, server side javascript

exec causes perpetual load

I noticed exec and shell_exec is causing perpetual loading.
Basically, I'm trying to do something as simple as loading a PHP script in the background. When I try to do that, it just loads and loads.
My code is as follows
exec('php test.php -- '.escapeshellarg($param1).' > /dev/null ');
I first thought it was my other script, so I pointed it to a file with just:
echo $agrv[1];
But it still loads perpetually.
Don't wait for the process to exit
exec() waits for the process to give an exit code. The link I provided above may help you.
Oh, and since you tagged Linux for whatever reason, I assume you're on a Linux distro.
You could consider this, aswell:
http://ca1.php.net/pcntl_fork

php shell_exec() help needed for running a script in the background

My project calls for 3 php scripts that are run with if-else conditions. The first script is loaded on the index page of the site to check if a condition is set, and if it is, it calls for the second script. The second script check to see if other conditions are set and it finally calls for the last script if everything is good.
Now I could do this by just including the scripts in the if statement, but since the final result is a resource hogging MySQL dump, i need it to be run independently of the original trigger page.
Also those scripts should continue doing their things once triggered, regardless of the user actions on the index page.
One last thing: it should be able to run on win and nix.
How would you do this?
Does the following code make any sense?
if ($blah != $blah-size){
shell_exec ('php first-script.php > /dev/null 2>/dev/null &');
}
//If the size matches, die
else {
}
Thanks a million in advance.
UPDATE: just in case someone else is going through the same deal.
There seem to be a bug in php when running scripts as cgi but command line in Apache works with all the versions I've tested.
See the bug https://bugs.php.net/bug.php?id=11430
so instead i call the script like this:
exec("php-cli mybigfile.php > /dev/null 2>/dev/null &");
Or you could call it as shell. It works on nix systems but my local windows is hopeless so if anyone run it on windows and it works, please update this.
I would not do this by shell exec because you'd have no control over how many of these resource-hogging processes would be running at any one time. Thus, a user could go click-click-click-click and essentially halt your machine.
Instead, I'd build a work queue. Instead of running the dump directly, the script would submit a record to some sort of FIFO queue (could be a database table or a text file in a dir somewhere) and then immediately return. Next you'd have a cron script that runs at regular intervals and checks the queue to see if there's any work to do. If so, it picks the oldest thing, and runs it. This way, you're assured that you're only ever running one dump at a time.
The easiest way I can think is that you can do
exec("screen -d -m php long-running-script.php");
and then it will return immediately and run in the background. screen will allow you to connect to it and see what's happening.
You can also do what you're doing with 'nohup php long-running-script.php', or by writing a simple C app that does daemonize() and then execs your script.

Categories