Delay php script without occupying connection - php

I want to ask if anyone knows a way I can delay a php script without actually occupying a connection slot all the time. I am not completely aware of this but I was told that apache has a limit of connections or a limit of scripts running at the same time I can't exactly recall and this script of mine needs to run for about 1 to 3 hours and it doesn't really do anything heavy, it actually sleeps like 90% of the time.

If you're running a script and not expecting any response you can either run it in the terminal of the server computer with php "dir/to/php/script.php".
If the initialization of the script happens remotely then you could have the script exit so the script continues running but does not keep the connection alive. header('Connection: Close');
Example:
<?php
echo "The server is now doing some complex actions in the background..."; //even maybe a redirect instead
header('Connection: Close');
file_put_contents(file_get_contents("largest_file_in_the_world.txt"),"/tmp/test.txt");
?>

In addition, just sending the connection: close header wasn't enough, here's how the connection gets closed:
ignore_user_abort(true);
header("Connection: close", true);
header("Content-Length: 0", true);
ob_end_flush();
flush();
fastcgi_finish_request();
Source

Related

flush / ob_flush not working on remote server

Please Read Before you mark as CLONE to this
I am running xampp on my local machine and when testing by writing http://localhost/test.php I am getting the desired output i.e. printing 1 to 10 after each second. But as soon as I change it to my local IP http://10.70.52.75/test.php it loads for 10 seconds and gives output in one shot. I actually need this method to collect output from a PHP script which will be running for 10-15 minutes.
Alredy checked the php.ini for buffering as off.
apache_setenv('no-gzip', 1); //can comment this line
header('Content-Encoding: none');
header( 'Content-type: text/html; charset=utf-8' );
for ($i=0; $i<10; $i++) {
echo $i.'<br>';
flush();
ob_flush();
sleep(1);
}
Tried all the hacks available on stackoverflow
PHP Flush/ob_flush not working
Calling ob_flush() and flush(), yet browser doesn't show any output until script finishes
and much more but did not get the correct one. Please try to run the code on your Linux server or try on my setup
http://host-1-89.linuxzoo.net/test.php
ssh to this machine with root#linuxzoo.net
Password is "secure"
Let me know any one succeed in any way

nginx: trigger slow php-fpm process, but quickly return status 200

Is it possible for nginx to trigger a php-fpm process, but then close the nginx worker and quickly return an empty page with status 200?
I have some slow php processes that need kicking off a few times a week. They can take between 3 and 4 minutes each. I trigger them with a cron manager site. The php process writes a lock file at the start, and when the process is complete an email is sent and finally the lock file is removed.
Following this guide, in my php-fpm worker pool, I have this: request_terminate_timeout = 300 and in my nginx site config I have fastcgi_read_timeout 300;
It works, but I don't care about the on-screen result. And the cron service I use has a time limit of 5 seconds, and after repeated timeouts, it disables the job.
Yes, I know I could fork a process in php, let it run in the background, and return a 200 to nginx. And yes, I could pay and upgrade my cron service. Nonetheless, it would be an interesting and useful thing to know, anyway.
So, is this possible, or does php-fpm require an open and "live" socket? I ask that because on the "increase your timeout" page referred to above, one answer says
"Lowest of three. It’s line chain. Nginx->PHP-FPM->PHP. Whoever dies
first will break the chain".
In other words, does that mean that I can never "trigger" a process, but then close the nginx part of the trigger?
You can.
exec a php cli script by adding a trailing &, redirecting output to a log file or /dev/null, pass any parameters as json or serialized (use escapeshellarg()), the exec will return 0 immediately (no error); or
use php's ignore_user_abort(), send a Connection: close header, flush any output buffers as well as a normal flush(). Put any slow code after that. You'll need to test this under Nginx.
Either way, return a 1xx code to signify acceptance but no response. And it's up you to make sure your script doesn't run forever; give it a heartbeat so it touch()es a file every so often. If the file is old and it's still running, kill it.
Thanks to #Walf's answer combined with this example from the php site, this SO answer and a little fiddling, this appears to be a solution for nginx that requires no messing with any php or nginx ini or conf files.
$start = microtime(true);
ob_end_clean();
header("Connection: close\r\n");
header('X-Accel-Buffering: no');
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
echo ('Text user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
sleep(35); // simulate something longer than default 30s timeout
$time_elapsed_secs = microtime(true) - $start;
echo $time_elapsed_secs; // you will never see this!
Or, at least, it works perfectly for what I want it to do. Thanks for the answers.

Run PHP script without webpage waiting for it to finish

I have some basic website tracking software that sends a JSON object with jQuery AJAX from a webpage cross-domain to a server where the data is processed by a php script. This is triggered on window.onbeforeunload.
When benchmarking my php script I have realised that the client website on a different domain is still waiting for the php script to finish running before loading the next page. For example, a visitor to a client site navigates to another page. We send the JSON object cross domain to the server to process it. If I add sleep(30); to my php script the client website will not load the next page until this php script finishes (30+ seconds).
I do not need to return any values after running this script so how can I ensure this php script runs without having any impact on the client site?
I hope I've explained myself well enough. Ask any questions if I haven't, thanks.
SOLUTION:
This is what worked for me (http://php.net/manual/en/features.connection-handling.php#93441):
ob_end_clean();
header("Connection: close\r\n");
header("Content-Encoding: none\r\n");
ignore_user_abort(true); // optional
ob_start();
echo ('Text user will see');
$size = ob_get_length();
header("Content-Length: $size");
ob_end_flush(); // Strange behaviour, will not work
flush(); // Unless both are called !
ob_end_clean();
//do processing here
sleep(5);
echo('Text user will never see');
//do some processing
For PHP-fpm:
To close connection with client (send response), but continue running script and processing some data, this function can help you - fastcgi_finish_request
For apache:
See this link - close a connection early

Simultaneous connections to single browser through PHP and Apache2

I have at the moment an AJAX request going to sendMail.php, it closes the connection immediately(using the header Connection: Close) and continues processing for approx 30 seconds.
But the problem I'm experincing now is that when that same client trys to load any PHP page from that server, it has to wait until sendMail.php has finished processing.
Is there any way around this?
I was reading on some other SO questions that it may be session related, but I'm not using any sessions, I even tried calling session_write_close() at the start of the sendMail.php script.
Example code (this is hacky and over done, but it works):
//File: SendMail.php
//error_reporting(E_ALL);
error_reporting(0);
session_write_close();
set_time_limit(0);
ignore_user_abort(1);
ignore_user_abort(true);
ini_set('ignore_user_abort','1');
apache_setenv('no-gzip', 1);
apache_setenv('KeepAlive',0);
ini_set('zlib.output_compression', 0);
ini_set('output_buffering', 0);
$size = ob_get_length();
// send headers to tell the browser to close the connection
header("Content-Length: $size");
header('Connection: Close');
// flush all output
ignore_user_abort(true);
ob_end_flush();
ob_flush();
flush();
sleep(30);//The real code has more stuff, but just for example lets just say it sleeps for 30 seconds
The rest of the referenced material is a normal navigation via GET.
It almost sounds like your AJAX call is calling a script that is chewing up your available resources on the server (whether that be memory/CPU/Disk I/O/etc) and that is preventing any new scripts from being able to run. Double check to make sure that you have enough resources allocated to get the job done.
On a side note, typically you don't allow your everyday user to spawn processes that run that long on the server simply because you're opening up the opportunity for them to overload your system and give them an easy opportunity for a DDOS attack. I'd consider Ibere's suggestion of moving this to a cron if that's a possibility.

PHP loop that runs beyond browser disconnection?

I want to have a simple PHP script that loops to do something every ten minutes. It would be hosted offsite, and I would activate it via my browser. I don't have access to the server other than my web space, so 'cron' as such isn't an option.
(I'm happy to have this stop after a certain time or number of job cycles. I just need it to continue running after I point the browser away from the page script.)
Is such a thing possible? Thanks.
It's possible, see ignore_user_abort():
set_time_limit(0);
ignore_user_abort(true);
while (true) // forever
{
// your code
}
You can use this two functions with a combination of sleep(), usleep(), time_nanosleep() or even better - time_sleep_until() to achieve a CRON-like effect.
PHP scripts timeout after a certain amount of time - they're not designed for long-running programs. You'll have to find some way to prod it every ten minutes.
Have a look at set_time_limit.
This is from the above page:
You can do set_time_limit(0); so that the script will run forever - however this is not recommended and your web server might catch you out with an imposed HTTP timeout (usually around 5 minutes).
Maybe you can write another script on a computer which you have access and then make that script request the other one periodically.
Yop can look at pnctl_fork.
Here's a hack for your problem:
// Anything before disconnecting, but nothing to be output to the client!
ob_end_clean();
header('Connection: close');
ob_start();
// Here you can output anything before disconnecting
echo "Bla bla bla";
$outsize = ob_get_length();
header('Content-Length: '.$outsize);
ob_end_flush();
flush();
// Do your background processing here
// and feel free to quit anytime you want.
A way to do this might be to launch a new php process from the web page, e.g.
<?php
exec("php script_that_runs_for_a_while.php > /dev/null");
?>
Adding the /dev/null means (on a linux system) that your script will complete, rather than waiting for the execution to finish.
So then that script that launches can do whatever it likes, since it is basically just a new process running on the server.
Note that at the start of your long running script, you will want to use the set_time_limit function to set the max execution time to some large value.

Categories