Error: 504 Gateway Time-out [duplicate] - php

I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?

Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)

if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();

If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.

Use php's system() to call a shell script which starts a service/background task.

Related

Run script on other Server

I have 2 websites, hosted on 2 different servers. They are kind of interlinked. Sometimes I just do stuff on Website-1 and run a script on Website-2. Like I edited something on Website-1 and now I want to run a script on Website-2 to update accordingly on it's server.
Till now I am using following code on website 1.
$file = file_get_contents('Website-2/update.php');
But the problem with this is that my Website-1 server script stops running and wait for the file to return some data. And I don't wanna do anything with that data. I just wanted to run the script.
Is there a way where I can do this in a better way or tell PHP to move to next line of code.
If you want to call the second site without making your user wait for a response,
I would recommend using a message queue.
Site 1 request would put a message to the queue.
Cron job to check queue and run update on site 2 when message exists.
Common queues apps to look at:
[https://aws.amazon.com/sqs/?nc2=h_m1][1]
[https://beanstalkd.github.io/][2]
[https://www.iron.io/mq][3]
[1]: https://aws.amazon.com/sqs/?nc2=h_m1
[2]: https://beanstalkd.github.io/
[3]: https://www.iron.io/mq
What you're trying to achieve is called a web hook and should be implemented with proper authentication, so that not anybody can execute your scripts at any time and overload your server.
On server 2 you need to execute your script asynchronously via workers, threads, message queues or similar.
You can also run the asynchronous command on your server 1. There are many ways to achieve this. Here are some links with more on this.
(Async curl request in PHP)
(https://segment.com/blog/how-to-make-async-requests-in-php/)
Call your remote server as normal. But, In the PHP script you normally call, Take all the functionality and put it in a third script. Then from the old script call the new one with (on Linux)
exec('php -f "{path to new script}.php" $args > /dev/null &');
The & at the end makes this a background or non-blocking call. Because you call it from the remote sever you don't have to change anything on the calling server. The php -f runs a php file. The > /dev/null sends the output from that file to the garbage.
On windows you can use COM and WScript.Shell to do the same thing
$WshShell = new \COM('WScript.Shell');
$oExec = $WshShell->Run('cmd /C php {path to new script}.php', 0, false);
You may want to use escapeshellarg on the filename and any arguments supplied.
So it will look like this
Server1 calls Server2
Script that was called (on Server2) runs exec and kicks off a background job (Server2) then exits
Server1 continues as normal
Server2 continues the background process
So using your example instead of calling:
file_get_contents('Website-2/update.php');
You will call
file_get_contents('Website-2/update_kickstart.php');
In update_kickstart.php put this code
<?php
exec('php -f "{path}update.php" > /dev/null &');
Which will run update.php as a separate background (non-blocking) call. Because it's non-blocking update_kickstart.php will finish and return to searver1 which can go about it's business and update.php will run on server2 independantly
Simple...
The last note is that file_get_contents is a poor choice. I would use SSH and probably PHPSecLib2.0 to connect to server2 and run the exec command directly with a user that has access only to that file(Chroot it or something similar). As it is anyone can call that file and run it. With it behind a SSH login it's protected, with it Chrooted that "special" user can only run that one file.

Alerting php when new event occurs in linux debian

I have a daemon program that prints in the terminal when new device is plugged or removed, now i want it to be printed in php like the way it was printed in linux. it's like realtime output. when a new device is plugged in linux it will alert php without you clicking any button it just prints in the screen. what my daemon program prints in linux also php prints.
I also have another program which scan devices but not daemon i can get it's output without a problem and prints it in php.
How am i supposed to make a real time output with my daemon program in php?
Thanks,
Comments becoming long so I add a post here.
First off the redirection of stderr and stdout to file by ~$ my-daemon >> my_logfile 2>&1 - unless your daemon has a log-file option.
Then you could perhaps use inotifywait with the -m flag on modify events (if you want to parse/do something on system outside PHP, i.e. by bash.)
Inotify can give you notification on various changes - This is i..e a short few lines of a bash script I use to check for new files in a specific directory:
notify()
{
...
inotifywait -m -e moved_to --format "%f" "$path_mon" 2>&- |
awk ' { print $0; fflush() }' |
while read buf; do
printf "NEW:[file]: %s\n" "$buf" >> "$file_noti_log"
blah blah blah
...
done
}
What this does is: each time a file get moved to $path_mon the script enters inside the while loop and perform various actions defined by the script.
Haven't used inotify on PHP but this looks perhaps like what you want:
inotify_init (separate module in PHP).
inotify check various events in one or several directories, or you can target a specific file. Check man inotifywait or inotify. You would most likely want to use the "modify" flag, "IN_MODIFY" under PHP: Inotify Constants.
You could also write your own in C. Haven't read this page, but IBM's pages use to be quite OK : Monitor file system activity with inotify
Another option could be to use PCNTL or similar under PHP.
it will alert php without you clicking any button
So you're talking about client side PHP.
The big problem is alerting the client browser.
For short lengths of time you could ignore the problem and just disable all buffering and send the daemon output to the browser. It's neither elegant nor really working in the long run, and it has... aesthetic issues. Moreover, you can't really manipulate the output client side at all, not easily or cleanly at least.
So you need to have a program running on the client, which means Javascript. The JS and the PHP programs must communicate, and PHP must also talk to the daemon, or at least monitor what it's doing.
There are ways of doing the first using Web Sockets, or maybe multipart-x-mixed-replace, but they're not very portable yet.
You could refresh the Web page but that's wasteful, and slow.
The problem of getting the notification to the client browser is then, in my opinion, best solved with an AJAX poll. You don't get an immediate alert, but you do get alerted within seconds.
You would send a query to PHP from AJAX every, say, 10 seconds (10000 ms)
function poll_devices() {
$.ajax({
url: '/json/poll-devices.php',
dataType: 'json',
error: function(data) {
// ...
},
success: function(packet) {
setTimeout(function() { poll_devices(); }, 10000);
// Display packet information
},
contentType: 'application/json'
});
}
and the PHP would check the accumulating log and send the situation.
Another possibility is to have the PHP script block up to 20 seconds, not enough to make AJAX time out and give up, and immediately return in case of changes. You would then employ an asynchronous AJAX function to drive the poll back-to-back.
This way, the asynchronous function starts and immediately goes to sleep while the PHP script is sleeping too. After 20 seconds, the call returns and is immediately re-issued, sleeping again.
The net effect is to keep one connection constantly open, and changes being echoed back to client side Javascript immediately. You have to manage connection interruptions, though. But this way, every 20 seconds you only issue one call, and still manage to be alerted almost instantly.
Server side PHP can check the log file's size at the start (last read position being saved in the session), and keep it open read only in shared mode and block reads with fgets(), if the daemon allows it.
Or you could pipe the daemon to logger, and get messages to syslog. Configure syslog to send those messages to a specific unbuffered file readable by PHP. Now PHP should be able to do everything with fopen(), ftell() and fgets(), without requiring additional notification systems.

How to run shell script with live feedback from PHP?

How would I execute a shell script from PHP while giving constant/live feedback to the browser?
I understand from the system function documentation:
The system() call also tries to automatically flush the web server's
output buffer after each line of output if PHP is running as a server
module.
I'm not clear on what they mean by running it as a 'server module'.
Example PHP code:
<?php
system('/var/lib/script_test.sh');
Example shell code:
#!/bin/bash
echo "Start..."
for i in {1..10}
do
echo "$i..."
sleep 1
done
echo "Done."
What this does: It will wait about 10 seconds and then flush to the output buffer.
What I want this to do: Flush to the output buffer after each line of output.
This can be done using popen() which gives you a handle to the stdout of whatever process you open. Chunks of data can be sent to the client using ob_flush(), the data can be displayed using an XHR.
One option is to write to a file in the shell script, on each step to say where it's up to. On your web page, use an ajax call every X seconds/minutes. The ajax call will call a PHP script which reads the status file and returns the status or completed steps.
The advantage to this approach is the page live information will be available to multiple visitors, rather than just the one that actually initiated the shell script. Obviously that may or may not be desirable depending on your needs.
The disadvantage of course is the longer the ajax interval, the more out of date the update will be.

Not Waiting for Response from an AJAX Request

Suppose I make an AJAX HTTP Request from jQuery to a backend PHP script. The request is made, the PHP script starts running and doing its magic. Suppose I then change to another website, away from the site where the original AJAX Request was made. As well, I do this before the PHP script finishes and has time to do a HTTP Response back. Does the PHP script finish running and doing its thing even though I've switched to another website before I got the HTTP Response?
So the order is this.
I'm on website www.xyz.com
I have a jQuery handler that kicks off an AJAX request to blah.php
blah.php starts running
I go to website www.abc.com soon after without waiting for a response from blah.php
What's going on with blah.php? Is execution still going on? Did it stop? I mean it didn't get a chance to respond so...
This may depend on your server configuration, but in general the script will continue to execute despite a closed HTTP connection.
I have tested this with Apache 2 + PHP 5 as mod_php. I would expect similar behaviour with PHP as CGI and with other webservers but do not know for certain.
The best way to determine for certain on your configuration is, as #tdammers suggests: set up a test script something like the following and monitor the log.
<?php
error_log('Test script started.');
for ($i = 1; $i < 13; $i++) {
sleep(10);
error_log('Test script got to ' . (10 * $i) . ' seconds.');
}
error_log('Test script got to the end.');
?>
Access this script (at /test.php or whatever) then before you get any results, hit stop on your browser. This is equivalent to navigating away before your XHR returns. You could even have it as the target of an XHR and navigate away.
Then check your error log: you should have a start and then messages every 10 seconds for two minutes and an end. You can modify how high $i gets to ensure your script will reach its anticipated maximum execution time if you'd like to test that too.
You don't have to use error_log() - you could write to a file, or make some other persistent change on the server that can be checked without needing to keep the client connection open.
The script execution time may stop before then because of the max_execution_time php.ini directive - but in any case this should be distinct from when the webserver times out.
Try ignore_user_abort(true);
ignore_user_abort(true);
it should not abort proccessing of your code
You might want to check out the answers to This Question.
Basically when you make your ajax call to a php function which calls the exec() function as shown in the answers to that question, you'll get an ajax response almost immediately, since your php function doesn't actually need to process anything. This way, it shouldn't matter if the user leaves the page.
Here's a small example:
ajax call in html file: $.ajax({url: 'blah.php'});
blah.php file: exec('bash -c "exec nohup setsid php really_slow_script.php > /dev/null 2>&1 &"');
And then finally in really_slow_script.php, just include the actual code you want to run.
I successfully used this kind of logic to allow users to post an already uploaded video from their account on my website to youtube. (The video had to be sent to youtube, and since videos are generally large files, I didn't want the user to have to wait while the video was being uploaded to youtube)
Navigating away will trigger a disconnect message on the server. The implications of that entirely depends on what what your server has been configured to do.
By default, the server will be set up so that a disconnect will not interrupt the way that the program functions. It is possible, however, to make it so that a user disconnect will trigger the function which has been registered with register_shutdown_function, garbage collection will occur, and the script will terminate.
Because it is something which can be configured several different places, it might be easiest to just run a test, but this is a php.ini directive. If you want to configure this on a global level, you can set ignore_user_abort = Off in php.ini. If you want this on a site-specific level, you can use php_value ignore_user_abort off in the htaccess in the parent directory of the current site. Otherwise you can use ignore_user_abort(false);.
Of course, there is no guarantee on a shared server that you have control of htaccess or php.ini, so you might just need to use ignore_user_abort(false);.

Preventing a 504 Gateway Timeout with huge PHP script

I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.

Categories