Php long script timeout with ajax call - php

I'm building a website for my office work which loads dynamically tables from an long PHP script. To do that, I've created a PHP file that handle AJAX and Get request (echo JSON from long PHP script).
When i use my code in local machine with a WAMP server all works good, of course I had to add this line ini_set('max_execution_time', 5000); to avoid the issue of timeout.
My problem is when i put my project on real Virtual Machine with FreeBSD environment that doesn't work, i mean after 600 sec the server kill my request (Fatal error: Uncaught exception 'HttpException' with message 'An error occurred : [503]).
But I note that if I run the PHP file directly from the console in command line all works good and the script echo my JSON request.
The detail of the script is essentially from for each and ended by echo JSON of processing loops.
So my question is how i can manage it ?
I was thinking about do something like that :
call the file script.php from ajax in command line
split the file script.php to avoid the timeout
Below, here is a scheme of the current process :

Related

Error: 504 Gateway Time-out [duplicate]

I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.

NGINX 403 forbidden with php

My server setup with NGINX and php.
I need to run a php script, that connecting two databases usig mysql_connect($host,$user,$pwd). Its a little bit long process and it takes large time to process all.
All connection are become success, but the script not giving result for complete operation, when i run that file it showing (processing circle is rotating), after some particular time stop execution and showing message "403 Forbidden"
Already set the following:
set_time_limit(0);
ini_set('mysql.connect_timeout', '0');
ini_set('max_execution_time', '0');
Is there any solution for this?
403 error means that your nginx not configured correctly.
If you need to execute long-running php script better to use CLI for this, for example
php script.php
or
nohup php script.php 2>&1 &

PHP hosted on IIS gives 500 on every alternate call

I have a PHP Script in which it executes a batch(.bat) file using passthru() function. The output of batch file is printed via using echo statement.
This PHP Script works absolutely fine when hosted on Apache webserver, however the same PHP script produces 500.0 error on every alternate call, when hosted on IIS 7.5
I did some research and found out that if PHP script takes long time to execute, the browser gets unresponsive.
Hence, I edited the PHP script to write into a file like "Before executing batch file" and "After executing batch file".
As there 500.0 error was displayed, the file was still getting updated by above lines. This concludes that while the script is getting executed but browser is displaying 500.0
Is there any settings that can be tweaked in IIS?
This problem occurs only for IIS 7.5. When I use Apache it works like a charm.
I've had the exact same problem as you; executing a batch file via exec(), shell_exec(), etc, would result in an internal 500 server error every other time I refreshed the page.
I resolved this by removing all PAUSE commands from the batch file.
Make sure you don't have any breaks in the flow of the batch file. That is, if user input is required at any point during the execution of the batch script php will hang and the server will time out.
Hope this helps!
(I'd comment but I don't have 50 reputation)

Running console program in PHP

I wrote a simple PHP code to execute a console program:
<?php
$cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');
exec($cmd);
?>
If I run the command on the console directly on the server, it works. However, when I run the PHP on the browser, it doesn't work. The process progName.exe is running (checked using Task Manager on the server), but it never finishes. This program is supposed to compute some parameters from the arguments and write the result to a binary file, and also produce a .WAV file. Here is the error message I get on the browser:
Error Summary
HTTP Error 500.0 - Internal Server Error
C:\php\php-cgi.exe - The FastCGI process exceeded configured activity timeout
Detailed Error Information
Module FastCgiModule
Notification ExecuteRequestHandler
Handler PHP
Error Code 0x80070102
Then I wrote a simple console program that write a sentence to a text file (writeTxt.exe hello.txt). Using the same PHP script, I ran it on the browser and it works.
I already tried to increase the timeout on the server, but still have the same error.
What could cause this problem?
When you execute a program in PHP using the exec function (e.g. exec('dir')), PHP waits until it is ended or you sent it to the background and PHP comes back directly (see documentation, especially the comments).
According to your posted PHP sources ($cmd = escapeshellcmd('progName.exe arg1 arg2 arg3');) the program is not sent to background by PHP - so what stays is that progName.exe...
...sends itself or a fork to the background (unlikely, but look into the sources of progName.exe)
...is waiting for input (<-- this is my favorite)
I missed something ;-)
As I said I bet it is the second option. Hope that helped a bit.

Preventing a 504 Gateway Timeout with huge PHP script

I'm currently running an Apache server (2.2) on my local machine (Windows) which I'm using to run some PHP scripts to take care of some tedious work. One of the scripts involves a ton of moving, resizing, and download / uploading files to another server. I would very much like the script to run constantly so that I don't have to baby the script by starting it up again every time it times out.
set_time_limit(0);
ignore_user_abort(1);
Both are set in my script, but after about 30mins to an hour the script stops and I get the 504 Gateway Time-out message in my browser. Is there something I missing in Apache or PHP to prevent the timeout? Or should I be running the script a different way?
Or should I be running the script a different way?
Definitely. You should run your script from command line (CLI)
if i should implement something like this i would you 2 different scripts:
A. process_controller.php
B. process.php
The workflow should be:
the user call the script A by using a browser
the script A start the script B by using a system() or exec() and pass to it a "process token" via command line.
the script B write the execution status into a shared space: a file named as the token, a database table. in general something that can be read also by the script A by using the token as reference
the script A contains an AJAX call, in polling, that ask to the script A the status of the process for a given token
Ajax polling:
<script>
var $myToken;
function ajaxPolling()
{
$.get('process_controller.php?action=getStatus&token='+$myToken, function(data) {
$('.result').html(data);
});
}
setInterval("ajaxPolling()",60*1000); //Every minute
</script>
there are some considerations about the communication between the 2 processes, depending on how many instances of the script B you would be able to run in parallel
Just one: you don't need a random/unique token
One per user: session_start(); $token = session_id();
More than one per user: session_start(); $token = session_id().microtime();
If you need to run it form your browser, You should make sure that there is not php execution limit in the php.ini file, but also that there is not limit set in mod_php (or what ever you are using) under apache.
Use php's system() to call a shell script which starts a service/background task.

Categories