I have an 80M zip file that I am uploading to 'ShareFile' through their REST API using PHP cURL. The script works fine when run via the browser or on my local Windows machine using the php7 CLI. However when I attempt to run the script via a cron job on our client's shared hosting using /usr/local/bin/ea-php71, the file and script suddenly stop running mid-upload.
I have set my email address to be notified of any and all cron output.
I have a set_error_handler error handler that echoes any received errors and attempts to log them to a file (this never gets called).
I have a register_shutdown_function that also never gets called.
Finally, I have a curl_setopt CURLOPT_PROGRESSFUNCTION callback to see what is happening with the upload - it writes the total upload bytes $uploadTotal and the current upload bytes $uploadCurrent to a log along with a timestamp.
(Edit) - Forgot to mention I've also got the verbose setting enabled CURLOPT_VERBOSE and am outputting stderr to a log file - but nothing relevant is displayed.
Number 4 has given me the most insight into the issue - when the script is run as a cron job the upload stops around:
upload progress: [UploadTotal: 87455948] [UploadCurrent: 34913772]
every time (roughly around 67 seconds).
Some final details ...
When uploading the 80MB file through cron - my error handler and shutdown function never get called, nor does the cron email from any echoed data. The script just seems to simply PAUSE forever (it's almost like the script is waiting for user input or something).
However:
If I run the script locally or change the upload file to a 5MB file - everything works as normal - all functions get called and I receive an email notifying that the script has finished running.
It looks to me like I'm hitting some kind of timeout through cron which, per everything I've read should not be the case. I'm looking for some ideas for what I could be running into or how to troubleshoot this one.
Thank you!
Related
I'm running a script which reads a lot of data from the database. The script itself gets triggered by another script. (To ensure the script will not run twice or more times) Unfortunately I keep running into a server timeout.I tried to solve it with
set_time_limit(0);
and
ini_set('max_execution_time', 0);
but that didn't do the trick.
What the scirpt does is, it generates a CSV file which then should be downloaded by the user ( via browser ).
The file itself has to be created just in time, so I cannot create it during night and push it or anything like this.
Is there something like a best-practise?
Since I cannot change the generating script itself, how can I ensure, the file gets generated and the user gets informed that the file is ready for download?
(I can't use mail)?
Thank you so much in advance.
I have a PHP Script in which it executes a batch(.bat) file using passthru() function. The output of batch file is printed via using echo statement.
This PHP Script works absolutely fine when hosted on Apache webserver, however the same PHP script produces 500.0 error on every alternate call, when hosted on IIS 7.5
I did some research and found out that if PHP script takes long time to execute, the browser gets unresponsive.
Hence, I edited the PHP script to write into a file like "Before executing batch file" and "After executing batch file".
As there 500.0 error was displayed, the file was still getting updated by above lines. This concludes that while the script is getting executed but browser is displaying 500.0
Is there any settings that can be tweaked in IIS?
This problem occurs only for IIS 7.5. When I use Apache it works like a charm.
I've had the exact same problem as you; executing a batch file via exec(), shell_exec(), etc, would result in an internal 500 server error every other time I refreshed the page.
I resolved this by removing all PAUSE commands from the batch file.
Make sure you don't have any breaks in the flow of the batch file. That is, if user input is required at any point during the execution of the batch script php will hang and the server will time out.
Hope this helps!
(I'd comment but I don't have 50 reputation)
Suppose I make an AJAX HTTP Request from jQuery to a backend PHP script. The request is made, the PHP script starts running and doing its magic. Suppose I then change to another website, away from the site where the original AJAX Request was made. As well, I do this before the PHP script finishes and has time to do a HTTP Response back. Does the PHP script finish running and doing its thing even though I've switched to another website before I got the HTTP Response?
So the order is this.
I'm on website www.xyz.com
I have a jQuery handler that kicks off an AJAX request to blah.php
blah.php starts running
I go to website www.abc.com soon after without waiting for a response from blah.php
What's going on with blah.php? Is execution still going on? Did it stop? I mean it didn't get a chance to respond so...
This may depend on your server configuration, but in general the script will continue to execute despite a closed HTTP connection.
I have tested this with Apache 2 + PHP 5 as mod_php. I would expect similar behaviour with PHP as CGI and with other webservers but do not know for certain.
The best way to determine for certain on your configuration is, as #tdammers suggests: set up a test script something like the following and monitor the log.
<?php
error_log('Test script started.');
for ($i = 1; $i < 13; $i++) {
sleep(10);
error_log('Test script got to ' . (10 * $i) . ' seconds.');
}
error_log('Test script got to the end.');
?>
Access this script (at /test.php or whatever) then before you get any results, hit stop on your browser. This is equivalent to navigating away before your XHR returns. You could even have it as the target of an XHR and navigate away.
Then check your error log: you should have a start and then messages every 10 seconds for two minutes and an end. You can modify how high $i gets to ensure your script will reach its anticipated maximum execution time if you'd like to test that too.
You don't have to use error_log() - you could write to a file, or make some other persistent change on the server that can be checked without needing to keep the client connection open.
The script execution time may stop before then because of the max_execution_time php.ini directive - but in any case this should be distinct from when the webserver times out.
Try ignore_user_abort(true);
ignore_user_abort(true);
it should not abort proccessing of your code
You might want to check out the answers to This Question.
Basically when you make your ajax call to a php function which calls the exec() function as shown in the answers to that question, you'll get an ajax response almost immediately, since your php function doesn't actually need to process anything. This way, it shouldn't matter if the user leaves the page.
Here's a small example:
ajax call in html file: $.ajax({url: 'blah.php'});
blah.php file: exec('bash -c "exec nohup setsid php really_slow_script.php > /dev/null 2>&1 &"');
And then finally in really_slow_script.php, just include the actual code you want to run.
I successfully used this kind of logic to allow users to post an already uploaded video from their account on my website to youtube. (The video had to be sent to youtube, and since videos are generally large files, I didn't want the user to have to wait while the video was being uploaded to youtube)
Navigating away will trigger a disconnect message on the server. The implications of that entirely depends on what what your server has been configured to do.
By default, the server will be set up so that a disconnect will not interrupt the way that the program functions. It is possible, however, to make it so that a user disconnect will trigger the function which has been registered with register_shutdown_function, garbage collection will occur, and the script will terminate.
Because it is something which can be configured several different places, it might be easiest to just run a test, but this is a php.ini directive. If you want to configure this on a global level, you can set ignore_user_abort = Off in php.ini. If you want this on a site-specific level, you can use php_value ignore_user_abort off in the htaccess in the parent directory of the current site. Otherwise you can use ignore_user_abort(false);.
Of course, there is no guarantee on a shared server that you have control of htaccess or php.ini, so you might just need to use ignore_user_abort(false);.
i have develop a eblast application
The program is use to send a email to some recipients
the recipients email will be grab from a xls file
and the program has set it to send 10 email each time and sleep 30 seconds
and use ob_flush(); and flush(); to out put the steam of the process and display in frontend
yesterday my client test it with 9000 recipients (it should take arround 10hours)
and he told me the program has stop, and i found the log file has mark that the program has stopped at 65XX emails,
that mean the program has already sent 6XXX email (arround 7hour)
and this problem will never happen in cron job,but only happen when exec though the web browser
my frd told me because it is all about long time sleep?
and he suggest to use cron job, however my application already has cron job to set,
the client just want to have a feature to send the email immediately
any other solution to solve? use php call a linux command and excu a php email sending script?
Long running processes in Apache or IIS are tricky. The problem is if anything happens like a restart of the webserver or a timeout you lose your work. You are better off keeping this simple and making into a cron job but if you are up for the challenge it is possible to get around.
I've gotten around occasional webserver restarts by saving the state of my process into a database and a script that continually hits the page to check if it's up and working. So when the long running process first loads it checks if it should be running and if it should continue a job or not. In your case that might be the line number of the excel file.
It ends up being lot of extra work and you need to be very careful. From the sounds of your project I would keep it simple by going the cron job route you mentioned.
My solution is, try to set your cronjob to run every minutes.
However, you should save the state of your cronjob so that it didn't run twice.
I usually do it this way (Note that this cron is intended to run every minute):
if(stat_check_file('cron.stat'))
{
die("Found CRON.STAT, Exit!");
}
else
{
stat_create_stat_file('cron.stat');
//do your long process here...
}
stat_delete_stat_file('cron.stat');
function stat_check_file($filename)
{
global $rootdir;
return file_exists($rootdir.'/'.$filename);
}
function stat_create_stat_file($filename){
global $rootdir;
touch($rootdir.'/'.$filename);
}
function stat_delete_stat_file($filename)
{
global $rootdir;
if(stat_check_file($filename))
{
#unlink($rootdir.'/'.$filename);
}
}
Now, on your cronjob, simply load the xls, run it and write log to either database / file.
then, on your panel, read that log and display it so that your client will see right now, there's xxx email sent and xxx email to go.
I have a PHP script that calls a .bat file using system(). The output is written to the screen and I derive some values from parsing this output. This is running on windows 2003 IIS server. PHP v5.2.0
Specifically I am using this script to launch an Amazon EC2 instance and assign an IP address to it. It has worked great for me so far but recently the problem started.
Here is the code
$resultBatTemp = system("cmd /C C:\Inetpub\ec2\my_batch_file_to_launch_instance.bat");
$resultBat = (string)$resultBatTemp;
$instanceId = substr($resultBat, 9, 10);
...
Once I have this instace Id I can run another batch file that calls associates an ip address with this instance. It would appear that the instance does get launched but I never get the output on the screen.
For some reason this has all stopped working, the page freezes and never refreshes. I also need to completely exit safari or mozilla otherwise all pages from the website fail to load. Only when I relaunch the browser can i view the website again. I've connected to the webserver that hosts these scripts and checked PHP error log but nothing shows there. I've opened a DOS prompt and entered the code from the bat file that way and it connects to amazon and launches the instance fine. Ive isolated this bit of code and removed the system command and the rest of the script runs fine, so it appears that the hold up is with outputting the results of the bat file.
Recently I have purchased a new domain name for the site so this script is running from this domain. Might this cause the problem?
thanks
------------------------------------------------UPDATE-----------------------------------------------
Well hope this helps someone, I didnt find out what was wrong but created a new PHP file with a simple system command that called a .bat file, and a non-existent .bat file expecting to get an error back but nothing - just the usual hang for ages. So I restarted IIS and this fixed the problem. Dont know what was wrong but that did the trick.
Maybe first check what the system() call returns. According to documentation it will return FALSE in case of failure. Also, including your my_batch_file_to_launch_instance.bat in the question might help in solving it.
Try using the passthru function
Also make sure that all your commands are safe use escapeshellarg() or escapeshellcmd() to ensure that users cannot trick the system into executing arbitrary commands.