Weird PHP error: exec() hangs sometimes on simple script - php

Heyas,
So this simple exec() script runs fine for the first two times, in trying to generate a PDF file from a webpage (using wkhtmltopdf).
It first) deletes the existing file, and second) creates the new PDF file in its place. If I run the script a second time, it deletes the file again, and then creates a new one, as expected. However, if I run it one more time, it deletes the files, creates a new one, but then the script seems to hang until the 30-second 504 timeout error is given. The script, when it works, only takes about 3 seconds to run/return. It also kills the entire server (any other local PHP sites no longer work). If I restart the PHP server, everything still hangs (with no success). Interestingly, if I run the script once, and then restart the PHP server, I can keep doing this without issue (but only generating the PDF up to two times). No PHP errors are logged.
Why would it be stalling out subsequent times?
$filePath = 'C:\\wtserver\\tmp\\order_' . $orderId . '.pdf';
// delete an existing file
if(file_exists($filePath)) {
if(!unlink($filePath)) {
echo 'Error deleting existing file: ' . $filePath;
return;
}
}
// generates PDF file at C:\wtserver\tmp\order_ID.pdf
exec('wkhtmltopdf http://google.com ' . $filePath);
I've tried a simple loop to check for the script's completion (successful output), and then try to exit, but it still hangs:
while(true) {
if(file_exists($filePath)) {
echo 'exit';
exit(); // have also tried die()
break;
}
//todo: add time check/don't hang
}
If I can't figure this bit out, for now, is there a way to kill the exec script, wrapping it somehow? The PDF is still generated, so the script is working, but I need to kill it and return a response to the user.
Solution:
Have to redirect standard output AND standard error, to end the process immediately, ie. in Windows:
exec('wkhtmltopdf http://google.com ' . $filePath . ' > NUL 2> NUL');

do you know that you can run the executable in background, like this
exec($cmd . " > /dev/null &");
This way you can immediately come out of it.

Related

php large file download timeout

first time posting so sorry if I get anything wrong.
I'm trying to create a secure file download storefront. Actually it works, but only with small file. I have a 1.9gb product to download and it keeps stopping partway through the transfer. Inconsistent sizes too, I've had up to 1gb, but often it is 200-500mb.
The aim is to create a space where only users with a registered account can download the file, so direct link is not possible.
I've read elsewhere on this site that resetting the script timeout within the file read loop should get around the script time limit.
try
{
$num_bytes = filesize ("products/" . $filename);
$mp3content = fopen("products/" . $filename, "rb") or die("Couldn't get handle");
$bytes_read=0;
if ($mp3content) {
while (!feof($mp3content)) {
set_time_limit(30);
$buffer = fread($mp3content, 4096);
echo $buffer;
$bytes_read+=4096;
}
fclose($handle);
}
}
catch (Exception $e)
{
error_log("User failed to download file: " . $row['FILENAME'] . "(" . $row['MIMETYPE'] . ")\n" . $e, 1, getErrorEmail());
}
error_log("Bytes downloaded:" . $bytes_read . " of " . $num_bytes, 1, getErrorEmail());
I don't receive the final error log email on large files that fail, but I do get the emails on smaller files that succeed, so I know the code works in principle.
Turns out my hosting is the issue. The PHP code is correct, but my shared hosting environment limits all php scripts to 30 seconds, which in the case of the code above, takes about 15 minutes to run its course. Unless someone can come up with a way of keeping PHP tied up in file handling methods which don't contribute to the timer, looks like this one is stuck.
Try this one
set_time_limit(0);
I had the same problem so I thought of a different approach.
When file is requested, I make a hard link of the file in a random named directory inside the "download" folder and give the user the link for 4 hours.
File url finishes being like this.
http://example.com/downloads/3nd83js92kj29dmcb39dj39/myfile.zip
Every call to the script parses the "download" folders and delete all folders and their contents that have over 4 hours of creation time to keep the thing clean.
This is not safe for brut force attacks, but can be worked around.

bash script to run sql query on wordpress db

I am having a weird problem. This is a 3 step scenario.
I have a code which downloads video on my ftp directory from Youtube from a given Youtube URL
So I have a code which issues the background command to bash script which downloads the heavy videos in background (on ftp directory)
Now, when the download is completed, the bash script will call a PHP file which updates an entry in a WordPress.
The problem
The video downloads fine on my ftp directory. And the bash script also works fine until calling my PHP file for updating db entry.
Here is my bash script code
#!/bin/bash
wget -o $1 --output-document=$2 $3 &
wait
/usr/bin/php ../cron/vids_pending_download.php $4
exit
This script is working fine and calls the PHP file which has this code.
require('../../wp-config.php');
require('../inc/inc_config.php');
$vid_key = trim($argv[1]);
#$vid_key = '123_video_key';
$sql_get_vids = "SELECT vid_id, vid_name, file_size, vid_usr FROM " . $wpdb->prefix . "video_table WHERE vid_name = '".$vid_key."' ";
$vid_info = $wpdb->get_row($sql_get_vids);
if ($vid_info != null) {
echo 'video found';
} else {
echo 'video not found';
}
Now the problem is, if I supply a fixed $vid_key to my sql, it works perfect. But if I bring the $vid_key from the array from bash, it brings empty result set. However if I print the sql and paste in phpMyAdmin, it brings the record fine which means the record is there.
Looking for help. Thanks everyone.
The problem is solved. The reason was the bash script was coming too fast in return and look for the next file. While it was doing that, the actual code which is inserting into db wasn't executed. So therefore the record for the return file was not available.
This issue was happening from the actual file which submits the command to bash file.
Thanks again to all for your help

Is there anyway to detect unix command "cat" finish merging files?

I was wondering, is there any way to detect cat function finish merging a file, so I can zip the files?
for example if I have simple case like this, folder with multiple files like this:
/var/www/testing/file1.iso0
/var/www/testing/file1.iso1
/var/www/testing/file1.iso2
/var/www/testing/file2.xls0
/var/www/testing/file2.xls1
basicly, using my web app, user will upload file1.iso and file2.xls, and using html5 I will slice each file and upload it in part which will result at the above structure. After all parts of each file finish uploading, I will use an ajax call to merge the file using cat unix command.
$command = '(files=(' . $list_file_copy . '); cat ' . '"${files[#]}" > ' . '"' . $file_name . '"' . '; rm ' . '"${files[#]}")';
something like this.
exec($cmd . " > /dev/null &");
My question is how can I know the file1.iso and file2.xls finish merging, so I can send another command to zip the file? as in file1.iso and file2.xls become filexxx.zip for example.
Please note that, the file could be multiple and each individual file can be huge (I will say 4GB at most, which is the reason why it will be slice 100MB to small part)
you could try and put a check for the exit code .......
when a command completes it does return a success/faliure indicator to the OS or calling prog....
in shell testing for the value of $? (would give you the exit code of the last command executed) , if thats a success it will return 0 as exit code - do the zip , else throw a warning or error as you like .
i do that in lot of my scripts.
hope this helps

Show output taken from shell_exec and display it in real time instead of after waiting 5-7min

Right now, I have code as follows.
$output = shell_exec( !-- unix commands are here --! );
echo $output;
I have a website where, upon the clicking of a particular button, the shell script is outputted and it is displayed on the browser. This is working perfectly. The only issue is that I can't see what's happening with the output until it is finished. I have to wait about 5-7 minutes, and then I see about a hundred lines of output. I am trying to push the output to the browser as the output executes -- I want to be able to see the output as its happening in real time (on the browser).
I've tried to use popen, proc_open, flush(), ob_start, etc. Nothing seems to be working. I just tried opening a text file, writing the contents of the output to the textfile, and reading the textfile incrementally on a loop. I'm a php beginner so it's possible that I haven't been using any of the above methods properly.
What is the simplest way to accomplish this?
Because PHP runs exec, system, pass_thru, etc in blocking mode, you are very limited in possibilities. PHP will require the code to finish executing before moving on throughout the script, unless you do something like add the following to your command:
"> /dev/null 2>/dev/null &"
Of course, this will halt the output of your command, but.. maybe something like:
exec('command > /cmd_file 2>/cmd_file &');
$file = fopen('/cmd_file', 'r');
while (!feof($file)) {
echo fgets($file);
sleep(1);
}
fclose($file);
Worth a shot.

Using filesize() PHP function in a Loop - Returns The Same Size

I've done a little bit of PHP coding and am familiar with aspects of it.
I have made a PHP script that runs as a cron job that will pull data from a database and if certain conditions are met, some information is written to a file.
Because there may be more than one result in the database, a loop is done to run through each result in the database.
Within that loop, I have another loop which will write data to a file. A cron job is then used to call this file every minute and run the contents in the bash script.
So, the PHP loop it setup to see if the file has anything written to it by using the filesize() function. If the filesize is not zero, then it will sleep for 10 seconds and try to read it again. Here is the code:
while(filesize('/home/cron-script.sh') != 0)
{
sleep(10);
}
Unfortunately, when the filesize is ran, it seems to place some kind of lock or something on the file. The cron job can execute the bash script without a problem and the very last command in the script is to zero out the file:
cat /dev/null > /home/cron-script.sh
But, it seems that once the while loop above is started, it locks in the original file size. As an example, I just simply put in the word "exit" in the cron-script.sh file and then ran through a test script:
while(filesize("/home/cron-script.sh") != 0)
{
echo "filesize: " . filesize("/home/cron-script.sh");
sleep(10);
}
The loop is infinite and will continue to show "filesize: 4" when I put in the word "exit". I will then issue the command at the terminal:
cat /dev/null > /home/cron-script.sh
Which will then clear the file while I have the test script above running. But, it continues to say the filesize is 4 and never returns to 0 - therefore making the PHP script run until the execution time limit is reached.
Could anyone give me some advice on how I can resolve this issue? In essence, I just need some way to reading the filesize - and if there is any kind of data in the file, it will need to loop through a sleep routine until the file is cleared. The file should clear within one minute (since the cron job calls that cron-script.sh file every minute).
Thank you!
From http://www.php.net/manual/en/function.filesize.php
Note: The results of this function are cached. See clearstatcache() for more details.
To resolve this, remember to call clearstatcache() before calling filesize():
while(filesize("/home/cron-script.sh") != 0)
{
echo "filesize: " . filesize("/home/cron-script.sh");
sleep(10);
clearstatcache();
}
The results of filesize are cached.
You can use clearstatchace to clear the cache on each iteration of the loop.

Categories