Is there anyway to detect unix command "cat" finish merging files? - php

I was wondering, is there any way to detect cat function finish merging a file, so I can zip the files?
for example if I have simple case like this, folder with multiple files like this:
/var/www/testing/file1.iso0
/var/www/testing/file1.iso1
/var/www/testing/file1.iso2
/var/www/testing/file2.xls0
/var/www/testing/file2.xls1
basicly, using my web app, user will upload file1.iso and file2.xls, and using html5 I will slice each file and upload it in part which will result at the above structure. After all parts of each file finish uploading, I will use an ajax call to merge the file using cat unix command.
$command = '(files=(' . $list_file_copy . '); cat ' . '"${files[#]}" > ' . '"' . $file_name . '"' . '; rm ' . '"${files[#]}")';
something like this.
exec($cmd . " > /dev/null &");
My question is how can I know the file1.iso and file2.xls finish merging, so I can send another command to zip the file? as in file1.iso and file2.xls become filexxx.zip for example.
Please note that, the file could be multiple and each individual file can be huge (I will say 4GB at most, which is the reason why it will be slice 100MB to small part)

you could try and put a check for the exit code .......
when a command completes it does return a success/faliure indicator to the OS or calling prog....
in shell testing for the value of $? (would give you the exit code of the last command executed) , if thats a success it will return 0 as exit code - do the zip , else throw a warning or error as you like .
i do that in lot of my scripts.
hope this helps

Related

Weird PHP error: exec() hangs sometimes on simple script

Heyas,
So this simple exec() script runs fine for the first two times, in trying to generate a PDF file from a webpage (using wkhtmltopdf).
It first) deletes the existing file, and second) creates the new PDF file in its place. If I run the script a second time, it deletes the file again, and then creates a new one, as expected. However, if I run it one more time, it deletes the files, creates a new one, but then the script seems to hang until the 30-second 504 timeout error is given. The script, when it works, only takes about 3 seconds to run/return. It also kills the entire server (any other local PHP sites no longer work). If I restart the PHP server, everything still hangs (with no success). Interestingly, if I run the script once, and then restart the PHP server, I can keep doing this without issue (but only generating the PDF up to two times). No PHP errors are logged.
Why would it be stalling out subsequent times?
$filePath = 'C:\\wtserver\\tmp\\order_' . $orderId . '.pdf';
// delete an existing file
if(file_exists($filePath)) {
if(!unlink($filePath)) {
echo 'Error deleting existing file: ' . $filePath;
return;
}
}
// generates PDF file at C:\wtserver\tmp\order_ID.pdf
exec('wkhtmltopdf http://google.com ' . $filePath);
I've tried a simple loop to check for the script's completion (successful output), and then try to exit, but it still hangs:
while(true) {
if(file_exists($filePath)) {
echo 'exit';
exit(); // have also tried die()
break;
}
//todo: add time check/don't hang
}
If I can't figure this bit out, for now, is there a way to kill the exec script, wrapping it somehow? The PDF is still generated, so the script is working, but I need to kill it and return a response to the user.
Solution:
Have to redirect standard output AND standard error, to end the process immediately, ie. in Windows:
exec('wkhtmltopdf http://google.com ' . $filePath . ' > NUL 2> NUL');
do you know that you can run the executable in background, like this
exec($cmd . " > /dev/null &");
This way you can immediately come out of it.

clamdscan can't read from tmp directory

I was wondering what's wrong with my code, if I use clamscan, it works fine both reading from /tmp, or manually specified the path. but if I use clamdscan, any path from /tmp will result in error (the int result is 2). This is the code.
$command = 'clamdscan ' . escapeshellarg($_FILES['file']['tmp_name']);
$out = '';
$int = -1;
exec($command, $out, $int);
echo "\n" . $command;
echo "\n" . $out;
echo "\n This is int = " . $int;
if ($int == 0) {
// all good, code goes here uploads file as normal IE move to
//echo "File path : ".$file."Return code : ".cl_pretcode($retcode);
//echo "\n";
move_uploaded_file($_FILES["file"]["tmp_name"], "filesave/" . $_FILES["file"]["name"]);
echo "Stored in: " . "filesave/" . $_FILES["file"]["name"];
} else {
echo "\n FAILED";
}
based on above code, it will failed because $int = 2. But, if I change the command to
//some file that is saved already in the directory
$command = 'clamdscan ' . '/abc/abc.txt';
It works perfectly fine.
It only failed if the command is clamdscan. if I use clamscan, temp directory is fine
any idea?
You should really just use one of the many clamd clients out there instead of relying on exec'ing a command and parsing its output, that's super fragile and will bring you nothing but headache in the future. For example:
http://torquecp.sourceforge.net/phpclamcli.html
If you are the do-it-yourself type, the clamd wire protocol is super simple (http://linux.die.net/man/8/clamd) and you could potentially write up a simple client in a couple of hours. Again, the benefit here is that it's a well defined protocol and has some nice features like a streaming call that allows you to operate the clamd service and your webapp with completely difference security credentials (heck they can even run on different boxes).
I hope this helps.
Just a quick remark on using http://torquecp.sourceforge.net/phpclamcli.html as a supposedly better alternative to a DIY cli exec. The aforementioned php script does entirely rely on the syntax of the clamav response text as well.
Old question but I've just had the same issue. clamdscan runs as user clamav which doesn't have permission to files in /tmp. There is an additional parameter --fdpass which runs the command as the user running the script.
Using $command = 'clamdscan --fdpass' . escapeshellarg($_FILES['file']['tmp_name']); should run the command as the www user which will have access to the temporary file.

passing php variable to bash script that uses shflags

I am trying to make a PHP program triggered by a web submit tell a bash script to run with a single command line parameter. I am using the shflags command line parser for bash.
The pertinent part of the PHP script is as follows:
// generate unique filename
$destinationFolder = Mage::getBaseDir('media') . DS . 'webforms' . DS . 'xml';
$filename = $destinationFolder . DS . $result->getId().'.xml';
// create folder
if (!(#is_dir($destinationFolder) || #mkdir($destinationFolder, 0777, true))) {
throw new Exception("Unable to create directory '{$destinationFolder}'.");
}
// export to file
$xmlObject->getNode()->asNiceXml($filename);
// define parameters to pass
exec ( '/opt/bitnami/apache2/htdocs/sfb/scripts/xform.sh --xmlfile'.' '.$filename);
}
}
?>
The bash script (xform.sh) (just a test script) is as follows.
#!/bin/bash
. ./shflags
echo "foo" >> /opt/bitnami/apache2/htdocs/sfb/scripts/seeds/xform/$$".txt"
echo "foo" >> /opt/bitnami/apache2/htdocs/sfb/scripts/seeds/xform/foo.txt
DEFINE_string 'xmlfilename' 'test' 'filename of current x.xml file from webforms' 'Z'
FLAGS "$#" || exit 1
eval set -- "${FLAGS_argv}"
echo "xml file was" ${FLAGS_xmlfilename} >> /opt/bitnami/apache2/htdocs/sfb/scripts/seeds/xform/foo.txt
The bash script works correctly from the command line, i.e.
$xform.sh --xmlfilename 1.xml
writes "xml file was 1.xml" to the foo.txt file.
When the PHP script is triggered from the web, the first part works correctly, i.e. it writes "foo" to the two target files, foo.txt and $$.txt. However, the xmlfilename variable is not coming along, and I really need that file name to be passed to the command line! (Note I should not need to use escapeshellarg because the file name is generated by my PHP program, not by user input.)
I have checked all the file permissions I can think of. xform.sh and shflags are both members of the www-data (Apache) group, owned by Apache, and a+x.
My suspicions are that the problem is related either to a) my PHP exec syntax or b) file permissions. Everything works as intended except the bit after xform.sh in this line!
exec ( '/opt/bitnami/apache2/htdocs/sfb/scripts/xform.sh --xmlfile'.' '.$filename);
UPDATE:
I've narrowed the problem some more by isolating the problem with some test code. With:
$script="echo";
$xmlfilename="$filename";
$target=">> /opt/bitnami/apache2/htdocs/sfb/scripts/seeds/xform/foo.txt";
exec ("$script $xmlfilename $target");
...
PHP correctly writes the $filename to foo.txt, so $script works when value is "echo" and $filename works too.
When I set $script to a different simple form of the xform script that (only) writes the data to the file, that also works correctly.
So the problem is specifically with something that happen when PHP tries to write the $filename as a command line variable. Does a script run by Apache need more permissions than usual if it includes a command line variable?
Sigh.
In your exec() call you have the flag as --xmlfile but you are calling it from the command line as --xmlfilename

PHP, problem with exec...how do I make sure the execution is working?

I am uploading a video, which is supposed to generate three screenshot thumbnails. I have the same upload code running in both admin and front-end, but for some odd reason the thumb is only being generated when I upload from front end, and not from backend...
My directory structure
root/convert.php (this is the file running through exec call)
(the following two files are the upload files running in user-end and admin-end respectively)
root/upload.php
root/siteadmin/modules/videos/edit.php
I believe convert.php is not being run from admin-side for some reason. The command is something like:
$cmd = $cgi . $config['phppath']. ' ' .$config['BASE_DIR']. '/convert.php ' .$vdoname. ' ' .$vid. ' ' .$ff;echo $cmd;die;
exec($cmd. '>/dev/null &');
And echoing out the exec $cmd, I get this:
/usr/bin/php /home/testsite/public_html/dev/convert.php 1272.mp4 1272 /home/testsite/public_html/dev/video/1272.mp4
How do I make sure convert.php is being run?
EDIT: OK, now I am sure it is not being executed from admin-side, any ideas why?
http://php.net/manual/en/function.exec.php
"return_var" - If the return_var argument is present along with the output argument, then the return status of the executed command will be written to this variable.
Another way to determine if exec actually runs the convert.php file, add some debugging info in convert.php (e.g. write something to a file when the covert.php script starts).
Just an Idea
you could print "TRUE" in the convert script when it runs successfully.
don't add >/dev/null &
check the return value of exec
$value = exec($cmd);
if($value == 'TRUE')
// did run sucessfully
}
chmod 755 convet.php
you also make sure the first line of convert.php is:
#!/usr/bin/php
check the full path of php cli executable.
Also make sure convert.php las unix line ending ("\n")

php script dies when it calls a bash script, maybe a problem of server configuration

I have some problems with a PHP script that calls a Bash script.... in the PHP script is uploaded a XML file, then the PHP script calls a Bash script that cut the file in portions (for example, is uploaded a XML file of 30,000 lines, so the Bash script cut the file in portions of 10,000 lines, so it will be 3 files of 10,000 each one)
The file is uploaded, the Bash script cut the lines, but when the Bash script returns to the PHP script, the PHP script dies, & I dont know why... I tested the script in another server and it works fine... I dont believe that is a memory problem, maybe it is a processor problem, I dont know, I dont know what to do, what can I do??? (Im using the function shell_exec in PHP to call the Bash script)
The error only happens if the XML file has more than 8,000 lines, but if the file has less then 8,000 everything is ok (this is relative, it depends of the amount of data, of strings, of letters that contains each line)
what can you suggest me??? (sorry for my bad english, I have to practice a lot xD)
I leave the code here
PHP script (at the end, after the ?>, there is html & javascript code, but it doesnt appear, only the javascript code... basically the html is only to upload the file)
" . date('c') . ": $str";
$file = fopen("uploadxmltest.debug.txt","a");
fwrite($file,date('c') . ": $str\n");
fclose($file);
}
try{
if(is_uploaded_file($_FILES['tfile']['tmp_name'])){
debug("step 1: the file was uploaded");
$norg=date('y-m-d')."_".md5(microtime());
$nfle="testfiles/$norg.xml";
$ndir="testfiles/$norg";
$ndir2="testfiles/$norg";
if(move_uploaded_file($_FILES['tfile']['tmp_name'],"$nfle")){
debug("step 2: the file was moved to the directory");
debug("memory_get_usage(): " . memory_get_usage());
debug("memory_get_usage(true): " . memory_get_usage(true));
debug("memory_get_peak_usage(): " . memory_get_peak_usage());
debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true));
$shll=shell_exec("./crm_cutfile_v2.sh \"$nfle\" \"$ndir\" \"$norg\" ");
debug("result: $shll");
debug("memory_get_usage(): " . memory_get_usage());
debug("memory_get_usage(true): " . memory_get_usage(true));
debug("memory_get_peak_usage(): " . memory_get_peak_usage());
debug("memory_get_peak_usage(true): " . memory_get_peak_usage(true));
debug("step 3: the file was cutted. END");
}
else{
debug("ERROR: I didnt move the file");
exit();
}
}
else{
debug("ERROR: I didnt upload the file");
//exit();
}
}
catch(Exception $e){
debug("Exception: " . $e->getMessage());
exit();
}
?>
Test
function uploadFile(){
alert("start");
if(document.test.tfile.value==""){
alert("First you have to upload a file");
}
else{
document.test.submit();
}
}
Bash script with AWK
#!/bin/bash
#For single messages (one message per contact)
function cutfile(){
lines=$( cat "$1" | awk 'END {print NR}' )
fline="$4";
if [ -d "$2" ]; then
exsts=1
else
mkdir "$2"
fi
cp "$1" "$2/datasource.xml"
cd "$2"
i=1
contfile=1
while [ $i -le $lines ]
do
currentline=$( cat "datasource.xml" | awk -v fl=$i 'NR==fl {print $0}' )
#creates first file
if [ $i -eq 1 ]; then
echo "$fline" >>"$3_1.txt"
else
#creates the rest of files when there are more than 10,000 contacts
rsd=$(( ( $i - 2 ) % 10000 ))
if [ $rsd -eq 0 ]; then
echo "" >>"$3_$contfile.txt"
contfile=$(( $contfile + 1 ))
echo "$fline" >>"$3_$contfile.txt"
fi
fi
echo "$currentline" >>"$3_$contfile.txt"
i=$(( $i + 1 ))
done
echo "" >>"$3_$contfile.txt"
return 1
}
#For multiple messages (one message for all contacts)
function cutfile_multi(){
return 1
}
cutfile "$1" "$2" "$3" "$4"
echo 1
thanks!!!!! =D
You should probably use the split utility instead of most or all of your Bash script.
These are some comments on your code. It really needs a major overhaul.
To get the number of lines in a file:
lines=$( wc -l < "$1" )
You never use the variable exsts. Why not just do:
if [ ! -d "$2" ]; then
mkdir "$2"
fi
Calling awk separately for every single line in the file is extremely slow. Either use one AWK script to do all or most of the work or iterate over the lines in your file using a while loop:
while read -r currentline
do
process_line # this represents what needs to be done to each line
done < datasource.xml
In Bash you can increment an integer variable like this:
(( contfile++ ))
(( i++ ))
Your PHP script is calling your Bash script with only three arguments. Other than that I didn't really look at the PHP script.
Seriously, though, use split. It's as simple as:
split --lines 10000 --numeric-suffixes --suffix-length 4 datasource.xml "$3_"
for file in "$3_*"
do
echo "$fline" | cat - "$file" > "$file.txt" && rm "$file"
done
Here are some things I would look at (mainly to know more specifics about the problem):
Try substituting different bash scripts, one that does nothing, and one that splits a file of a large size. Does the PHP script die in one or both cases?
Check permissions on the servers. Are the scripts and files being granted the appropriate permissions? (Though this doesn't explain why small files work OK.)
Check the PHP and web server error logs. It may be that the error is appearing there rather than in the browser window. (And if so, include it with your question for additional help.)
Because it seems related to memory, you could try increasing settings in PHP that deal with memory usage for uploads, though I believe the error would occur during the upload, before the bash script call.
Hope these help, I am sorry I can't offer more specific ideas.

Categories