I have a php shell script that downloads a large file, it would be a bit of a luxury to be able to see the progress of the download in shell while it's happening, anyone have any idea how this can be achieved (or at least point me in the right direction!)
Thanks!
You could try to poll the filesize of the downloaded file as it's downloading, and compare it with the filesize of the file you requested.
This seemed to work (unexpectedly!)
echo "wget '$feedURL'\n";
$execute = "wget -O ".$filePath." '$feedURL'\n";
$systemOutput = shell_exec($execute);
$systemOutput = str_replace( "\n", "\n\t", $systemOutput);
echo "\t$systemOutput\n";
Read the header of the file to get the size of the file (if that information is available). Then keep track of how much you have downloaded and that will give you your percentage.
How you might do that depends on what libraries/functions you are using.
Related
We are using dUnzip2 in our script to unzip files before download, write license then use zip.lib to zip it and serve. But the dUnzip2 is using:
foreach ($f as $file_row => $file)
which works fine for small files , but for files larger than 10 mb it should use something like
for($n = 1;$n < count($f);$n++){
$file = $f[$n];
}
which is causing memory limit issues on files that are bigger than 10MB. We have to increase memory limit on the server for that lib all the time. The script itself is HUGE and to be honest I would not dare on taking the task to modify it.
So do you know any other unzip library that would do the same job as dUnzip2 or better solution?
Why not use PHP's built in Zip stuff: http://www.php.net/manual/en/zip.examples.php
Obviously assuming it's enabled (usually is) do a phpinfo() to check.
i am not sure whether this is in the right section or not but i am building an file upload site and want to be able to scan the files on upload for viruses etc.. How would i be able to do this?
Any ideas to get me started?
Thanks
The clamav library has a PHP binding called php-clamav. You then can scan files for viruses from within your PHP code:
if ($_FILES['file']['size'] == 0 || !is_file($_FILES['file']['tmp_name']))
{
throw new Exception('Please select a file for upload!');
} else {
cl_setlimits(5, 1000, 200, 0, 10485760);
if ($malware = cl_scanfile($_FILES['file']['tmp_name']))
throw new Exception($malware.'(ClamAV version: '.clam_get_version(),')');
}
...
Another alternative is to install the Mod_Security web application firewall. It can be configured to scan all upload files for viruses using modsec-clamscan.
You could try something like the following using AVG:
Windows:
<?php
exec("avgscanx.exe /SCAN=filename.ext/");
$result = exec("echo %ERRORLEVEL%");
?>
Linux:
<?php
exec("avgscan filename.ext -a -H -c");
$result = exec("echo $?");
?>
Both platforms return the same error codes, allowing you to determine whether a scan was successful or not.
References:
http://www.avg.com/ww-en/faq.num-4443
http://www.avg.com/ww-en/faq.num-4441
http://www.avg.com/ww-en/faq.num-1854
http://www.avg.com/ww-en/faq.num-1759
It depends on your server configuration, but for example on linux, it's easy to install something like clam and access it through the command line. You can use something like php's exec() to run it.
You could also use VirusTotals public API. You can read more about it here. There is some PHP code available here.
This way you get a lot of scanners, and you don't have to run AV locally. On the other hand you'll have to wait a while for the result.
So, I am using a php program to read a file, make some changes and then write it to a new file. After that, I call gnuplot, using a system call:
system('cat sarx.conf | /usr/bin/gnuplot');
sarx.conf has the gnuplot commands to generate the plot. The problem is if run my php from the command line (its on a linux server) it generates the image and stores it on the disk. But when I do the same thing by running the php on my browser it generates the image and tries to spit it out on the browser without actually storing it on disk.
Things i tried:
I though i might have had issues with permission settings but it didn't help.
I also hard coded the path where I want the image to be in sarx.conf. That didn't help either.
I also tried looking for it in the tmp directory --- no luck!!
Does anyone have any ideas on how can I get this to work? I need to store this image on disk so that my website can grab it to show the plot later. Is there any php stuff which can grab the image and write it to disk?
There is a great LGPL-licensed PHP interface to gnuplot here: http://www.liuyi1.com/PHP-GNUPlot/
Here is how you could do something similar:
$my_file = tempnam();
$handle = popen('gnuplot', 'w');
fwrite($this->ph, "Run some gnuplot commands here\n");
fwrite($this->ph, "set term png\n");
fwrite($this->ph, "set output ".$my_file."\n");
fwrite($this->ph, "replot\n");
flush($handle);
pclose($handle);
header('Content-Length: '.filesize($my_file));
header('Content-Type: image/png');
print file_get_contents($my_file);
unlink($my_file);
Hi is there any possibility to download a zip file with curl and unzip it on the fly without to save the file ot the disc?
For example:
.
.
.
$resultZip = curl_exec($curl);
$result = some_unzip_way($resultZip);
Thanks!
Nik
php curl has a flag to unzip content if needed
curl_setopt($ch,CURLOPT_ENCODING, '')
see this answer
Its not super easy, php has zip functions, but require a file to exist. Look at the first comment on this page, the guy describes your exact scenario and gives some code:
http://php.net/manual/en/ref.zip.php
Perhaps Perl is a better choice for that kind of operation.
What I would like to script: a PHP script to find a certain string in loads of files
Is it possible to read contents of thousands of text files from another ftp server without actually downloading those files (ftp_get) ?
If not, would downloading them ONCE -> if already exists = skip / filesize differs = redownload -> search certain string -> ...
be the easiest option?
If URL fopen wrappers are enabled, then file_get_contents can do the trick and you do not need to save the file on your server.
<?php
$find = 'mytext'; //text to find
$files = array('http://example.com/file1.txt', 'http://example.com/file2.txt'); //source files
foreach($files as $file)
{
$data = file_get_contents($file);
if(strpos($data, $find) !== FALSE)
echo "found in $file".PHP_EOL;
}
?>
[EDIT]: If Files are accessible only by FTP:
In that case, you have to use like this:
$files = array('ftp://user:pass#domain.com/path/to/file', 'ftp://user:pass#domain.com/path/to/file2');
If you are going to store the files after you download them, then you may be better served to just download or update all of the files, then search through them for the string.
The best approach depends on how you will use it.
If you are going to be deleting the files after you have searched them, then you may want to also keep track of which ones you searched, and their file date information, so that later, when you go to search again, you won't waste time searching files that haven't changed since the last time you checked them.
When you are dealing with so many files, try to cache any information that will help your program to be more efficient next time it runs.
PHP's built-in file reading functions, such as fopen()/fread()/fclose() and file_get_contents() do support FTP URLs, like this:
<?php
$data = file_get_contents('ftp://user:password#ftp.example.com/dir/file');
// The file's contents are stored in the $data variable
If you would need to get a list of the files in the directory, you might want to check out opendir(), readdir() and closedir(), which I'm pretty sure supports FTP URLs.
An example:
<?php
$dir = opendir('ftp://user:password#ftp.example.com/dir/');
if(!$dir)
die;
while(($file = readdir($dir)) !== false)
echo htmlspecialchars($file).'<br />';
closedir($dir);
If you can connect via SSH to that server, and if you can install new PECL (and PEAR) modules, then you might consider using PHP SSH2. Here's a good tutorial on how to install and use it. This is a better alternative to FTP. But if it is not possible, your only solution is file_get_content('ftp://domain/path/to/remote/file');.
** UPDATE **
Here is a PHP-only implementation of an SSH client : SSH in PHP.
With FTP you'll always have to download to check.
I do not know what kind of bandwidth you're having and how big the files are, but this might be an interesting use-case to run this from the cloud like Amazon EC2, or google-apps (if you can download the files in the timelimit).
In the EC2 case you then spin up the server for an hour to check for updates in the files and shut it down again afterwards. This will cost a couple of bucks per month and avoid you from potentially upgrading your line or hosting contract.
If this is a regular task then it might be worth using a simple queue system so you can run multiple processes at once (will hugely increase speed) This would involve two steps:
Get a list of all files on the remote server
Put the list into a queue (you can use memcached for a basic message queuing system)
Use a seperate script to get the next item from the queue.
The procesing script would contain simple functionality (in do while loop)
ftp_connect
do
item = next item from queue
$contents = file_get_contents;
preg_match(.., $contents);
while (true);
ftp close
You could then in theory fork off multiple processes through the command line without needing to worry about race conditions.
This method is probabaly best suited to crons/batch processing, however it might work in this situation too.