Physical memory warning - php

<?php
file_put_contents("10gb.zip", fopen("http://website.website/10GB.zip", 'r'));
echo "File Downloaded!";
I am using this code to download files from url to my server. But when I run my code My hosing servers memory turn into red! -_- and my download stuck at 3.79 GB.
Is there any limitation to download big files? i want to download more than 50 GB with 5 process! Is it possible?

i would go for streaming when dealing with large file rather than copying them directly
from the example provided here : http://php.net/manual/en/function.stream-copy-to-stream.php
you can try :
<?php
function pipe_streams($in, $out)
{
while (!feof($in))
fwrite($out,fread($in,8192));
}
pipe_streams("http://website.website/10GB.zip", "10gb.zip");
?>
or use curl (http://php.net/manual/en/book.curl.php) :
<?php
$url = "http://website.website/10GB.zip";
$path = "10gb.zip";
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
check this https://www.sitepoint.com/performant-reading-big-files-php/ for more streaming options

Related

PHP: Get metadata of a remote .mp3 file

I am looking for a function that gets the metadata of a .mp3 file from a URL (NOT local .mp3 file on my server).
Also, I don't want to install http://php.net/manual/en/id3.installation.php or anything similar to my server.
I am looking for a standalone function.
Right now i am using this function:
<?php
function getfileinfo($remoteFile)
{
$url=$remoteFile;
$uuid=uniqid("designaeon_", true);
$file="../temp/".$uuid.".mp3";
$size=0;
$ch = curl_init($remoteFile);
//==============================Get Size==========================//
$contentLength = 'unknown';
$ch1 = curl_init($remoteFile);
curl_setopt($ch1, CURLOPT_NOBODY, true);
curl_setopt($ch1, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch1, CURLOPT_HEADER, true);
curl_setopt($ch1, CURLOPT_FOLLOWLOCATION, true); //not necessary unless the file redirects (like the PHP example we're using here)
$data = curl_exec($ch1);
curl_close($ch1);
if (preg_match('/Content-Length: (\d+)/', $data, $matches)) {
$contentLength = (int)$matches[1];
$size=$contentLength;
}
//==============================Get Size==========================//
if (!$fp = fopen($file, "wb")) {
echo 'Error opening temp file for binary writing';
return false;
} else if (!$urlp = fopen($url, "r")) {
echo 'Error opening URL for reading';
return false;
}
try {
$to_get = 65536; // 64 KB
$chunk_size = 4096; // Haven't bothered to tune this, maybe other values would work better??
$got = 0; $data = null;
// Grab the first 64 KB of the file
while(!feof($urlp) && $got < $to_get) { $data = $data . fgets($urlp, $chunk_size); $got += $chunk_size; } fwrite($fp, $data); // Grab the last 64 KB of the file, if we know how big it is. if ($size > 0) {
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RESUME_FROM, $size - $to_get);
curl_exec($ch);
// Now $fp should be the first and last 64KB of the file!!
#fclose($fp);
#fclose($urlp);
} catch (Exception $e) {
#fclose($fp);
#fclose($urlp);
echo 'Error transfering file using fopen and cURL !!';
return false;
}
$getID3 = new getID3;
$filename=$file;
$ThisFileInfo = $getID3->analyze($filename);
getid3_lib::CopyTagsToComments($ThisFileInfo);
unlink($file);
return $ThisFileInfo;
}
?>
This function downloads 64KB from a URL of an .mp3 file, then returns the array with the metadata by using getID3 function (which works on local .mp3 files only) and then deletes the 64KB's previously downloaded.
Problem with this function is that it is way too slow from its nature (downloads 64KB's per .mp3, imagine 1000 mp3 files.)
To make my question clear : I need a fast standalone function that reads metadata of a remote URL .mp3 file.
This function downloads 64KB from a URL of an .mp3 file, then returns the array with the metadata by using getID3 function (which works on local .mp3 files only) and then deletes the 64KB's previously downloaded. Problem with this function is that it is way too slow from its nature (downloads 64KB's per .mp3, imagine 1000 mp3 files.)
Yeah, well what do you propose? How do you expect to get data if you don't get data? There is no way to have a generic remote HTTP server send you that ID3 data. Really, there is no magic. Think about it.
What you're doing now is already pretty solid, except that it doesn't handle all versions of ID3 and won't work for files with more than 64KB of ID3 tags. What I would do to improve it to is to use multi-cURL.
There are several PHP classes available that make this easier:
https://github.com/jmathai/php-multi-curl
$mc = EpiCurl::getInstance();
$results[] = $mc->addUrl(/* Your stream URL here /*); // Run this in a loop, 10 at a time or so
foreach ($results as $result) {
// Do something with the data.
}

php function file_get_contents() gets only few KB of remote file

Hello i want to download remote zip, which is about 8 MB big. I wrote simple script
set_time_limit(0);
$zip = file_get_contents('http://web.tld/folder/download/getfile.do?filename=file.zip&_lang=Lang');
file_put_contents('zip_files/file.zip',$zip);
it works but stored file is not 8 MB but only 52 KB.
Its same if i use
set_time_limit(0);
$url = 'http://web.tld/folder/download/getfile.do?filename=file.zip&_lang=Lang';
$path = 'zip_files/file.zip';
/* get and save remote data without exceeding php memory limit */
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
so maybe i have to use some stream option ?! Thank you
ps: i tried Snoopy library (http://sourceforge.net/projects/snoopy/) and its also same, only 52KB :P
include "libs/Snoopy-2.0/Snoopy.class.php";
$snoopy = new Snoopy;
$snoopy->submit($url);
print $snoopy->results;
Look inside saved file (use any text editor) it's possible to see not zip, just a page with wrong URL or something.

PHP download a file through direct link and save it on my server

I'm trying to run PHP script (from a Linux server) that will download a file through direct download link and save it on my server.
here is the script I'm using:
<?php
$url = 'http://download.maxmind.com/app/geoip_download?edition_id=108&date=20131015&suffix=zip&license_key=XXXXXXXXXXX';
$path = '/apps/test/';
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
for some reason it doesn't work for me, any suggestions ?
You need to verify that the ports are open on your firewall and use the below command:
(this will also download the file in the original format)
shell_exec("wget -P /apps/birst/php_test_scripts/ --content-disposition "."'"."https://download.maxmind.com/app/geoip_download?edition_id=108&suffix=zip&license_key=XXXXXXXX"."'");
Why dont you just use :
shell_exec("wget -P /target/directory/ http://download.link.com/download.zip");
Try this
$url = 'http://download.maxmind.com/app/geoip_download?edition_id=108&date=20131015&suffix=zip&license_key=XXXXXXXXXXX';
$path = '/apps/test/';
$filepath = $path .'file.zip';
$data = file_get_contents($url);
file_put_contents($filepath, $data);

fopen alternative for FPDF (PHP)

I am using FPDF to extract info from a PNG file. Unfortunately, the server has fopen disabled. Can anyone recommend a good way of getting around this? Any help would be much appreciated. Thanks in advance!
function _parsepng($file)
{
// Extract info from a PNG file
$f = fopen($file,'rb');
if(!$f)
$this->Error('Can\'t open image file: '.$file);
$info = $this->_parsepngstream($f,$file);
fclose($f);
return $info;
}
You can try curl but usually if your hosting company disables one they also disable the other.
I really don't know if it can work :-)
but you may try with fsockopen
http://www.php.net/manual/en/function.fsockopen.php
file:// can be used as protocol
hope this helps
Ended up using a cURL workaround. Thanks everyone for the input!
function _parsepng($file)
{
$ch = curl_init($file);
$fp = fopen('/tmp/myfile.png', 'wb');
curl_setopt($ch, CURLOPT_HEADER, false);
curl_setopt($ch, CURLOPT_FILE, $fp);
$image_data = curl_exec($ch);
curl_close($ch);
// Extract info from a PNG file
$f = fopen('/tmp/myfile.png','rb');
if(!$f)
$this->Error('Can\'t open image file: '.$file);
$info = $this->_parsepngstream($f,$file);
fclose($f);
return $info;
}

PHP server to server file request

I have script-1 on server A, where user ask for a file.
I have script-2 on server B (the file repository) where I check that user can access it and return the correct file (I'm using Smart File Download http://www.zubrag.com/scripts/download.php).
I've tried cURL and file_get_contents, I've changed Content Header in various ways, but I wasn't still able to download the file.
This is my request:
$request = "http://mysite.com/download.php?f=test.pdf";
and it works fine.
What should I call in script-1 to force the file be downloaded?
Some of my tries
This works, but I don't know how to handle unauthorized or broken downloads
header('Content-type: application/pdf');
$handle = fopen($request, "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
This prints the pdf code (not the text) straight in the browser (I think it's a header problem):
$c = curl_init();
curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($c, CURLOPT_URL, $request);
$contents = curl_exec($c);
curl_close($c);
if ($contents) return $contents;
else return FALSE;
This generate a white page
file_get_contents($request);
To force download, add
header('Content-disposition: attachment');
But Note, that it's not in HTTP 1.1 spec anymore, see Uses of content-disposition in an HTTP response header first answer
Without your code I don't know what you've tried, but you need to get the contents of the file via cURL and then save it to your server. Something like...
$url = 'http://website.com/file.pdf';
$path = '/tmp/file.pdf';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$contents = curl_exec($ch);
curl_close($ch);
file_put_contents($path, $contents);
If you want to downloads file from the FTP server you can use php File Transfer Protocol (FTP) extension. Please find below code:
<?php
$SERVER_ADDRESS="";
$SERVER_USERNAME="";
$SERVER_PASSWORD="";
$conn_id = ftp_connect($SERVER_ADDRESS);
// login with username and password
$login_result = ftp_login($conn_id, $SERVER_USERNAME, $SERVER_PASSWORD);
$server_file="test.pdf" //FTP server file path
$local_file = "new.pdf"; //Local server file path
##----- DOWNLOAD $SERVER_FILE AND SAVE IT TO $LOCAL_FILE--------##
if (ftp_get($conn_id, $local_file, $server_file, FTP_BINARY)) {
echo "Successfully written to $local_file\n";
} else {
echo "There was a problem\n";
}
ftp_close($conn_id);
?>
Download the file with curl, then check this: http://php.net/function.readfile
It shows how to force download.
SOLVED
I ended by simply redirect the request with:
header("Location: $request");

Categories