How Can php know if the image has been loaded fully? - php

I wrote a PHP script which simply gets URL for images and tries to download/save them on server.
My problem here is that sometimes the image is not fully loaded and the codes blow only save images partially. I did some research but couldn't figure out if I can add something to it, so it can check whether it is saving the full image or not.
Images sizes and other properties of images are random so I can't check it with those factors unless there is a way that I can get those info before loading them image.
Thank you all.
if ( $leng >= "5" ) {
define('UPLOAD_DIR', dirname(__FILE__) . '/files/');
$length = 512000;
$handle = fopen($url, 'rb');
$filename = UPLOAD_DIR . substr(strrchr($url, '/'), 1);
$write = fopen($filename, 'w');
while (!feof($handle))
{
$buffer = fread($handle, $length);
fwrite($write, $buffer);
}
fclose($handle);
fclose($write);
}else {Echo "failed";}

I think that using cURL is a better solution than using fopen for url. Check this out:
$file = fopen($filename, 'wb'); //Write it as a binary file - like image
$c = curl_init($url);
curl_setopt($c, CURLOPT_FILE, $file);
curl_setopt($c, CURLOPT_FOLLOWLOCATION, true); //Allow cURL to follow redirects (but take note that this does not work with safemode enabled)
curl_exec($c);
$httpCode = curl_getinfo($c, CURLINFO_HTTP_CODE); //200 means OK, you may check it later, just for sure
curl_close($c);
fclose($file);
Partially based on Downloading a large file using curl

Related

Have PHP Upload to NextCloud folder

This may seem like a weird question, but I am trying to get my website to upload files that users input to a folder on my Nextcloud instance. So, that I may share it with others on my team.
I have not been able to find anything about how to do this. I have created a folder and created a share link for it.
The code that I am trying to use to do this is:
$file = $_FILES['fileToUpload'];
$fileName = $file['name'];
$fileTmpName = $file['tmp_name'];
$fileSize = $file['size'];
$fileError = $file['error'];
$fileType = $file['type'];
$fileExt = explode('.', $fileName);
$fileActualExt = strtolower(end($fileExt));
$allowed = array('pdf', 'docx', 'doc');
if (in_array($fileActualExt, $allowed)) {
if ($fileError === 0) {
if ($fileSize < 1000000) {
$c = curl_init();
curl_setopt($c, CURLOPT_URL, "https://files.devplateau.com/nextcloud/index.php/s/jfbavtp4q6UWNlR");
curl_setopt($c, CURLOPT_USERPWD, "username:password");
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_PUT, true);
//curl_setopt($c, CURLOPT_INFILESIZE, filesize($file));
$fp = fopen($file, "r");
curl_setopt($c, CURLOPT_INFILE, $fp);
curl_exec($c);
curl_close($c);
fclose($fp);
} else {
echo "Your file is too big!";
}
} else {
echo "There was an error uploading your file!";
}
} else {
echo "You cannot upload files of this type!";
}
When trying this though, I get a few errors. They are:
Warning: fopen() expects parameter 1 to be a valid path, array given...
Warning: curl_setopt(): supplied argument is not a valid File-Handle resource...
Warning: fclose() expects parameter 1 to be resource, boolean given...
I understand what the first warning means. I just do not know how to get the array into something that fopen can use. Then I really do not understand what the other warning means.
Can someone help me get this working or see a better way to get this done?
You have to change the url to: remote.php/dav/files/user/path/to/file

Downloading Images from URL in PHP

I have a file, where I read URLs line by line. The URL are links to images. All the image-links work in the browser. Now I want to download them to my current directory. For the record I need to use a PHP-script.
This is how I get my Images:
for ($j; $j< $count;$j++)
{
// Get Image-URL as String and then do getImage
$image = $arr[$j];
$newname = $j;
getImage($image, $newname);
}
function getImage($image, $newname)
{
$ch = curl_init($image);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$rawdata=curl_exec ($ch);
curl_close ($ch);
$fp = fopen("$newname.jpg",'w');
fwrite($fp, $rawdata);
fclose($fp);
}
But the problem is, I get all the images, only the last one is viewable. The others I can't open and they only have 1KB.
So what did I do wrong?
Plus I need to also download png-Files later on, so can I then just change the $newname.jpg into $newname.png?
Thanks in advance, I really need an answer fast, been sitting here for hours, trying to figure it out.
Why not use stream_copy_to_stream?
function getImage($image, $newname, $fileType = "jpg")
{
$in = fopen($image, "r");
$out = fopen("$newname.$fileType",'w');
stream_copy_to_stream($in, $out);
fclose($in); fclose($out);
}
You can also play with stream_set_read_buffer
stream_set_read_buffer($in, 4096);
Just tested this with our avatar pics.
$data = [
"https://www.gravatar.com/avatar/42eec337b6404f97aedfb4f39d4991f2?s=32&d=identicon&r=PG&f=1",
"https://www.gravatar.com/avatar/2700a034dcbecff07c55d4fef09d110b?s=32&d=identicon&r=PG&f=1",
];
foreach ($data as $i => $image) getImage($image, $i, "png");
Works perfectly.
Debug: Read the remote headers for more info?
$httpresponseheader
var_dump($http_response_header);
or stream_get_meta_data
var_dump(stream_get_meta_data($in), stream_get_meta_data($out));

How to make sure file saved by using cURL

<?php
$source = 'http://www.xxx.com/1.jpg';
$fileBody = date('YmdHis') . rand(1000, 9999);
$extension = pathinfo($source, PATHINFO_EXTENSION);
$fileName = $fileBody . '.' . $extension;
$ch = curl_init($source);
$fp = fopen($path . $fileName, 'wb');
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
clearstatcache();
return $fileName;
This is how I grab image from internet, the image saved successfully and I will return file name for ajax to make immediately thumbnail, but sometimes php return the $fileName when it still processing download, therefore JavaScript reveal empty image on the page, how response after the file indeed been download.
curl_exec returns true on success and false on failure. Use that information.
Also, You can check curl_getinfo to make sure the transfer completed successfully and was not empty. (You get http_code there for example, as well as content_type and size_download).
$downComplete = false;
while(!$downComplete)
{
if(!file_exist($filePath))
{
sleep(1);
}
else
{
$downComplete = true;
break;
}
}
Hi,Above is an idea for check if file is completely downloaded,i think it's useful,
main idea is check saved file all the time until its finished download,then you can
display the img to front end..you can add it after your curl code,i didn't run it
myself,so just an idea..

Downloading big files and writing it locally

which is the best way to download from php large files without consuming all server's memory?
I could do this (bad code):
$url='http://server/bigfile';
$cont = file_get_contents($url);
file_put_contents('./localfile',$cont);
This example loads entry remote file in $cont and this could exceed memory limit.
Is there a safe function (maybe built-in) to do this (maybe stream_*)?
Thanks
You can use curl and the option CURLOPT_FILE to save the downloaded content directly to a file.
set_time_limit(0);
$fp = fopen ('file', 'w+b');
$ch = curl_init('http://remote_url/file');
curl_setopt($ch, CURLOPT_TIMEOUT, 75);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_exec($ch);
curl_close($ch);
fclose($fp);
Here is a function I use when downloading large files. It will avoid loading the entire file into the buffer. Instead, it will write to the destination as it receives the bytes.
function download($file_source, $file_target)
{
$rh = fopen($file_source, 'rb');
$wh = fopen($file_target, 'wb');
if (!$rh || !$wh) {
return false;
}
while (!feof($rh)) {
if (fwrite($wh, fread($rh, 1024)) === FALSE) {
return false;
}
}
fclose($rh);
fclose($wh);
return true;
}

Need php script to download a file on a remote server and save locally

Trying to download a file on a remote server and save it to a local subdirectory.
The following code seems to work for small files, < 1MB, but larger files just time out and don't even begin to download.
<?php
$source = "http://someurl.com/afile.zip";
$destination = "/asubfolder/afile.zip";
$data = file_get_contents($source);
$file = fopen($destination, "w+");
fputs($file, $data);
fclose($file);
?>
Any suggestions on how to download larger files without interruption?
$ch = curl_init();
$source = "http://someurl.com/afile.zip";
curl_setopt($ch, CURLOPT_URL, $source);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
$data = curl_exec ($ch);
curl_close ($ch);
$destination = "/asubfolder/afile.zip";
$file = fopen($destination, "w+");
fputs($file, $data);
fclose($file);
file_get_contents shouldn't be used for big binary files because you can easily hit PHP's memory limit. I would exec() wget by telling it the URL and the desired output filename:
exec("wget $url -O $filename");
Since PHP 5.1.0, file_put_contents() supports writing piece-by-piece by passing a stream-handle as the $data parameter:
file_put_contents("Tmpfile.zip", fopen("http://someurl/file.zip", 'r'));
I always use this code,it's working very well.
<?php
define('BUFSIZ', 4095);
$url = 'Type The URL Of The File';
$rfile = fopen($url, 'r');
$lfile = fopen(basename($url), 'w');
while(!feof($rfile))
fwrite($lfile, fread($rfile, BUFSIZ), BUFSIZ);
fclose($rfile);
fclose($lfile);
?>
Use this solution if you do not know the format of the file that you are going to download.
$url = 'http:://www.sth.com/some_name.format' ;
$parse_url = parse_url($url) ;
$path_info = pathinfo($parse_url['path']) ;
$file_extension = $path_info['extension'] ;
$save_path = 'any/local/path/' ;
$file_name = 'name' . "." . $file_extension ;
file_put_contents($save_path . $file_name , fopen($url, 'r'))
Try out phpRFT:http://sourceforge.net/projects/phprft/files/latest/download?source=navbar
It have progress_bar and simple filename detactor...
A better and lighter script which is streaming file:
<?php
$url = 'http://example.com/file.zip'; //Source absolute URL
$path = 'file.zip'; //Patch & file name to save in destination (currently beside of PHP script file)
$fp = fopen($path, 'w');
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_FILE, $fp);
$data = curl_exec($ch);
curl_close($ch);
fclose($fp);
?>

Categories