I would like to know the best way to save an image from a URL in php.
At the moment I am using
file_put_contents($pk, file_get_contents($PIC_URL));
which is not ideal. I am unable to use curl. Is there a method specifically for this?
Using file_get_contents is fine, unless the file is very large. In that case, you don't really need to be holding the entire thing in memory.
For a large retrieval, you could fopen the remote file, fread it, say, 32KB at a time, and fwrite it locally in a loop until all the file has been read.
For example:
$fout = fopen('/tmp/verylarge.jpeg', 'w');
$fin = fopen("http://www.example.com/verylarge.jpeg", "rb");
while (!feof($fin)) {
$buffer= fread($fin, 32*1024);
fwrite($fout,$buffer);
}
fclose($fin);
fclose($fout);
(Devoid of error checking for simplicity!)
Alternatively, you could forego using the url wrappers and use a class like PEAR's HTTP_Request, or roll your own HTTP client code using fsockopen etc. This would enable you to do efficient things like send If-Modified-Since headers if you are maintaining a cache of remote files.
I'd recommend using Paul Dixon's strategy, but replacing fopen with fsockopen(). The reason is that some server configurations disallow URL access for fopen() and file_get_contents(). The setting may be found in php.ini and is called allow_url_fopen.
Related
Let's say I'd like to download some information from a file on the internet within PHP, but I do not need the entire file. Therefore, loading the full file through
$my_file = file_get_contents("https://www.webpage.com/".$filename);
would use up more memory and resources than necessary.
Is there a way to download only e.g. the first 5kb of a file as plain text with PHP?
EDIT:
In the comments it was suggested to use e.g. maxlen arg for file_get_contents or similar. But what I noticed that the execution time of the call does not vary appreciably for different maxlen which means that the function loads the full file and then just returns a substring to the variable.
Is there a way to make PHP download just the required amount of bytes and no more, to speed things up?
<?php
$fp = fopen("https://www.webpage.com/".$filename, "r");
$content = fread($fp,5*1024);
fclose($fp);
?>
Note: Make sure allow_url_fopen is enabled.
PHP Doc: fopen, fread
Some PHP functions, like fopen(), have a return value of type "resource".
However, most of these functions require some actual outside resource, such as a file or database. Or they require additional PHP extension to be installed, such as curl_open().
I sometimes want to experiment with different value types on https://3v4l.org, where I cannot rely on external resources.
Another scenario where this might be relevant is unit tests, where we generally want as little side effects as possible.
So, what is the simplest way to get a value of type resource, without external side effects, 3rd party extensions, or external dependencies?
I use
fopen('php://memory', 'w'); or fopen('php://temp', 'w'); when I just need a file stream resource to play with.
php://temp is better if the buffer will exceed 2mb.
You can use php://memory or php://temp as resource. The first one doesn't even need access to the system /tmp folder.
Example:
$resource = fopen('php://temp', 'w+');
The best I've come up with so far is tmpfile().
It does work in https://3v4l.org/00VlY. Probably they have set up some kind of sandbox filesystem.
$resource = tmpfile();
var_dump(gettype($resource));
var_dump($resource);
var_dump(intval($resource));
I would say it is still not completely free of side effects, because it does something with a file somewhere. Better ideas are welcome!
Question about this helper http://codeigniter.com/user_guide/helpers/download_helper.html
If, for example, program.exe weights 4 GB, will it take a lot of PHP memory for reading and delivering that file?
$data = file_get_contents("/path/to/program.exe"); // Read the file's contents
$name = 'software.exe';
force_download($name, $data);
force_download function just set the proper HTTP headers to make the client's browser download the file. So, it won't open the file, just pass it's URL to the client.
Check the helper source code, if you need: https://bitbucket.org/ellislab/codeigniter-reactor/src/31b5c1dcf2ed/system/helpers/download_helper.php
Edit: I'd sugest creating your own version of the helper, and, instead of using strlen to get the file size, use the php function filesize, which takes only the file name as argument and returns the size in bytes.
More info, at http://www.php.net/manual/en/function.filesize.php
Yea... that could get... bad...
file_get_contents reads the entire contents of a file into a string. For large files, that can get, well, bad. I would look into readfile. Please remember too -- since CI automatically caches when you are loading a view, that means there will be no discernible benefit to readfile if it is used in a CI view. It would almost be better to handle this with an external script or by outputting directly from the controller and not calling the view at all.
I have a function that will be passed in a link. The link is to a remote image. I thought I could just use the extension of the file in the URL to determine the type of image but some URLs won't have extensions in the URL. They probably just push headers to the browser and therefore I do not have an extension to parse from the URL.
How can I test if the URL has an extension and if not then read the headers to determine the file type?
Am I over complicating things here? Is there an easier way to do this? I am making use of Codeigniter maybe there is something already built in to do this?
All I really want to do is download an
image from a URL with the correct
extension.
This is what I have so far.
function get_image($image_link){
$remoteFile = $image_link;
$ext = ''; //some URLs might not have an extension
$file = fopen($remoteFile, "r");
if (!$file) {
return false;
}else{
$line = '';
while (!feof ($file)) {
$line .= fgets ($file, 4096);
}
$file_name = time().$ext;
file_put_contents($file_name, $line);
}
fclose($file);
}
Thanks all for any help
You never want to switch on the file extension. For example, not all ASCII text files will have ".txt", and then there's always the fun ".jpg" versus ".jpeg". Instead, switch on the Content-Type header that the web server responds with. (See http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.17).
Some web servers will also respect the Accept header (see http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.1). So, if you know you only want PNG image files, then send Accept: image/png. If the server respects it, then its response will be much smaller than sending the whole unwanted file over the network connection. However, because not all web servers do this well (if at all), make sure you still switch on the Content-Type response header. It's just a bandwidth saving thing.
You could also send Accept: image/* to get any type of image.
Cheers.
You should always use the Content-Type response header to determine the type of data received. From that you can set the correct extension. URLs do not have extensions, and you should not rely on anything after a period being such.
You won't be able to read headers using fsock/fread or even file_get_contents. That's all hidden away in the background and discarded when the retrieval finishes. You'll have to use the CURL functions and set up a proper HTTP GET session using that, from which you can retrieve full details of the transfer.
is file_get_contents() enough for downloading remote movie files located on a server ?
i just think that perhaps storing large movie files to string is harmful ? according to the php docs.
OR do i need to use cURL ? I dont know cURL.
UPDATE: these are big movie files. around 200MB each.
file_get_contents() is a problem because it's going to load the entire file into memory in one go. If you have enough memory to support the operation (taking into account that if this is a web server, you may have multiple hits that generate this behavior simultaneously, and therefore each need that much memory), then file_get_contents() should be fine. However, it's not the right way to do it - you should use a library specifically intended for these sort of operations. As mentioned by others, cURL will do the trick, or wget. You might also have good luck using fopen('http://someurl', 'r') and reading blocks from the file and then dumping them straight to a local file that's been opened for write privileges.
As #mopoke suggested it could depend on the size of the file. For a small movie it may suffice. In general I think cURL would be a better fit though. You have much more flexibility with it than with file_get_contents().
For the best performance you may find it makes sense to just use a standard unix util like WGET. You should be able to call it with system("wget ...") or exec()
http://www.php.net/manual/en/function.system.php
you can read a few bytes at a time using fread().
$src="http://somewhere/test.avi";
$dst="test.avi";
$f = fopen($src, 'rb');
$o = fopen($dst, 'wb');
while (!feof($f)) {
if (fwrite($o, fread($f, 2048)) === FALSE) {
return 1;
}
}
fclose($f);
fclose($o);