Download Remote File in PHP [duplicate] - php

This question already has answers here:
Closed 11 years ago.
Possible Duplicate:
Forcing to download a file using PHP
PHP - send file to user
Is it possible to serve a file from another server , and change the name.
I have all files uploaded in another server but names are changed.I want to hit this server and fetch file from it but want to change it s name.
I wrote a simple curl script , but this adds extra traffic to my php server as well so i will be billed twice for each file and more over it will use php memory as well(if file size increases site may crash)
$init = curl_init();
curl_setopt_array($init, $opArray);
$myFile = curl_exec($init);
$info = curl_getinfo($init);
curl_close($init);
headers ....
echo File
All i am interested is in headers part but for this i need to get file into php server can this be avoided ?

You can use the following headers:
Content-Type: xxx/xxx
Content-Disposition: attachment; filename=theFilenameYouWant.xxx
If you want to avoid heavy memory consumption and if you are using Apache as web server, you can take a look at the mod_xsendfile:
http://codeutopia.net/blog/2009/03/06/sending-files-better-apache-mod_xsendfile-and-php/

Related

server http doesn't work while updating php file via ftp [duplicate]

This question already has answers here:
How to update a website without making it go down?
(3 answers)
Closed 3 years ago.
when I update a php file via ftp (filezilla), pages using that file stop to work until transfer is complete. The server is linux with nginx/php-fpm, but I had the same problem with apache. The only "solution" I found is to edit directly the php file on a server remote shell, and update the content. But this is a very uncomfortable solution.
Is there anybody with a better solution?
Thanks
It is normal if you are doing upload via FTP. the Best solution is using Continues Deployment services with zero downtime approach.
Continuous deployment without downtime
But if you talk about one file. You can just check php file if it exists or correct uploaded file else you can use old copy of this file.
Somesthing like this :
$file = 'uploaded.php';
$oldFile = 'uploaded_old.php';
if (file_exists($file)) {
require_once($file);
} else {
require_once($oldFile);
}

download file from other server to client directly php [duplicate]

This question already has answers here:
Download file from FTP server directly to browser with PHP
(2 answers)
Closed 7 years ago.
my site located in server 1 and some file located in server 2. When i want to download file from server 2 to client by using ftp_get() or curl() , that function download file from server 2 to server 1 not to client.
Server 2 is a private ftp server. If that is not possible with ftp there is no solution other than ftp ?
what is the best solution to store file to server 2?
One easy way is to use Cross Domain enabled ajax to send file (post method) to another server and use below header on server 2 to accept post data from server 1.
header('Access-Control-Allow-Origin: http://server1.com');

Best way to get file size of large files (> 2 GB) in PHP? [duplicate]

This question already has answers here:
PHP x86 How to get filesize of > 2 GB file without external program?
(13 answers)
Closed 7 years ago.
I am working with large files in PHP, and need a RELIABLE way to get the file size of larger files over 4 GB, but PHP runs into problems with files over 2 GB... so far I have only seen solutions involving command line exec features, but the script will be used as a standalone console application, therefore, I am a bit hesitant to using exec as it might react differently on different platforms. The only way left as I see it is reading all the data and count the bytes, but this would be VERY slow... I need a fast and reliable way that will react equally on many different computers (Linux, Windows, Mac).
This previously asked question seems very similar and has some ideas in it that you could explore:
PHP x86 How to get filesize of > 2 GB file without external program?
In it the author comes up with a solution that he hosts on GitHub, the link is located here: https://github.com/jkuchar/BigFileTools/blob/master/src/BigFileTools.php
Beyond that you are running a 32 bit system and thus files over 2 GB will be troublesome in PHP from http://php.net/manual/en/function.filesize.php:
Note: Because PHP's integer type is signed and many platforms use
32bit integers, some filesystem functions may return unexpected
results for files which are larger than 2GB.
Below code works OK for any filesize on any version of PHP / OS / Webserver / Platform.
// http head request to local file to get file size
$opts = array('http'=>array('method'=>'HEAD'));
$context = stream_context_create($opts);
// change the URL below to the URL of your file. DO NOT change it to a file path.
// you MUST use a http:// URL for your file for a http request to work
// SECURITY - you must add a .htaccess rule which denies all requests for this database file except those coming from local ip 127.0.0.1.
// $tmp will contain 0 bytes, since its a HEAD request only, so no data actually downloaded, we only want file size
$tmp= file_get_contents('http://127.0.0.1/pages-articles.xml.bz2', false, $context);
$tmp=$http_response_header;
foreach($tmp as $rcd) if( stripos(trim($rcd),"Content-Length:")===0 ) $size= floatval(trim(str_ireplace("Content-Length:","",$rcd)));
echo "File size = $size bytes";
// example output .... 9 GB local file
File size = 10082006833 bytes

How to keep a file open in PHP? [duplicate]

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
File resource persistence in PHP
If using fopen for opening a file, it will be closed and unset at the end of the PHP script even without fclose.
$fp = fopen('data.txt', 'w');
fwrite($fp, 'text');
fclose($fp);
Now if it is a frequently used script, we need to open/close the file with the filesystem too many times (file I/O). It would be better to keep the file open. This is the technique that database systems use.
Is there a function in PHP to leave a file open and do not re-open it on the next run?
Or How we can setup a semi-server to keep a file opened for frequent access by PHP?
No - you cant. You can open the file in the start of your script/scripts and close it in the end. You can operate with the between the opening and the closing as much as you like. For example you can open the file in the header of your site, and close it at the footer.
To solve the task you require, you might want to take a look at a PHP extension called memcahced. It will store some pieces of information in the RAM of the machine, so that you can reuse them later. You can also add expiration time of each piece of information.
Take a look at this module here: http://www.php.net/manual/en/book.memcached.php
You could lock the file using flock(). Since PHP 5.3.2 the file remains locked until explicitely unlocked, so you need to make sure that the version of the PHP on the server you're running the code is higher than or at least 5.3.2
There is no way to keep a file open between sereral execution of the same script (luckily:)
Opening a file is not very intensive, I suspect it does not worth setting up a "semi-server" that keep the file open.
Why do you need that ?

simultaneously download on server and on user

I am currently developing an application in PHP in which my server (a dedicated server) must to download a file, and the user should download the file in same time.
Here is an example :
Server start to download a file at a time A.
User wants to download this file at the time A + 3 seconds (for example)
I already solved the problem :"If the user downloads the file faster than the server..". But I didn't know how to make a php script in which the user is gonna to download the full file (it means that the size must be the full size of the file, not the size it's currently downloaded at the time A+3seconds). I already make that :
header('Content-Type: application/octet-stream');
header('Content-Disposition: attachment; filename="'.$data['name'].'";');
header('Content-Transfer-Encoding: binary');
header('Content-Length: '.$data['size']);
readfile($remoteFile);
But it doesn't work, the user is gonna download just the size it is currently on the server (which corrupt the file) and not the full file...
If you have any solution, thank you.
You could probably pipe the file manually, by opening the connection and reading until you're past all headers. Then once you've figured out the Content-Length, send that to the user and just echo all remaining data you get (do use flush() and avoid output buffers).
Pseudocode(-ish):
open the file
# grab headers
while you didn't get all HTTP headers:
read more
look for the Content-Length header
send the Content-Length header
# grab the file
while the rest of the request isn't done
read more
send it to the user
flush the buffers
done
Expanding on #Tom answer, you can use cURL to greatly simplify the algorithm by using the CURLOPT_HEADERFUNCTION and CURLOPT_READFUNCTION callbacks - see curl_setopt().
Don't send the content-length header. It's not required assuming you're using http 1.1(your webserver almost certainly does). Drawback is their browser cant show download time/size remaining.

Categories