Copy file from remote server using PHP over HTTP - php

I want to copy a file using PHP over http from a link in this format
http://myserver.com/?id=1234
if I open the link, the download of the file starts ...
So I assume that server redirects to a .mp3 file to start the download.
So how to copy/download the file from the remote server to to my server (localhost)?

Just to gove an example of what Victor is tlking about with cURL:
$options = array(
CURLOPT_FILE => '/local/path/for/file.mp3',
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_URL => 'http://myserver.com/?id=1234',
);
$ch = curl_init();
curl_setopt_array($ch, $options);
curl_exec($ch);

I'm assuming here that the remote server sends the complete file over HTTP. You could use a library such as curl to send an HTTP request and store the received data as a file (using CURLOPT_FILE).
If your local PHP server is correctly configured, you can also use copy to copy from a remote URL to a local path.

$handle = fopen("http://www.example.com/", "rb");
$contents = '';
while (!feof($handle)) {
$contents .= fread($handle, 8192);
}
fclose($handle);
from
http://php.net/manual/en/function.fread.php

Try using a notification callback (read here for mor informations http://www.php.net/manual/function.stream-notification-callback.php)
e.g. you could to this if you like to copy:
function stream_notification_callback($notification_code, $severity, $message, $message_code, $bytes_transferred, $bytes_max)
{
if($notification_code == STREAM_NOTIFY_PROGRESS)
{
// save $bytes_transferred and $bytes_max to file or database
}
}
$ctx = stream_context_create();
stream_context_set_params($ctx, array("notification" => "stream_notification_callback"));
copy($remote_url,$Local_target,$ctx);
Another PHP file could read the saved $bytes_transferred and $bytes_max and show a nice progress bar.

Related

PHP: read a remote file (ideally using fopen)

I'd like to read a remote text file (ideally using fopen) using PHP. My script works using fopen when I'm using this function on a local file.
I've tried:
$file = fopen ("http://abc.abc.abc", "r");
if (!$file) {
echo "<p>Unable to open remote file.\n";
exit;
}
and I got:
Warning: fopen(http://abc.abc.abc): failed to open stream: No connection could be made because the target machine actively refused it. in C:\xampp\htdocs\NMR\nmrTest5.php on line 2 Unable to open remote file.
I've read that phpseclib could be a good option and since I can access to my files using WinSCP (SFTP) or by using Puttyfor I tried this (after copying all the files from phpseclib to my directory) hoping that I could copy locally the file and then read it with fopen (not the best thing for met but I could live with that):
include('Net/SFTP.php');
$sftp = new Net_SFTP('abc.abc.abc');
if (!$sftp->login('username', 'pass')) {
exit('Login Failed');
}
and I got:
Notice: No compatible server to client encryption algorithms found in C:\xampp\htdocs\NMR\Net\SSH2.php on line 1561
Login Failed
Interstingly, I got a different message if I was connected to the server (using WinSCP):
Notice: Error reading from socket in C:\xampp\htdocs\NMR\Net\SSH2.php on line 3362
Notice: Connection closed by server in C:\xampp\htdocs\NMR\Net\SSH2.php on line 1471
Login Failed
Any idea on how I could get it to work? Ideally I would use fopen but I'm open to other solution.
I've just been working through this exact problem myself and couldn't find any good documentation in any one single place for how to accomplish this.
I have just made a logging service that uses Monolog and basically makes a custom stream handler based on the log files that are being written to/created. As such it requires a resource (such as one created by fopen) in order to write the log files to an SFTP server.
I had it working using the ssh2 library like this:
$connection = ssh2_connect($this->host, 22);
ssh2_auth_password($connection, $this->user, $this->password);
$sftp = ssh2_sftp($connection);
//some stuff to do with whether the file already exists or not
$fh=fopen("ssh2.sftp://$sftp".ssh2_sftp_realpath($sftp,".")."/$this->logName/$this->fileName", 'a+');
return new StreamHandler($fh);
Everything was working beautifully until I went to integrate the service into a different project and realised this was only working on my development machine because it has the libssh2 library installed as outlined in this question.
Unfortunately, the production server is not so easy to add libraries to. I therefore found myself looking for a different solution.
I have used phpseclib in other projects but only for basic get(), put() and some nlist() calls.
In order to get this working I had to use a Stream object. Not very well documented but there is a good discussion here.
Based on the info there, plus some digging around in the SFTP class, particularly the get() function, this is how I managed to achieve the same functionality using phpseclib
SFTP\Stream::register();
$sftpFileSystem = new SFTP($this->host);
if (!$sftpFileSystem->login($this->user, $this->password)) {
throw new Exception("Error logging in to central logging system. Please check the local logs and email for details", 1);
}
$context = [
'sftp' => [
'sftp' => $sftpFileSystem
],
];
//some stuff to do with whether the file already exists or not
$remote_file = $sftpFileSystem->realpath('test.txt');
$sftpStream = fopen("sftp://.{$remote_file}", 'a+', null, stream_context_create($context));
if (!$sftpStream) {
exit(1);
}
return new StreamHandler($sftpStream);
note the dot (.) after the sftp:// in the call to fopen(). It wasted me a good half an hour!
This is how I ended fixing my problem usng phpseclib as suggested by #neubert in the comments of my question.
I first added the phpseclib folder on my server. Then I used this code in my PHP file to get access to my file on a remote server:
//needed for phpseclib
set_include_path(get_include_path() . PATH_SEPARATOR . 'phpseclib');
include_once('Net/SFTP.php');
//connection to the server
$sftp = new Net_SFTP('abc.abc.abc');
if (!$sftp->login('my_login', 'my_password')) {
exit('Login Failed');
}
//dump entire file in a string, convert to an array and check number of lines
else {
$text = $sftp->get('full_path_to_my_file');
}
$myArray = explode("\n", $text);
$nbline = count($myArray);
I have faced similiar issues with fopen.
Curl is useful for these purposes.
Please check with the following basic example function(if the url is https, please uncomment the CURLOPT_SSL_VERIFYPEER = FALSE line).
$url = '***THE URL***';
$result = get_web_page_by_curl($url);
if ($result['errno'] != 0) echo 'error: bad url, timeout, redirect loop ...';
if ($result['http_code'] != 200) echo 'error: no page, no permissions, no service ...';
else {
$page = $result['content'];
echo $page;
}
function get_web_page_by_curl($url) {
$agent = "Mozilla/5.0 (Windows; U; Windows NT 5.0; en-US; rv:1.4) Gecko/20030624 Netscape/7.1 (ax)";
$options = array(
CURLOPT_RETURNTRANSFER => true, // return web page
CURLOPT_HEADER => false, // don't return headers
CURLOPT_FOLLOWLOCATION => true, // follow redirects
CURLOPT_ENCODING => "", // handle all encodings
CURLOPT_USERAGENT => $agent, // who am i
CURLOPT_AUTOREFERER => true, // set referer on redirect
CURLOPT_CONNECTTIMEOUT => 120, // timeout on connect
CURLOPT_TIMEOUT => 120, // timeout on response
CURLOPT_MAXREDIRS => 10, // stop after 10 redirects
//CURLOPT_SSL_VERIFYPEER => FALSE // this line makes it work under https
);
$ch = curl_init($url);
curl_setopt_array($ch, $options);
$content = curl_exec($ch);
$err = curl_errno($ch);
$errmsg = curl_error($ch);
$header = curl_getinfo($ch);
curl_close($ch);
$header['errno'] = $err;
$header['errmsg'] = $errmsg;
$header['content'] = $content;
return $header;
}

Creating a zip file on the fly from files stored on S3 using php

I have a Laravel web app in which users can upload files. These files can be sensitive and although they are stored on S3 they are only accessed via my webservers (streamed download). Once uploaded users may wish to download a selection of these files.
Previously when users went to download a selection of files my web server would download the files from S3, zip them locally and then send the zip down to the client. However once in production due to file sizes the server response would frequently time out.
As an alternative method I want to zip the files on the fly via ZipStream but I haven't had much luck. The zip file either ends up with corrupted files or is corrupted itself and incredibly small.
If it possible to pass a stream resource for a file on S3 to ZipStream and what is the best way to address my timeout issues?
I have tried several method my most recent two are as follows:
// First method using fopen
// Results in tiny corrupt zip files
if (!($fp = fopen("s3://{$bucket}/{$key}", 'r')))
{
die('Could not open stream for reading');
}
$zip->addFileFromPath($file->orginal_filename, "s3://{$bucket}/{$key}");
fclose($fp);
// Second method tried get download the file from s3 before sipping
// Results in a reasonable sized zip file that is corrupt
$contents = file_get_contents("s3://{$bucket}/{$key}");
$zip->addFile($file->orginal_filename, $contents);
Each of these sits within a loop that goes through each files. After the loop I call $zip->finish().
Note I do not get any php errors just corrupt files.
In the end the solution was to use signed S3 url's and curl to provide a file stream for ZipStream as demonstrated by s3 bucket steam zip php. The resulting code edited from the aforementioned source is as follows:
public function downloadZip()
{
// ...
$s3 = Storage::disk('s3');
$client = $s3->getDriver()->getAdapter()->getClient();
$client->registerStreamWrapper();
$expiry = "+10 minutes";
// Create a new zipstream object
$zip = new ZipStream($zipName . '.zip');
foreach($files as $file)
{
$filename = $file->original_filename;
// We need to use a command to get a request for the S3 object
// and then we can get the presigned URL.
$command = $client->getCommand('GetObject', [
'Bucket' => config('filesystems.disks.s3.bucket'),
'Key' => $file->path()
]);
$signedUrl = $request = $client->createPresignedRequest($command, $expiry)->getUri();
// We want to fetch the file to a file pointer so we create it here
// and create a curl request and store the response into the file
// pointer.
// After we've fetched the file we add the file to the zip file using
// the file pointer and then we close the curl request and the file
// pointer.
// Closing the file pointer removes the file.
$fp = tmpfile();
$ch = curl_init($signedUrl);
curl_setopt($ch, CURLOPT_TIMEOUT, 120);
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_exec($ch);
curl_close($ch);
$zip->addFileFromStream($filename, $fp);
fclose($fp);
}
$zip->finish();
}
Note this requires curl and php-curl to be installed and functioning on your server.
I had the same issues as #cubiclewar and investigated a little bit. I found that the most up to date solution to this doesn't need curl and it visible here on the wiki for the maennchen/ZipStream-PHP/ library.
https://github.com/maennchen/ZipStream-PHP/wiki/Symfony-example
use ZipStream;
//...
/**
* #Route("/zipstream", name="zipstream")
*/
public function zipStreamAction()
{
//sample test file on s3
$s3keys = array(
"ziptestfolder/file1.txt"
);
$s3Client = $this->get('app.amazon.s3'); //s3client service
$s3Client->registerStreamWrapper(); //required
//using StreamedResponse to wrap ZipStream functionality for files on AWS s3.
$response = new StreamedResponse(function() use($s3keys, $s3Client)
{
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
//initialise zipstream with output zip filename and options.
$zip = new ZipStream\ZipStream('test.zip', $opt);
//loop keys - useful for multiple files
foreach ($s3keys as $key) {
// Get the file name in S3 key so we can save it to the zip
//file using the same name.
$fileName = basename($key);
//concatenate s3path.
$bucket = 'bucketname'; //replace with your bucket name or get from parameters file.
$s3path = "s3://" . $bucket . "/" . $key;
//addFileFromStream
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
});
return $response;
}

How to speed up file_get_contents?

Here's my code:
$language = $_GET['soundtype'];
$word = $_GET['sound'];
$word = urlencode($word);
if ($language == 'english') {
$url = "<the first url>";
} else if ($language == 'chinese') {
$url = "<the second url>";
}
$opts = array(
'http'=>array(
'method'=>"GET",
'header'=>"User-Agent: <my user agent>"
)
);
$context = stream_context_create($opts);
$page = file_get_contents($url, false, $context);
header('Content-Type: audio/mpeg');
echo $page;
But I've found that this runs terribly slow.
Are there any possible methods of optimization?
Note: $url is a remote url.
It's slow because file_get_contents() reads the entire file into $page, PHP waits for the file to be received before outputting the content. So what you're doing is: downloading the entire file on the server side, then outputting it as a single huge string.
file_get_contents() does not support streaming or grabbing offsets of the remote file. An option is to create a raw socket with fsockopen(), do the HTTP request, and read the response in a loop, as you read each chunk, output it to the browser. This will be faster because the file will be streamed.
Example from the Manual:
$fp = fsockopen("www.example.com", 80, $errno, $errstr, 30);
if (!$fp) {
echo "$errstr ($errno)<br />\n";
} else {
header('Content-Type: audio/mpeg');
$out = "GET / HTTP/1.1\r\n";
$out .= "Host: www.example.com\r\n";
$out .= "Connection: Close\r\n\r\n";
fwrite($fp, $out);
while (!feof($fp)) {
echo fgets($fp, 128);
}
fclose($fp);
}
The above is looping while there is still content available, on each iteration it reads 128 bytes and then outputs it to the browser. The same principle will work for what you're doing. You'll need to make sure that you don't output the response HTTP headers which will be the first few lines, because since you are doing a raw request, you will get the raw response with headers included. If you output the response headers you will end up with a corrupt file.
Instead of downloading the whole file before outputting it, consider streaming it out like this:
$in = fopen($url, 'rb', false, $context);
$out = fopen('php://output', 'wb');
header('Content-Type: video/mpeg');
stream_copy_to_stream($in, $out);
If you're daring, you could even try (but that's definitely experimental):
header('Content-Type: video/mpeg');
copy($url, 'php://output');
Another option is using internal redirects and making your web server proxy the request for you. That would free up PHP to do something else. See also my post regarding X-Sendfile and friends.
As explained by #MrCode, first downloading the file to your server, then passing it on to the client will of course incur a doubled download time. If you want to pass the file on to the client directly, use readfile.
Alternatively, think about if you can't simply redirect the client to the file URL using a header("Location: $url") so the client can get the file directly from the source.

How to get a .jpg image from external website and store it (Storing it seperate from CURL)

I am currently attempting to make a function in my class which gets data from an external server.
I am able to get the data with CURL, but I do not want to use CURL to directly store it in a file.
This is semi difficult to explain so I will show you.
This is my function for getting the image:
function getCharacterPortrait($CharID, $size){
$url = "http://image.eveonline.com/character/{$CharID}_{$size}.jpg";
$ch = curl_init($url);
curl_setopt_array($ch, array(
CURLOPT_RETURNTRANSFER => true,
CURLOPT_HEADER => false,
CURLOPT_FOLLOWLOCATION => true,
CURLOPT_ENCODING => "",
CURLOPT_AUTOREFERER => true,
));
$data = curl_exec($ch);
curl_close($ch);
return $data;
}
So, what I want to do from here, is take the $data which is the raw image I presume, and store it in a .jpg in a specified file.
I have found a similar explanation for this using CURL, but CURL was used to directly store it. I would like the script that is calling the function to get the image to store the file.
Sorry if I am being a bit confusing, but I think you get the premise of what I am saying. If more explaining is needed please do say so.
How about this?
$a = implode('',#file('http://url/file.jpg'));
$h = fopen('/path/to/disk/file.jpg','wb');
fwrite($h,$a);
fclose($h);
Just write all the data cURL gave you to a file with file_put_contents() for example?
[your curl code...]
file_put_contents('localcopy.jpg', $data);
Edit:
Apparently there is also a cURL option to download to a file like:
$fp = fopen($path, 'w');
curl_setopt($ch, CURLOPT_FILE, $fp);
Found at http://www.phpriot.com/articles/download-with-curl-and-php
Use it
$url = "http://image.eveonline.com/character/{$CharID}_{$size}.jpg";
$local_path = "myfolder/{$CharID}_{$size}.jpg";
$file = file_get_contents($url);
file_put_contents($file, $local_path);

PHP server to server file request

I have script-1 on server A, where user ask for a file.
I have script-2 on server B (the file repository) where I check that user can access it and return the correct file (I'm using Smart File Download http://www.zubrag.com/scripts/download.php).
I've tried cURL and file_get_contents, I've changed Content Header in various ways, but I wasn't still able to download the file.
This is my request:
$request = "http://mysite.com/download.php?f=test.pdf";
and it works fine.
What should I call in script-1 to force the file be downloaded?
Some of my tries
This works, but I don't know how to handle unauthorized or broken downloads
header('Content-type: application/pdf');
$handle = fopen($request, "r");
if ($handle) {
while (!feof($handle)) {
$buffer = fgets($handle, 4096);
echo $buffer;
}
fclose($handle);
}
This prints the pdf code (not the text) straight in the browser (I think it's a header problem):
$c = curl_init();
curl_setopt($c, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($c, CURLOPT_URL, $request);
$contents = curl_exec($c);
curl_close($c);
if ($contents) return $contents;
else return FALSE;
This generate a white page
file_get_contents($request);
To force download, add
header('Content-disposition: attachment');
But Note, that it's not in HTTP 1.1 spec anymore, see Uses of content-disposition in an HTTP response header first answer
Without your code I don't know what you've tried, but you need to get the contents of the file via cURL and then save it to your server. Something like...
$url = 'http://website.com/file.pdf';
$path = '/tmp/file.pdf';
$ch = curl_init($url);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
$contents = curl_exec($ch);
curl_close($ch);
file_put_contents($path, $contents);
If you want to downloads file from the FTP server you can use php File Transfer Protocol (FTP) extension. Please find below code:
<?php
$SERVER_ADDRESS="";
$SERVER_USERNAME="";
$SERVER_PASSWORD="";
$conn_id = ftp_connect($SERVER_ADDRESS);
// login with username and password
$login_result = ftp_login($conn_id, $SERVER_USERNAME, $SERVER_PASSWORD);
$server_file="test.pdf" //FTP server file path
$local_file = "new.pdf"; //Local server file path
##----- DOWNLOAD $SERVER_FILE AND SAVE IT TO $LOCAL_FILE--------##
if (ftp_get($conn_id, $local_file, $server_file, FTP_BINARY)) {
echo "Successfully written to $local_file\n";
} else {
echo "There was a problem\n";
}
ftp_close($conn_id);
?>
Download the file with curl, then check this: http://php.net/function.readfile
It shows how to force download.
SOLVED
I ended by simply redirect the request with:
header("Location: $request");

Categories