I've finally gotten to testing external file storage systems in my project and I'm encountering a strange error when I try to analyze some of these files.
What I'm trying to achieve: Grab a list of all the files in a certain s3 directory (done) and analyze them by their ID3 tags using a php package:
https://packagist.org/packages/james-heinrich/getid3
$files = Storage::disk('s3')->files('going/down/to/the/bargin/basement/because/the/bargin/basement/is/cool'); //Get Files
$file = Storage::disk('s3')->url($files[0]); // First things first... let's grab the first one.
$getid3 = new getID3; // NEW OBJECT!
return $getid3->analyze($file); // analyze the file!
However when I throw that into tinker it squawks back at me with:
"GETID3_VERSION" => "1.9.14-201703261440",
"error" => [
"Could not open "https://a.us-east-2.amazonaws.com/library/pending/admin/01%20-%20Cathedrals.mp3" (!is_readable; !is_file; !file_exists)",
],
Which seems to indicate that the file is not readable? This is my first time utilizing AWS S3 so there may be something I haven't configured correctly.
GetId3 doesn't have support for remote files.
You will need to pull your file from S3 to your local storage, then pass the local path for the file to the analyze method of getID3.
# $file[0] is path to file in bucket.
$firstFilePath = $file[0];
Storage::put(
storage_path($firstFilePath),
Storage::get($firstFilePath)
);
$getid3->analyze(storage_path($firstFilePath));
The problem is that you are passing URL to analyze method. This is mentioned here.
To analyze remote files over HTTP or FTP you need to copy the file locally first before running getID3()
Ideally, you will save the file from your URL locally and then pass to getID3->analyze()
// save your file from URL ($file)
// I assume $filePath is the local path to the file
$getID3 = new getID3;
return $getID3->analyze($filePath); // $filePath should be local file path and not a remote URL
To save an s3 file locally
$contents = $exists = Storage::disk('s3')->get('file.jpg');
$tmpfname = tempnam("/tmp", "FOO");
file_put_contents($tmpfname, $contents);
$getID3 = new getID3;
// now use $tmpfname for getID3
$getID3->analyze($tmpfname);
// you can delete temporary file when done
Related
I need download with CodeIgniter. I tried with the force_download function and download function.
With the download function it works, but doesn't permit select the folder for the download. With the force download function the browser downloads an empty file.
$this->load->helper('download');
$path = file_get_contents(base_url()."modulos/".$filename); // get file name
$name = "sample_file.pdf"; // new name for your file
//
force_download($name, $path); // start download
// or
$this->ftp->download($path, '/local/path/to/'.$name);
$this->ftp->download downloads a file from an FTP server to a web server.
force_download downloads a file from the web server to the client.
To download a file from the FTP server all the way to the the client, you have to combine/chain both functions.
$temp_path = $temp_file = tempnam(sys_get_temp_dir(), $name);
$this->ftp->download($path, $temp_path);
$data = file_get_contents($temp_path);
force_download($name, $data);
unlink($temp_path);
I want to open a ZIP file by passing a remote URL (http://www.example.com/file.zip or http://localhost/wordpress/wp-content/uploads/file.zip) instead of a file location (C:\wamp\www\wordpress\wp-content\uploads\file.zip)
This constructor works fine for a file location but not for a remote url of a file. How does one open a file using a remote URL for this scenario?
public function __construct($file = false)
{
if ($file && is_file($file)) {
//$file="C:\wamp\www\wordpress\wp-content\uploads\file.zip" here
$this->open($file);
$this->fileName = basename($this->filePath = $file);
} else {
throw new \Exception($file . " not a regular file");
}
}
The safest way is to
download the file
This is super easy if allow_url_fopen is enabled: file_get_contents() accepts remote URLs. If that's not enabled, use cURL or a Wordpress HTTP helper to download it.
save it locally
Also super easy, with file_put_contents(). The /tmp folder is probably writable for you. On Windows, I don't know where the tmp folder lives.
open it like any other
As you would a local ZIP archive, with ZipArchive::open() or your nameless class
just use php fopen function
http://php.net/manual/en/function.fopen.php
$handle = fopen("http://www.example.com/", "r");
I have used this to get the contents of a webpage but not a zip file - not sure if that will work but it did well for me. $contents definitely worked for text.
// For PHP 5 and up
$handle = fopen("https://www.thesiteyouwant.com/the_target_file.ext", "r");
$contents = stream_get_contents($handle);
http://php.net/manual/en/function.stream-get-contents.php
I need to serve to my web portal enduser a file stored on a FTP server.
I create a local temp file, I download the FTP's file in my temporary file, and when it finish I send it with Symfony's file HTTP response:
$tempPath = tempnam(sys_get_temp_dir(), 'ftp');
$tempFile = fopen($tempPath, 'a+');
$log = "myLog";
$pass = "myPwd";
$conn_id = ftp_connect('my.ftp.server');
$ftp_login = ftp_login($conn_id, $log, $pass);
ftp_fget($conn_id, $tempFile, $path, FTP_ASCII, 0);
ftp_close($conn_id);
fclose($tempFile);
return $app->sendFile($tempPath, 200, array('Content-Type' => 'application/force-download', 'Content-disposition' => "attachment; filename=myFile.zip"));
It works but I would do better (ensure temp file deletion, improve performance, ...)
I see that Symfony provides HTTP streamed response helper, so I imagine to send the FTP's file without to store it on web server's hard drive disk.
To get it I think I need to connect ftp_fget() function (or other FTP function) to a PHP function that can output it to php://output or standard output (I don't really know this PHP scope).
I find the readfile() function but it take the filename as argument, not a resource like ftp_fget() need... Is there other function compatible with fget_fget or other way to do it?
Your problem consists of two parts:
Reading FTP file as a stream (see an example with fread(): "PHP: How do I read a .txt file from FTP server into a variable?")
Streaming a Response in Symfony2
There is a free ftp space xxx.yyy.zzz , i upload the code (named getweb.php) into it with filezilla.
<?php
$word=file_get_contents('http://www.webkaka.com/');
$filename='c:\\file.txt';
$fh=fopen($filename,"w");
echo fwrite($fh,$word);
fclose($fh);
?>
when i inpu xxx.yyy.zzz\getweb.php, i found that there is a file named c:\file.txt in my ftp space,i had to download it with filezilla.
I can do that with python ,how can capture the web pages directly into my local disk in php code ?
,
You must install PHP on your local computer if you want PHP code to be able to create files locally. You could try a ready-to-run XAMP package if you don't want to install native applications.
If you're using php 5, you can use file_put_contents:
file_put_contents('/path/to/file.txt', $word);
Code to download a file using PHP's built-in FTP streams
// the file your trying to get
$file ="ftp://ftp_user:ftp_pass#domain.com/file.txt";
// get the file
$contents = file_get_contents($file);
Code to download a file using PHP's built-in HTTP streams
// the file your trying to get
$file ="http://domain.com/file.txt";
// get the file
$contents = file_get_contents($file);
Code to UPLOAD (put) file on remote FTP
// get the file from LOCAL HARD DRIVE
$contents = file_get_contents('C:/local/path/to/file.txt');
// the file your trying to UPLOAD
$file ="ftp://ftp_user:ftp_pass#domain.com/file.ext";
// write USING FTP
$opts = array('ftp' => array('overwrite' => true));
$context = stream_context_create($opts);
file_put_contents($file, $contents, NULL, $context);
OR COMBINE A FEW OF ABOVE: Download using FTP, upload using FTP
// the file your trying to get
$file ="ftp://ftp_user:ftp_pass#domain-ONE.com/file.txt";
// get the file USING FTP
$contents = file_get_contents($file);
// the file your trying to UPLOAD
$file ="ftp://ftp_user:ftp_pass#domain-TWO.com/file.ext";
// write USING FTP
$opts = array('ftp' => array('overwrite' => true));
$context = stream_context_create($opts);
file_put_contents($file, $contents, NULL, $context);
Download using HTTP, upload using FTP
// the file your trying to get
$file ="http://domain-ONE.com/file.txt";
// get the file USING HTTP
$contents = file_get_contents($file);
// the file your trying to UPLOAD
$file ="ftp://ftp_user:ftp_pass#domain-TWO.com/file.ext";
// write USING FTP
$opts = array('ftp' => array('overwrite' => true));
$context = stream_context_create($opts);
file_put_contents($file, $contents, NULL, $context);
This code can be ran remotely on a webserver, locally on a webserver, locally using php-cli, or remotely using php-cli over SSH.
Let me know your specific demands and i'll adjust my answer
I would like to create a zip file in memory using a ZipArchive (or a native PHP class) and read the content of the file back to the client. Is this possible? If so, how?
The files that I want to zip in this application are a maximum of 15 MB total. I think we should be in good shape memory-wise.
Try ZipStream (link to GitHub repo, also supports install via Composer).
From original author's website (now dead):
ZipStream is a library for dynamically streaming dynamic zip files
from PHP without writing to the disk at all on the server.
Thanks to Frosty Z for the great library ZipStream-PHP. We have a use case to upload some large zip files in quantity and size to S3. The official documentation does not mention how to upload to S3.
So we've had an idea to stream the zip created by ZipStream output directly to S3 without creating the zip file on the server.
Here is a working sample code that we came up with:
<?php
# Autoload the dependencies
require 'vendor/autoload.php';
use Aws\S3\S3Client;
use ZipStream\Option\Archive as ArchiveOptions;
//s3client service
$s3Client = new S3Client([
'region' => 'ap-southeast-2',
'version' => 'latest',
'credentials' => [
'key' => '<AWS_KEY>',
'secret' => '<AWS_SECRET_KEY>',
]
]);
$s3Client->registerStreamWrapper(); //required
$opt = new ArchiveOptions();
$opt->setContentType('application/octet-stream');
$opt->setEnableZip64(false); //optional - for MacOs to open archives
$bucket = 'your_bucket_path';
$zipName = 'target.zip';
$zip = new ZipStream\ZipStream($zipName, $opt);
$path = "s3://{$bucket}/{$zipName}";
$s3Stream = fopen($path, 'w');
$zip->opt->setOutputStream($s3Stream); // set ZipStream's output stream to the open S3 stream
$filePath1 = './local_files/document1.zip';
$filePath2 = './local_files/document2.zip';
$filePath3 = './local_files/document3.zip';
$zip->addFileFromPath(basename($filePath1), $filePath1);
$zip->addFileFromPath(basename($filePath2), $filePath2);
$zip->addFileFromPath(basename($filePath3), $filePath3);
$zip->finish(); // sends the stream to S3
?>
Take a look at the following library, it allows creating zip files and returning them as a stream: PHPClasses.org.
There is another thread talking about this:
Manipulate an Archive in memory with PHP (without creating a temporary file on disk)
nettle has suggested to use zip.lib.php from phpmyadmin. I think this is a rather solid solution.
FYI zip.lib.php does not exist anymore, it has been replace by ZipFile.php in the same libraries/ folder.