I need to display videos / images file with ACL:PRIVATE uploaded to my Amazon S3 account on my wordpress blog.
I am a newbie to PHP oops based coding. Any script help, link references, free plugins or even Logical Algorithm will be great help :)
Thanks in advance.
This issue could be solved by implementing the following steps:
Download latest stable version of SDK from here
Extract the .zip file & place in wamp/www folder
Rename config-sample.inc.php file to config.inc.php
Add the access key & secret key (retrieved from Amazon S3 account) into above file, save & exit
create a sample file to display public / private objects from Amazon S3
The content of the file should look as follows:
require('sdk.class.php');
require('services/s3.class.php');
$s3 = new AmazonS3();
$bucket = "bucketname";
$temp_link = $s3->get_object_url($bucket, 'your/folder/path/img.jpg', '5 minute');
echo $temp_link;
In above code, the URL you receive as output is a signed URL for your private object, thus it is valid only for 5 minutes.
You may grant access for a future date and allow only authorized users to access your private content or media on Amazon S3.
This question is a little bit old, but I'm posting it anyway. I had a simliar issue today and found out there's a simple answer.
aws doc explains it clearly and has an example as well.
http://docs.aws.amazon.com/aws-sdk-php-2/guide/latest/service-s3.html#amazon-s3-stream-wrapper
Basically, you need to register AWS' stream wrapper and use s3:// protocol.
Here's my code sample.
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
$s3 = Aws::factory(array(
'key' => Config::get('aws.key'),
'secret' => Config::get('aws.secret'),
'region' => Config::get('aws.region')
))->get('s3');
$s3->registerStreamWrapper();
// now read file from s3
// from the doc.
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/key', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}
Related
I couldn't find out there a PHP tutorial for storing a file in the free bucket of a Google Apps Engine project and retrieving its contents. The idea of this post is to do it step by step assuming the GAE project was correctly created.
1) If you created a GAE project it has been granted a 5Gb of a free Google Cloud Storage bucket. Its name is "YOUR_PROJECT_ID.appspot.com"
https://console.cloud.google.com/storage/browser
2) A Service Account must be created and asgigned to use the SDK.
Steps Here
3) This is the basic PHP code to store a file with a "Hello World" content. This code could be executed from the terminal windows.
<?php
$filename = "tutorial.txt"; // filename in the bucket
$txt_toSave = "Hello Word"; // text content in the file
// lets add code here
?>
php GCStorage_save_example.php
4) This is the basic PHP code to retrieve a file's content from the bucket.
<?php
// lets add code here
echo $txt_fileContent;
?>
php GCStorage_retrieve_example.php
a) if permissions must be granted feel free to add the step
b) if any other step must be done feel free to add it
You can find the PHP code samples for Google Cloud Storage operations in the Official Google Cloud Storage documentation [1].
For instance, an example to store a file in Google Cloud Storage according to the documentation [2] would be:
use Google\Cloud\Storage\StorageClient;
/**
* Upload a file.
*
* #param string $bucketName the name of your Google Cloud bucket.
* #param string $objectName the name of the object.
* #param string $source the path to the file to upload.
*
* #return Psr\Http\Message\StreamInterface
*/
function upload_object($bucketName, $objectName, $source)
{
$storage = new StorageClient();
$file = fopen($source, 'r');
$bucket = $storage->bucket($bucketName);
$object = $bucket->upload($file, [
'name' => $objectName
]);
printf('Uploaded %s to gs://%s/%s' . PHP_EOL, basename($source), $bucketName, $objectName);
}
You can follow the How-To [3] for retrieving an object from the bucket as well. Note that this is no different than storing/downloading objects from other buckets, as you can specify from which bucket to pull/push from.
[1] https://cloud.google.com/storage/docs/how-to
[2] https://cloud.google.com/storage/docs/uploading-objects
[3] https://cloud.google.com/storage/docs/downloading-objects
I have a symfony 2.8 application and on the user clicking "download" button, I use the keys of several large (images, video) files on s3 to stream this to the browser as a zip file using ZipStream (https://github.com/maennchen/ZipStream-PHP).
The streaming of files and download as a zip (stored, not compressed) is successful in the browser & the zip lands in Downloads. However, when attempting to open the zip in Mac OSX El Capitan using Archive Utility (native archive software on OSX), it errors out. The Error:
Unable to expand "test.zip" into "Downloads". (Error 2 - No such file or directory.)
I have seen older, identical issues on SO and attempted those fixes, specifically this post: https://stackoverflow.com/a/5574305/136151 and followed up on the Issues & PRs in ZipStream that relates and the upstream fixes in Guzzle etc.
Problem is, the above fixes were back in 2011 and things move on in that time. So applying the same fixes I am not getting a working result.
Specific fixes I have tried is:
1. Setting "version to extract" to 0x000A as suggested. Also as '20' as recommended in another post. I've set the same for "version made by".
2. I tried to force the compression method to 'deflate' instead of 'stored' to see if I got a working result. A stored resulted is all I need and suitable for a zip used as container file for images & video.
I am able to extract the zip using a third party archive app called The Unarchiver. However, users won't know & can't be expected to install an alternative archive app to suit my web app. Thats not an effective solution.
Does anyone have knowledge or experience of solving this issue and can help me out with how to resolve it?
NB: A streamed zip to browser is the required solution. Downloading assets from s3 to the server to create a zip and then stream the resulting zip to the browser is not a solution given the amount of time and overhead of such an approach.
Added info if required:
Code & Setup:
- Files are stored on s3.
- Web app is symfony 2.8 on PHP7.0 runing on ec2 with Ubuntu.
- Using aws-sdk-php, create an s3client with valid credentials and I register StreamWrapper (s3client->registerStreamWrapper()) on the s3client. This is to fetch files from s3 via fopen to stream to ZipStream library:
$this->s3Client = $s3Client;
$this->s3Client->registerStreamWrapper();
// Initialize the ZipStream object and pass in the file name which
// will be what is sent in the content-disposition header.
// This is the name of the file which will be sent to the client.
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
$zip = new ZipStream\ZipStream($filename, $opt);
$keys = array(
"zipsimpletestfolder/file1.txt"
);
foreach ($keys as $key) {
// Get the file name in S3 key so we can save it to the zip file
// using the same name.
$fileName = basename($key);
$bucket = 'mg-test';
$s3path = "s3://" . $bucket . "/" . $key;
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
Zip Output Results:
Extraction on mac via Archive Utility fails with error 2
Extraction on mac with The unarchiver works.
Extraction on windows with 7-zip works.
Extraction on windows with WinRar fails - says zip is corrupted.
Response Headers:
Edit
I'm open to using another method of streaming files to browser as a zip that can be opened on Mac, Windows, Linux natively without using ZipStream if suggested. Just not to creating a zip file server side to subsequently stream thereafter.
The issue with the zip downloaded is that it contained html from the response in the Symfony controller that was calling ZipStream->addFileFromStream(). Basically, when ZipStream was streaming data to create zip download in client browser, a controller action response was also sent and, best guess, the two were getting mixed up on client's browser. Opening the zip file in hex editor and seeing the html in there it was obviously the issue.
To get this working in Symfony 2.8 with ZipStream, I just used Symfony's StreamedResponse in the controller action and used ZipStream in the streamedResponse's function. To stream S3 files I just passed in an array of s3keys and the s3client into that function. So:
use Symfony\Component\HttpFoundation\StreamedResponse;
use Aws\S3\S3Client;
use ZipStream;
//...
/**
* #Route("/zipstream", name="zipstream")
*/
public function zipStreamAction()
{
//test file on s3
$s3keys = array(
"ziptestfolder/file1.txt"
);
$s3Client = $this->get('app.amazon.s3'); //s3client service
$s3Client->registerStreamWrapper(); //required
$response = new StreamedResponse(function() use($s3keys, $s3Client)
{
// Define suitable options for ZipStream Archive.
$opt = array(
'comment' => 'test zip file.',
'content_type' => 'application/octet-stream'
);
//initialise zipstream with output zip filename and options.
$zip = new ZipStream\ZipStream('test.zip', $opt);
//loop keys useful for multiple files
foreach ($s3keys as $key) {
// Get the file name in S3 key so we can save it to the zip
//file using the same name.
$fileName = basename($key);
//concatenate s3path.
$bucket = 'bucketname';
$s3path = "s3://" . $bucket . "/" . $key;
//addFileFromStream
if ($streamRead = fopen($s3path, 'r')) {
$zip->addFileFromStream($fileName, $streamRead);
} else {
die('Could not open stream for reading');
}
}
$zip->finish();
});
return $response;
}
Solved. Maybe this helps someone else use ZipStream in Symfony.
I have a Django app that contains a Video-on-Demand feature. It's powered by Azure Media Services (AMS). When a user uploads a video, I first save the video in an Azure storage blob, and then I use a PHP script (which utilizes the AMS php sdk) to encode the said video and prep a streaming URL (hosted on AMS).
My problem is this: how do I get the dimensions of the video? I need to know the height and width so that I can encode the video to lower res formats on AMS. I can't get the dimensions from python since I'm not uploading the video file onto a local server first (where my web server is running). What are my options? Please advise.
As you are using AMS SDK for PHP to create AMS task, and which requires the video asset file. You can leverage the PHP module http://getid3.sourceforge.net/ to get the info of video asset during the PHP process with a ease.
You can download the PHP module http://getid3.sourceforge.net/ and extract to your php application's folder, and you can use the following code snippet to get the dimensions of video asset:
require_once('./getid3/getid3.php');
$filename="<video_path>";
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($filename);
var_dump($ThisFileInfo['asf']['video_media']);
Any further concern, please feel free to let me know.
update using remotefile on Azure Storage
Here is a code sample, leveraging which, you can use the SAS url of blobs on Azure Storage. It will download the file to server folder, and detect the info, and then delete the template file.
$remotefilename = '<SAS Url>';
if ($fp_remote = fopen($remotefilename, 'rb')) {
$localtempfilename = tempnam('/tmp', 'getID3');
if ($fp_local = fopen($localtempfilename, 'wb')) {
while ($buffer = fread($fp_remote, 8192)) {
fwrite($fp_local, $buffer);
}
fclose($fp_local);
// Initialize getID3 engine
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($localtempfilename);
// Delete temporary file
unlink($localtempfilename);
}
fclose($fp_remote);
var_dump($ThisFileInfo);
}
i am in the proccess of creating a "Content Management System" for a "start up company". I have a Post.php model in my project, the following code snippet is taken from the Create method:
if(Request::file('display_image') != null){
Storage::disk('s3')->put('/app/images/blog/'.$post->slug.'.jpg', file_get_contents(Request::file('display_image')));
$bucket = Config::get('filesystems.disks.s3.bucket');
$s3 = Storage::disk('s3');
$command = $s3->getDriver()->getAdapter()->getClient()->getCommand('GetObject', [
'Bucket' => Config::get('filesystems.disks.s3.bucket'),
'Key' => '/app/images/blog/'.$post->slug.'.jpg',
'ResponseContentDisposition' => 'attachment;'
]);
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
The above code checks if there is a "display_image" file input in the request object.
If it finds a file it uploads it directly to AWS S3 storage. I want to save the link of the file in the Database, so i can link it later in my views.
Hence i use this piece of code:
$request = $s3->getDriver()->getAdapter()->getClient()->createPresignedRequest($command, '+5 minutes');
$image_url = (string) $request->getUri();
$post->display_image = $image_url;
I get a URL, the only problem is that whenever i visit the $post->display_image URL i get a 403 permission denied. Obviously no authentication takes place when using the URL of the image.
How to solve this? I need to be able to link all my images/files from amazon S3 to the front-end interface of the website.
You could open up those S3 URLs to public viewing, but you probably wouldn't want to. You have to pay for the outgoing bandwidth every time someone views one of those images.
You might want to check out Glide, a pretty simple-to-use image library that supports S3. Make sure to reduce the load requirements on your server and wallet by setting caching headers on the images you serve.
Alternatively, you could use a CloudFront distribution as a caching proxy in front of your S3 bucket.
I want to upload a big video file to my AWS S3 bucket. After a good deal of hours, I finally managed to configure my php.ini and nginx.conf files, so they allowed bigger files.
But then I got a "Fatal Error: Allowed Memory Size of XXXXXXXXXX Bytes Exhausted". After some time I found out larger files should be uploaded with streams using fopen(),fwrite(), and fclose().
Since I'm using Laravel 5, the filesystem takes care of much of this. Except that I can't get it to work.
My current ResourceController#store looks like this:
public function store(ResourceRequest $request)
{
/* Prepare data */
$resource = new Resource();
$key = 'resource-'.$resource->id;
$bucket = env('AWS_BUCKET');
$filePath = $request->file('resource')->getRealPath();
/* Open & write stream */
$stream = fopen($filePath, 'w');
Storage::writeStream($key, $stream, ['public']);
/* Store entry in DB */
$resource->title = $request->title;
$resource->save();
/* Success message */
session()->flash('message', $request->title . ' uploadet!');
return redirect()->route('resource-index');
}
But now I get this long error:
CouldNotCreateChecksumException in SignatureV4.php line 148:
A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Stream\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.
So I am currently completely lost. I can't figure out if I'm even on the right track. Here are the resource I try to make sense of:
AWS SDK guide for PHP: Stream Wrappers
AWS SDK introduction on stream wrappers
Flysystem original API on stream wrappers
And just to confuse me even more, there seems to be another way to upload large files other than streams: The so called "multipart" upload. I actually thought that was what the streams where all about...
What is the difference?
I had the same problem and came up with this solution.
Instead of using
Storage::put('file.jpg', $contents);
Which of course ran into an "out of memory error" I used this method:
use Aws\S3\MultipartUploader;
use Aws\Exception\MultipartUploadException;
// ...
public function uploadToS3($fromPath, $toPath)
{
$disk = Storage::disk('s3');
$uploader = new MultipartUploader($disk->getDriver()->getAdapter()->getClient(), $fromPath, [
'bucket' => Config::get('filesystems.disks.s3.bucket'),
'key' => $toPath,
]);
try {
$result = $uploader->upload();
echo "Upload complete";
} catch (MultipartUploadException $e) {
echo $e->getMessage();
}
}
Tested with Laravel 5.1
Here are the official AWS PHP SDK docs:
http://docs.aws.amazon.com/aws-sdk-php/v3/guide/service/s3-multipart-upload.html
the streaming part applies to downloads.
for uploads you need to know the content size. for large files multipart uploads are the way to go.