php API to upload and download files to Amazon S3 - php

I have a website hosted on amazon. I want my clients to give access to upload files that are already in their amazon s3 space to my s3 space. Is there any php API that supports this functionality?

Amazon actually provides one. And there are lots of examples on the web of using it. Google is your friend.

Amazon have a PHPSDK , check the sample code
// The sample code below demonstrates how Resource APIs work
$aws = new Aws($config);
// Get references to resource objects
$bucket = $aws->s3->bucket('my-bucket');
$object = $bucket->object('image/bird.jpg');
// Access resource attributes
echo $object['LastModified'];
// Call resource methods to take action
$object->delete();
$bucket->delete();
Or use old s3.php for uploading files to s3 bucket. its a single php file named s3.php
You just download that and from your code . for more read this.
<?php
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'YourAccess S3 Key');
if (!defined('awsSecretKey')) define('awsSecretKey', 'Yor Secret Key');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("bucket name", S3::ACL_PRIVATE);
//move the file
if ($s3->putObjectFile("your file name in the server with path", "which bucket ur using (bucket name)", "fine name in s3 server", S3::ACL_PRIVATE)) {
//s3 upload success
}
?>

Related

Move images from one server to s3 PHP Laravel 5.3

I'm trying to write a script where I'm trying to move images from an old server to Amazon s3. Is it possible to do by downloading the image from an url?
$companies = Company::all();
$companies->each( function ($company) {
//some method to download file
$file = download($company->logo);
//Store on s3
$filename = $file->storeAs('images', uniqid('img_') . "." . $file->guessClientExtension(),'s3');
//Get the new path
$new_path = Storage::disk('s3')->url($filename);
//save the new path to logo
$company->logo = $new_path;
//save the new path
$company->save();
}
You can use this library Php League FileSystem
It has an integration for laravel and other framewoks such as zend, Cake etc
The library has an adapter for amazon S3 V2 and amazon s3 V3
Full documentation here

How to get dimensions of an uploaded Video in Azure Media Services (PHP SDK/Django project)

I have a Django app that contains a Video-on-Demand feature. It's powered by Azure Media Services (AMS). When a user uploads a video, I first save the video in an Azure storage blob, and then I use a PHP script (which utilizes the AMS php sdk) to encode the said video and prep a streaming URL (hosted on AMS).
My problem is this: how do I get the dimensions of the video? I need to know the height and width so that I can encode the video to lower res formats on AMS. I can't get the dimensions from python since I'm not uploading the video file onto a local server first (where my web server is running). What are my options? Please advise.
As you are using AMS SDK for PHP to create AMS task, and which requires the video asset file. You can leverage the PHP module http://getid3.sourceforge.net/ to get the info of video asset during the PHP process with a ease.
You can download the PHP module http://getid3.sourceforge.net/ and extract to your php application's folder, and you can use the following code snippet to get the dimensions of video asset:
require_once('./getid3/getid3.php');
$filename="<video_path>";
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($filename);
var_dump($ThisFileInfo['asf']['video_media']);
Any further concern, please feel free to let me know.
update using remotefile on Azure Storage
Here is a code sample, leveraging which, you can use the SAS url of blobs on Azure Storage. It will download the file to server folder, and detect the info, and then delete the template file.
$remotefilename = '<SAS Url>';
if ($fp_remote = fopen($remotefilename, 'rb')) {
$localtempfilename = tempnam('/tmp', 'getID3');
if ($fp_local = fopen($localtempfilename, 'wb')) {
while ($buffer = fread($fp_remote, 8192)) {
fwrite($fp_local, $buffer);
}
fclose($fp_local);
// Initialize getID3 engine
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($localtempfilename);
// Delete temporary file
unlink($localtempfilename);
}
fclose($fp_remote);
var_dump($ThisFileInfo);
}

using php to upload to Amazon S3 without access & secret key

Using PHP I want to upload some files to amazon s3 public bucket. But I don't want to use access key and secret key in my code. Because when I searched and found that using keyword "null" will solve the case. In PHP we use '' as null, but doesn't have luck. Below is my code
<?php
if (!class_exists('s3'))require_once('s3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'null');
if (!defined('awsSecretKey')) define('awsSecretKey', 'null');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("bucket name", S3::ACL_PUBLIC_READ_WRITE);
//move the file
if ($s3->putObjectFile("your file name in the server with path", "bucket name", "fine name in s3 server",S3::ACL_PUBLIC_READ_WRITE)){
//s3 upload success
}

AWS upload to S3 with SQS - PHP syntax

Would uploading to S3 using SQS make the process more fault tolerant?
If so, i am having a hard time with syntax, trying to combine creating a queue then uploading to S3.If my logic is not correct, how would i set up a system using SQS to upload to S3?
if (!class_exists('S3'))require_once('S3.php');
// *these keys are random strings
$AWS_KEY = "6VVWTU4JDAAKHYB1C3ZN";
$AWS_SECRET_KEY = "GMSCUD8C0QA1QLV9Y3RP2IAKDIZSCHRGKEJSXZ4F";
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', $AWS_KEY);
if (!defined('awsSecretKey')) define('awsSecretKey', $AWS_SECRET_KEY);
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
//check whether a form was submitted
if(isset($_POST['Submit'])){
//retreive post variables
$fileName = $_FILES['theFile']['name'];
$fileTempName = $_FILES['theFile']['tmp_name'];
//create a new bucket
$s3->putBucket("mybucket", S3::ACL_PUBLIC_READ);
//add the queue
$sqs = new AmazonSQS(array( "key" => $AWS_KEY, "secret" => $AWS_SECRET_KEY ));
$response = $sqs->create_queue('test-topic-queue');
$queue_url = (string) $response->body->CreateQueueResult->QueueUrl;
$queue_arn = 'arn:aws:sqs:us-east-1:ENCQ8gqrAcXv:test-topic-queue';
//$queue_url . ?Action=SendMessage&MessageBody=Your%20Message%20Text?&AWSAccessKeyId=AKIAIOSFODNN7EXAMPLE&Version=2011-10-01?&Expires=2008-02-10T12:00:00Z?&Signature=lBP67vCvGlDMBQ1do?fZxg8E8SUEXAMPLE&SignatureVersion=2&SignatureMethod=HmacSHA256
// HOW DO I INCORPORATE SQS AND S3
//move the file
if ($s3->putObjectFile($fileTempName,
"mybucket",
"myFolder/" . $fileName, S3::ACL_PUBLIC_READ,
array(),
$_FILES['theFile']['type']) ) {
//it works
}else{
// error
}
}
I'm not exactly understanding the fault tolerance you are requesting. But in terms of using S3 and SQS for scaling, there is an excellent paper on the Amazon AWS website that talks about scaling up and down your infrastructure using SQS and EC2 together, which can of course include processes like uploading to S3 and using SQS to tell the application to process something. You don't mention whether or not you're using EC2 or if this is of interest.
Here is the article: http://aws.amazon.com/articles/1464
Otherwise, it sounds like your logic may be confused as SQS isn't an in-between from server to S3, but rather more for application messaging.
I think i figured out the OP's confusion on this topic.
The diagram shown makes it appear that SQS is handling uploads, but its really not, when you upload 1 or 100 photo's, they're added to S3 directly, then using SQS it creates a "Task" which one of the EC2 "Processing Servces" will pull, and then grab the actual picture from the S3 storage that as named in the SQS message.
Hopefully this gives some insight for future users who see this question feeling lost.

Streaming private videos from Amazon S3

I need to display videos / images file with ACL:PRIVATE uploaded to my Amazon S3 account on my wordpress blog.
I am a newbie to PHP oops based coding. Any script help, link references, free plugins or even Logical Algorithm will be great help :)
Thanks in advance.
This issue could be solved by implementing the following steps:
Download latest stable version of SDK from here
Extract the .zip file & place in wamp/www folder
Rename config-sample.inc.php file to config.inc.php
Add the access key & secret key (retrieved from Amazon S3 account) into above file, save & exit
create a sample file to display public / private objects from Amazon S3
The content of the file should look as follows:
require('sdk.class.php');
require('services/s3.class.php');
$s3 = new AmazonS3();
$bucket = "bucketname";
$temp_link = $s3->get_object_url($bucket, 'your/folder/path/img.jpg', '5 minute');
echo $temp_link;
In above code, the URL you receive as output is a signed URL for your private object, thus it is valid only for 5 minutes.
You may grant access for a future date and allow only authorized users to access your private content or media on Amazon S3.
This question is a little bit old, but I'm posting it anyway. I had a simliar issue today and found out there's a simple answer.
aws doc explains it clearly and has an example as well.
http://docs.aws.amazon.com/aws-sdk-php-2/guide/latest/service-s3.html#amazon-s3-stream-wrapper
Basically, you need to register AWS' stream wrapper and use s3:// protocol.
Here's my code sample.
use Aws\Common\Aws;
use Aws\S3\Enum\CannedAcl;
use Aws\S3\Exception\S3Exception;
$s3 = Aws::factory(array(
'key' => Config::get('aws.key'),
'secret' => Config::get('aws.secret'),
'region' => Config::get('aws.region')
))->get('s3');
$s3->registerStreamWrapper();
// now read file from s3
// from the doc.
// Open a stream in read-only mode
if ($stream = fopen('s3://bucket/key', 'r')) {
// While the stream is still open
while (!feof($stream)) {
// Read 1024 bytes from the stream
echo fread($stream, 1024);
}
// Be sure to close the stream resource when you're done with it
fclose($stream);
}

Categories