Move images from one server to s3 PHP Laravel 5.3 - php

I'm trying to write a script where I'm trying to move images from an old server to Amazon s3. Is it possible to do by downloading the image from an url?
$companies = Company::all();
$companies->each( function ($company) {
//some method to download file
$file = download($company->logo);
//Store on s3
$filename = $file->storeAs('images', uniqid('img_') . "." . $file->guessClientExtension(),'s3');
//Get the new path
$new_path = Storage::disk('s3')->url($filename);
//save the new path to logo
$company->logo = $new_path;
//save the new path
$company->save();
}

You can use this library Php League FileSystem
It has an integration for laravel and other framewoks such as zend, Cake etc
The library has an adapter for amazon S3 V2 and amazon s3 V3
Full documentation here

Related

Can't write image data to path - Intervention Image

I'm trying to resize my images using Intervention package in my laravel project. I'm able to store images using Storage facade, but when I store them with Image class it throws below error.
Can't write image data to path
(/var/www/commerce/storage/featureds/2-1537128615.jpg)
if ($request->has("featured_img")) {
$file_path = $request->get("featured_img");
$exp = explode("/",$file_path);
$filename = $exp[1];
$featured_path = "featureds/".$filename;
\Storage::copy($request->get("featured_img"), $featured_path);
\Image::make(\Storage::get($featured_path))->fit(400, 300)->save(storage_path($featured_path));
$product->media()->where("type", "featured")->update(["path" => $featured_path]);
}
How can I fix this ? Thanks for all your helps
Is there featureds directory in storage?
If no, you use mkdir to create directory because Intervention don't create directory automatically.

How to store base 64 decoded image to amazon s3 bucket in php

I was trying to store base 64 decoded image to amazon s3 bucket in php here is my code
$data = base64_decode($pro_img_nm);
$camp_name = "name";
$bucketName = "bucket";
$file = $camp_name.uniqid().'.png';
$image = imagecreatefromstring($data);
header('Content-Type: image.png');
imagepng($image,'folder/'.$file);
$s3->putObjectFile($image, $bucketName, "folder/".$file, S3::ACL_PUBLIC_READ)
and image getting tored on local because of this (imagepng()) but failed to store on s3 bucket error as "S3::inputFile(): Unable to open input file " can anyone help me on this. Thanks
Finally i found the solution, the error is occurred because of the version i used the version we need to use v3 this is the link that we should follow.
http://docs.aws.amazon.com/aws-sdk-php/v3/guide/

How to get dimensions of an uploaded Video in Azure Media Services (PHP SDK/Django project)

I have a Django app that contains a Video-on-Demand feature. It's powered by Azure Media Services (AMS). When a user uploads a video, I first save the video in an Azure storage blob, and then I use a PHP script (which utilizes the AMS php sdk) to encode the said video and prep a streaming URL (hosted on AMS).
My problem is this: how do I get the dimensions of the video? I need to know the height and width so that I can encode the video to lower res formats on AMS. I can't get the dimensions from python since I'm not uploading the video file onto a local server first (where my web server is running). What are my options? Please advise.
As you are using AMS SDK for PHP to create AMS task, and which requires the video asset file. You can leverage the PHP module http://getid3.sourceforge.net/ to get the info of video asset during the PHP process with a ease.
You can download the PHP module http://getid3.sourceforge.net/ and extract to your php application's folder, and you can use the following code snippet to get the dimensions of video asset:
require_once('./getid3/getid3.php');
$filename="<video_path>";
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($filename);
var_dump($ThisFileInfo['asf']['video_media']);
Any further concern, please feel free to let me know.
update using remotefile on Azure Storage
Here is a code sample, leveraging which, you can use the SAS url of blobs on Azure Storage. It will download the file to server folder, and detect the info, and then delete the template file.
$remotefilename = '<SAS Url>';
if ($fp_remote = fopen($remotefilename, 'rb')) {
$localtempfilename = tempnam('/tmp', 'getID3');
if ($fp_local = fopen($localtempfilename, 'wb')) {
while ($buffer = fread($fp_remote, 8192)) {
fwrite($fp_local, $buffer);
}
fclose($fp_local);
// Initialize getID3 engine
$getID3 = new getID3;
$ThisFileInfo = $getID3->analyze($localtempfilename);
// Delete temporary file
unlink($localtempfilename);
}
fclose($fp_remote);
var_dump($ThisFileInfo);
}

Upload file from url to AWS in laravel

I know how to upload file from local storage to aws using laravel. But I want to upload file directly from external url to aws without downloading.
Any suggestion, how can I achieve this.
I finally solved this using Intervention Image Library.
use Image;
use Storage;
$image = Image::make('url');
$image->encode('jpg');
$s3 = Storage::disk('s3');
$filePath = '/profilePhotos/'.$time();
$s3->put($filePath, $image->__toString(), 'public');
Installation instructions for Image library can be found here in the "Integration in Laravel" section.
The accepted answer is dependent on another library. You can do it without the Intervention Image Library. Here is how you can do it-
$url = 'https://remote.site/photo/name.jpg'
$contents = file_get_contents($url);
$name = substr($url, strrpos($url, '/') + 1);
Storage::put($name, $contents);

php API to upload and download files to Amazon S3

I have a website hosted on amazon. I want my clients to give access to upload files that are already in their amazon s3 space to my s3 space. Is there any php API that supports this functionality?
Amazon actually provides one. And there are lots of examples on the web of using it. Google is your friend.
Amazon have a PHPSDK , check the sample code
// The sample code below demonstrates how Resource APIs work
$aws = new Aws($config);
// Get references to resource objects
$bucket = $aws->s3->bucket('my-bucket');
$object = $bucket->object('image/bird.jpg');
// Access resource attributes
echo $object['LastModified'];
// Call resource methods to take action
$object->delete();
$bucket->delete();
Or use old s3.php for uploading files to s3 bucket. its a single php file named s3.php
You just download that and from your code . for more read this.
<?php
if (!class_exists('S3'))require_once('S3.php');
//AWS access info
if (!defined('awsAccessKey')) define('awsAccessKey', 'YourAccess S3 Key');
if (!defined('awsSecretKey')) define('awsSecretKey', 'Yor Secret Key');
//instantiate the class
$s3 = new S3(awsAccessKey, awsSecretKey);
$s3->putBucket("bucket name", S3::ACL_PRIVATE);
//move the file
if ($s3->putObjectFile("your file name in the server with path", "which bucket ur using (bucket name)", "fine name in s3 server", S3::ACL_PRIVATE)) {
//s3 upload success
}
?>

Categories