Save PHP file_get_contents stream to AWS S3 - php

Im trying to get a remote file (an image) using PHP and then put that image to an S3 Bucket.
It mostly works, except that the file that is uploaded to S3 is empty when downloaded back again.
Here is my code so far:
$src = "http://images.neventum.com/2016/83/thumb100x100/56f3fb1b9dbd6-acluvaq_63324840.png";
$name = md5(uniqid());
$ext = pathinfo($src, PATHINFO_EXTENSION);
$file_content = file_get_contents($src);
$filename = "{$name}.{$ext}";
try {
$file = $this->s3->putObject([
'Bucket' => getEnv('s3bucket'),
'Key' => $filename,
'Body' => $file_content,
'ACL' => 'private'
]);
} catch (S3Exception $e) {
//Catch
}
Hope you can help. Thank you so much in advance.
Update 1:
The problem is not with the ACL (I have tested using "public"), is that the saved object on S3 is not uploded correctly (I think is something to do with the encoding, but have not been able to figure it out)

Once you upload image to S3 bucket it will be private you can't able to download directly. You need to give public read access to make object available for download to users.

Related

Google Drive API upload 1 fiile from php

on output when open page -
File ID: 1qG8tteyVhAbB_rbu_VUvaE9ReqnSjEAh...
But on google drive no new files are created. I want upload file to cron but now i want only download test.pdf and end.
require_once './google-api-php-client/vendor/autoload.php';
use Google\Client;
use Google\Service\Drive;
function uploadBasic()
{
try {
$client = new Client();
//$client->useApplicationDefaultCredentials();
$client->setAuthConfig('./google-api-php-client/1710-6c50418be6b2.json');
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'parents' => ['225qhcKKyf8Ot0IhrRxRtqgHNTxLV1LiyI'],
'name' => 'test.pdf',
'mimeType' => 'application/pdf'));
$mimeType=mime_content_type($fileMetadata);
$content = file_get_contents('https://example.com/test.pdf');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'application/pdf',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}
}
uploadBasic();
how to debug issue
The fastest way to debug this is to do a File.list. This will tell you if in fact the file was uploaded.
You are not setting parents yin your meta data, so the file will have been uploaded to the root directory.
service account
Remember if you are using a service account that the files are uploaded into the service accounts google drive account, not your personal drive account.
To upload to your personal drive account you would need to create a directory on your drive account, share that directory with your service account using the service account email address. The service account email address can be found in the json key file its the only one with an #.
Then set parents in the meta data to the folder on your drive account
$fileMetadata = new Drive\DriveFile(array(
'parents' => { 'FolderId' }
'name' => 'ASB-Background-3.png'));
File 0 size error after edit
You edited your question. It originally stated you were doing this
$content = file_get_contents('./google-api-php-client/ASB-Background-3.png');
It is bad practice to update your question and change your code. It changes the answer to your question and in this case your error message.
That being said From the documentation for file_get_contents
file_get_contents — Reads entire file into a string
There is nothing in the documentation that states that this method could load a file from a url. So your edit changing to a URL is probably not going to work.
file_get_contents('https://example.com/test.pdf');
Your file is uploading with 0 because your not giving it a file. Download that file on to the machine running it and then send it, or write our own method which will accept a url and return a string file conents.
upload image
Files are uploaded in two parts first the fileMetadata and then the file itself.
MimeType must be properly set to that of the file you are uploading. file_get_contents will only work on a file that is currently accessible by your code.
If the file size is 0 make sure
to check the most recent uploaded file. every create will create a new file.
ensure that your code has access to the file you are uploading.
make sure the mimeType is correct.
Sample.
try {
$client = new Client();
$client->useApplicationDefaultCredentials();
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'name' => 'photo.jpg'));
$content = file_get_contents('../files/photo.jpg');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'image/jpeg',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}

Uploading image to web server using FTP

I'm pretty new to laravel and I'm currently stuck at uploading images to web server via ftp. I've followed this article/tutorial to set everything up, but I can't get it to work, I get this error:
Can't write image data to path (/storage/ssd4/849/3099849/ponijeri/public/uploads/1523970289image1.jpg)
Note: I was transferring files/images from my desktop to uploads folder in my project files while the website was still in development (localhost), and everything was working fine until I decided to upload files to my live website on web server.
Controller code:
public function update(Request $request, $id)
{
$findObject = Accommodation::find($id);
$findObject->update($request->all());
Gallery::destroy('objects_id', $id);
foreach ($request['img'] as $img) {
$gallery = new Gallery();
$gallery->objects_id=$id;
$name = time() . $img->getClientOriginalName(); // prepend the time (integer) to the original file name
$img->move('uploads', $name); // move it to the 'uploads' directory (public/uploads)
$gallery->img=$name;
$gallery->save();
// // create instance of Intervention Image
$img = Image::make('uploads/'.$name);
$img->save(public_path().'/uploads/'.$name);
Storage::disk('ftp')->put($gallery, fopen($request->file('img[]'), 'r+'));
}
$headerImage = $request['headerImage'];
$name = time() . $headerImage->getClientOriginalName(); // prepend the time (integer) to the original file name
$headerImage->move('uploads', $name); // move it to the 'uploads' directory (public/uploads)
$findObject->headerImage=$name;
$findObject->save();
// // create instance of Intervention Image
$headerImage = Image::make('uploads/'.$name);
$headerImage->save(public_path().'/uploads/'.$name);
Storage::disk('ftp')->put($headerImage, fopen($request->file('headerImage'), 'r+'));
return redirect('/objects');
}
FTP configuration:
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USERNAME'),
'password' => env('FTP_PASSWORD'),
'root' => '/public/uploads'
],
I appreciate any help!
It seems you are using relative path
public_path()
on production server, please try to use absolute path instead.

How to download a file from URL without showing the full path in laravel?

Download link:-
<a href='"+ downloadlink + '/attachment-download/' + $('#employee_ID').val()+'/'+ res[i].file_name +"'>
Route:-
Route::get('/attachment-download/{id}/{filename}', array(
'uses' => 'AttachmentsController#getDownloadAttachments'
));
Attachment Controller:-
public function getDownloadAttachments($id,$filename){
$file="./img/user-icon.png";
$resource = \Employee::WhereCompany()->findOrfail($id);
$path = $this->attachmentsList($resource);
foreach($path as $index => $attachment){
if ($filename == $attachment['file_name']) {
$filePath = $attachment['url'];
}
}
//return \Response::download($file);
return \Response::download($filePath);
}
File URL Output:-
https://zetpayroll.s3.ap-south-1.amazonaws.com/employees/81/Screenshot%20from%202017-04-26%2015%3A07%3A45.png?X-Amz-Content-Sha256=UNSIGNED-PAYLOAD&X-Amz-Algorithm=AWS4-HMAC-SHA256&X-Amz-Credential=AKIAI57OFN3HPCBPQZIQ%2F20170612%2Fap-south-1%2Fs3%2Faws4_request&X-Amz-Date=20170612T144818Z&X-Amz-SignedHeaders=host&X-Amz-Expires=3600&X-Amz-Signature=59ecc4d11b7ed71bd336531bd7f4ab7c84da6b7424878d6487679c97a8c52ca7
In this, if try to download the file by using a static path like
$file="./img/user-icon.png";
return \Response::download($file);
it is downloaded fine. But not possible to downloading file from AWS URL, Please help me how to down file automatically using URL. Or How to get Path from URL in laravel or PHP.
Thank you.
Using the above function all the files are being downloaded. But while trying to open the files, text, pdf, ... files open (.text, .csv, .pdf..) without problem, but images don't.
$fileContent = file_get_contents($filePath);
$response = response($fileContent, 200, [
'Content-Type' => 'application/json',
'Content-Disposition' => 'attachment; filename="'.$filename.'"',
]);
return $response;

Uploading multiple files to AWS S3 with Progress Bar

I'm using Laravel to upload multiple files into an S3 bucket but can't get the progress data correctly on client side.
On client side I have the following (simplified):
var xhr = new XMLHttpRequest();
xhr.addEventListener("progress", updateProgress, false);
xhr.addEventListener("load", transferComplete, false);
xhr.open("POST", "my_url_to_upload");
xhr.send(formData);//formData is defined earlier and contains multiple files
function updateProgress(e) {
console.log('updateProgress', e);
}
function transferComplete(e) {
console.log("Files uploaded successfully.", e);
}
On Laravel's side:
$files = \Input::file('file');
$s3 = \Storage::disk('s3');
foreach ($files as $file)
{
$file_name = "/some_folder/" . $file->getClientOriginalName();
$s3->put($file_name, file_get_contents($file), 'public');
}
This works great as far as uploading the files to the S3 bucket.
The problem is that when uploading multiple files, the client side updateProgress function is only called once and only when all files have been uploaded (instead of after each file and during the uploads).
Ideally, the progress bar will get updated periodically during file uploads, so that when uploading large files, it will show close to real time progress (and not just when each file is completed).
How would I get Laravel (or PHP in general) to report back the progress, during the uploads?
Here's how to do it in SDK v3:
$client = new S3Client(/* config */);
$result = $client->putObject([
'Bucket' => 'bucket-name',
'Key' => 'bucket-name/file.ext',
'SourceFile' => 'local-file.ext',
'ContentType' => 'application/pdf',
'#http' => [
'progress' => function ($downloadTotalSize, $downloadSizeSoFar, $uploadTotalSize, $uploadSizeSoFar) {
printf(
"%s of %s downloaded, %s of %s uploaded.\n",
$downloadSizeSoFar,
$downloadTotalSize,
$uploadSizeSoFar,
$uploadTotalSize
);
}
]
]);
This is explained in the AWS docs - S3 Config section. It works by exposing GuzzleHttp's progress property-callable, as explained in this SO answer.
I understand you're doing it a bit differently (with Streams) but I'm sure you can adjust my example to your needs.

Store S3 URL in database table

I'm building a small asset management system in Laravel 5.2
A user can upload images, video etc to the app and the asset meta data gets stored in the assets table. While that's happening, the asset is renamed to match the asset id (I'm storing the original filename too), I'm storing the mime type and uploading the file to S3.
Where I've come unstuck is storing the S3 url in database.
This is my method
public function store(AssetRequest $request)
{
// Initialise new asset and set the name
// from the form
$asset = new Asset(array(
'name' => $request->get('name')
));
$asset->user_id = Auth::user()->id;
// save the asset to the db
$asset->save();
// set the file var = form input
$file = $request->file('asset_path');
$extension = $file->getClientOriginalExtension();
// modify the asset name
$assetFile = $asset->id . '.' . $request->file('asset_path')->getClientOriginalExtension();
// push the new asset to s3
Storage::disk('s3')->put('uploads/' . $assetFile, file_get_contents($file));
$asset->mime = $file->getClientMimeType();
$s3Url = Storage::url($file);
$asset->s3_url = $s3Url;
$asset->original_filename = $file->getClientOriginalName();
$asset->filename = $assetFile;
$asset->file_extension = $extension;
// return ok
$asset->save();
return \Redirect::route('asset.create')->with('message', 'Asset added!');
}
The lines relating to my attempt at storing the S3 url
$s3Url = Storage::url($file);
$asset->s3_url = $s3Url;
Only seems to store a temporary path /storage//tmp/php8su2r0 rather than an actual S3 url. I'd like to avoid having to set the bucket manually, rather hoping I can use what is configured in config/filesystem.php
Any ideas?
you can get everything from the config using the config(key) function helper
so to get the s3 public url of file, do this:
function publicUrl($filename){
return "http://".config('filesystems.disks.s3.bucket').".s3-website.".config('filesystems.disks.s3.region').".amazonaws.com/".$filename;
}
or you can use the underlying S3Client:(taken from here)
$filesystem->getAdapter()->getClient()->getObjectUrl($bucket, $key);
What your are trying to achieve, I have been doing that in many projects.
All you need to do is create image_url column in database. And pass the s3 bucket link + the name of the file + the extension.
You should know the bit that is constant for me like : https://s3-eu-west-1.amazonaws.com/backnine8/fitness/events/ then I have the id and the extension. in your case it could be name and extension.
if(Input::get('file')) {
$extension = pathinfo(Input::get('filename'), PATHINFO_EXTENSION);
$file = file_get_contents(Input::get('file'));
$s3 = AWS::get('s3');
$s3->putObject(array(
'ACL' => 'public-read',
'Bucket' => 'backnine8',
'Key' => '/fitness/events/'.$event->id.'.'.$extension,
'Body' => $file,
));
$event->image_url = 'https://s3-eu-west-1.amazonaws.com/backnine8/fitness/events/'.$event->id.'.'.$extension;
$event->save();
}

Categories