Uploading image to web server using FTP - php

I'm pretty new to laravel and I'm currently stuck at uploading images to web server via ftp. I've followed this article/tutorial to set everything up, but I can't get it to work, I get this error:
Can't write image data to path (/storage/ssd4/849/3099849/ponijeri/public/uploads/1523970289image1.jpg)
Note: I was transferring files/images from my desktop to uploads folder in my project files while the website was still in development (localhost), and everything was working fine until I decided to upload files to my live website on web server.
Controller code:
public function update(Request $request, $id)
{
$findObject = Accommodation::find($id);
$findObject->update($request->all());
Gallery::destroy('objects_id', $id);
foreach ($request['img'] as $img) {
$gallery = new Gallery();
$gallery->objects_id=$id;
$name = time() . $img->getClientOriginalName(); // prepend the time (integer) to the original file name
$img->move('uploads', $name); // move it to the 'uploads' directory (public/uploads)
$gallery->img=$name;
$gallery->save();
// // create instance of Intervention Image
$img = Image::make('uploads/'.$name);
$img->save(public_path().'/uploads/'.$name);
Storage::disk('ftp')->put($gallery, fopen($request->file('img[]'), 'r+'));
}
$headerImage = $request['headerImage'];
$name = time() . $headerImage->getClientOriginalName(); // prepend the time (integer) to the original file name
$headerImage->move('uploads', $name); // move it to the 'uploads' directory (public/uploads)
$findObject->headerImage=$name;
$findObject->save();
// // create instance of Intervention Image
$headerImage = Image::make('uploads/'.$name);
$headerImage->save(public_path().'/uploads/'.$name);
Storage::disk('ftp')->put($headerImage, fopen($request->file('headerImage'), 'r+'));
return redirect('/objects');
}
FTP configuration:
'ftp' => [
'driver' => 'ftp',
'host' => env('FTP_HOST'),
'username' => env('FTP_USERNAME'),
'password' => env('FTP_PASSWORD'),
'root' => '/public/uploads'
],
I appreciate any help!

It seems you are using relative path
public_path()
on production server, please try to use absolute path instead.

Related

Google Drive API upload 1 fiile from php

on output when open page -
File ID: 1qG8tteyVhAbB_rbu_VUvaE9ReqnSjEAh...
But on google drive no new files are created. I want upload file to cron but now i want only download test.pdf and end.
require_once './google-api-php-client/vendor/autoload.php';
use Google\Client;
use Google\Service\Drive;
function uploadBasic()
{
try {
$client = new Client();
//$client->useApplicationDefaultCredentials();
$client->setAuthConfig('./google-api-php-client/1710-6c50418be6b2.json');
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'parents' => ['225qhcKKyf8Ot0IhrRxRtqgHNTxLV1LiyI'],
'name' => 'test.pdf',
'mimeType' => 'application/pdf'));
$mimeType=mime_content_type($fileMetadata);
$content = file_get_contents('https://example.com/test.pdf');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'application/pdf',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}
}
uploadBasic();
how to debug issue
The fastest way to debug this is to do a File.list. This will tell you if in fact the file was uploaded.
You are not setting parents yin your meta data, so the file will have been uploaded to the root directory.
service account
Remember if you are using a service account that the files are uploaded into the service accounts google drive account, not your personal drive account.
To upload to your personal drive account you would need to create a directory on your drive account, share that directory with your service account using the service account email address. The service account email address can be found in the json key file its the only one with an #.
Then set parents in the meta data to the folder on your drive account
$fileMetadata = new Drive\DriveFile(array(
'parents' => { 'FolderId' }
'name' => 'ASB-Background-3.png'));
File 0 size error after edit
You edited your question. It originally stated you were doing this
$content = file_get_contents('./google-api-php-client/ASB-Background-3.png');
It is bad practice to update your question and change your code. It changes the answer to your question and in this case your error message.
That being said From the documentation for file_get_contents
file_get_contents — Reads entire file into a string
There is nothing in the documentation that states that this method could load a file from a url. So your edit changing to a URL is probably not going to work.
file_get_contents('https://example.com/test.pdf');
Your file is uploading with 0 because your not giving it a file. Download that file on to the machine running it and then send it, or write our own method which will accept a url and return a string file conents.
upload image
Files are uploaded in two parts first the fileMetadata and then the file itself.
MimeType must be properly set to that of the file you are uploading. file_get_contents will only work on a file that is currently accessible by your code.
If the file size is 0 make sure
to check the most recent uploaded file. every create will create a new file.
ensure that your code has access to the file you are uploading.
make sure the mimeType is correct.
Sample.
try {
$client = new Client();
$client->useApplicationDefaultCredentials();
$client->addScope(Drive::DRIVE);
$driveService = new Drive($client);
$fileMetadata = new Drive\DriveFile(array(
'name' => 'photo.jpg'));
$content = file_get_contents('../files/photo.jpg');
$file = $driveService->files->create($fileMetadata, array([
'data' => $content,
'mimeType' => 'image/jpeg',
'uploadType' => 'multipart',
'fields' => 'id']));
printf("File ID: %s\n", $file->id);
return $file->id;
} catch(Exception $e) {
echo "Error Message: ".$e;
}

PHP AWS S3 sdk: How to generate pre signed url to upload a file to a folder in the S3 bucket using PHP?

I am trying to create a pre signed url to upload a file to a folder in the S3 bucket using PHP. I am able to generate url to upload in bucket, but couldn't figure out where to mention folder name. Below is my code.
$object = 'test_103.jpg';
$bucket = $config['s3_input']['bucket'];
$expiry = new DateTime('+10 minutes');
$command = $s3_input->getCommand(
'PutObject',
[
'Bucket' => $bucket,
'Key' => $object
]
);
$signedRequest = $s3_input->createPresignedRequest($command,'+10 minutes');
$signedUploadUrl = $signedRequest->getUri();
echo $signedUploadUrl;
In above code how do I pass folder name in which I want to create pre signed url?
S3 does not have the hierarchy/directory structure, so keep that in mind, It becomes important when later you want to "move" or "rename" a "folder" on S3
https://serverfault.com/questions/435827/what-is-the-difference-between-buckets-and-folders-in-amazon-s3
Direct answer for your question : Add the slashes "/" in your "Key" to achieve the effect you want
Also, from the document
$commands[] = $s3Client->getCommand('PutObject', array(
'Bucket' => 'SOME_BUCKET',
'Key' => 'photos/photo01.jpg',
'Body' => fopen('/tmp/photo01.jpg', 'r'),
));

Blueimp jQuery fileupload Plugin and Symfony : how to dynamically change file repository name based on an id?

I use the plugin 'blueimp jquery fileupload' to upload files in Javascript (that part I got it right) and then I have to handle the uploads on the server side using an uploadhandler affiliated to the plugin (UploadHandler.php).
Working in Symfony, I managed to create a service based on this php script and it works in my controller (the files are uploaded in the default repository, yet on my page I get the error message 'upload failed' and I don't know why but it's not a big problem I guess), but the thing is :
I would like to custom the repository path to upload the files based on the user id, and since I call the uploadhandler file as a service, I don't know how to override the options using the construct function, as I would be able to with a basic call in php.
Here's my code :
public function uploadFiles(Request $request)
{
$uploadhandler = $this->container->get('extranetcontratbundle.uploadhandler');
$response = $uploadhandler->response;
$files = $response['files'];
return new JsonResponse($files);
}
In the options of UploadHandler.php there is :
$this->options = array(
'script_url' => $this->get_full_url().'/'.$this->basename($this->get_server_var('SCRIPT_NAME')),
'upload_dir' => dirname($this->get_server_var('SCRIPT_FILENAME')).'/files/',
'upload_url' => $this->get_full_url().'/files/',
'input_stream' => 'php://input',
'user_dirs' => false,
'mkdir_mode' => 0755,
'param_name' => 'files',
...blabla
And I would like to override the options in a similar way as I would in 'normal' php :
$tmpImagesDir = JPATH_ROOT . 'tmp' . $userId .;
$tmpUrl = 'tmp/' . $userId . '/' . '/';
$uploadOptions = array('upload_dir' => $tmpImagesDir, 'upload_url' => $tmpUrl);
$uploadHandler = new UploadHandler($uploadOptions);
But to do that I would have to write "require_once(blabla)" and I would have created the service for nothing. If I understood it right, that's not the way to do it in Symfony. Is there a way ?
Thank you for reading, please help.

Save PHP file_get_contents stream to AWS S3

Im trying to get a remote file (an image) using PHP and then put that image to an S3 Bucket.
It mostly works, except that the file that is uploaded to S3 is empty when downloaded back again.
Here is my code so far:
$src = "http://images.neventum.com/2016/83/thumb100x100/56f3fb1b9dbd6-acluvaq_63324840.png";
$name = md5(uniqid());
$ext = pathinfo($src, PATHINFO_EXTENSION);
$file_content = file_get_contents($src);
$filename = "{$name}.{$ext}";
try {
$file = $this->s3->putObject([
'Bucket' => getEnv('s3bucket'),
'Key' => $filename,
'Body' => $file_content,
'ACL' => 'private'
]);
} catch (S3Exception $e) {
//Catch
}
Hope you can help. Thank you so much in advance.
Update 1:
The problem is not with the ACL (I have tested using "public"), is that the saved object on S3 is not uploded correctly (I think is something to do with the encoding, but have not been able to figure it out)
Once you upload image to S3 bucket it will be private you can't able to download directly. You need to give public read access to make object available for download to users.

Store S3 URL in database table

I'm building a small asset management system in Laravel 5.2
A user can upload images, video etc to the app and the asset meta data gets stored in the assets table. While that's happening, the asset is renamed to match the asset id (I'm storing the original filename too), I'm storing the mime type and uploading the file to S3.
Where I've come unstuck is storing the S3 url in database.
This is my method
public function store(AssetRequest $request)
{
// Initialise new asset and set the name
// from the form
$asset = new Asset(array(
'name' => $request->get('name')
));
$asset->user_id = Auth::user()->id;
// save the asset to the db
$asset->save();
// set the file var = form input
$file = $request->file('asset_path');
$extension = $file->getClientOriginalExtension();
// modify the asset name
$assetFile = $asset->id . '.' . $request->file('asset_path')->getClientOriginalExtension();
// push the new asset to s3
Storage::disk('s3')->put('uploads/' . $assetFile, file_get_contents($file));
$asset->mime = $file->getClientMimeType();
$s3Url = Storage::url($file);
$asset->s3_url = $s3Url;
$asset->original_filename = $file->getClientOriginalName();
$asset->filename = $assetFile;
$asset->file_extension = $extension;
// return ok
$asset->save();
return \Redirect::route('asset.create')->with('message', 'Asset added!');
}
The lines relating to my attempt at storing the S3 url
$s3Url = Storage::url($file);
$asset->s3_url = $s3Url;
Only seems to store a temporary path /storage//tmp/php8su2r0 rather than an actual S3 url. I'd like to avoid having to set the bucket manually, rather hoping I can use what is configured in config/filesystem.php
Any ideas?
you can get everything from the config using the config(key) function helper
so to get the s3 public url of file, do this:
function publicUrl($filename){
return "http://".config('filesystems.disks.s3.bucket').".s3-website.".config('filesystems.disks.s3.region').".amazonaws.com/".$filename;
}
or you can use the underlying S3Client:(taken from here)
$filesystem->getAdapter()->getClient()->getObjectUrl($bucket, $key);
What your are trying to achieve, I have been doing that in many projects.
All you need to do is create image_url column in database. And pass the s3 bucket link + the name of the file + the extension.
You should know the bit that is constant for me like : https://s3-eu-west-1.amazonaws.com/backnine8/fitness/events/ then I have the id and the extension. in your case it could be name and extension.
if(Input::get('file')) {
$extension = pathinfo(Input::get('filename'), PATHINFO_EXTENSION);
$file = file_get_contents(Input::get('file'));
$s3 = AWS::get('s3');
$s3->putObject(array(
'ACL' => 'public-read',
'Bucket' => 'backnine8',
'Key' => '/fitness/events/'.$event->id.'.'.$extension,
'Body' => $file,
));
$event->image_url = 'https://s3-eu-west-1.amazonaws.com/backnine8/fitness/events/'.$event->id.'.'.$extension;
$event->save();
}

Categories