Convert Uploaded S3 Images to Original File Path - php

I attempted to move over our local WordPress images to S3 using the WP Offload S3 plugin and it turns out because the Object Versioning option was turned on, it added numbered folders for each image in the path, i.e:
[…]xxx.s3-us-west-1.amazonaws.com/wp-content/uploads/2018/04/xxxxxxxx/example.jpg
[…]xxx.s3-us-west-1.amazonaws.com/wp-content/uploads/2018/04/xxxxxxxx/example-123×456.jng
The original path was previously:
[…]xxx.com/wp-content/uploads/2018/04/example.jpg
[…]xxx.com/wp-content/uploads/2018/04/example-123×456.jng
If I delete the WP Offload S3 plugin and then reinstall it, it seems like the plugin's settings are still saved and the media still points to the Amazon S3 link, probably saved somewhere in the SQL database.
Is there a simple PHP script to convert the object-versioned file paths back to the original path on all of the already uploaded media?
I don't know much about coding, but perhaps something like this might be of reference? https://gist.github.com/TJNevis/8df059e8ba6d9cf28a01#file-s3fixexistingmedia-php-L27

Related

Upload Large file from My computer to S3 Server without editing php.ini

I want to upload a large file from My computer to S3 Server without editing php.ini. Firstly,I choose file from browse button and submit upload button and then upload to s3 server. But I can't post form file data when I upload a large file. But I don't want to edit php.ini.Is there any way to upload a large local file to s3 server?
I've done this by implementing Fine Uploader's php implementation for S3. As of recently it is through an MIT license. It's an easy way to upload huge files to S3 without changing your php.ini at all.
It's not the worst thing in the world to set up. You'll need to set some environment variables for the public/secret keys, set up CORS settings on the bucket, and write a php page based on one of the examples which will call a php endpoint that'll handle the signing.
One thing that was not made obvious to me was that, when setting the environment variables, they expect you to make two separate AWS users with different privileges for security reasons.
ini_set("upload_max_filesize","300M");
try this

Google App Engine PHP Serving Images from Google Cloud Storage

In my app, I want users to be able to upload a picture to set for their profile. I only want them to be able to have one picture set and in storage (Google Cloud Storage) at a time.
When users upload the picture, it gets renamed the same thing every time. I do this so I don't have to search and delete the old file. The new file just replaces it.
The problem I'm running into is that once an image is uploaded and replaces the old image, it remains the old image even though the actual file in Google Cloud Storage has changed to the new image. I have verified the file has been successfully replaced by looking at the actual file in the storage browser. There is no trace of the old file.
To serve the image I am using this method:
https://developers.google.com/appengine/docs/php/googlestorage/images
getImageServingUrl() should be pulling the correct file as it is named exactly the same, but it's not. How is it still hanging on to the old image? Is the old file being cached or something?
The result of getImageServingUrl() is being echoed into the src attribute of an img tag.
Any insight would be appreciated! Thanks!
Update: I tried calling deleteImageServingUrl() before getImageServingUrl(), however the problem still persists.

where to save images generated by php before moving to cloud

I have a laravel php app were a user is going to upload an image. This image is going to be converted into a number of different sizes as required around the application and then each image is going to be uploaded to aws s3.
When the user uploads the image php places it in /tmp until the request has completed if it hasnt been renamed. I am planning on pushing the job of converting and uploading the versions to a queue. What is the best way to ensure that the image stays in /tmp long enough to be converted and then uploaded to s3
Secondly where should I save the different versions so that I can access them to upload them to s3 and then remove them from the server(preferably automatically)?
I would create a new directory and work on it. tmp folder is flushed every now and then depending on your system.
As for different sizes, i would create separate buckets for each size which you can access with whatever constant you use to store the image (ex: email, user id, etc..).

Download JPEG files in batch mode keeping the path

I wish to download a lot of jpegs keeping the original name and their paths.
Example: http://www.somesite.org/path1/image1.jpg, should be downloaded on www.mysite.com/path1/image1.jpg (and creating the "path1" directory if it does not exist) This is repeated a lot of times, fetching the original download site from a field on a database.
Is it possible? Keep in mind that I can't use cURL nor wget since I am on a "limited" hosting service.
Well, first of all you have to find a method to get all download links. There are several ways to fetch data from a database, depending on what database it is (mysql, exel, textfile...)
Then you need to use ftp to upload these files into your database. Strip down the original link to the path (remove the www.somesite.org) and use your site + the path for storing.
This is very manageable, but we are not here to do all the thinking for you. Use google and try methods first, then come back if you are facing a specific problem.

Looking for a flash uploader to upload large files

I need a flash uploader, to use it in my CMS project.
I need something like this, but with greater max upload size (it doesn't allow to upload files larger ini_get('upload_max_filesize')).
My server doesn't allow me to overwrite ini settings, so I'm looking for an uploader which can upload large files independently from the ini settings.
If you want to go around the ini limit, one option would be to change and use an FTP uploader.
I've used once net2ftp and it was easy enough in its installation; I've never used it again since (almost 1 year and a half), but I see from their page that the project is updated and not dead, so you might give it a try.
You just download the package, place it in your webapp, customize it, and you're set.
You might want to create a dedicated FTP user with appropriate permissions, and not use the root one, of course.
You wont be able to post more data to the server than the max_upload_size.
As a workaround you can upload the data to Amazon S3 and sync it back via s3sync.
We have a setup with plupload in place for one of our clients and are able to upload up to 2GB per file (that's a client restriction, I don't know about S3 restrictions)
Mind you that S3 costs some money.

Categories