I've been spending few days trying to figure out how to set aws s3 as external storage for Resourcespace. and i've been getting more confused with the this app.
I'm using the opensource version and trying to customize it to my needs.
I've been through the web app's lengthy documentation but couldn't find anything about setting storage (like other web apps out there) However, I found a feature called syncdir where it sets an alternative external storage (for backup) but not as an external storage, as from the documentation, it doesent seem to have a direct method to specify storage/integrate s3 with it.
I've tried the following:
I've tried using aws s3 integration and how to integrate to any php website, by changing storing directory of 'storagedir' and directory of 'syncdir' in config.default file (i added the require s3 autoload file and added aws keys in config file), but it's not working, site is still storing locally
Note: I've integrated aws s3 before with Laravel 5.7 & Codeigniter 3 frameworks successfully.
I tried adding the require aws-autoload into the file where uploading functions is, and tried to look for the code responsible to upload, but code seems confusing to me where the upload functionality is (its not a php funtion where $_FILES receives your upload.
Changed place of require aws-autoload into include/general.php, but no luck.
Followed up with some forums on the matter like:
using external storage
Amazon S3 integration
I'm assuming that using the config file (to store AWS credentials and storage set to s3 bucket url), i include the aws-autoload in general/upload file, and it would automatically understand where it should upload, but no error or bug is reporting to address it.
But most of what i found is related to the paid version of the DAM system where it seems to be already set up on amazon.
Please advise, Any help is appreciated.
I'm using Wamp on Winddows 10 PC btw
Check this discussion out, it might help you :
https://groups.google.com/forum/#!topic/resourcespace/JT833klfwjc
It look like it is still a work in progress, so you may see the WIP code,
You will find links to code in the mentioned link.
Related
I'm stuck wondering what the best solution is to handling large file uploads and sending them to a third-party API. Any pointers on what would be a good solution would be very welcome. Thank you in advance.
The end goal is to send video files to this API - https://docs.bunny.net/reference/manage-videos#video_uploadvideo. The complication is that the files are often large - up to 5GB in size.
I have an existing website built in PHP7 that runs on a LAMP setup on Amazon Lightsail and I want to add a feature for users to upload video files.
Currently I'm uploading the files directly to Amazon S3 using a pre-signed URL. This part is working fine.
But I need to send the files to the API mentioned above. This is where I get stuck!
I think there's two options to explore - (1) find a way to upload directly to the API and skip the S3 upload or (2) continue with uploading to S3 first and then transfer to the API. But I'm not sure if option 1 is even possible or how to do option 2!
With option 1, I'm wondering if there's a way to upload the files from the users directly to the API. If I do this using the regular HTML form upload, then the files are stored temporarily on my server before I can use cURL through PHP to transfer them to the API. This is really time consuming and feels very inefficient. But I don't know how else to send the files to the API without them first being on my server. Maybe there's an option here that I don't know about!
With option 2, I can already upload large files directly to S3 with pre-signed URLs and this process seems to run fine. But I don't know how I would then send the file from S3 to the API. I can use an S3 trigger on new files. But when I looked at Lambda, they have a tiny file size limit. Because my site is hosted on Lightsail, I noticed they have a container option. But I don't know if that can be used for this purpose and if so, how.
Basically, I'm not sure what solution is best, nor how to proceed with that. And maybe there's an option 3 that I'm not aware of!
I would welcome your input.
Many thanks in advance.
It would strike me as something rather mundane, but so far I cannot find any reference to how I can backup my Amazon AWS hosted application to my AWS S3 Bucket automatically on a regular basis?
Long story short, I have a AWS VPS with MySQL and WordPress installed and now am looking to do weekly/monthly backups of the site, including the DB. This seems to be possible via a WP plugin called "Updatedraft Plus" which allows for backups to be created and pushed over to S3 buckets, but given that this is a rather simple feature, I am quite sure that Amazon must have a solution for this already, I just cant seem to figure out how to do it.
I run both a WP site and a Laravel application if that matters, so could rely on either to assist with the backup.
Thanks for any advice on this.
I use this code
move_uploaded_file($file_tmp,$us_id.".jpg");
but after run this script it not error but file not appear into folder ,how can it do?
before these, I test in localhost it work.
You haven't specified what $file_tmp contains, but... in an Azure Web App, the root folder is at d:\home\site\wwwroot. And you'll find the %HOME% environment variable set to d:\home.
Edited based on #David's comments and Kudu's Azure runtime environment
In a Cloud environment, saving files to the current filesystem of your Web App is not advised. If you simply upload your site through FTP, you might not have issues, but if you rely on Continuous Integration or automated deployment scenarios, your site folder might change or have content overwritten.
That is why, for storage of files that need to be accesed in the future or need to be permantently saved, you might want to use something like Azure Blob Storage. Not only is really cheap, but you can apply CDN over it for improving your files delivery.
Here is how to use it on PHP and the Azure SDK for PHP.
As I leverage your code at Azure window sever can't upload file from php in my test project, it works perfectly on my side, even I don't value the $us_id in your code, the picture is still updated to the uploadimg folder with the name .jpg.
I suspect whether if your Web Apps's disk space has reached the limitation of your App Service plan.
As every App Service pricing tier has a limit disk space and will shared in all the web apps in this App Service plan. You can check the metric on dashboard page of your web app portal, e.g.
You can refer to https://azure.microsoft.com/en-us/pricing/details/app-service/ for details.
I have Wordpress instance on Amazon Elastic BeanStalk. When I upload instance with EB scripts, the whole page is being replaced, also uploaded images which can be attached to posts. And after such automatic deploy, posts have missing pictures:)
I tried to solve this:
1) I logged into Amazon machine with SFTP, but my user ec2-user has only read-access to files. So I was not able overwrite only part of application, with retaining uploaded files.
2) I read I can use Amazon S3 as external storage for upload files. This is still not tested by me:). Do you know if this is good approach?
3) Any other approach for this problem? How to organize it on amazon: machine backup probably should be set?
The Elastic Beanstalk environment is essentially stateless; meaning that all data that is persisted to disk will be lost when the application is updated, the server is rebuilt or the environment scales.
The best way in my option is to use a plugin that writes all media files to AWS S3; something similar to the Amazon S3 and Cloudfront plugin.
You log files should also be shipped to a remote syslog server which you can either build yourself or use a 3rd party.
Google: loggly, logstash, graylog, splunk
) i am creating a php application in google appengine. Since i will have to pay for the cloud sql, i decided to use some files instead of the sql database and read/write from it. but the problem is i am not able to create or modify the existing files using php's standard file operations like 'fopen' and 'fwrite',etc. when i run the same code in 'localhost:8080' it is working properly. but not in the uploaded version in internet. i am getting errors while creating/writing to the files. please help. thank you
You should read the appengine php runtime docs https://developers.google.com/appengine/docs/php/#PHP_The_sandbox
Specifically the section about the sandbox. The crucial statement -
An App Engine application cannot:
write to the filesystem. PHP applications can use Google Cloud Storage
for storing persistent files. Reading from the filesystem is allowed,
and all application files uploaded with the application are available.
This was a while ago, but if you still want to write to file system, you can write to the gs cloud
https://cloud.google.com/appengine/docs/php/googlestorage/#writing_files_from_your_app
you need to add the bucket to your php.ini file
google_app_engine.allow_include_gs_buckets="yourapp.appspot.com"
and make sure your Service Account Name has permissions to edit the cloud storage
HTH