I have a big CRM system on PHP, which active works with users files, and stores them in the folder in the root of the project. Now i need to change system to save files on Amazon S3, but because of bad arhitecture of the system (it's an old open-source system) i cannot just rewrite the code. So, i got a little bit crazy idea, to intercept all system calls to one folder ("/var/www/%my_project%/uploads"), and process them in special way. PHP should be sure that it works with usual folder, file_put_contents and file_get_contents should work as usual, but in fact they should work with code which will serve files from S3 for them. Is that possible (how?), or it's too crazy idea?
Thanks :)
Amazon S3 Stream Wrapper helped me. It's needed just to create a client object, and call registerStreamWrapper method. That's all, now you can work with a folder "s3://yourbacketname" as with usual folder.
Related
I have four servers and and on one of these servers i have installed Wordpress for selling digital files. For security reasons i don't want to keep files on the same server as Wordpress is installed. I want to move "Uploads" folder to other three servers and connect theme to Wordpress core. So whenever i upload something via Wordpress i want it to be transferred to the second or third server based on file format. How is such thing possible?
P.S: Unfortunately i can't use Amazon S3.
This was an interesting answer:
https://wordpress.stackexchange.com/questions/74180/upload-images-to-remote-server/78129#78129
It is a bit outdated, but the code says is still does the trick. However, to do exactly what you need, some modifications are required.
Hopefully this will be helpful to you. Have a nice day!
I'm currently working on a project which involves:
Downloading a zip file from an API (bing ads, to be specific).
Unzipping the file (to get a csv), editing it and making database queries.
Re-zipping the file and uploading it back to the API using a service similar to the one used to download the zip file in the first place.
I'm restricted by the client libraries of the API to write the project in PHP. I've already written the project to run locally, and hence store the files on my hard disk.
However, I want to have the whole process running on-line. I've tried to use the Google App Engine, but the zip archive class doesn't seem to be working (although it does work locally).
I don't have much experience with putting apps on-line and I was wondering if anyone can point me in the right direction.
Thanks!
All the tutorials on optimizing website assets with grunt and grunt-usemin are based on the src -> dist deploy strategy. basically processing the items in your src folder and compiles them into the dist folder.
But PHP doesn't work this way. It isn't compiled and "distributed". The source and target page are the same. Which makes it a destructive process.
How do you work around this? Any tips on using Grunt in PHP project in general?
Thx.
We do something similar. We created an index.src.html with the non-minized sources, and create an index.html from this file using usemin, and we have a "proxy" php file which opens the index.html if it exists (which means we are in the production environment and the assets are built), otherwise the index.src.html if the assets haven't been built or we are in a development environment. The main idea is to separate the important parts for usemin to a different file.
The usermin plugin is for preparing static assets (concatenate, minimise,...). Static means the server gives the same content for everybody. This constraint doesn't apply to PHP...
If you would like to use this tools to optimize your php generated pages assets, you should create input files which can be parse-able by usermin. For example you can collect the javascript/css file references into a template or a different php/html, and you include/use this file where you need, and after the build you use the usermin parsed version of it.
The ideal approach would take advantage of the dynamic nature of PHP to make the "distribution" URL replacement. The process would be:
Your build tool creates some kind of manifest or machine readable file with changes applied. Some tools, as gulp-rev will do this for you.
Read the manifest from PHP and replace the resource URLs with the final ones.
I have a PHP framework. Recently I've put some burden on NODE.js shoulders. Now I don't know where to put node.js files. For now I have one file, but It'll get bloated in the near future. Where do you suggest to put these files?
Thanks in advance.
EDIT :
I'm asking to see if putting nodejs files inside the php framework best practice. Or should we put all nodejs file in www folder without ever caring about our PHP framework.
You shouldn't put your node.js code in any web accessible folder (like www), as this is server code and it shouldn't be accessible from the web.
Your best bet is to just make another directory to host your node.js application which isn't in www or any other web accessible directory.
There must be some file named js, javascript, or script somewhere in your root folder. There you should put this file.
I have one instance in Amazon EC2. It is a Photobook. User can signup and upload images and they can create a photobook. S for that we want S3 storage. I searched a lot for a good PHP class to manipulate these folders, images and html files.
Now I have no idea how we can create folder, delete folder and search for a folder.
Can anyone help find a solution for these folder/image manipulation requirements like create , search, delete?
The official PHP SDK?
http://aws.amazon.com/sdkforphp/
http://aws.amazon.com/php/
Good question. I was trying to do it now and didn't found the answer in any documentation.
So I discovered that you just need to send the entire path in $keyname parameter:
folder/my_file_name.jpg
The S3 service can recognize and creates automatically the folders.
I think as me, you were looking for a method create_folder-like.
Easy and tricky.