we inherited Zend project, which hosts all static files (css, js, images, ..) on Amazon S3. Original programmer told me, that deployment is automated, but when I update new (for example) css file on server, on S3 is still old version.
Same guy linked me this article: http://www.labnol.org/internet/lower-amazon-s3-bill-improve-website-loading-time/5193/, where is section "Implement Caching for Amazon S3 Files". I added ?v param for my css file, but it didn't help.
I set permissions in Amazon S3 console for user "info" (I'm logged as info#domain.tld) to: Open/Download, View Permissions and Edit permissions for mentioned css file.
Is it possible, that Zend can update static files on Amazon S3, or it's bad info? I tried to search in Zend doc, but unsuccessfully.
Thank for help
Zend Framework does offer a component to connect and work with Amazon S3. check out Zend Framework S3
It dosen't matter what kind of file (object) it is, it can be updated.
If you can find it post some of the update code and maybe someone can troubleshoot it.
Related
I have a big CRM system on PHP, which active works with users files, and stores them in the folder in the root of the project. Now i need to change system to save files on Amazon S3, but because of bad arhitecture of the system (it's an old open-source system) i cannot just rewrite the code. So, i got a little bit crazy idea, to intercept all system calls to one folder ("/var/www/%my_project%/uploads"), and process them in special way. PHP should be sure that it works with usual folder, file_put_contents and file_get_contents should work as usual, but in fact they should work with code which will serve files from S3 for them. Is that possible (how?), or it's too crazy idea?
Thanks :)
Amazon S3 Stream Wrapper helped me. It's needed just to create a client object, and call registerStreamWrapper method. That's all, now you can work with a folder "s3://yourbacketname" as with usual folder.
I've been using since a few year now a home-made PHP CMS/ERP based on Symfony components for my clients. Every site using this framework is hosted on a "classic" linux hosting and I use git to deploy updates. Every site has a folder that stores html/css templates, images and downloadable files that the site admin can access to customize their site. This folder is git-ignored so my framework updates via git won't overwrite the site admin's changes.
I'm thinking about moving those sites in a cloud-based scalable environment like Heroku. As I understand it, this git-ignored folder would be reset every time I'd do git push to Heroku or for every new dyno created. How could I handle this without including this custom folder in the git repo ?
I've thought about storing HTML/CSS templates in the database, and copy them to the drive each time they are updated, after every git deployment or new instance creation. But that wouldn't solve the problem for images or downloadable files.
Instead of storing user-uploaded files locally, Heroku recommends putting them on an external service like Amazon S3.
You may want to use an existing library for this, e.g. KnpGaufretteBundle:
Easily use Gaufrette in your Symfony projects.
Gaufrette itself is "a PHP5 library that provides a filesystem abstraction layer". Amazon S3 is one of its supported backends.
I am looking for a way to do this.
Is it possible, if so how?
Thanks
Yes, not specific for CodeIgniter, but the Windows Azure SDK for PHP should have all the functionality you need.
It's also created by the Microsoft Windows Azure team, so it's officially supported.
Blobs
create, list, and delete containers, work with container metadata and permissions, list blobs in container
create block and page blobs (from a stream or a string), work with blob blocks and pages, delete blobs
work with blob properties, metadata, leases, snapshot a blob
REST API Version: 2011-08-18
Here's a codeigniter library that might help you
https://github.com/thomasantony/codeigniter-azure/
Copy the WindowsAzure directory of the downloaded archive to your application directory structure and reference classes from your application.
https://github.com/Azure/azure-sdk-for-php
I have one instance in Amazon EC2. It is a Photobook. User can signup and upload images and they can create a photobook. S for that we want S3 storage. I searched a lot for a good PHP class to manipulate these folders, images and html files.
Now I have no idea how we can create folder, delete folder and search for a folder.
Can anyone help find a solution for these folder/image manipulation requirements like create , search, delete?
The official PHP SDK?
http://aws.amazon.com/sdkforphp/
http://aws.amazon.com/php/
Good question. I was trying to do it now and didn't found the answer in any documentation.
So I discovered that you just need to send the entire path in $keyname parameter:
folder/my_file_name.jpg
The S3 service can recognize and creates automatically the folders.
I think as me, you were looking for a method create_folder-like.
Easy and tricky.
Im building an app using kohana. I need to be able to upload files directly to S3. Can you advise on a S3 helper for kohana?
Thanks
Someone wrote a module for Kohana using the AWS PHP SDK.
You can find it here : https://github.com/jylinman/kohana-aws
I would highly recommend using the AWS PHP SDK.
It is not specific to kohana and will work on any platform.
It is really easy to use, especially with S3.
There is minimal setup, and it is really well documented.