We have six appication server(under LVS) which send request randomly to all servers.
Around 8 years back we used to store images in databases.
Pros : Can be accessed from all the application server
Cons : Slow
Around 5 years back, we shifted images to store as a file on one of the six application server with the help of nginx rules that make sure all image read/write request go to single server.
Pros : Fast
Cons : All images read/write request go to single server.
Question: Is there any better images to solve the following issue:
1. Can be accessed from all application server.
2. Fast access
Note : we move images to common image server after some time.
We don not move instantly as we dont want to reply on that server and also it will increase user upload time.
You can leverage the power of Content Delivery Networks (CDN) and storage buckets which is provided by services like AWS.
Upload all images to a single server say an AWS S3 bucket https://aws.amazon.com/s3/ which will help you to get all images from a central server and will be accessible from all application servers.
You can then link your S3 bucket with a CDN service like AWS's Cloudfront or some of the free services like Cloudflare.
https://aws.amazon.com/cloudfront/
Read more about how to use S3 for PHP here:
https://devcenter.heroku.com/articles/s3
http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrieveObjSingleOpPHP.html
Read more about linking an S3 bucket to cloudfront here:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/MigrateS3ToCloudFront.html
So AWS's S3 will provide you globally accessible images and Cloudfront CSN will provide you amazing speeds.
Related
I have an image upload functionality on my website - I upload images directly to s3 bucket. What I do is I made that bucket public, I fetch images on my website and use public links of images from the s3 bucket to show them as html img tags.
Users of my website will take these images, possibly as links, and use them on their websites. I can see several problems here:
links of images are direct links to aws
the whole bucket is public
there will be a lot of unpredictable reads
Its a better idea to use cdn here (probably cloudfront). How to integrate cdn into this process though?
Can I upload images directly into cdn, without storing them in s3? Why would I need s3 if I have a cdn?
Any suggestions? Thanks a lot!
S3 is still required, since Cloudfront uses S3 as its origin. S3 remains the data store, but Cloudfront caches these objects at the edge.
Check out this walkthrough on adding a Cloudfront distribution to cache your static files.
This will allow you to use your own url for the static content, as well as restrict direct access to S3 via Origin Access Identity
As already answered, cloudfront needs an 'origin' and often types that origin is S3, so you can't use just cloudfront - the images need to exist somewhere.
One precaution you could do (not clear from your question if you are - you said the links are 'directy to aws'), is to serve all of these images under your own domain name - either directly from s3, or with cloudfront - doing that now will ensure that if you do ever need to switch where you store the images, you customers won't need to change their links and the process would be seamless for them.
I have a web page on a web hosting and images are stored on Amazon S3. I want with php be able to download multiple images from Amazon S3 through my web page in a zip file.
What are my options and what is the best?
What I know, it is not possible to compress files on S3. Can I use Amazon lambda?
Best solution I've come across.
The user selects on my website which images they want to downloaded.
I get the file name from my database on my web host and download the
images from S3 to a temporary directory on my web host.
A zip file is created in a temporary directory and a link is sent
to the user. After a certain time, I clear up the temporary directory (with a script) on my web host.
But it would be great if there are a way that did not go through my hosting to create and download the zip-file.
AWS S3 is "basic building blocks", so it doesn't support a feature like zipping multiple objects together.
You've come up with a good method to do it, though you could stream the objects into a zip file rather than downloading them. EC2 instances can do this very quickly because they tend to have fast connections to S3.
Lambda doesn't work for this, as it is only triggered when an object is placed into an S3 bucket. You are doing the opposite.
I am currently writing an application using the Yii-framework in PHP that stores a large number of files uploaded by the users of the application. I decided that since the number of files are going to be ever increasing, it would be beneficial to use Amazon S3 to store these files and when requested, the server could retrieve the files and send it to the user. (The server is an EC2 instance in the same zone)
Since the files are all confidential, the server has to verify the identity of the user and their credentials before allowing the user to receive the file. Is there a way to send the file to the user in this case directly from S3 or do I have to pull the data to the server first and then serve it to the user.
If So, Is there any way to cache the recently uploaded files on the local server so that it does not have to go to s3 to look for the file. In most cases, the most recently uploaded files will be requested repeatedly by multiple clients.
Any help would be greatly appreciated!
Authenticated clients can download files directly from S3 by signing the appropriate URLs on the server prior to displaying the page/urls to the client.
For more information, see: http://s3.amazonaws.com/doc/s3-developer-guide/RESTAuthentication.html
Note that for confidential files you may also want to consider server-side/client side encryption. Finally, for static files ( such as images ) you may want to set the appropriate cache headers as well.
Use AWS Cloud Front to server these static files. Rather than sending the files to the user, send them links to the files. The Links need to be cloud front links & not direct links to the S3 bucket.
This has the benefit of keeping load low on your server as well as caching files close to your users for better performance.
More details here Serving Private Content through CloudFront
I'm looking for some quick info about best practices on storing user's uploaded files on different servers or sub domains...
For example, photos on facebook of course arent on facebook.com/files/users/453485 etc...
but rather on photos.ak.fbcdn.net or whatever...
I'm wondering how with php i can upload to a different server whilst maintaining a mysql connection to my original... is it possible?
Facebook uses a content delivery network (cdn, hence fbcdn or facebook content delivery network) and probably uses webservices to pass binary data (photos) from server to server.
Rackspace Cloud offers a similar service. Here is an example application of their PHP library to access their webservice api: http://cloudfiles.rackspacecloud.com/index.php/Sample_PHP_Application
I'm going to make the assumption that you have multiple webservers, and want to be able to access the same set of files on each one. In that case, some sort of shared storage that each machine can access might be a good place to start.
Here are a couple options I've used:
Shared NFS Volume [ http://en.wikipedia.org/wiki/Network_File_System_(protocol) ]
MogileFS [ http://www.danga.com/mogilefs/ ]
Amazon S3 [ http://aws.amazon.com/s3/ ]
If you don't have control over the hardware or aren't able to install extra software, I'd suggest Amazon S3. There is an api that you can use to shuttle files back and forth. The only downside is that you don't get to use storage that you may already use, and it will cost you some money.
If you do have access to the hardware and software, MogileFS is somewhat like S3, in that you have an api to access the files. But is different in that you get to use your existing storage and get to do so for no additional cost.
NFS is a typical place where people will start, because it's the simplest way to get started. The downside is that you'll have to be able to configure servers, and setup a NFS volume for them to mount.
But if I were starting a high-volume photo hosting service, I'd use S3, and I'd put a CDN like Akamai in front of it.
I've got a media-rich site that's going just beyond the what our server can handle in terms of storage. Estimate between 500 gigs and 2 terabytes.
The media is uploaded through the website usually 500k to 30 megs at a time and are just videos and photos users have uploaded.
Using the PHP FTP functions the media is then copied from the temp directory into the media directoy.
I'm looking for the best way to handle storing the file after the user has uploaded it.
EDIT
I have a cloud computing account with Mosso and all our sites are hosted on dedicated boxes with RackSpace (traditional). My question applies to the actual process of getting media in to the site the way it currently is and then what to do next...
Try Rackspace Mosso or Amazon S3 with Cloud Front
Also think about using a Content delivery network, to speed up your visitors load times.
Both have an Application Programing Interface that can be used with your internal systems, for uploading new content.
Twitter use Amazon S3 for users avatars.