I've got a media-rich site that's going just beyond the what our server can handle in terms of storage. Estimate between 500 gigs and 2 terabytes.
The media is uploaded through the website usually 500k to 30 megs at a time and are just videos and photos users have uploaded.
Using the PHP FTP functions the media is then copied from the temp directory into the media directoy.
I'm looking for the best way to handle storing the file after the user has uploaded it.
EDIT
I have a cloud computing account with Mosso and all our sites are hosted on dedicated boxes with RackSpace (traditional). My question applies to the actual process of getting media in to the site the way it currently is and then what to do next...
Try Rackspace Mosso or Amazon S3 with Cloud Front
Also think about using a Content delivery network, to speed up your visitors load times.
Both have an Application Programing Interface that can be used with your internal systems, for uploading new content.
Twitter use Amazon S3 for users avatars.
Related
I have been trying to integrate Amazon S3 & Amazon Cloudfront with my website (Hosted on Amazon EC2) developed using opencart from past few days. While searching, i found a lot of extensions from which not a single one fits the requirement.
According to extensions all data be stored on local volume storage & you can create a sub-domain with directory root to /image/ directory & access the images form sub-domain. But here i do not see how the images and all get to Amazon S3. I might be missing something here. But below is what i want to implement.
What i want is to store all images & downloadable to Amazon S3 & retrieve the same from Amazon S3 using Amazon Cloudfront. When admin uploads an image then that gets stored in Amazon S3 instead of local volume storage.
I have gone through the /image/ models, library files which comes with opencart default installation. After seeing files, it looks impossible to implement what i want in current structure. The solution i see is rather i create my own library for this & update each & every file of opencart where images are being used or use any of the extensions present (using this will cause issues while using Amazon Elastic load balancing or Amazon Auto Scaling).
Any suggestions !!!
Searching the market place I found this solution, Amazon CloudFront / S3 Integration
The extensions says:
speed up image loading on your website using Amazon CloudFront. This integration allows you to easily upload your OpenCart image cache onto S3 and serve it through CloudFront.
Finally i found the way for this. Basically we have two Questions in my above questions :-
1. Integrate Amazon S3
2. Integrate Amazon CloudFront
Integrate Amazon S3
The best practice for this is to streamline all directories and files inside '{$ROOT}/image' completely with S3. Ultimately the goal is to make the application as much scalable as possible. This way when we put load balancer in front of our application then it won't create any issues as files no longer being saved on our local storage.
To achieve this - one have to customize the application so that when ever admin add/update any images then they all get add/update to S3 instead of local storage. Also when images get pulled in website front then they all get pulled from S3 instead of local storage.
Integrate Amazon CloudFront
This has two options:-
2a. With S3 implemented - Just need to provide the S3:Bucket:ARN in amazon cloudfront and change the images url throughtout the web application.
2b. Without S3 (Local Storage used) - Instead of S3:Bucket:ARN we need to give image directory url of our application to amazon cloudfront for example:- www.example.com/image/ that's it. Now change the urls of images through out the web application and images will be pulled out from amazon cloudfront url.
The easiest way I found is to host in AWS Lightsail plan. Although the AWS Lightsail plan does not support Opencart by default we can use the LAMP stack and add the opencart code in it.
Host LAMP stack in AWS Lightsail, Opencart hosting in AWS
In this post we describe
How to set up an instance in Lightsail
How to select the free tier plan
Update system and PHP version in AWS Lightsail
Opencart installation steps in the AWS Lightsail LAMP stack
Create Static IP
Create DNS Zone
Add Name servers to your domain registrar
Create a database, database user and grant access
Install free Let’s Encrypt Certificate
Configure HTTP to HTTPS redirection for Opencart
Activate the SEO URL
How to setup FileZilla SFTP in AWS Lightsail to transfer files?
PHPMyAdmin access
How to upgrade to a higher Lightsail package?
How to set up the CDN?
Let us know if any questions or concerns.
We have six appication server(under LVS) which send request randomly to all servers.
Around 8 years back we used to store images in databases.
Pros : Can be accessed from all the application server
Cons : Slow
Around 5 years back, we shifted images to store as a file on one of the six application server with the help of nginx rules that make sure all image read/write request go to single server.
Pros : Fast
Cons : All images read/write request go to single server.
Question: Is there any better images to solve the following issue:
1. Can be accessed from all application server.
2. Fast access
Note : we move images to common image server after some time.
We don not move instantly as we dont want to reply on that server and also it will increase user upload time.
You can leverage the power of Content Delivery Networks (CDN) and storage buckets which is provided by services like AWS.
Upload all images to a single server say an AWS S3 bucket https://aws.amazon.com/s3/ which will help you to get all images from a central server and will be accessible from all application servers.
You can then link your S3 bucket with a CDN service like AWS's Cloudfront or some of the free services like Cloudflare.
https://aws.amazon.com/cloudfront/
Read more about how to use S3 for PHP here:
https://devcenter.heroku.com/articles/s3
http://docs.aws.amazon.com/AmazonS3/latest/dev/RetrieveObjSingleOpPHP.html
Read more about linking an S3 bucket to cloudfront here:
http://docs.aws.amazon.com/AmazonCloudFront/latest/DeveloperGuide/MigrateS3ToCloudFront.html
So AWS's S3 will provide you globally accessible images and Cloudfront CSN will provide you amazing speeds.
I want to create an Android app which basically deals with audio files.
The purpose of this app is to allow user to listen to audio files either by streaming or by downloading files. Audio files are categorized into channels and subchannels. Users can navigate to their desired category and get audio files related to that category and play them.
I have 10000 of audio files each sizing around 10 MB. I want to save the files to a centralized server (online hosting) and consume them in front-end using REST services.
Also, there needs to be a login form. Also, users can subscribe to channels if they are logged in.
As per my research, I know that I can do this by hiring a dedicated server and host files there. For the rest API I plan to use PHP.
I have a web page on a web hosting and images are stored on Amazon S3. I want with php be able to download multiple images from Amazon S3 through my web page in a zip file.
What are my options and what is the best?
What I know, it is not possible to compress files on S3. Can I use Amazon lambda?
Best solution I've come across.
The user selects on my website which images they want to downloaded.
I get the file name from my database on my web host and download the
images from S3 to a temporary directory on my web host.
A zip file is created in a temporary directory and a link is sent
to the user. After a certain time, I clear up the temporary directory (with a script) on my web host.
But it would be great if there are a way that did not go through my hosting to create and download the zip-file.
AWS S3 is "basic building blocks", so it doesn't support a feature like zipping multiple objects together.
You've come up with a good method to do it, though you could stream the objects into a zip file rather than downloading them. EC2 instances can do this very quickly because they tend to have fast connections to S3.
Lambda doesn't work for this, as it is only triggered when an object is placed into an S3 bucket. You are doing the opposite.
I have a dynamic website on which users can upload different files to be reviewed by different experts.
Users can upload/download their files to/from our server.
How can I use the Amazon S3 and PHP to expand my storage? I already have an AWS account and made a research on the net, but i still can't figure out How to make sure that every user can access only his files and the one he will upload will be in a specific bucket with his username?
You shouldn't create a separate bucket for each user. You should use one bucket per application/environment. To work with Amazon S3 you can use AWS SDK for PHP. Your file "structure" will look something like this: /{bucket}/{userid}/images/*, /{bucket}/{userid}/videos/*, etc.