Managing images and other media for large website - php

I am currently developing a big web application. Users will be able to post images as well as music files to my server. I am using PHP with Codeigniter framework and work off an Apache server from A2hosting.com . I was wondering how I will be able to manage space. I know that they offer unlimited storage but I know that I am going to run into issues if too many people are uploading too much.
How is the best way to deal with this? Would you have your own separate hosting plan for storing all this media? Could it be stored in a through a third party? Will my site eventually be slowing down because there is way too much memory that I am holding for people?
I guess I would kind of like to know what issues I am going to be running into? My project is almost completed and I want to avoid any large scale errors that may occur. I am the only one working on this project so man power is pretty precious, as well as time.
Any help and insights will be greatly appreciated.

Anyone offering you "unlimited storage" at a fixed rate is having you on.
We put our media files on Amazon S3, which is designed to handle trillions of files.

If you do host the uploaded data locally please don't place your uploads folder anywhere in your web root or in a place directly accessible by remote users. Expanding your storage is easy but recovering from a total website or server compromise is not!

Related

AWS S3 EC2 Dilemma.

Recently my Web design firm got a big contract to build a website that will be media rich and needs to run on wordpress. The client wants it so because of the simplicity and familiarity with Wordpress they have.
The hosting will be undoubtedly with AWS EC2, and we are not torn into hosting the actual files on a separate instance or a S3 bucket. I have never worked with S3, but have some 2+ years experience with EC2. Users uploading images, videos, documents,...will be a big component of the website.
ANTICIPATED: Based on the market stydy done by another firm for the client, we expect in the upwards of 1000 unique visitors daily, of whom 5-10% would be uploading on the server/bucket.
AIM: A fast website with that kind of media richess
Any advice as to the choice of the server / infrastructure settings/choices?
Wordpress by default does however store all of its files in the local file system. You can get plugins to allow uploads to be stored in S3. Although with only 1000 uniques, it may not be necessary.
The biggest gain in speed is going to be with using caching systems (preferably caching to memory).
There are many options open to you, but with something like 1000 unique per day, you don't have much to worry about. If you want to take advantage of the CDN part of S3 then:-
Create a bucket in S3, with public CDN options enabled
Mount this bucket using S3 FUSE in Linux -> Guide here
http://juliensimon.blogspot.de/2013/08/howto-aws-mount-s3-buckets-from-linux.html
Ensure memory caching is enabled in Wordpress (W3 Cache)
Minify the CSS and JS files using W3 Cache (careful as this sometimes breaks the themes)
If site is not responsive enough consider using AWS CloudFront or CloudFlare
If the site much be online at all times then consider 2 instances with DNS roundrobin. Keep wordpress Sync'd using rsync. Ensure they both mount the same S3 bucket.
This should be more than enough.
Hope this helps.
1000 visitors a day is not really so large a strain on a server that I'd be too especially worried about it. If it were me, I'd make sure to use caching (like datasage recommended), and also looking into leveraging a CDN, especially since you're dealing with a lot of media. No matter what CDN you use, be it Cloudflare, MaxCDN, VideoPress, Amazon CloudFront, Akamai, or any one of many great content delivery network providers out there, I think you'll get a lot further with that than you will tweaking your server. If you want to do that too, I'd suggest caching and NGINX. Obviously minify CSS and JS too, before you deploy, but that's kinda obvious
I appreciate all the input as the concensus is that 1000 uniques / day is not much of a deal. Check
Need I to mention though that we'll build every single functionality ourselves as a main plugin to fully have control over the features. Many themes nowadays are stuffed with unnecessary junk which doesn't help the case we're trying to make. I will look at StuartB's solution more closely, but I certainly do appreciate all of your inputs.

Scaling a web app based on Laravel 3

I'm writing my first web application using the PHP framework Laravel and a MySQL database. The application is quite image heavy with users likly to typically have 1 GB or more of images hosted by me. Images will be stored in the server's file system with a file reference stored in the database.
I'm just thinking ahead as I code because the application will initially be hosted on my InMoting VPS system to start with but (hopefully) will ultimately outgrow this.
Is there anything I should be doing at this early stage to make sure that the application could be scaled?
I would say the easiest and best thing to do right now is move all of your users images over to a CDN like Rackspace cloud files or similar. This way if/when you upgrade the framework, change servers, add new ones, etc most of your physical files are located somewhere else...in addition to having all of the benefits of a CDN network. Besides that I wouldn't worry about over optimizing too much.

Splitting form submissions to speed up transfer time

I have a simple CRM system that allows sales to put in customer info and upload appropriate files to create a project.
The system is already being hosted in the cloud. But the office internet upload speed is horrendous. One file may take up to 15 minutes or more to finish, causing a bottleneck in the sales process.
Upgrading our office internet is not an option; what other good solutions are out there?
I propose splitting the project submission form into 2 parts. Project info fields are posted directly to our cloud server webapp and stored in the appropriate DB table, the file submission will actually be submitted to a LAN server with a simple DB and api that will allow the cloud-hosted server webapp to communicate with to retrieve the file if ever needed again via a download link. Details need to be worked out for this set-up. But this is what I want to do in general.
Is this a good approach to solving this slow upload problem? I've never done this before, so are there also any obstacles to this implementation (cross-domain restrictions is something that comes into mind, but I believe that can be fixed with using an iFrame)?
If bandwidth is the bottleneck, then you need a solution that doesn't chew up all your bandwidth. You mentioned that you can't upgrade your bandwidth - what about putting in a second connection?
If not, the files need to stay on the LAN a little longer. It sounds like your plan would be to keep the files on the LAN forever, but you can store them locally initially and then push them later.
When you do copy the files out to the cloud, be sure to compress them and also setup rate limiting (so they take up maybe 10% of your available bandwidth during business hours).
Also put some monitoring in place to make sure the files are being sent in a timely manner.
I hope nobody needs to download those files! :(

Images Sharing Feature

I'm creating for my web application (PHP) a feature of images sharing between users which means all users can upload their images to my server.
so,
my first assumption is that I need a dedicated server to my "images sharing" feature.
the problem is that if the server will get many requests - bottleneck will be created.
I learned about caching (memcached, varnish, squid...)
do you think one of these technologies is suitable for me?
what is the best/ideal architecture for me? I assume only one server is not enough in some point.
so I guess I will need cluster of servers (master and slaves). right?
I will be very glad if you could give me some orientation about the right technologies & architecture.
Everything depends on how big traffic you will have. Can you estimate it ?
Cashinig solutions are good rather for samall images. I've some experience with image sharing/voting website and quite big traffic (12mln fullsize image downloads a month).

images/videos/mp3s on Network file system using php

I did some Google searches and can't seem to find what i want. I'm designing my web site to use MYSQL, PHP Web Servers. multiple web servers with load balancers and a MySql Custer for scaling is planed so far. But then i get to images/videos/mp3s. I need a file system multiple servers can read files from and write files to. So one web server can run the MySQL, Networked File System and Web Server, but as the site scales the site can be switched to multiple servers. Does anyone have any examples, tutorials or resources to help me on this? The site runs on Ubuntu Servers. My original idea was to just store the images in MySQL(I know how to do that and have working examples) so all servers could read/write but other people told me thats a bad idea and i should use a file system(but don't want to use the local one, as i don't think it san scale for large sites).
There are Three systems that come to mind - Mogilefs, Mongodb GridFS and a cloud based storage solution.
MogileFS (OMG Files!) was developed for Livejournal and stores metadata in Mysql. It uses that to find the actual disk with the appropriate file and streams it out.
MongoDB GridFS is a lot newer, and probably easier to get going, certainly for a smaller system. It uses a new 'NoSql' database to store parts of files across its database, assembling as required. Searching around for information will find plenty of information.
Finally, you could simply avoid the whole issue and just upload images into Amazon's S3, or Rackspace Cloudfiles. I've done the latter before (though the site was already running inside Rackspace's system) and it's not very difficult, again with plenty of examples around.
For S3 there is also a command-line tool, s3cmd that can be set to sync (or, better) upload and then delete a directory full of files into an S3 'bucket'.
First storing images/large files is not really possible with MySQL because of the maximum size limitation
To quote this answer Choosing data type for MySQL?
MySQL is incapable of working with any data that is larger than max_allowed_packet (default: 1M) in size, unless you construct complicated and memory intense workarounds at the server side. This further restricts what can be done with TEXT/BLOB-like types, and generally makes the LARGETEXT/LARGEBLOB type useless in a default configuration.
Now for storage and upgrade compatibility why not just store them on an NAS or Raid system that you can continue to tack drives onto. Then in your DB just store a path to the file. Much lest db intensive and allows for decent scalability.

Categories