I'm creating for my web application (PHP) a feature of images sharing between users which means all users can upload their images to my server.
so,
my first assumption is that I need a dedicated server to my "images sharing" feature.
the problem is that if the server will get many requests - bottleneck will be created.
I learned about caching (memcached, varnish, squid...)
do you think one of these technologies is suitable for me?
what is the best/ideal architecture for me? I assume only one server is not enough in some point.
so I guess I will need cluster of servers (master and slaves). right?
I will be very glad if you could give me some orientation about the right technologies & architecture.
Everything depends on how big traffic you will have. Can you estimate it ?
Cashinig solutions are good rather for samall images. I've some experience with image sharing/voting website and quite big traffic (12mln fullsize image downloads a month).
Related
So, for a simple test game, I'm working on generating user images based on their current in-game avatar. I got this idea from Club Penguin and GTA V. They both generate images of the current in-game avatar.
I created a script to simply put a few images together and print out the final image to the client. It's similar to how Club Penguin does it, I believe: http://cdn.avatar.clubpenguin.com/%7B13bcb2a5-2e21-442c-b8e4-10516be6abc6%7D/cp?size=300
As you can see, the penguin is wearing multiple clothing items. The items are each different images located at http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/ (ex: http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/210.png)
Anyway, I've already made the script and all, but I have a few questions.
When going to Club Penguin's or Grand Theft Auto's avatar generator, you'll notice it finishes the request so fast. Even when it's a new user, (so before it has a chance to cache the image since it hasn't been generated yet), it finishes in under a second.
How could I possibly speed up the image generation process? Right now I'm just using PHP, but I could definitely switch over to another language. I know a few others too and I'm willing to learn. Which language can provide the fastest web-image generator (it has to connect to a database first to grab the user avatar info)?
For server specs, how much RAM and all that fun stuff would be an okay amount? Right now I'm using an OVH cloud server (VPS Cloud 2) to test it and it's fine and all. But, if someone with experience with this could help, what might happen if I started getting a lot more traffic and there were people with 100+ image requests being made per client when they first log in (relationship system that shows their friend's avatar). I'll probably use Cloudflare and other caching tools to help so that most of them get cached for a maximum of 24 hours, but I can't completely rely on that.
tl;dr:
Two main questions:
What's the fastest way to generate avatars on the web (right now I'm using PHP)?
What are some good server specs for around 100+ daily unique clients (at minimum) using this server for generating these avatars?
Edit: Another question, which webserver could process more requests for this? Right now I'm using Apache for this server, but my other servers are using nginx for other API things (like logging users in, getting info, etc).
IMHO, language is not the bottleneck. PHP is fast enough for real-time small images processing. You just need right algorithm. Also, check out bytecode caching engines such as APC, or XCache, or even HHVM. They can significantly improve PHP performance.
I think, any VPS can do the job until you have >20 concurrent requests. The more clients use service at the same time the more RAM you need. You can easily determine your script memory needs and other performance info by using profiler, such as XHProf.
Nginx or Lighttpd in FastCGI mode use less RAM than Apache http server and they can handle more concurrent connections. But is's not important until you have many concurrent connections.
Yes, PHP is can do this job fast and flexible(example generate.php?size=32)
I know only German webspaces, but they have also an English interface. www.nitrado.net
Recently my Web design firm got a big contract to build a website that will be media rich and needs to run on wordpress. The client wants it so because of the simplicity and familiarity with Wordpress they have.
The hosting will be undoubtedly with AWS EC2, and we are not torn into hosting the actual files on a separate instance or a S3 bucket. I have never worked with S3, but have some 2+ years experience with EC2. Users uploading images, videos, documents,...will be a big component of the website.
ANTICIPATED: Based on the market stydy done by another firm for the client, we expect in the upwards of 1000 unique visitors daily, of whom 5-10% would be uploading on the server/bucket.
AIM: A fast website with that kind of media richess
Any advice as to the choice of the server / infrastructure settings/choices?
Wordpress by default does however store all of its files in the local file system. You can get plugins to allow uploads to be stored in S3. Although with only 1000 uniques, it may not be necessary.
The biggest gain in speed is going to be with using caching systems (preferably caching to memory).
There are many options open to you, but with something like 1000 unique per day, you don't have much to worry about. If you want to take advantage of the CDN part of S3 then:-
Create a bucket in S3, with public CDN options enabled
Mount this bucket using S3 FUSE in Linux -> Guide here
http://juliensimon.blogspot.de/2013/08/howto-aws-mount-s3-buckets-from-linux.html
Ensure memory caching is enabled in Wordpress (W3 Cache)
Minify the CSS and JS files using W3 Cache (careful as this sometimes breaks the themes)
If site is not responsive enough consider using AWS CloudFront or CloudFlare
If the site much be online at all times then consider 2 instances with DNS roundrobin. Keep wordpress Sync'd using rsync. Ensure they both mount the same S3 bucket.
This should be more than enough.
Hope this helps.
1000 visitors a day is not really so large a strain on a server that I'd be too especially worried about it. If it were me, I'd make sure to use caching (like datasage recommended), and also looking into leveraging a CDN, especially since you're dealing with a lot of media. No matter what CDN you use, be it Cloudflare, MaxCDN, VideoPress, Amazon CloudFront, Akamai, or any one of many great content delivery network providers out there, I think you'll get a lot further with that than you will tweaking your server. If you want to do that too, I'd suggest caching and NGINX. Obviously minify CSS and JS too, before you deploy, but that's kinda obvious
I appreciate all the input as the concensus is that 1000 uniques / day is not much of a deal. Check
Need I to mention though that we'll build every single functionality ourselves as a main plugin to fully have control over the features. Many themes nowadays are stuffed with unnecessary junk which doesn't help the case we're trying to make. I will look at StuartB's solution more closely, but I certainly do appreciate all of your inputs.
I am currently developing a big web application. Users will be able to post images as well as music files to my server. I am using PHP with Codeigniter framework and work off an Apache server from A2hosting.com . I was wondering how I will be able to manage space. I know that they offer unlimited storage but I know that I am going to run into issues if too many people are uploading too much.
How is the best way to deal with this? Would you have your own separate hosting plan for storing all this media? Could it be stored in a through a third party? Will my site eventually be slowing down because there is way too much memory that I am holding for people?
I guess I would kind of like to know what issues I am going to be running into? My project is almost completed and I want to avoid any large scale errors that may occur. I am the only one working on this project so man power is pretty precious, as well as time.
Any help and insights will be greatly appreciated.
Anyone offering you "unlimited storage" at a fixed rate is having you on.
We put our media files on Amazon S3, which is designed to handle trillions of files.
If you do host the uploaded data locally please don't place your uploads folder anywhere in your web root or in a place directly accessible by remote users. Expanding your storage is easy but recovering from a total website or server compromise is not!
I am currently working on configuring my CakePHP (1.3) based web app to run in a HA Setup. I have 4 web boxes running the app itself a MySQL cluster for database backend. I have users uploading 12,000 - 24,000 images a week (35-70 GB). The app then generates 2 additional files from the original, a thumbnail and a medium size image for preview. This means a total of 36,000 - 72,000 possible files added to the repositories each week.
What I am trying to wrap my head around is how to handle large numbers of static file request coming from users trying to view these images. I mean I can have have multiple web boxes serving only static files with a load-balancer dispatching the requests.
But does anyone on here have any ideas on how to keep all static file servers in sync?
If any of you have any experiences you would like to share, or any useful links for me, it would be very appreciated.
Thanks,
serialk
It's quite a thorny problem.
Technically you can get a high-availability shared directory through something like NFS (or SMB if you like), using DRBD and Linux-HA for an active/passive setup. Such a setup will have good availability against single server loss, however, such a setup is quite wasteful and not easy to scale - you'd have to have the app itself decide which server(s) to go to, configure NFS mounts etc, and it all gets rather complicated.
So I'd probably prompt for avoiding keeping the images in a filesystem at all - or at least, not the conventional kind. I am assuming that you need this to be flexible to add more storage in the future - if you can keep the storage and IO requirement constant, DRBD, HA NFS is probably a good system.
For storing files in a flexible "cloud", either
Tahoe LAFS
Or perhaps, at a push, Cassandra, which would require a bit more integration but maybe better in some ways.
MySQL-cluster is not great for big blobs as it (mostly) keeps the data in ram; also the high consistency it provides requires a lot of locking which makes updates scale (relatively) badly at high workloads.
But you could still consider putting the images in mysql-cluster anyway, particularly as you have already set it up - it would require no more operational overhead.
im trying to work out the best way to have my site dynamicly transcode and stream video files to users who are mostly on mobile devices, site is php/mysql based and running on a windows 2003 server which i have full access to, any ideas how best to do this - id rather not need to transcode videos on upload if possible
For your services consider something with some oomph: Inlet, Digital Rapids, or Rhozet. Some of these players offer some form of live-stream encoding but you'll generally have limitations on hardware. They all have suitable APIs for interacting with the hardware and profiles.
You can also consider using public transcoding services but keep the assets private. It's not quite as elegant as a roll-your-own but it does solve the problem.
Transcode-/Encode-on-upload will probably serve your needs better if the volume of content or traffic increases. Real-time transcoding has many hurdles including race situations and bandwidth.
I had a similar similar situation in my past at that time I had book marked this like below which has some very interesting stuff ,
php video transcoder
I am sorry If this didn't help you