Video upload size - php

I'm having a hard time figuring this one out, so hopefully, some of you who has tried this before, will take the time to reply and share your knowledge.
I'm working on a site, which after release, will be feeded in the television and other commercial places. The site asks the user to upload a video with a story, and we expect alot of people to do so.
My problem is the whole storage/space talk. A normal, unencoded iPhone recording easily fills around 100-120 MB for a minute or two.
I've tried setting up and using FFMPEG to re-encode the movies, but the problem is, that one encoding sucks up 100% of the CPU, leaving the site inaccisible for anybody else.
Is there anything you could suggest, which would be sufficient for such a site? The client is on a budget, so price is a consideration aswell. Best of all would be a free alternative to etc. FFMPEG, but with less CPU usage.
My specs are as follows
CentOs 6 on a
1GB ram DigitalOcean cloud service with nginx + php-fpm and mysql.
Im hoping for some cleaver folks to answer this!
Thanks in advance.
Jonas

Ideally you would "queue" up the items that need to be worked on. As users submit videos, you might do something like:
Move the uploaded file to somewhere it can be worked on
Create an entry in a system (in a MySQL database?) that keeps track of the videos that need to be processed.
A separate process (cron-job?) periodically looks at the queue, pops an item off the list, and executes the encoding command on a separate thread
FFMPEG probably has switches that can limit how much CPU it uses. For example, check this thread: How can I limit FFMpeg CPU usage?

Related

PHP Image Generation

So, for a simple test game, I'm working on generating user images based on their current in-game avatar. I got this idea from Club Penguin and GTA V. They both generate images of the current in-game avatar.
I created a script to simply put a few images together and print out the final image to the client. It's similar to how Club Penguin does it, I believe: http://cdn.avatar.clubpenguin.com/%7B13bcb2a5-2e21-442c-b8e4-10516be6abc6%7D/cp?size=300
As you can see, the penguin is wearing multiple clothing items. The items are each different images located at http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/ (ex: http://mobcdn.clubpenguin.com/game/items/images/paper/image/300/210.png)
Anyway, I've already made the script and all, but I have a few questions.
When going to Club Penguin's or Grand Theft Auto's avatar generator, you'll notice it finishes the request so fast. Even when it's a new user, (so before it has a chance to cache the image since it hasn't been generated yet), it finishes in under a second.
How could I possibly speed up the image generation process? Right now I'm just using PHP, but I could definitely switch over to another language. I know a few others too and I'm willing to learn. Which language can provide the fastest web-image generator (it has to connect to a database first to grab the user avatar info)?
For server specs, how much RAM and all that fun stuff would be an okay amount? Right now I'm using an OVH cloud server (VPS Cloud 2) to test it and it's fine and all. But, if someone with experience with this could help, what might happen if I started getting a lot more traffic and there were people with 100+ image requests being made per client when they first log in (relationship system that shows their friend's avatar). I'll probably use Cloudflare and other caching tools to help so that most of them get cached for a maximum of 24 hours, but I can't completely rely on that.
tl;dr:
Two main questions:
What's the fastest way to generate avatars on the web (right now I'm using PHP)?
What are some good server specs for around 100+ daily unique clients (at minimum) using this server for generating these avatars?
Edit: Another question, which webserver could process more requests for this? Right now I'm using Apache for this server, but my other servers are using nginx for other API things (like logging users in, getting info, etc).
IMHO, language is not the bottleneck. PHP is fast enough for real-time small images processing. You just need right algorithm. Also, check out bytecode caching engines such as APC, or XCache, or even HHVM. They can significantly improve PHP performance.
I think, any VPS can do the job until you have >20 concurrent requests. The more clients use service at the same time the more RAM you need. You can easily determine your script memory needs and other performance info by using profiler, such as XHProf.
Nginx or Lighttpd in FastCGI mode use less RAM than Apache http server and they can handle more concurrent connections. But is's not important until you have many concurrent connections.
Yes, PHP is can do this job fast and flexible(example generate.php?size=32)
I know only German webspaces, but they have also an English interface. www.nitrado.net

Splitting form submissions to speed up transfer time

I have a simple CRM system that allows sales to put in customer info and upload appropriate files to create a project.
The system is already being hosted in the cloud. But the office internet upload speed is horrendous. One file may take up to 15 minutes or more to finish, causing a bottleneck in the sales process.
Upgrading our office internet is not an option; what other good solutions are out there?
I propose splitting the project submission form into 2 parts. Project info fields are posted directly to our cloud server webapp and stored in the appropriate DB table, the file submission will actually be submitted to a LAN server with a simple DB and api that will allow the cloud-hosted server webapp to communicate with to retrieve the file if ever needed again via a download link. Details need to be worked out for this set-up. But this is what I want to do in general.
Is this a good approach to solving this slow upload problem? I've never done this before, so are there also any obstacles to this implementation (cross-domain restrictions is something that comes into mind, but I believe that can be fixed with using an iFrame)?
If bandwidth is the bottleneck, then you need a solution that doesn't chew up all your bandwidth. You mentioned that you can't upgrade your bandwidth - what about putting in a second connection?
If not, the files need to stay on the LAN a little longer. It sounds like your plan would be to keep the files on the LAN forever, but you can store them locally initially and then push them later.
When you do copy the files out to the cloud, be sure to compress them and also setup rate limiting (so they take up maybe 10% of your available bandwidth during business hours).
Also put some monitoring in place to make sure the files are being sent in a timely manner.
I hope nobody needs to download those files! :(

Input on decision: file hosting with amazon s3 or similar and php

I appreciate your comments to help me decide on the following.
My requirements:
I have a site hosted on a shared server and I'm going to provide content to my users. About 60 GB of content (about 2000 files 30mb each. Users will have access to only 20 files at a time), I calculate about 100 GB monthly bandwidth usage.
Once a user registers for the content, links will be accessible for the user to download. But I want the links to expire in 7 days, with the posibility to increase the expiration time.
I think that the disk space and bandwidth calls for a service like Amazon S3 or Rackspace Cloud files (or is there an alternative? )
To manage the expiration I plan to somehow obtain links that expire (I think S3 has that feature, not Rackspace) OR control the expiration date on my database and have a batch process that will rename on a daily basis all 200 files on the cloud and on my database (in case a user copied the direct link, it won't work the next day, only my webpage will have the updated links). PHP is used for programming.
So what do you think? Cloud file hosting is the way to go? Which one? Does managing the links makes sense that way or it is too difficult to do that through programming (send commands to the cloud server...)
EDIT:
Some host companies have Unlimited space and Bandwidth on their shared plans.. I asked their support staff and they said that they really honor the "unlimited" deal. So 100 GB of transfer a month is ok, the only thing to look out is CPU usage. So going shared hosting is one more alternative to choose from..
FOLLOWUP:
So digging more into this I found that the TOS of the Unlimited plans say that it is not permitted to use the space primarily to host multimedia files. So I decided to go with Amazon s3 and the solution provided by Tom Andersen.
Thanks for the input.
I personally don't think you necessarily need to go to a cloud based solution for this. It may be a little costly. You could simply get a dedicated server instead. One provider that comes to mind gives 3,000 GB/month of bandwidth on some of their lowest level plans. That is on a 10Mbit uplink; you can upgrade to 100Mbps for $10/mo of 1Gbit for $20/mo. I won't mention any names, but you can search for dedicated servers and possibly find one to your liking.
As for expiring the files, just implement that in PHP backed by a database. You won't have to move files around, store all the files in a directory not accessible from the web, and use a PHP script to determine if the link is valid, and if so read the contents of the file and pass them through to the browser. If the link is invalid, you can show an error message instead. It's a pretty simple concept and I think there are a lot of pre-written scripts that do that available, but depending on your needs, it isn't too difficult to do it yourself.
Cloud hosting has advantages, but right now I think its costly and if you aren't trying to spread the load geographically or plan on supporting thousands of simultaneous users and need the elasticity of the cloud, you could possibly use a dedicated server instead.
Hope that helps.
I can't speak for S3 but I use Rackspace Cloud files and servers.
It's good in that you don't pay for incoming bandwidth, so uploads are super cheap.
I would do it like this:
Upload all the files you need to a 'private' container
Create a public container with CDN enabled
That'll give you a special url like http://c3214146.r65.ce3.rackcdn.com
Make your own CNAME DNS record for your domain point to that, like: http://cdn.yourdomain.com
When a user requests a file, use the COPY api operation with a long random filename to do a server side copy from the private container to the public container.
Store the filename in a mysql DB for your app
Once the file expires, use the DELETE api operation, then the PURGE api operation to get it out of the CDN .. finally delete the record from the mysql table.
With the PURGE command .. I heard it doesn't work 100% of the time and it may leave the file around for an extra day .. also in the docs it says to reserve it's use for only emergency things.
Edit: I just heard, there's a 25 purge per day limit.
However personally I've just used delete on objects and found that took it out the CDN straight away. In summary, the worst case would be that the file would still be accessible on some CDN nodes for 24 hours after deletion.
Edit: You can change the TTL (caching time) on the CDN nodes .. default is 72 hours so might pay to set it to something lower .. but not so low that you loose the advantage of CDN.
The advantages I find with the CDN are:
It pushes content right out to end users far away from the USA servers and gives super fast download times for them
If you have a super popular file .. it won't take out your site when 1000 people start trying to download it .. as they'd all get copies pushed out the whatever CDN node they were closest to.
You don't have to rename the files on S3 every day. Just make them private (which is default), and hand out time limited urls for day or a week to anyone who is authorized.
I would consider making the links only good for 20 mins, so that a user has to re-login in order to re-download the files. Then they can't even share the links they get from you.

Have I outgrown the shared hosting or I my scripts aren't optimized?

I currently have more than 1000 visits / day and hosting at 1and1 with PHP memory 30MB limit
I have a dynamic shopping guide with more than 5000 items and users enter to browse / search for items. I started getting "Internal Error 500" every now and then. Which are show more on days, and I don't notice them on others. 1and1 support say that I have outgrown the 30MB PHP limit.
What do you think? Is that true? or they just want to sell me a more expensive hosting? I currently can't afford more than a shared host :(
I am using PHP / MySQL Javascript / My BB Forum / PHP thumbail (which I am now trying to switch with static thumbnails to ease the load a bit)
Advice is appreciated
It's very unlikely that disk space constraints are causing 500s, even with 5000 items over 30mb, that's roughly 6k per item - only probably if each item has an image.
When you reach your disk quota, most PHP frameworks will report an error themselves, rather than sending a bog standard 500 response, so the 30Mb limit is almost certainly not the problem. Your hosting company may have other limits they're imposing, but even 1000 visits a day shouldn't break the most draconian hosting thresholds. It's more likely that your hosting company are looking for an easy up-sell.
Give us a URL and maybe we can get a clearer picture, but it's more likely something is simply broken in your site's code. Of course, without any reference material, this is strictly a theory.
If you are bumping up against 30 megabytes of PHP memory (if you are in fact talking about the php memory_limit setting, which it seems you are), then most likely some optimization could solve it. I usually only hit that kind of number if I'm processing large arrays and/or converting to PDF files, etc. And then I usually figure out a way to drop the array size by using memcached or writing the PDF some other way or something like that.
Sometimes the easiest thing to do is to buy their upgrade. Throwing hardware at the problem can at least be a quick fix. If you are making money from your web site, any time you get a 500, you could be losing money so it might be wise financially for you. But, you didn't ask that question. :)
To determine how much memory your scripts are using, sprinkle http://us2.php.net/memory_get_usage around at strategic spots to see (assuming you don't have a profiler that can handle this better).
I have moved to another hosting "Blue Host" and the errors stopped ever since. It seems they handle the resources in a different way: "CPU throttling". And they told me I can use their services till I feel the website is slow and I then should switch to VPS or dedicated server. 1and1 handle the overload on shared limits with error 500. This is not good!

ImageMagick server requirements

I'm in the process of building a simple website, or to be more precise a simple component for a website that adds a watermark to an image, creates a few different size images, and overlays it onto a few products. These edits will be made every time someone queries an image in a certain directory on the server.
I know this can all be done with imagemagick, my only concern is that the whole website will grind to a halt every time someone views their image for the first time (after the edit's been made once, the database is updated to get the edited version every time a user accesses it).
The website isn't hosted yet, for the time being I'm testing on XAMPP, but I figured for this I'm going to need a virtual or dedicated server, I just need some advice on what sort of hardware specs I ought to be looking at. I doubt more than 2 or 3 people will be viewing photos at any one time, but at a guess I need to be sure that the server can handle up to 10 or so and still be functional.
Hope someone can advise on this, cheers!
For Imageprocessing you need some CPU Power. If the Images are large they will also consume memory. But I think you should in any case work with caching. I don't know your application but certainly there are possibilities to cache images which are rendered once into the filesystem.

Categories