Are there any security and/or performance implications to consider when allowing large file uploads in PHP? For example, these are the PHP ini settings I currently have set.
memory_limit = 950M
upload_max_filesize = 950M
post_max_size = 950M
max_execution_time = 0
What, if anything, could go wrong with these settings?
The security considerations do not change by changing these settings. However for performance the following is valid:
The art of serving users in a performing way is to offer enough ressources to what is requested by the sum of your users. Translating this into examples upon your settings would be something like:
10 users uploading 950 MB would require you to serve 9.5 GB of bandwidth and I/O throughput (which is eg. ipacted by disk speed) in a performing manner. I as user could probably live with uploading 950 MB in 1 minute, but would be dissatisfied with this taking me an hour.
100 users uploading 950 MB would require you to serve 95 GB...
1000 users uploading 950 MB would reuire you to serve 950 GB...
...
Of cause not all of your users go for max at all the time and even concurrent uploads might be limited. However these Max-settings add to your risk stack. So depending on your usage characteristics and your ressource stuffing these settings could be valid.
However I assume you gave extreme examples and want to learn about implications.
When I google "optimize php memory_limit" I get this:
https://softwareengineering.stackexchange.com/questions/207935/benefits-of-setting-php-memory-limit-to-lower-value-for-specific-php-script
Obviously you can do the same with the other settings.
In forums you can find a lot of swear against setting those config-values such high. However having this in environments, where ressource utilization is managed carefully on other access layers (eg. restrict the number of upload-users via in-app permissions) did work out for me in past very well.
Related
Let's say I have a web app installed in a shared hosting and 1M user online.
If users were to upload a bunch of small files ( let's say a 1000 file, and the size of each file is approximately 100Kb ) successively and at the same time, using multiple AJAX requests,
Would this have the same effect as a DDoS attack ( so, it'll overwhelm the server) ?
Let’s say it depends on your headroom.
If you really have 1 server.
It may be possible to have 1M users logged in.
But a server like Node.js can handle maximum 1M concurrent connections.
So if these 1M go, everything is DoSed in second 1.
It all depends on the available concurrent connections.
It is — not — about the file size in this example.
In PHP, shared hosting environment, what shall be an optimal memory consumption to load a page. My current PHP script is consuming 3,183,440 bytes of memory. What shall I consider a good memory usage, to entertain say, 10000 users parallely?
Please be detailed, as I am a novice in optimization part.
Thanks in advance
3MB isn't that bad - keep in mind that parts of PHP are shared, depending on which server is used (IIS, ngx, apache etc.) you can specify pools and clusters as well when having to scale up.
But the old adage testing is knowledge goes well here, try load tests on the site, concurrent 10 -> 100 -> 1000 connections and look at the performance metrics, it wil give you more insight on how much memory is required.
For comparison, the site I normally work on has an average of 300+ users concurrently online and the memory usage is just under 600MB, however I run certain processes locally it will easily use up 16MB.
So while configuring a server for uploading files I noticed the default for max_file_uploads is 20 files. Is there any reason to keep this at a low value or is it safe to up it to 100 files?
This will depends about your server resources (bandwidth, memory, cpu, etc) If you have a powerfull server and you need to download at the same time 100 files, go head and change it to 100 otherwise keep it as low possible
Consider a normal PHP image upload functionality (not using AJAX) and there occurs this problem of large image upload failing occasionally - less frequently on one test server and more frequently on another test server. Assuming the debugger has not yet started debugging the problem and there are no file/folder permission issues, how to proceed?
I am sure I have file_uploads on. I do not want to just blindly set some safe values or increase the values until it works. Basically, I want the values to be exactly as per my concerned modules. I am ready to override the settings in my concerned modules, if that is the best approach.
According to settings related to file upload, these are all the relevant/related settings -
* file_uploads
* upload_max_filesize
* max_input_time
* memory_limit
* max_execution_time
* post_max_size
Finding parameters/values for concerned script -
So that I can find out which one of them is/how many of them are actually being violated by my script and causing the failure, I need to first find the corresponding values for my script. How to find the following values for my script:
Total uploaded files size
Input time
Memory usage
Script execution time
Posted data size
Which tool(s) can be used for the same. Using PHP code, I think, I can find out a few:
Script execution time - Difference between microtime(true) at script start and end.
Total Uploaded file size - Foreach loop on $_FILES to find the sum of ['size'] attribute
How to find out the rest like Memory Usage, Input time etc.?
Where/How to override
Finally, when I have found the violating setting(s), suppose I need to increase/override values for 2 of the settings. Where to apply the override? I guess it is not correct to set memory_limit etc. for all the modules in htaccess or in PHP script. Rather, applying only in the concerned module is better. Am I correct?
Settings for Less demanding modules
Also, for other modules, where much resources are not needed, is it good/wise to override the settings to reduce them, after carefully studying the resource requirements of the modules? Will it reduce unnecessary resource consumption? If so, how about having 2 or 3 combinations of these settings (depending on the project requirements, naming them normal-script, heavy-file-upload) and calling a single function to load any one combination for every module?
memory_limit precautions
Regarding memory_limit it is mentioned here that -
Setting too high a value can be very dangerous because if several uploads are being handled concurrently all available memory will be used up and other unrelated scripts that consume a lot of memory might effect the whole server as well.
What general precautions to take about this?
Thanks,
Sandeepan
A few ideas for debugging:
For manual testing, I would prepare a series of images with different dimensions whose net size (width x height) increases in small steps: 100 x 100, 100 x 200, 100 x 300 .... and try them. At some point, they could start failing if the problem is the memory limit. You could turn on error_reporting() for yourself only (maybe using a debugging cookie of some sort) so you see what exactly fails.
If that's not an option, I would set up a mechanism of some sort for long-term logging that stores the image's dimensions into a log file or table before the resizing starts, and also the contents of the $_FILES array. At the successful end of the script, add an "OK" to that entry. That way, you will be able to find out more about the failed uploads, if they make it through to the script (and don't fail beforehand due to a timeout setting).
Also, for other modules, where much resources are not needed, is it good/wise to override the settings to reduce them
I think the answer is always "no". As far as I know, the memory limit is the maximum limit of memory that can be allocated, but that amount is not reserved for every request. I have never heard of anybody fine-tuning the memory limit in this way.
However, if some parts of the system (e.g. the image resizer) require an enormously high memory limit, it may be wise to apply specific memory_limit settings only to them, e.g. through a .htaccess setting.
I have a site that enables users to upload images which are then re-sized into 4 different sizes.
I'm moving to a new host and I wondered what makes a good spec for handling this task - or should any server spec be able to handle this task. Should I look at more RAM or a better CPU etc...
Images are currently restricted to 2mb but I'd like to increase that.
Is there anything to choose between these (for this task)?
Option 1.
* Processor: Pentium 4 3GHZ Hyperthreaded
* Memory: 2GB DDR SDRAM
* Hd1: 120GB 7200RPM SATA / 8MB Cache
* Hd2: 120GB 7200RPM SATA / 8MB Cache
* OS: Linux - CentOS 5 (+32 Bit)
Option 2.
* Processors: Dual Core Intel Core 2 Duo 2.2GHz
* Memory: 1GB RAM
* Hard Disk: 1x 160GB 7,200rpm
* OS: Linux - CentOS 5.2
edit:
I'm using
http://pear.php.net/package/Image_Transform
with GD2
Volume is very low, but
certain JPG files fail even when
they are < 2mb
Current hosting is a VPS with 768mb dedicated ram (find
out about processor)
You don't say how many you are doing per time period, what you are using (GD? ImageMagick? Something else) or the spec and performance of your current server.
However unless you are doing a lot, both of those servers should be way more than fine.
Definitely stick with a VPS (vs. shared hosting) because working with images in PHP is all about tuning your php.ini file.
There are a ton of reasons why a PHP script would fail to process an upload:
Upload is too big. The upload size is controlled by several directives:
post_max_size, upload_max_filesize, memory_limit. If all of the above directives are not configured properly, the defaults will cap you around 2MB.
Ran out of memory working with the image. The memory_limit directive affects this. Also make sure your code is releasing resources as soon as possible instead of waiting for script termination.
Operations took too long. max_input_time and max_execution_time control how long the script gets to execute (max_input_time controls HTTP I/O, max_execution_time controls actual script execution). Bigger images take longer to process.
Figure out which conditions are failing, and then scale your server up to resolve those conditions. If you are switching hosts based on performance issues, you might want to do this first. You might find that the switch is unneeded.
IF you're just doing development/testing, and maybe just a soft launch - one if fine. If you expect to go live you're going to need to keep tabs on your server load and how many processes you are spawning, as well as how long your actual resize time is for images.
If you expect to handle serious volume in the near future, you'll definitly want the dual core system. Resizing images is very intensive. Further down the road, you may need additional machines just to handle image processing and one to handle the site.