Uploading multiple small files to a PHP server and DDoS attacks - php

Let's say I have a web app installed in a shared hosting and 1M user online.
If users were to upload a bunch of small files ( let's say a 1000 file, and the size of each file is approximately 100Kb ) successively and at the same time, using multiple AJAX requests,
Would this have the same effect as a DDoS attack ( so, it'll overwhelm the server) ?

Let’s say it depends on your headroom.
If you really have 1 server.
It may be possible to have 1M users logged in.
But a server like Node.js can handle maximum 1M concurrent connections.
So if these 1M go, everything is DoSed in second 1.
It all depends on the available concurrent connections.
It is — not — about the file size in this example.

Related

What determines a server's or LAMP stack's max concurrent users?

I'm looking to build my own server for the fun of it and in order to learn about the process.
I will be setting up a LAMP stack on the server and use it as a back-end for my mobile application that has some 1000s of daily active users.
But I am completely clueless as to what determines the limits of my server. I see on many hosting providers' websites that they offer a fixed amount of concurrent users, like 20 or 100.
What are the reasons that determine the maximum number of concurrent users for a server? Is it just dependent on the server's hardware, like available RAM? Does it have to do with code or software? What happens to users who try to access the server when the maximum limit has already been reached?
Your question is not really about any code issue, but since you asked...
I will be setting up a LAMP stack on the server and use it as a back-end for my mobile application that has some 1000s of daily active users.
If you care about the performance, you probably should use LEMP instead of LAMP.
But I am completely clueless as to what determines the limits of my server. I see on many hosting providers' websites that they offer a fixed amount of concurrent users, like 20 or 100.
All servers you can buy can be divided into 3 simple groups:
Shared hosting (you and 1000 other folks share the same server).
VPS/VDS (the server is divided into N isolated parts and your resources are guaranteed: M CPU cores, K RAM, T GB HDD/SSD, S GB in/out traffic).
Dedicated servers (you own everything).
If you see any limitation like "X max connections" or "Y SQL queries per second", that means you talk about the shared hosting. You are not allowed to consume more than your limit, otherwise all clients of the same server may suffer from your website/service "popularity". Stick to the VPS/VDS at least if you don't want to hear about boring limitations, your only limitations will be cores, RAM, HDD space and traffic usage.
What happens to users who try to access the server when the maximum limit has already been reached?
Depends on your client's software configs and server's configs. The default behavior for most of clients (like browsers) is to wait until a specific timeout. The default behavior for most of webservers (Apache/Nginx) is to keep connections in a queue until the interpreter (PHP-CGI/PHP-FPM) would become available or die if a timeout is reached (whatever comes first). But that's configurable for each actor in that scheme from increasing the timeout to dropping the extra load immediately.
P.S. If you really want to test your server performance, you can always use a regression testing/benchmarking software or write your own (flood your own server with connections until it would die).

Max connection limit in php apache server

I have 32 GB ubuntu server where my site is hosted. I have installed the XAMPP and running my site. So here my question is what is the limit of maximum concurrent connections apache will handle and how I can check that? At which extent I can increase it and how?
My server must have 5000 concurrent users at a time So for that I have to configure it.
Generally the formula is :
(Total available memory - Memory needed by operating system) / memory each PHP process needs.
Honestly it's a bit hard to predict sometimes, so it might be worth doing some experimentation. The goal is that you never use more memory that available, so your operating system never swaps.
However, you can also turn it around. 5000 concurrent requests is frankly a lot, so I'm going by your 5000 concurrent users.
Say if 5000 users are actively using your application at a given time, and maybe they do on average each 1 request every 30 seconds or so. And say that the average PHP script takes 100ms to execute.
That's about 166 requests per second made by your users. Given that it takes 100ms to fulfill a request, it means you need about 17 connections to serve all that up. Which is easy for any old server.
Anyway, the key to all these types of dilemmas is to:
Make an educated guess
Measure
Make a better guess
Repeat

Allowing large file uploads in PHP (security)

Are there any security and/or performance implications to consider when allowing large file uploads in PHP? For example, these are the PHP ini settings I currently have set.
memory_limit = 950M
upload_max_filesize = 950M
post_max_size = 950M
max_execution_time = 0
What, if anything, could go wrong with these settings?
The security considerations do not change by changing these settings. However for performance the following is valid:
The art of serving users in a performing way is to offer enough ressources to what is requested by the sum of your users. Translating this into examples upon your settings would be something like:
10 users uploading 950 MB would require you to serve 9.5 GB of bandwidth and I/O throughput (which is eg. ipacted by disk speed) in a performing manner. I as user could probably live with uploading 950 MB in 1 minute, but would be dissatisfied with this taking me an hour.
100 users uploading 950 MB would require you to serve 95 GB...
1000 users uploading 950 MB would reuire you to serve 950 GB...
...
Of cause not all of your users go for max at all the time and even concurrent uploads might be limited. However these Max-settings add to your risk stack. So depending on your usage characteristics and your ressource stuffing these settings could be valid.
However I assume you gave extreme examples and want to learn about implications.
When I google "optimize php memory_limit" I get this:
https://softwareengineering.stackexchange.com/questions/207935/benefits-of-setting-php-memory-limit-to-lower-value-for-specific-php-script
Obviously you can do the same with the other settings.
In forums you can find a lot of swear against setting those config-values such high. However having this in environments, where ressource utilization is managed carefully on other access layers (eg. restrict the number of upload-users via in-app permissions) did work out for me in past very well.

Concurrent AJAX file upload LIMIT

I'm doing multiple file uploader to upload my files by using XMLHttp 2. When i'm doing this, if i upload the files one by one (by sending a file through an ajax request and waiting it to complete before sending the next and so on), the uploader works well. But when i upload multiple files by using concurrent ajax requests to upload multiple files concurrently, the browser hangs down. Here is the performance comparison,
So is there any maximum limit for UPLOAD FILE SIZE or NUMBER OF AJAX REQUESTS that the browser can handle concurrently?
Note: That red numbers shows the total upload time in that the Firefox(using firebug) consumes to upload all the files. In parallel upload, since all uploads happening concurrently, i took the time consumed by the largest file which ended at last.
There's no theoretical maximum for the number of concurrent uploads (unless the browser builders put one in explicitly).
However, in practice, upload speed performance drops significantly after two or three concurrent uploads due to bandwidth choking, with the exception of very low latency whereby the tcp window limits the maximum speed of a single upload.
I would recommend setting a concurrency limit of 2, especially if you're providing this to external users whose bandwidth may vary. Alternatively, you could do speed benchmarking as well, adapting the concurrency level based on measured upload performance.

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

Categories