We have a lamp server that is fairly busy, the CPU usage hovers around 90% at peak times. We are having an intermittent problem where file uploads from web forms fail. It only seems to happen with larger files (over a mb) and it seems to affect some users more than others. We've gone through and checked the obvious stuff like PHP ini max upload sizes, execution times, and folder write permissions. Also, the site worked for a year without trouble like this and it suddenly began (we don't think any of our application php would cause this).
We've watched what happens in Charles Proxy and it shows the upload happening (the sent filesize increases regularly) until it just stops sending data. The browser just shows it's spinning progress like it's proceeding but you can wait 20 minutes and nothing will happen or it reports a timeout.
Does anyone know why an upload might fail intermittently? My only guess is that maybe it has to do with server traffic, like apache is closing that connection prematurely.
If the load on the server is high, then your scripts may be timing out while trying to upload the file. Can you give any more specifics on your problem? I think PHP scripts have a 30 second timeout by default, meaning that if the script has not completed i.e. uploading the file within that time frame then the script will timeout and the upload will fail.
It seems like if the site worked for over a year, and now traffic has grown to the point where it is starting to strain the load on the server then it is possible that scripts may be timing out given the increased traffic and load on the server.
Related
I have a web server application with Apache, PHP and MySQL in Windows Server 2008. The server also serves web pages and images.
Recently I have noticed that some users (8 users out of 150) that upload images have a response time from Apache of 200 seconds for example, but the execution time of the PHP script is 2 seconds. But other users are not affected and they're using the same script.
I know this times because I'm logging each request in a MySQL table.
To obtain the apache response time before the execution ends I use
microtime(true) - $_SERVER["REQUEST_TIME_FLOAT"]
And to obtain the PHP execution time I use
microtime(true) - $GLOBALS["tiempo_inicio_ejecucion"];
where $GLOBALS["tiempo_inicio_ejecucion"] is another microtime that I get at the beginning of the script execution.
The server load is low, CPU and RAM are far of their limits.
If I try to reproduce this behaviour uploading files from my PC, I can't reproduce it, it uploads fast.
I suppose is some network issue, but I can't get it solved, or maybe is a network issue of the clients.
How can I know what is happening here?
Thanks in advance.
Possible suggestion: A virus checker.
Your server may have a virus checker installed which scans files automatically when they are created. This will be scanning uploaded file. It is possible that it may be running low on resource or being given a low priority by the server, and thus scans of the uploaded files are taking a long time. However it won't release the file to the web server until the scan is complete, and thus the server takes a long time to start running the PHP code.
I have no idea if this is actually your problem, but I have seen similar problems on other Windows Server boxes. It can be a very difficult problem to diagnose.
I currently have a website that has twice been suspended by my hosting provider for "overusing system resources". In each case, there were 300 - 400 crashed copies of one of my PHP scripts left running on the server.
The scripts themselves pull and image from a web camera at home and copy it to the server. They make use of file locks to ensure only one can write at a time. The scripts are called every 3 seconds by any client viewing the page.
Initially I was confused, as I had understood that a PHP script either completes (returning the result), or crashes (returning the internal server error page). I am, however, informed that "defunct scripts" are a very common occurrence.
Would anyone be able to educate me? I have Googled this to death but I cannot see how a script can end up in a crashed state. Would it not time out when it reaches the max execution time?
My hosting provider is using PHP set up as CGI on a Linux platform. I believe that I have actually identified the problem with my script in that I did not realise that flock was a blocking function (and I am not using the LOCK_NB mask). I am assuming that somehow hundreds of copies of my script end up blocked waiting for a resource to become available and this leads to a crash? Does this sound plausible? I am reluctant to re-enable the site for fear of it being suspended again.
Any insights greatly appreciated.
Probably the approach I would recommend is to use tempnam() first and write the contents inside (which may take a while). Once done, you do the file locking, etc.
Not sure if this happens when a PUT request is being done; typically PHP will handle file uploads first before handing over the execution to your script.
Script could crash on these two limitations
max_execution_time
memory_limit
while working with resources, unless you have no other errors in script / check for notice errors too
I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.
I've spent hours googling, as well as searching the Apache site, and I can't find any documentation about how Apache handles file uploads — particularly large ones. I've read anecdotal reports that PHP isn't involved until the upload is complete, which is what I'd expect. But as far as what Apache does during the upload, I can't find anything.
The reason I'm hot for documentation is that Apache is storing uploads entirely in memory, instead of streaming them to disk. httpd will use every byte of available memory on the server I'm using until it crashes. Typically the amount of physical memory consumed is 3x the size of the file being uploaded, and increases in the vicinity of 5 MB/s (nowhere near my upload speed).
I've tested this same request on another LAMP stack I'm using, and Apache memory usage doesn't change at all throughout the course of the upload.
Can anyone explain to me how Apache could handle the same upload so differently on two different servers? Any thought greatly appreciated.
Technically, PHP is handling the upload on behalf of Apache, and buffering the file in ram until it completes. However, your script will not gain control until after the upload completes (or aborts). Apache by itself won't buffer out to disk unless it has to. Think of it as an invisible "handle_upload()" function call that's transparently inserted as the very first thing in your script.
Back in the "everything is a cgi script" days when language interpreters like PHP weren't embedded in the webserver process, POST data was sent to the CGI script via standard input The file would pass through Apache directly to the CGI process and could be read byte-by-byte as it came in.
The answer is unsatisfying. I never found any documentation.
I continued poking around in the dark, finally stumbling on a mod_fcgid upgrade (from 2.2 to 2.3.6) that did the trick. Perhaps there was a bug in 2.2.
The memory usage still goes up in 2.3.6, but far less dramatically. Only a few megabytes for a ~100 MB file. (However, when the upload finishes and the file is moved, memory usage instantly shoots up ~100-200 MB, but then seems to be immediately released.)
This might help you, because the WAMP server has Apache in it.
http://www.wampserver.com/phorum/read.php?2,39439
I have made a program wherein I am able to upload a file. Everything is working fine.
But, when I tried uploading 11mb file, it seems like it is loading forever or sending the file to server forever..
I have already tried setting the upload_max_filesize to 20M.
Any ideas what could be the cause and how to resolve this?
The why is almost certainly related to your connection speed. Unless you're connecting via a LAN to the machine in question you'll in all probability be connecting via a consumer grade broadband connection. These are pretty much always configured so that download speed is a lot higher than upload speed. As a consequence, that 20 meg file that takes a minute or so to download will take 10 minutes or more to upload over the same connection.
What can you do about this, other than switching to an enterprise-grade broadband connection? Not a great deal, those bits will only transfer so fast over the connection you've got and no faster. What you can do, however, is at least keep the user informed as to how the upload is progressing. PHP from version 5.2 onwards provides hooks that you can use to monitor the progress of a file upload. You can use javascript to monitor these hooks and display a progress bar to the user.
http://www.phpriot.com/articles/php-ajax-file-uploads
How long is "forever"? A typical upload speed on consumer broadband is 256 kilobits per second, at which speed an 11 megabyte file will take over five minutes to upload.
If you are using the Google Chrome web browser, you get an upload progress bar so you can tell if it is working or not.