PHP APC uploads are never marked as complete - php

I have a LAMP setup running PHP 5.2.6-1 with the Suhosin Patch (0.9.6.2) and Zend (2.2.0) with APC enabled for use with a file upload script using an ajax cal to get the status and generate a progress bar.
Everything appears to be working, the file uploads perfectly and is displayed correctly on the website or if you download it, but it never gets marked as "complete" by APC, nor does the file size reach the actual size (in the APC call, the uploaded file is just fine).
What could be the reason for APC never seeing the file completely uploaded, and how can I solve this? I'm currently running a rather hack'n'slash way for this, since the file size always reaches at least 90%, I've got my ajax call checking the size, if it's at 90% and stays there for 3 updates, it waits 5 more seconds and then expects it to be completed (not ideal if it's a large file and it really isn't done yet)

Try setting apc.rfc1867_freq=0 this should make APC update the size all the way, whereas before it may have been updating it in 10k increments and stopped near the end.

check the upload_max_filesize. If you are trying to upload a file that is bigger than upload_max_filesize then the you will have this problem. Increase the upload_max_filesize to fix the problem.

Related

Increasing the max concurrent file uploads on a LAMP server

I have an issue where my client wants to be able to upload more than 2 large files at a time, and i was wondering if any of you have a solution to this, i'm pretty sure that it is a setting in either Apache or PHP.
Here is what the scenario, i have an upload form for videos with a max file size of 1024MB (1GB), so when a user uploads 1 file that is less than or equal to 1GB it works fine, then they open a new tab and upload another large file, but when they open up a 3rd tab of that form and try to Upload i Get a IO error from the form, so i assume either PHP or apache is limiting the max number of concurrent uploads to the server.
Edit:
When i use another computer with a totally separate internet connection (separate/different external IP) i can upload there as well, so it seems like a max concurrent upload per client/IP.
I think increasing the memory assigned to a php script can help you, I don't think this is the most appropriate solution but when i was having problem handling lot of big files, I increased the memory for the script. If you are developing this for a very busy site i don't recommend this method but as far as i know try increasing the memory.
ini_set('memory_limit','128M');
for testing if you have -1
instead of 128M the system will give unlimited memory to the script. You can use this to test if the problem is caused due to memory.

PHP File uploading stops

For a client I have build a simple script that uploads multiple files(images), resizes them, stores them on a temporary folder and then later on move them to their destination.
Resizing is done using PHP's GD, as Imagick is not available.
These images are about 2/4 MB a piece and the client uploads about 30 images in one shot.
I used HTML5's multiple="" attribute which all works fine.
In tests all worked fine because I used Windows standard wallpaper images.
I can't find the source of the problem.
When uploading more then 1 image, the script failes debugging tells me it does upload the second image but won't resize.
I checked the memory usage for the images which is aprox 105724352 bytes each.
My PHP ini settings:
max_execution_time = 300
max_input_time = 600
memory_limit = 200M
So you see at the second image the memory reached it limit, making my script stop. Is that correct?
If so, how wise is it to upgrade the memory limit?
Thanks in advance!
EDIT:
It now seems the GD Function imagecreatefromjpeg cant handle files with a resolution bigger then 3500px wide, my files are bigger then 5000px wide.
Does anyone have a work arround for this?
At this point I am wondering if it is wise to have the client on a shared host at all if he needs so much memory for these images.
So you see at the second image the memory reached it limit, making my
script stop. Is that correct?
Check your Apache error logs under (**nix system) /var/log/apache2/error.log to see if it is really the problem.
If so, how wise is it to upgrade the memory limit?
You should not hande multiple image operations in one script. Make ajax queries for each, handle them in seperate instances.

PHP file_get_contents() Timeout?

I am in the early stages of building a PHP application, part of which involves using file_get_contents() to fetch large files from a remote server and transfer them to a user. Lets say, for example, the targeted file that is being fetched is 200 mB.
Will this process time out if downloading to the server takes too long?
If so, is there a way to extend this timeout?
Can this file that is being downloaded also be transferred to a user simultaneously, or does the file have to be saved on the server then manually fetched by the user once download has completed?
I am just trying to make sure that I know my options or limitations are before I do much too more.
Thank you for your time.
Yes, you can use set_time_limit(0) and the max_execution_time directive to cancel the time limit imposed by PHP.
You can open a stream of the file, and transfer it to the user seamlessly.
Read about fopen()
If not a timeout you may well run into memory issues depending on how your PHP is configured. You can adjust a lot of these settings manually through code without much difficulty.
http://php.net/manual/en/function.ini-set.php
ini_set('memory_limit', '256M');

PHP POST requests timeout

I'm currently working on a upload script supporting larger uploads (~50 Mb) and I have very rapidly run into a problem! I'm using the traditional POST request with a form uploading the file to a temp location and later moving it with PHP. Naturally I've updated my php.ini file to support slightly larger than default files and files around 15 Mb upload really well!
The main problem is due to my hosting company. They let scripts timeout after 60 seconds meaning that POST requests taking longer than 60 seconds to complete will die before the temp file reaches the PHP script and this naturally yields an internal server error.
Not being able to crank the timeout on the server (after heated debates) I'm considering the options. Is there a way to bump the request or somehow refresh it to notify the server and reset the timing? Or are there alternative upload methods that don't timeout?
There are a few things you could consider. Each has a cost, and you'll need to determine which one is least costly.
Get a new hosting company. This may be your best solution.
Design a rather complex client-side system that breaks up the upload into multiple chunks and submits them via AJAX. This is ugly especially since it is only useful in getting around a host rule.
I'd really research #1.
With great difficulty. By far your easiest option is to dump the hard-headed host and pick one that actually lets you be productive. I personally use TSOHost - been with them for over a year and a half and have so far had absolutely no reason to complain (not even a slight annoyance).
Are you really sure it s a timeout issue? My first idea...
the transfert failed due to a configuration limitation set up in the web server php.ini file. You need to change it or set it as local settings in your script
# find it in php.ini used by your configuration
memory_limit = 96M
post_max_size = 64M
upload_max_filesize = 64M
Or directly inyour script
ini_set('memory_limit', '96M');
ini_set('post_max_size', '64M');
ini_set('upload_max_filesize', '64M');

Any documentation about how Apache handles file uploads?

I've spent hours googling, as well as searching the Apache site, and I can't find any documentation about how Apache handles file uploads — particularly large ones. I've read anecdotal reports that PHP isn't involved until the upload is complete, which is what I'd expect. But as far as what Apache does during the upload, I can't find anything.
The reason I'm hot for documentation is that Apache is storing uploads entirely in memory, instead of streaming them to disk. httpd will use every byte of available memory on the server I'm using until it crashes. Typically the amount of physical memory consumed is 3x the size of the file being uploaded, and increases in the vicinity of 5 MB/s (nowhere near my upload speed).
I've tested this same request on another LAMP stack I'm using, and Apache memory usage doesn't change at all throughout the course of the upload.
Can anyone explain to me how Apache could handle the same upload so differently on two different servers? Any thought greatly appreciated.
Technically, PHP is handling the upload on behalf of Apache, and buffering the file in ram until it completes. However, your script will not gain control until after the upload completes (or aborts). Apache by itself won't buffer out to disk unless it has to. Think of it as an invisible "handle_upload()" function call that's transparently inserted as the very first thing in your script.
Back in the "everything is a cgi script" days when language interpreters like PHP weren't embedded in the webserver process, POST data was sent to the CGI script via standard input The file would pass through Apache directly to the CGI process and could be read byte-by-byte as it came in.
The answer is unsatisfying. I never found any documentation.
I continued poking around in the dark, finally stumbling on a mod_fcgid upgrade (from 2.2 to 2.3.6) that did the trick. Perhaps there was a bug in 2.2.
The memory usage still goes up in 2.3.6, but far less dramatically. Only a few megabytes for a ~100 MB file. (However, when the upload finishes and the file is moved, memory usage instantly shoots up ~100-200 MB, but then seems to be immediately released.)
This might help you, because the WAMP server has Apache in it.
http://www.wampserver.com/phorum/read.php?2,39439

Categories