Debugging techniques for a PHP script with many, many file uploads? - php

I have a PHP script which services a form that contains many, many file uploads. There's about 40 separate files being uploaded, although each one is less than 30 KB in size (so about a meg, total of actual data is being transfered).
I'm using CakePHP for this, if it somehow makes a difference.
The problem I'm having is that only 19 of the files are being uploaded (once they're uploaded I send out an email using GMail as an SMTP relay). I've checked the obvious things, such as listed here:
Can file uploads time out in PHP?
And I've got generous values for everything.
Could anyone suggest strategies to use for running down this problem and/or specific things to check?

I don't think it is a CakePHP issue. Apache and php have certain limits on number of files and max size of files that can be uploaded at a time. Also there will be max post size.
There are two ways to overcome this
You can override the settings in .htaccess file like
php_value upload_max_filesize 10M
php_value post_max_size 15M
php_value max_file_uploads 50
Change it in php.ini file and restart server.

So after much, much digging it looks like it was a problem with my SMTP setup. I couldn't tell you how or why, but the problem was that attempting to email a zillion emails (well, ~40) caused the PHP process to halt (no error messages, nothing). Running the exact same code under XDebug worked fine, and putting a quarter-second sleep() after each email appears to work.
(On a semi-related note: Is there a way to delete a StackOverflow question that you've asked? :) )

Related

Can't upload large file in Laravel (php.ini settings are correct in my opinion)

I have the following problem in Laravel.
I would like to upload a file through a form. But for some reason if the file is larger than around 2100 KB, the validation fails, and says that the file is 'required' and I did not provide it.
I've read numerous articles, that this can be because of php.ini settings. On my server they are the following:
upload_max_filesize 64M
post_max_size 64M
These values are copied from the output of phpinfo(), so they are in effect.
And despite this, the upload fails even for a 2 MB file. Do you have any ideas what I could check/set to solve this?
I am using laravel 5.2, and PHP 7.
Check which server software you are using. Nginx for instance has it's own limit (default set to 1MB I believe). Apache might have it too. Consult the respective manuals for those packages on how to configure them. Or if you're using shared hosting, contact support to see if they can increase the limit.
Though this isn't a really scalable solution. Next time you might want to upload a 100MB file, and you probably don't want to allow 100MB requests on your servers. A better approach would be to split the file in smaller chunks in the frontend, with JavaScript, and submit them as parts of the same upload, then recombine the parts on the server once the file is completely uploaded. Beware of additional checks you'll have to do here though.
You might want to incorporate the following into your own code:
<?php
//--- this tries to override the default of only 2M file uploads.
ini_set("upload_max_filesize","25M");
ini_set("max_execution_time",600); //--- 10 minutes
ini_set("post_max_size","35M");
ini_set("file_uploads","On");
?>
In my case, it was HDD space issue. not enough space to store the file.
Laravel should handle it with proper message, instead of indicating user didn't upload anything.
If you are not using any other package to upload files to check , then
then remember to restart apache .

Uploading files gives ERR_CONNECTION_RESET

When uploading files that are larger and take longer I get this response:
First time I try to upload a big file it seem to get: ERR_CONNECTION_ABORTED
after that it becomes ERR_CONNECTION_RESET
I checked all the upload limits etc all are 240seconds or higher and 256mb
I tried uploading 200mb file.
It always seems to happen at the same moment which is at 1 min.
There are no 60 second limits or anything in the phpinfo.
Also tried locally upping all the limits with no result.
Here is the main upload part. If additional info required I'll update the question A.S.A.P
Edit: Just came across this which is being used as well: SWFUpload
Getting error code 2038
Reinstalling flash doesn't work, neither do other browsers!
increase upload_max_filesize, max_input_time, and post_max_size an memory_limit in your php.ini. eventhought 200MB it is too big. if it is text/csv you might want to split the file in multiple files.

How to upload large files above 500MB in PHP [duplicate]

This question already has answers here:
upload large files using php, apache
(5 answers)
Closed 9 years ago.
I made an upload page in PHP, but I dont know why the page would not upload documents greater than 500MB, This is my first time of trying to upload something that large, I changed all the configurations in the PHP.INI (post_max_size = 700M, upload_max_filesize = 600M, and max_execution_time = 300). The codes for upload is as below
if(isset($_FILES['upload']) && !empty($_FILES['upload']['name'])){
move_uploaded_file($_FILES['upload']['tmp_name'], $this->filePath.$this->fileName);
}
I need help, I wonder if there is something am not doing right..
Do you think if increasing upload size limit will solve the problem? what if uploading 2GB file, what's happening then? Do you take into consideration the memory usage of such a script?
Instead, what you need is chunked upload, see here : Handling plupload's chunked uploads on the server-side
and here : File uploads; How to utilize "chunking"?
By configuration, PHP only allows to upload files up to a certain size. There are lots of articles around the web that explain how to modify this limit. Below are a few of them:
PHP Increase Upload File Size Limit
How do I increase the PHP upload limits?
Uploading large(big) files in PHP using .htaccess
For instance, you can edit your php.ini file and set:
memory_limit = 32M
upload_max_filesize = 24M
post_max_size = 32M
You will then need to restart apache.
Note:
That being said, Uploading large files like that is not very reliable. Errors can occur. You may want to split the files, and include some additional data for error corrections. One way to do that is to use par recovery files. You can then check the files after upload using the par command line utility on unix-like systems.
I assume you mean that you transferring the files via HTTP.
While not quite as bad as FTP, its not a good idea if
you can find another of solving the problem.
HTTP (and hence the component programs) are optimized around transferring relatively small files around the internet.
While the protocol supports server to client range requests,
it does not allow for the reverse operation. Even if the software at either end were unaffected by the volume, the more data you are pushing across
the greater the interval during which you could lose the connection.
But the biggest problem is that caveat in the last sentence.

If i have a serverside PHP script which uploads files to another server, what limits it?

I have a client with a site (built with Kohana framework) which has a chunking file uploader that after upload then posts the file to vimeo for conversion. Problem is that although the file is uploaded successfully to the server, it then errors when sending it on to vimeo. My suspicion is that this post to vimeo is hitting a limit which the first stage chunking uploader avoids.
What settings in the PHP.ini should I be changing to fix this.
The video files are up to 2GB in size.
Update:
In answer to your questions this is a 500 error. I have no more than that due to the fact it only happens on the live server and not on staging and testing. I have been told not to turn on the display_errors for PHP as this will show errors on the live site (which do occur apparently) also this is not my code.
There is a limit to vimeo videos of 500MB for basic accounts as seen here. I'm guessing the kind of account that your server is registered with is that, basic, even when your coumputer might be registered with a plus account. You'll need to check how to connect with a server using your plus account or restrain your videos size to 500MB.
With more information we would be able to provide better answers.
you can try the following in your .htaccess file you can increase or decrease the value accordingly, by default php sets the max file upload limit to 8M
php_value upload_max_filesize 20M
php_value post_max_size 20M
php_value max_execution_time 200
php_value max_input_time 200
OK everyone, thanks for the answers. Turns out the issue was not with any of the limits but instead was a problem with the file chunking routine and an error reporting the sizes of the chunks sent over to Vimeo. This was only an issue on the live server which was RedHat 4.1 and not on any of the staging or testing machines. A temporary fix has been done by removing the size verification while we find the issue. Thanks for all your help.

Upload File Size Limit: Symfony Admin Generator Module

I have form created by the admin generator in the backend of a website. It allows the upload of a video to the site.
It works fine but strangely, the upload fails for files of 10mb or over. However, I have not set any file limits in my form.
Are there Symfony/PHP/Apache/web browser settings regarding this type of behaviour that I can look into?
Or is there a way I can inform Symfony that I'd like to permit larger files?
Even I haven't ever worked with Symfony I expect the problem due to limitations on your Web-Server.
If you have the possibility to edit or add your .htaccess file then the following line of code will probably help you:
php_value upload_max_filesize 100M
the 100M in example is for 100 Megabyte.
Also make sure that (at a minimum) you update post_max_size to match. See the PHP documentation, especially the sections on "Common Pitfalls" and "Error Messages Explained".

Categories