Upload form - can't upload / post over 120 Kb - php

I am building a website with an upload form. It will upload only up to 120 Kb but the browser times out if anything bigger is tried. No error message, the browser just times out.
What I've done so far:
Changed all php.ini settings, .htaccess settings;
Tried different browsers;
Rebuilt the form on another server (hosting company) and it works fine
(removed everything from the upload except the bare basics);
This is where it starts getting weird. It works fine on my friends computer.
I think I've eliminated:
The coding - because it does work;
The server - because it works fine on their computers and on my friends computer
The browser - Well I've used another web-hosting company and the form works fine with them, so I'm guessing it's not a browser problem, or even my internet connection problem or maybe even settings on my computer?
The problem is on my website the form won't work for anything over 120 kb. On the test website on another server it does work. It's only my computer it doesn't work on other peoples computers it does work. Why and why set at 120 kb. Am I missing something?? I'm taking this very personally, it's as if the internet, my computer, or technology just doesn't like me.
EDIT 1:
I've just tried something else. I got another computer and tried it on that to eliminate the possibility it was a problem with the computer. Still doesn't work on a different computer at my house - so I'm thinking my broadband provider? But that doesn't explain why the test website on another server works okay??

Review your settings in:
php.ini
; Maximum size of POST data that PHP will accept.
; Its value may be 0 to disable the limit. It is ignored if POST data reading
; is disabled through enable_post_data_reading.
; http://php.net/post-max-size
post_max_size=10M
; Whether to allow HTTP file uploads.
; http://php.net/file-uploads
file_uploads = On
; Maximum allowed size for uploaded files.
; http://php.net/upload-max-filesize
upload_max_filesize=512M
; Maximum number of files that can be uploaded via a single request
max_file_uploads = 10
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
; Maximum amount of time each script may spend parsing request data. It's a good
; idea to limit this time on productions servers in order to eliminate unexpectedly
; long running scripts.
; http://php.net/max-input-time
max_input_time = 60
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = 128M
It's important after editing php.ini file to restart your httpd server, to re-read php.ini file!
Make sure that you set units correctly. There is no shorthand-bytes
like "MB". You should use "K", "M", "G" for that.
Check this PHP FAQ.
httpd.conf or/and .htaccess
php_value post_max_size 10M
php_value upload_max_filesize 512M

If it really is only your computer, it probably is a browser problem. Have you tried different browsers on your computer? Have you tried disabling all browser plugins and extensions?

If you changed all of your PHP settings in php.ini & you are sure that they are set—perhaps via phpinfo() reflecting your settings—and it still doesn’t work it sounds like it might be a firewall issue where the network itself is limiting content being uploaded.
Unclear what this server is or where you have it setup, but I bet a firewall setting is in play.

Related

net::ERR_CONNECTION_RESET when large file takes longer than a minute

I have a multipart file upload in a form with a php backend. I've set max_execution_time and max_input_time in php.ini to 180 and confirmed on the file upload that these values are set and set TimeOut 180 in Apache. I've also set
RewriteRule .* - [E=noabort:1]
RewriteRule .* - [E=noconntimeout:1]
When I upload a 250MB file on a fast connection it works fine. When I'm on a slower connection or a network link conditioner to artificially slow it down, the same file times out and on Chrome gives me net::ERR_CONNECTION_RESET after 1 minute (and 5 seconds) reliably. I've also tried other browsers with the same outcome, just different error messages.
There is no indication to an error in any log and I've tried both on http and https.
What would cause the upload connection to be reset after 1 minute?
EDIT
I've now also tried to have a simple upload form that bypasses any framework I'm using, still timeouts at 1 minute.
I've also just made a sleep script that timeouts after 2 and a half minutes, and that works, page takes around 2.5 minutes to load so I can't see how it's browser or header related.
I've also used a server with more RAM to ensure it's not related to that. I've tested on 3 different servers with different specs but all from the same CentOS 7 base.
I've now also upgraded to PHP 7.2 and updated the relevant fields again with no change in the problem.
EDIT 2
The tech stack for this isolated instance is
Apache 2.4.6
PHP 5.6 / 7.2 (tried both), has OPCache
Redis 3.2.6 for session information and key / value storage (ElastiCache)
PostgreSQL 10.2 (RDS)
Everything else in my tech stack has been removed from this test area to try and isolate the problem. EFS is on the system but in my most isolated test it's just using EBS.
EDIT 3
Here some logs from the chrome network debugger:
{"params":{"net_error":-101,"os_error":32},"phase":0,"source": {"id":274043,"type":8},"time":"3332701830","type":69},
{"params": {"error_lib":33,"error_reason":101,"file":"../../net/socket/socket_bio_adapter.cc","line":216,"net_error":-101,"ssl_error":1},"phase":0,"source": {"id":274043,"type":8},"time":"3332701830","type":56},
{"phase":2,"source":{"id":274038,"type":1},"time":"3332701830","type":159},
{"phase":1,"source": {"id":274038,"type":1},"time":"3332701830","type":164},
{"phase":1,"source": {"id":274038,"type":1},"time":"3332701830","type":287},
{"params": {"error_lib":33,"error_reason":101,"file":"../../net/socket/socket_bio_adapter.cc","line":113,"net_error":-101,"ssl_error":1},"phase":0,"source": {"id":274043,"type":8},"time":"3332701830","type":55},
{"params":{"net_error":-101},"phase":2,"source": {"id":274038,"type":1},"time":"3332701830","type":287},
{"params":{"net_error":-101},"phase":2,"source":{"id":274038,"type":1},"time":"3332701830","type":164},
{"params":{"net_error":-101},"phase":2,"source":{"id":274038,"type":1},"time":"3332701830","type":97},
{"phase":1,"source":{"id":274038,"type":1},"time":"3332701830","type":105},
{"phase":2,"source":{"id":274038,"type":1},"time":"3332701830","type":105},
{"phase":2,"source":{"id":274043,"type":8},"time":"3332701830","type":38},
{"phase":2,"source":{"id":274043,"type":8},"time":"3332701830","type":38},
{"phase":2,"source":{"id":274043,"type":8},"time":"3332701830","type":34},
{"params":{"net_error":-101},"phase":2,"source":{"id":274038,"type":1},"time":"3332701830","type":2},
I went through a similar problem, in my case it was related to mod_reqtimeout by adding:
RequestReadTimeout header=20-40, MinRate=500 body=20, MinRate=500
to httpd.conf did the trick!
You can check the documentation here.
Hope it helps!
Original source here
ERR_CONNECTION_RESET usually means that the connection to the server has ceased without sending any response to the client. This means that the entire PHP process has died without being able to shut down properly.
This is usually not caused by something like an exceeded memory_limit. It could be some sort of Segmentation Fault or something like that. If you have access to error logs, check them. Otherwise, you might get support from your hosting company.
I would recommend you to try some of these things:
Try cleaning the browser's cache. If you have already visited the page, it is possible for the cache to contain information that doesn’t match the current version of the website and so blocks the connection setup, making the ERR_CONNECTION_RESET message appear.
Add the following to your settings:
memory_limit = 1024M
max_input_vars = 2000
upload_max_filesize = 300M
post_max_size = 300M
max_execution_time = 990
Try setting the following input in your form:
In your processing script, increase the session timeout:
set_time_limit(200);
You might need to tune up the SSL buffer size in your apache config file.
SSLRenegBufferSize 10486000
The name and location of the conf file is different depending on distributions.
In Debian you find the conf file in /etc/apache2/sites-available/default-ssl.conf
A few times it is mod_security module which prevents post of large data approximately 171 KB. Try adding/modifying the following in mod_security.conf
SecRequestBodyNoFilesLimit 10486000
SecRequestBodyInMemoryLimit 10486000
I hope something might work out!
Incase anybody else runs into this - there is also a problem with this relating to PHP-FPM. If you dont set "ProxyTimeout" in your httpd.conf - PHP-FPM uses a default timeout of one minute. It took me several hours to figure out the problem as I initially was thinking of all the normal settings like everyone else.
I had the same problem. I used the resumable file upload method where if the internet is disconnected and reconnects back then the upload resumes from the same progress.
Check out the library https://packagist.org/packages/pion/laravel-chunk-upload
Installation
composer require pion/laravel-chunk-upload
Add service provider
\Pion\Laravel\ChunkUpload\Providers\ChunkUploadServiceProvider::class
Publish the config
php artisan vendor:publish --provider="Pion\Laravel\ChunkUpload\Providers\ChunkUploadServiceProvider"
In my opinion it maybe relative to one of them:
About apache config (/etc/httpd2/conf ou /etc/apache2/conf):
Timeout 300
max_execution_time = 300
About php config ('php.ini'):
upload_max_filesize = 2000M
post_max_size = 2000M
max_input_time = 300
memory_limit = 3092M
max_execution_time = 300
About PostgreSQL config (execute this request):
SET statement_timeout TO 0;
About proxy, (or apache mod_proxy), it maybe also be due to proxy timeout configuration
in case anyone has the same issue, the problem I encountered is that the http request has to go through proxy sever and waf, small files upload is ok, but with large files the tcp connection automatically closed, how to validate:
simply change your hosts setting point the domain to the web server ip address (or you may use firefox with no-proxy if there is no waf), if your problem gone then it's the caused by the proxy or the waf in between your web server and the browser
Connection-Reset occurs when php process dies without proper error message.
Changing oracle client version from 19 to 12c and then appropriately configuring in php.ini solved the connection reset issue for our team.

Uploading Images with PHP Script Failing

I'm fairly new to PHP, but I'm having a recurring issue via multiple different scripts and servers when uploading images via ShareX to my server with a custom script, specifically this one.
I've migrated servers (I was on a shared host, now I'm on a VPS), and have since changed to using this script, but I'm still having the issue and I don't know what exactly the problem is.
The issue (does not occur 100% of the time, but it does most of the time; sometimes it works after retrying) is that uploading images over a certain size, about 250-500KB times out or fails most of the time. After 60 seconds, I get a 502 error (Bad Gateway) on ShareX.
I've looked up common solutions to similar problems ("large" files timing out in PHP), and have checked the following variables in my PHP.ini file.
max_execution_time = 60
max_input_time = 60
memory_limit = 128M
post_max_size = 8M
When uploads are successful, it takes a few seconds in total to upload and get the link of the uploaded image returned, but when it fails, it's always 60 seconds and then error. There is no middle ground, it's either it succeeds instantly or times out after 60 seconds.
I don't know exactly how to go about finding what exactly the error (if any) is. When it happens, ShareX reports a (502) Bad Gateway error, the 'Response:' is just the source code of the page (the script is set up to redirect you to this page if it detects you aren't uploading anything or it fails), and the 'Stack Trace' is the following:
StackTrace:
at System.Net.HttpWebRequest.GetResponse()
at ShareX.UploadersLib.Uploader.UploadData(Stream dataStream, String url, String fileName, String fileFormName, Dictionary`2 arguments, NameValueCollection headers, CookieCollection cookies, ResponseType responseType, HttpMethod method, String requestContentType, String metadata)
Edit: My server is behind cloudflare, and I read that cloudflare might cause problems. However, I've checked the settings and the maximum upload size is set at 100MB on cloudflare, and pausing it doesn't seem to help.
Edit: I removed the limit on post_max_size which was 8M and it seems to have partly fixed the issue. I can now upload things up to about 3MB but after that it always fails with a custom error message from the script.
When increasing file POST limits, you may need to change at least 2 settings:
upload_max_filesize = 30M
post_max_size = 32M
Dont think it has anything to do with CloudFlare. See if you can check the error log for Apache if the above settings dont work.

PHP still won't allow file uploads larger than 2 MB

I have a Debian Squeeze install on an Amazon EC2 instance running Apache2, and PHP 5.3.3-7. I would like it to be able to accept uploads from a standard point-and-shoot camera (about 5 MB). Accordingly, I've edited php.ini in /etc/php5/apache2/ to allow for up to 18MB uploads, and I've upped the time PHP will allow to work on a script.
Despite restarting Apache and even the machine itself, it absolutely refuses to upload any file larger than 2 MB. Is this an EC2 problem or is it still a PHP issue. I'm fairly sure I've ironed out all possibility of it being PHP, but I've been staring at the same 4 lines of code for the last week and searching like a mad person for what this could possibly be.
/etc/php5/apache2/php.ini:
max_execution_time = 120
...
max_input_time = 120
...
upload_max_filesize = 18M
...
post_max_size = 18M
I have double checked just now with phpinfo(), these settings are in effect, but it still does not work.
Check settings upload_max_filesize and post_max_size in your php.ini file
The problem is likely to be Sohusin. The default packages from APT on debian has Sohusin built in.
This also affects your upload size limit. Take a look at this link for a fix and an explanation:
http://www.cyberciti.biz/faq/linux-unix-apache-increase-php-upload-limit/
don't know if that will be of any help on your setup?
In Apache:
TimeOut
Amount of time the server will wait for certain events before failing a request
LimitRequestBody
Restricts the total size of the HTTP request body sent from the client
Also on some server setups you cant change php.inp via scipts
try this
max_execution_time = 120
max_input_time = 120
upload_max_filesize = 40M
post_max_size = 40M
Save then run
sudo service apache2 restart

PHP Connection Reset on Large File Upload Regardless Correct Setting

I am having a very common problem which it seems that all the available solutions found are not working.
We have a LAMP server which is receiving high amount of traffic. Using this server, we perform a regular file submission upload. On small file uploads, it works perfectly. On files of around 4-5MB, this submission upload failed intermittently (sometimes it works but many times it failed).
We have the following configuration on our PHP:
max_input_time: 600
max_execution_time: 600
max_upload_size: 10M
post_max_size: 10M
Apache setting:
Timeout: 600
Keep-Alive Timeout: 15
Keep-Alive: On
Per Child: 1000
Max Conn: 100
Thus, I wonder if anyone can help me with this. We have found the issues and solutions online but none of them work in our case.
Thank you so much. Any input / feedback is much appreciated!
The connection coud be terminating at several places:
Apache
Post size limit inside of php.ini
Memory limit inside of php.ini
Input time limit inside of php.ini
Execution time limit inside of php.ini or set_time_limit()
I would increase all of these, and see if it still persists. But you will have to bounce apache for the changes inside of php.ini to take affect.
These are also affected by what kind of connection speed the end user has, if it is failing for certain users, it's because their connection is slower than others, and their connection with the server is terminating.

PHP: Uploading large files fail

Im confused... I can't seem to upload files in the 2gb range. When i try using curl to send a 1.92gb file to my site (through an API), it doesn't report anything at all, its just blank. When i send a 1kb file, it reports back like it should.
When i try uploading via the upload form, it ends up freezing mid way, around 33%. Although im not sure if only the progress bar has froze or if the actual file upload it self has been suspended. I suspect that only the progress bar has froze because it still says data is being sent even though the progress bar freezes.
My php.ini (yes, its reflected by phpinfo as well):
register_globals = Off
magic_quotes_gpc = Off
post_max_size = 2047M
upload_max_filesize = 2047M
max_execution_time = 25200 ; Maximum execution time of each script, in seconds
max_input_time = 25200 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2048M ; Maximum amount of memory a script may consume (16MB)
short_open_tag = On
My vps doesnt actually have 2gb of ram at its disposal, but does memory_limit really need to be set this high?
How should i go about testing this? I know 400mb files work, i haven't tested anything in between 400mb and 1.92gb
You will need a premium account to test up to 2gb, so here is one you can play with:
User: testreferral
Pass: 1234
http://filefx.com
I dont understand where this problem is arising.
Check for:
Memory limit. Try uploading files above and below the actual memory limit.
Time limit. Aren't your uploads take 7+ hours, are they?
The effective settings. Some setting might be overridden by server/etc settings.
PHP: mysql query skipped/ignored after large file uploads?
Mysql was timing out during the file upload. So the file wasn't showing up in the DB

Categories