Undefined index error in Large file uploads - php

i am trying to write upload script. Using uniform server on windows 7. My upload_max_size is 10M. I want to control that if user try send correct size of file or not. So im cheking error code with this code. Here
print_r($_FILES['userfile']['error'];
This code works when i try small file from limit and shows 0 on screen. But if i try large file from limit, then does not show error code, gives undefined index error. How can i solve this, and see error code when i try loading exceeded file size?
Thanks.

There are several limiting factors for the file size (depending on the server):
The upload_max_size you mentioned.
A upper limit for HTTP Post data (or whichever HTTP Method you use)
A (soft) upper limit in the client's browser
Timeouts limiting the file size due to the limited response time
Proxies
Check the other ones and never rely on any files. Always check existance.

Open all your php.ini like files, search for post_max and upload_max and change their values to 1000M

Related

Writing to bigger files leads to Internal Server Error

Implementing a feature for uploading files with a (potentially) unlimited filesize using chunked-file uploading and WebWorkers, I stumbled upon a quite strange problem:
Whenever I try to attempt to write to a file bigger than 128 MB - 134 MB using fwrite(), an internal Server error is raised and, thus, script execution is stopped. The problem could be simplified to this (hopefully self-explanatory) test-case:
$readHandle = fopen("smallFile", "r"); // ~ 2 MB
$writeHandle = fopen("bigFile", "a"); // ~ 134 MB
// First possible way of writing data to the file:
// If the file size of bigFile is at approx. 134 MB, this
// will result in an HTTP 500 Error.
while (!feof($readHandle)){
fwrite($writeHandle, fread($readHandle, 1024 * 1024));
}
// Second way of reproducing the problem:
// Here, the data is just NOT written to the destination
// file, but the script itself doesn't crash.
// stream_copy_to_stream($readHandle, $writeHandle);
fclose($readHandle);
fclose($writeHandle);
When using stream_copy_to_stream, the script doesn't crash, but the data is just not written to the destination file.
Having contacted the support-team of my (shared) server host, I got the answer that this limit had something to do with the php configuration variables post_max_size and upload_max_size. However, neither do the set values (96MB for both) correspond to the measured maximum file size (134MB) at which files are writeable, nor does the problem exist when I apply the same values to my local test server.
Also, I could not find any information about a potential correlation between PHP_MEMORY_LIMIT (the offer I am using states 512MB) and the maximum writeable file size of 128 - 134MB (of which 512MB is a multiple).
Does anybody know if
the said configuration values really correspond to the problem at all?
there is any other way of continuing appending data to such a file?
PS: This SO thread might be based on the same problem, but here the question(s) are different.

Uploading Images with PHP Script Failing

I'm fairly new to PHP, but I'm having a recurring issue via multiple different scripts and servers when uploading images via ShareX to my server with a custom script, specifically this one.
I've migrated servers (I was on a shared host, now I'm on a VPS), and have since changed to using this script, but I'm still having the issue and I don't know what exactly the problem is.
The issue (does not occur 100% of the time, but it does most of the time; sometimes it works after retrying) is that uploading images over a certain size, about 250-500KB times out or fails most of the time. After 60 seconds, I get a 502 error (Bad Gateway) on ShareX.
I've looked up common solutions to similar problems ("large" files timing out in PHP), and have checked the following variables in my PHP.ini file.
max_execution_time = 60
max_input_time = 60
memory_limit = 128M
post_max_size = 8M
When uploads are successful, it takes a few seconds in total to upload and get the link of the uploaded image returned, but when it fails, it's always 60 seconds and then error. There is no middle ground, it's either it succeeds instantly or times out after 60 seconds.
I don't know exactly how to go about finding what exactly the error (if any) is. When it happens, ShareX reports a (502) Bad Gateway error, the 'Response:' is just the source code of the page (the script is set up to redirect you to this page if it detects you aren't uploading anything or it fails), and the 'Stack Trace' is the following:
StackTrace:
at System.Net.HttpWebRequest.GetResponse()
at ShareX.UploadersLib.Uploader.UploadData(Stream dataStream, String url, String fileName, String fileFormName, Dictionary`2 arguments, NameValueCollection headers, CookieCollection cookies, ResponseType responseType, HttpMethod method, String requestContentType, String metadata)
Edit: My server is behind cloudflare, and I read that cloudflare might cause problems. However, I've checked the settings and the maximum upload size is set at 100MB on cloudflare, and pausing it doesn't seem to help.
Edit: I removed the limit on post_max_size which was 8M and it seems to have partly fixed the issue. I can now upload things up to about 3MB but after that it always fails with a custom error message from the script.
When increasing file POST limits, you may need to change at least 2 settings:
upload_max_filesize = 30M
post_max_size = 32M
Dont think it has anything to do with CloudFlare. See if you can check the error log for Apache if the above settings dont work.

Ajax error - 0 after a long php script

I am using jquery ajax to send the url of a file (csv file) located on the server to my php script so as to process it.
The csv file contains telephone calls. If i have a file with even 10.000 calls everything is ok. But if i try a big file with like for example 20000 calls then i get an Ajax Error 0 . I check for server responce with firebug but i get none.
This behaviour occurs after like 40mins of w8ing for the php script to end. So why do i get this error on big files only? Does it have to do with apache, mysql or the server itself? Anyone able to help will be my personal hero cause this is driving me nuts.
I need a way to figure out whats happening exactly but firebug wont return a server responce. Any other way i can find out whats happening?
I checked the php error log and it reports nothing on the matter
Thanks in advance.
The script may have timed out:
See your php.ini file
max_execution_time
max_input_time ;# for the max time an input can be processed
Were your PHP.ini is depends on your enviroment, more information: http://php.net/manual/en/ini.php
Check:
max_input_time
This sets the maximum time in seconds a script is allowed to parse input data, like POST and GET. It is measured from the moment of receiving all data on the server to the start of script execution.
max_execution_time
This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30. When running PHP from the command line the default setting is 0.
Also
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. Both default to 300 seconds. See your web server documentation for specific details.
First enable php error by placing below code at top of the php file.
error_reporting(E_ALL);
Then as Shamil explained in this answer, checkout your max_execution_time settings of your php.
To check max_execution time, open your php.ini file and search for that, and then change it to a maximum value of say one hour (3600).
I hope this will fix your issue.
Thank you

Need workaround for AJAX uploader requiring massive PHP ini memory limit, seg faults when exceeding!

We have recently replaced our flash-based uploader with this ajax one:
http://valums.com/ajax-upload/
Unfortunately, this uses the POST mechanism to transfer data to the server, which means that if I have a 50MB upload limit, I need to add at least 50MB to my PHP ini, as the data is added into the $_POST array.
Unfortunately, this means that all pages now have this huge added limit, which is not acceptable. I cannot even set the limit for the page on the fly, since using the ini_set would occur AFTER the $_POST processing had occurred. Can anyone think of an alternate solution?
In addition, anything exceeding the max limit causes a PHP seg fault / Fatal Error! Any ideas?
You can do something like this in Apache in the .conf file for your site's configuration:
<Location /upload.php>
php_value memory_limit 60M
</Location>
That makes the higher memory limit apply only to scripts invoked /upload.php in the URL. You can do similar overrides with <Files>, <FilesMatch>, <Directory>, etc...
The override has to be done at this level, since as you've found out, by the time ini_set would get executed, the script's already been killed.
The memory limit will not preallocate the 50Mb, it's a hard limit to how much it can use.
If previously no script is failing, it's quite probable that you won't notice the increased memory limit.

24MB PHP file upload fails silently

I'm writing an app that accepts .mp4 uploads.
So I've a 24.3MB .mp4 that is posted to the server, but it fails silently.
The next smallest file I have is a 5.2MB .flv. It's not the file type of course, but file size.
I wonder if anybody could shed some light on this?
P.S. the relevant php.ini entries are as follows:
memory_limit = 256M
upload_max_filesize = 32M
Help!
You should also set post_max_size. Files are sent using HTTP POST.
I wonder if it's encoding-related. Base64 encoding = 33% greater size. 24.3 * 1.33 = 32.4 MB > 32 MB. Try a 23.9 MB file and see if that succeeds
Set error reporting level to E_ALL. Might give you some hint about what's going wrong.
post_max_size is a good idea, also you should check for timeouts. Since uploading larger files takes longer, the webserver might decide it's all taking too long and cancel the request. Check for maximum execution time in php.ini, also check whether there are other server-side time limits (I know of webervers where all tasks are killed after 30 secs. no matter what. An upload might easily take longer than that).
Have you considered using a Flash-Based uploader? This gives you more control over the upload process, and you can display a progress bar during upload (more user-friendly)

Categories