Implementing a feature for uploading files with a (potentially) unlimited filesize using chunked-file uploading and WebWorkers, I stumbled upon a quite strange problem:
Whenever I try to attempt to write to a file bigger than 128 MB - 134 MB using fwrite(), an internal Server error is raised and, thus, script execution is stopped. The problem could be simplified to this (hopefully self-explanatory) test-case:
$readHandle = fopen("smallFile", "r"); // ~ 2 MB
$writeHandle = fopen("bigFile", "a"); // ~ 134 MB
// First possible way of writing data to the file:
// If the file size of bigFile is at approx. 134 MB, this
// will result in an HTTP 500 Error.
while (!feof($readHandle)){
fwrite($writeHandle, fread($readHandle, 1024 * 1024));
}
// Second way of reproducing the problem:
// Here, the data is just NOT written to the destination
// file, but the script itself doesn't crash.
// stream_copy_to_stream($readHandle, $writeHandle);
fclose($readHandle);
fclose($writeHandle);
When using stream_copy_to_stream, the script doesn't crash, but the data is just not written to the destination file.
Having contacted the support-team of my (shared) server host, I got the answer that this limit had something to do with the php configuration variables post_max_size and upload_max_size. However, neither do the set values (96MB for both) correspond to the measured maximum file size (134MB) at which files are writeable, nor does the problem exist when I apply the same values to my local test server.
Also, I could not find any information about a potential correlation between PHP_MEMORY_LIMIT (the offer I am using states 512MB) and the maximum writeable file size of 128 - 134MB (of which 512MB is a multiple).
Does anybody know if
the said configuration values really correspond to the problem at all?
there is any other way of continuing appending data to such a file?
PS: This SO thread might be based on the same problem, but here the question(s) are different.
Related
I'm getting a random crash while uploading a file to S3 using Laravel File Storage system. The crash is not reproducible in local/dev environment and in production also it is very random. All the files are still getting uploaded to S3. The issue occurs randomly for any file type (pdf, png, jpg). File size is usually 1 MB to 3 MB.
Aws\Exception\CouldNotCreateChecksumException
A sha256 checksum could not be calculated for the provided upload body, because it was not seekable. To prevent this error you can either 1) include the ContentMD5 or ContentSHA256 parameters with your request, 2) use a seekable stream for the body, or 3) wrap the non-seekable stream in a GuzzleHttp\Psr7\CachingStream object. You should be careful though and remember that the CachingStream utilizes PHP temp streams. This means that the stream will be temporarily stored on the local disk.
Crashed in non-app: /vendor/aws/aws-sdk-php/src/Signature/SignatureV4.php in Aws\Signature\SignatureV4::getPayload
/app/Http/Controllers/ApiController.php in App\Http\Controllers\ApiController::__invoke at line 432
$filename = $request->file('file')->getClientOriginalName();
$user_file_id = $request->input('file_id');
$path = Storage::putFileAs(
'fileo',
$request->file('file'),
$user_file_id
);
return $path;
I had the same error message, but files were not being saved to S3 - so it my be different.
I followed the answer StackOverflow - update php.ini to increase upload limits and this error stopped.
I had the same issue on laravel and minio object storage. The problem was from my /etc/php.ini configuration I messed up some values. Just make sure that you did not changed this or if you did, make sure they are correct.
upload_max_filesize = 1024M
max_file_uploads = e.g 25
i am trying to write upload script. Using uniform server on windows 7. My upload_max_size is 10M. I want to control that if user try send correct size of file or not. So im cheking error code with this code. Here
print_r($_FILES['userfile']['error'];
This code works when i try small file from limit and shows 0 on screen. But if i try large file from limit, then does not show error code, gives undefined index error. How can i solve this, and see error code when i try loading exceeded file size?
Thanks.
There are several limiting factors for the file size (depending on the server):
The upload_max_size you mentioned.
A upper limit for HTTP Post data (or whichever HTTP Method you use)
A (soft) upper limit in the client's browser
Timeouts limiting the file size due to the limited response time
Proxies
Check the other ones and never rely on any files. Always check existance.
Open all your php.ini like files, search for post_max and upload_max and change their values to 1000M
I need to read a large file to find some labels and create a dynamic form. I can not use file() or file_get_contents() because the file size.
If I read the file line by line with the following code
set_time_limit(0);
$handle = fopen($file, 'r');
set_time_limit(0);
if ($handle) {
while (!feof($handle)) {
$line = fgets($handle);
if ($line) {
//do something.
}
}
}
echo 'Read complete';
I get the following error in Chrome:
Error 101 (net::ERR_CONNECTION_RESET)
This error occurs after several minutes so that the constant max_input_time, I think not is the problem.(is set to 60).
What browser software do you use? Apache, nginx? You should set the max accepted file upload at somewhere higher than 500MB. Furthermore, the max upload size in the php.ini should be bigger than 500MB, too, and I think that PHP must be allowed to spawn processes larger than 500MB. (check this in your php config).
Set the memory limit ini_set("memory_limit","600M");also you need to set the time out limit
set_time_limit(0);
Generally long running processes should not be done while the users waits for them to complete.
I'd recommend using a background job oriented tool that can handle this type of work and can be queried about the status of the job (running/finished/error).
My first guess is that something in the middle breaks the connection because of a timeout. Whether it's a timeout in the web server (which PHP cannot know about) or some firewall, it doesn't really matter, PHP gets a signal to close the connection and the script stops running. You could circumvent this behaviour by using ignore-user-abort(true), this along with set_time_limit(0) should do the trick.
The caveat is that whatever caused the connection abort will still do it, though the script would still finish it's job. One very annoying side effect is that this script could possibly be executed multiple times in parallel without neither of them ever completing.
Again, I recommend using some background task to do it and an interface for the end-user (browser) to verify the status of that task. You could also implement a basic one yourself via cron jobs and database/text files that hold the status.
When I try to paste a large(5000 lines) sql file into PhpMyAdmin, I get this error? I know I can use the upload but on my old version of PhpMyAdmin this used to work without a problem.
ALERT - configured request variable value length limit exceeded - dropped variable
'sql_query' (attacker '111.171.123.123', file '/usr/share/apache2/phpmyadmin/import.php'),
referer: https://example.co.uk/phpmyadmin/db_sql.php?db=test&server=1&
token=0f355f8bbc6fc09d5c512e0409e9cac9&db_query_force=1
I have already tried changing the $cfg['ExecTimeLimit'] = 0;
php.ini
;;;;;;;;;;;;;;;;;;;
; Resource Limits ;
;;;;;;;;;;;;;;;;;;;
; Maximum execution time of each script, in seconds
max_execution_time = 120
; Maximum amount of time each script may spend parsing request data
max_input_time = 60
;max_input_nesting_level = 64 ; Maximum input variable nesting level
;Maximum amount of memory a script may consume (128MB)
memory_limit = 100M
As far as I'm concerned this message means that Suhosin (a security patch for PHP) is blocking your request because of its length. The simplest way to solve your problem without changing Suhosin's config - is to import a file with the same SQL statements to PHPMyAdmin (it allows uploading files for import).
So basically all you need - is to create a simple text file, paste the same SQL statements into it, and upload this file to PHPMyAdmin - it has the appropriate page for such imports.
If you really want to use PhpMyAdmin try using the version 3.4.3.2 or higher as I am not sure if yours version has got this
Partial import
Allow the interruption of an import in case the script detects it is close to the PHP timeout limit. (This might be good way to import large files, however it can break transactions.)
http://www.phpmyadmin.net/home_page/index.php
I hope it helps.
I've been trying to hash the contents of some zip files from a remote source using PHP's md5_file function:
md5_file($url);
I'm having a problem with a couple of URLs; I'm getting the following error:
Warning: md5_file($url): failed to open stream: HTTP request failed!
I think it's because the zip files are quite large in those cases.
But as yet I haven't been able to find much information or case studies for md5_file hashing remote files to confirm or refute my theory. It seems most people grab the files and hash them locally (which I can do if necessary).
So I suppose it's really out of curiosity: Does md5_file have any specific limits to how large remote files can be? Does it have a timeout which will stop it from downloading larger files?
Probably the simplest solution is to set the timeout yourself via:
ini_set('default_socket_timeout', 60); // 60 secs
Of course if these files are big, another option is to use file_get_contents() as you can specify the filesize limit. You don't want to assign this to a value, as it's more efficient to wrap it like so:
$limit = 64 * 1024; // 64 being the number of KB to limit your retrieval
md5(file_get_contents($url, false, null, 0, $limit ));
Now you can create MD5s off parts of the file, and not worry if somebody tries to send you a 2GB file. Of course keep in mind it's only a MD5 for part of the file, if anything after that changes, this breaks. You don't have to set a filesize limit at all, just try it like so:
ini_set('default_socket_timeout', 60); // 60 secs
md5(file_get_contents($url));
Some hosting environments don't allow you to access remote files in this manner. I think that the MD5 function would operate a lot like the file() function would. Make sure you can access the contents of remote files with that command first. If not, you may be able to CURL your way to the file and it's contents.
You could try set_time_limit(0); if file is relatively large and you are not definite with how much time it would consume