Continuous file uploads? - php

Some files has to be uploaded to system. Mostly around 5-10MB JPG files.
However, users usually have a very slow upload speed, so it exceeds max_execution_time() most of the time.
I don't have permission to modify max_execution_time()
Is there anything I can do on this case?

Try to set using ini_set():
$max_execution_time = 1000; // or whatever value you need
ini_set('max_execution_time', $max_execution_time);

Related

Writing to bigger files leads to Internal Server Error

Implementing a feature for uploading files with a (potentially) unlimited filesize using chunked-file uploading and WebWorkers, I stumbled upon a quite strange problem:
Whenever I try to attempt to write to a file bigger than 128 MB - 134 MB using fwrite(), an internal Server error is raised and, thus, script execution is stopped. The problem could be simplified to this (hopefully self-explanatory) test-case:
$readHandle = fopen("smallFile", "r"); // ~ 2 MB
$writeHandle = fopen("bigFile", "a"); // ~ 134 MB
// First possible way of writing data to the file:
// If the file size of bigFile is at approx. 134 MB, this
// will result in an HTTP 500 Error.
while (!feof($readHandle)){
fwrite($writeHandle, fread($readHandle, 1024 * 1024));
}
// Second way of reproducing the problem:
// Here, the data is just NOT written to the destination
// file, but the script itself doesn't crash.
// stream_copy_to_stream($readHandle, $writeHandle);
fclose($readHandle);
fclose($writeHandle);
When using stream_copy_to_stream, the script doesn't crash, but the data is just not written to the destination file.
Having contacted the support-team of my (shared) server host, I got the answer that this limit had something to do with the php configuration variables post_max_size and upload_max_size. However, neither do the set values (96MB for both) correspond to the measured maximum file size (134MB) at which files are writeable, nor does the problem exist when I apply the same values to my local test server.
Also, I could not find any information about a potential correlation between PHP_MEMORY_LIMIT (the offer I am using states 512MB) and the maximum writeable file size of 128 - 134MB (of which 512MB is a multiple).
Does anybody know if
the said configuration values really correspond to the problem at all?
there is any other way of continuing appending data to such a file?
PS: This SO thread might be based on the same problem, but here the question(s) are different.

Undefined index error in Large file uploads

i am trying to write upload script. Using uniform server on windows 7. My upload_max_size is 10M. I want to control that if user try send correct size of file or not. So im cheking error code with this code. Here
print_r($_FILES['userfile']['error'];
This code works when i try small file from limit and shows 0 on screen. But if i try large file from limit, then does not show error code, gives undefined index error. How can i solve this, and see error code when i try loading exceeded file size?
Thanks.
There are several limiting factors for the file size (depending on the server):
The upload_max_size you mentioned.
A upper limit for HTTP Post data (or whichever HTTP Method you use)
A (soft) upper limit in the client's browser
Timeouts limiting the file size due to the limited response time
Proxies
Check the other ones and never rely on any files. Always check existance.
Open all your php.ini like files, search for post_max and upload_max and change their values to 1000M

Does md5_file have a memory limit/timeout for remote files?

I've been trying to hash the contents of some zip files from a remote source using PHP's md5_file function:
md5_file($url);
I'm having a problem with a couple of URLs; I'm getting the following error:
Warning: md5_file($url): failed to open stream: HTTP request failed!
I think it's because the zip files are quite large in those cases.
But as yet I haven't been able to find much information or case studies for md5_file hashing remote files to confirm or refute my theory. It seems most people grab the files and hash them locally (which I can do if necessary).
So I suppose it's really out of curiosity: Does md5_file have any specific limits to how large remote files can be? Does it have a timeout which will stop it from downloading larger files?
Probably the simplest solution is to set the timeout yourself via:
ini_set('default_socket_timeout', 60); // 60 secs
Of course if these files are big, another option is to use file_get_contents() as you can specify the filesize limit. You don't want to assign this to a value, as it's more efficient to wrap it like so:
$limit = 64 * 1024; // 64 being the number of KB to limit your retrieval
md5(file_get_contents($url, false, null, 0, $limit ));
Now you can create MD5s off parts of the file, and not worry if somebody tries to send you a 2GB file. Of course keep in mind it's only a MD5 for part of the file, if anything after that changes, this breaks. You don't have to set a filesize limit at all, just try it like so:
ini_set('default_socket_timeout', 60); // 60 secs
md5(file_get_contents($url));
Some hosting environments don't allow you to access remote files in this manner. I think that the MD5 function would operate a lot like the file() function would. Make sure you can access the contents of remote files with that command first. If not, you may be able to CURL your way to the file and it's contents.
You could try set_time_limit(0); if file is relatively large and you are not definite with how much time it would consume

Image size reduction in PHP

In cases such as this error:
Fatal error: Out of memory (allocated
31981568) (tried to allocate 3264
bytes)
Can I use the GD lib to reduce its file size first before getting to this stage?
In short: no.
GD uses memory to reduce the size of an image, if the image is too big the memory limit is exceeded and an error is given.
You can calculate how big an image can be so you can stay under a certain memory limit here: http://www.dotsamazing.com/en/labs/phpmemorylimit
An option, although an unpopular one with shared hosts, is increasing the memory limit, you can do this with ini_set() or in an .htaccess file. Please check if your host allows this. If you have your own host, configure Apache accordingly.
An (also mentioned) option is using Imagemagick, a program that runs on the server that you can call to do the resizing for you. The memory limit for this program can be different than the one for PHP, but there probably will be a limit as well. Contact your host for more info.
You can instead set a higher memory limit.
ini_set('memory_limit', $file_size * 2);
Because if you want to reduce the size using GD, you still need to allocate memory for that file before you can reduce the size.
Also remember to set a file size limit to your image uploads.
You can use the filesize() function to check the file size before reading/opening the file.
Are you reading a file from disk with a PHP script?
This is obviously continuing from your previous post. If I remember rightly it's an uploaded image you're working with? If so what is the size of the image? If it's really large you should consider limiting the size of image uploads.
That would happen when u want the memory to do more than it can .. you might want to look into imagemagick , so instead of resizing via PHP just send request to imagemagick to do the resizing.
Or the easier way would be to increase the memory limit per script of php scripts via ini.

24MB PHP file upload fails silently

I'm writing an app that accepts .mp4 uploads.
So I've a 24.3MB .mp4 that is posted to the server, but it fails silently.
The next smallest file I have is a 5.2MB .flv. It's not the file type of course, but file size.
I wonder if anybody could shed some light on this?
P.S. the relevant php.ini entries are as follows:
memory_limit = 256M
upload_max_filesize = 32M
Help!
You should also set post_max_size. Files are sent using HTTP POST.
I wonder if it's encoding-related. Base64 encoding = 33% greater size. 24.3 * 1.33 = 32.4 MB > 32 MB. Try a 23.9 MB file and see if that succeeds
Set error reporting level to E_ALL. Might give you some hint about what's going wrong.
post_max_size is a good idea, also you should check for timeouts. Since uploading larger files takes longer, the webserver might decide it's all taking too long and cancel the request. Check for maximum execution time in php.ini, also check whether there are other server-side time limits (I know of webervers where all tasks are killed after 30 secs. no matter what. An upload might easily take longer than that).
Have you considered using a Flash-Based uploader? This gives you more control over the upload process, and you can display a progress bar during upload (more user-friendly)

Categories