Iam trying to upload a file using php. I can upload.zip files up to 3 mb . But can't upload files >3mb. It take a lot of time to submit the html form. I have checked the upload and memory details using the following code.
$max_upload = (int)(ini_get('upload_max_filesize'));
$max_post = (int)(ini_get('post_max_size'));
$memory_limit = (int)(ini_get('memory_limit'));
$upload_mb = min($max_upload, $max_post, $memory_limit);
And it gives the out put as
max_upload=10
memory_limit=64
upload_mb=10
Please help me to find out the solution.
It could also be the webserver, see LimitRequestBody for apache or client_max_body_size for nginx
Another reason would be proxy (transparent proxy?). You can test that by asking someone else to try uploading the file
Have you checked the timeout for your scripts? By default is 30 sec... Maybe that's the limit...
since it takes a lot of time may be you are exceeding the 30 secon timeout
you can alter it by adding
like
set_time_limit(60);
from http://php.net/manual/en/function.set-time-limit.php
and
run this code
<?php
phpinfo(); ?>
Run that file to get your system settings (search for upload_max_filesize, etc);
I gave the same answer to a previous PHP large file upload question, but the answer still applies:
For large files, if you don't want to have to deal with configuring server settings (particularly if you are on shared hosting or some other hosting that doesn't give you full control over the server), one potential solution is to hand the upload off to a third party service.
For example, you could have the form do a direct post to Amazon S3 (http://s3.amazonaws.com/doc/s3-example-code/post/post_sample.html) or use a service like Filepicker.io
Full disclosure: I work at Filepicker.io, but want to help out folks who are dealing with issues doing large file uploads
Related
I have a php script for multiple upload of files.
I noticed that when the upload takes more than (about) two minutes I get the following error:
500 - Internal server error. There is a problem with the resource you
are looking for, and it cannot be displayed.
Some info:
PHP Version: 5.4.23
System: Windows NT SDADMIN32263436 6.1 build 7601 (Windows Server 2008
R2 Standard Edition Service Pack 1) i586
Any tips?
Thank you
By default PHP only allows upload of files a couple of meg big. You could try changing the following directives in the php.ini file ....
memory_limit = 32M
upload_max_filesize = 24M
post_max_size = 32M
Obviously use values that are appropriate to you.
It could however not be linked to the upload size at all. As PHP is server side, the 500 error is incredibly generic. You can try looking at your PHP log files (you can do this on IIS through server 2008).
It might also help you to turn on some error catching in your application. For development, one way to do this is to put the following at the top of your PHP script
ini_set('display_startup_errors',1);
ini_set('display_errors',1);
error_reporting(-1);
This will mean PHP will show any errors it encounters in the browser. It is NOT a good idea to this in production though, as it can give sensitive information about your server and hosting out.
I refer to this question. This user seems to have the same problem of yours and in this answer he was suggested make some changes in the configuration file:
"max_execution_time" integer
This sets the maximum time in seconds a script is allowed to run before it is terminated by the parser. This helps prevent poorly written scripts from tying up the server. The default setting is 30. When running PHP from the command line the default setting is 0.
The maximum execution time is not affected by system calls, stream operations etc. Please see the "set_time_limit()" function for more details.
[...]
"max_input_time" integer
This sets the maximum time in seconds a script is allowed to parse input data, like POST, GET and file uploads.
[...]
Additionally here's some info on checking/setting CGI Timeout in IIS5 and 6.
I also suggest you to check the PHP error logs in order to retrieve more information about the upload execution.
Finally in this question and this question they also talk about the IIS configuration in order to allow PHP to make bigger uploads.
I'm still trying to import a large text file into phpPgAdmin and not succeeding. I have changed the following fields in the php.ini file, but it didn't help.
I changed the following fields to:
upload_max_filesize = 3G
post_max_size = 4G
memory_limit = 5G
I believe the timeout default is 30 seconds (which I haven't changed), but I get error messages right away saying "No server supplied!" or "Import error: File could not be uploaded to the server".
And these are very small files compared to the 2GB text file I am trying to import. The largest file I have been able to import is 1.6 MB.
Any ideas?
This error was happening to me, and the solution is pretty simple (something that could easily be overlooked). This is an older post, but since it came up first on a google search I hope I can save someone some time. Restart apache and the changes should take effect. Cheers!
I am having trouble uploading files to S3 from on one of our servers. We use S3 to store our backups and all of our servers are running Ubuntu 8.04 with PHP 5.2.4 and libcurl 7.18.0. Whenever I try to upload a file Amazon returns a RequestTimeout error. I know there is a bug in our current version of libcurl preventing uploads of over 200MB. For that reason we split our backups into smaller files.
We have servers hosted on Amazon's EC2 and servers hosted on customer's "private clouds" (a VMWare ESX box behind their company firewall). The specific server that I am having trouble with is hosted on a customer's private cloud.
We use the Amazon S3 PHP Class from http://undesigned.org.za/2007/10/22/amazon-s3-php-class. I have tried 200MB, 100MB and 50MB files, all with the same results. We use the following to upload the files:
$s3 = new S3($access_key, $secret_key, false);
$success = $s3->putObjectFile($local_path, $bucket_name,
$remote_name, S3::ACL_PRIVATE);
I have tried setting curl_setopt($curl, CURLOPT_NOPROGRESS, false); to view the progress bar while it uploads the file. The first time I ran it with this option set it worked. However, every subsequent time it has failed. It seems to upload the file at around 3Mb/s for 5-10 seconds then drops to 0. After 20 seconds sitting at 0, Amazon returns the "RequestTimeout - Your socket connection to the server was not read from or written to within the timeout period. Idle connections will be closed." error.
I have tried updating the S3 class to the latest version from GitHub but it made no difference. I also found the Amazon S3 Stream Wrapper class and gave that a try using the following code:
include 'gs3.php';
define('S3_KEY', 'ACCESSKEYGOESHERE');
define('S3_PRIVATE','SECRETKEYGOESHERE');
$local = fopen('/path/to/backup_id.tar.gz.0000', 'r');
$remote = fopen('s3://bucket-name/customer/backup_id.tar.gz.0000', 'w+r');
$count = 0;
while (!feof($local))
{
$result = fwrite($remote, fread($local, (1024 * 1024)));
if ($result === false)
{
fwrite(STDOUT, $count++.': Unable to write!'."\n");
}
else
{
fwrite(STDOUT, $count++.': Wrote '.$result.' bytes'."\n");
}
}
fclose($local);
fclose($remote);
This code reads the file one MB at a time in order to stream it to S3. For a 50MB file, I get "1: Wrote 1048576 bytes" 49 times (the first number changes each time of course) but on the last iteration of the loop I get an error that says "Notice: fputs(): send of 8192 bytes failed with errno=11 Resource temporarily unavailable in /path/to/http.php on line 230".
My first thought was that this is a networking issue. We called up the customer and explained the issue and asked them to take a look at their firewall to see if they were dropping anything. According to their network administrator the traffic is flowing just fine.
I am at a loss as to what I can do next. I have been running the backups manually and using SCP to transfer them to another machine and upload them. This is obviously not ideal and any help would be greatly appreciated.
Update - 06/23/2011
I have tried many of the options below but they all provided the same result. I have found that even trying to scp a file from the server in question to another server stalls immediately and eventually times out. However, I can use scp to download that same file from another machine. This makes me even more convinced that this is a networking issue on the clients end, any further suggestions would be greatly appreciated.
This problem exists because you are trying to upload the same file again. Example:
$s3 = new S3('XXX','YYYY', false);
$s3->putObjectFile('file.jpg','bucket-name','file.jpg');
$s3->putObjectFile('file.jpg','bucket-name','newname-file.jpg');
To fix it, just copy the file and give it new name then upload it normally.
Example:
$s3 = new S3('XXX','YYYY', false);
$s3->putObjectFile('file.jpg','bucket-name','file.jpg');
now rename file.jpg to newname-file.jpg
$s3->putObjectFile('newname-file.jpg','bucket-name','newname-file.jpg');
I solved this problem in another way. My bug was, that filesize() function returns invalid cached size value. So just use clearstatcache()
I have experienced this exact same issue several times.
I have many scripts right now which are uploading files to S3 constantly.
The best solution that I can offer is to use the Zend libraries (either the stream wrapper or direct S3 API).
http://framework.zend.com/manual/en/zend.service.amazon.s3.html
Since the latest release of Zend framework, I haven't seen any issues with timeouts. But, if you find that you are still having problems, a simple tweak will do the trick.
Simply open the file Zend/Http/Client.php and modify the 'timeout' value in the $config array. At the time of writing this it existed on line 114. Before the latest release I was running at 120 seconds, but now things are running smooth with a 10 second timeout.
Hope this helps!
There are quite a bit of solutions available. I had this exact problem but I don't wanted to write a code and figure out the problem.
Initially I was searching for a possibility to mount S3 bucket in the Linux machine, found something interesting:
s3fs - http://code.google.com/p/s3fs/wiki/InstallationNotes
- this did work for me. It uses FUSE file-system + rsync to sync the files in S3. It kepes a copy of all filenames in the local system & make it look like a FILE/FOLDER.
This saves BUNCH of our time + no headache of writing a code for transferring the files.
Now, when I was trying to see if there is other options, I found a ruby script which works in CLI, can help you manage S3 account.
s3cmd - http://s3tools.org/s3cmd - this looks pretty clear.
[UPDATE]
Found one more CLI tool - s3sync
s3sync - https://forums.aws.amazon.com/thread.jspa?threadID=11975&start=0&tstart=0 - found in the Amazon AWS community.
I don't see both of them different, if you are not worried about the disk-space then I would choose a s3fs than a s3cmd. A disk makes you feel more comfortable + you can see the files in the disk.
Hope it helps.
You should take a look at the AWS PHP SDK. This is the AWS PHP library formerly known as tarzan and cloudfusion.
http://aws.amazon.com/sdkforphp/
The S3 class included with this is rock solid. We use it to upload multi GB files all of the time.
I have a backup script which backups up all files for a website to a zip file (using a script similar to the answer to this question). However, for large sites the script times out before it can complete.
Is there any way I can extend the length of time available for the script to run? The websites run on shared Windows servers, so I don't have access to the php.ini file.
If you are in a shared server environment, and you don’t have access to the php.ini file, or you want to set php parameters on a per-site basis, you can use the .htaccess file (when running on an Apache webserver).
For instance, in order to change the max_execution_time value, all you need to do is edit .htaccess (located in the root of your website, usually accesible by FTP), and add this line:
php_value max_execution_time 300
where 300 is the number of seconds you wish to set the maximum execution time for a php script.
There is also another way by using ini_set function in the php file
eg. TO set execution time as 5 second, you can use
ini_set('max_execution_time', 300); //300 seconds = 5 minutes
Please let me know if you need any more clarification.
set time limit comes to mind, but may still be limited by php.ini settings
set_time_limit(0);
http://php.net/manual/en/function.set-time-limit.php
Simply put; don't make a HTTP request to start the PHP script. The boundaries you're experiencing are set because you're using a HTTP request, which means you can have a time-out. A better solution would be to implement this using a "cronjob", or what Microsoft calls "Scheduled tasks". Most hosting providers will allow you to run such a task at set times. By calling the script from command line, you don't have to worry about the time-outs any more, but you're still at risk of running into memory issues.
If you have a decent hosting provider though, why doesn't it provide daily backups to start with? :)
You can use the following in the start of your script:
<?php
if(!ini_get('safe_mode')){
set_time_limit(0); //0 in seconds. So you set unlimited time
}
?>
And at the end of the script use flush() function to tell PHP to send out what it has generated.
Hope this solves your problem.
Is the script giving the "Maximum execution time of xx seconds exceeded" error message, or is it displaying a blank page? If so, ignore_user_abort might be what you're looking for. It tells php not to stop the script execution if the communication with the browser is lost, which may protect you from other timeout mechanisms involved in the communication.
Basically, I would do this at the beginning of your script:
set_time_limit(0);
ignore_user_abort(true);
This said, as advised by Berry Langerak, you shouldn't be using an HTTP call to run your backup. A cronjob is what you should be using. Along with a set_time_limit(0), it can run forever.
In shared hosting environments where a change to the max_execution_time directive might be disallowed, and where you probably don't have access to any kind of command line, I'm afraid there is no simple (and clean) solution to your problem, and the simplest solution is very often to use the backup solution provided by the hoster, if any.
Try the function:
set_time_limit(300);
On windows, there is a slight possibility that your webhost allows you to over ride settings by uploading a php.ini file in the root directory of your webserver. If so, upload a php.ini file containing:
max_execution_time = 300
To check if the settings work, do a phpinfo() and check the Local Value for max_execution_time.
Option 1: Ask the hosting company to place the backups somewhere accesible by php, so the php file can redirect the backup.
Option 2: Split the backup script in multiple parts, perhaps use some ajax to call the script a few times in a row, give the user a nice progress bar and combine the result of the script calls in a zip with php and offer that as a download.
Recently I ran into a problem on larger file downloads in PHP. PHP is running as CGI on zeus server. I tried everything but all in vain. like:
set_time_limit(0);
ini_set('max_execution_time',0);
The problem is that after downloading about 4-5MB, downloading stops without any warning. However, when I run the code locally everything works like a charm. Help me get out of this problem.
Look in your PHP.ini file on the zeus server and your local box. Check the
upload_max_filesize = ??
Or the:
post_max_size = ??
values on both servers. See if they are different.
This might be a memory limitation of the CGI process or some other limitation in the response delivery chain.
don't load the whole file into memory, e.g. echo file_get_contents(<file>)
disable output compression for this request (PHP and Webserver)
I suggest you also read this page.
Could you paste the code that send the file?
I tend to use:
post_max_size = ?
Best of luck!