I am trying to upload a file of 1GB size using php script and it works perfectly if file size is less than 20MB, but when I increase the file size than after pressing upload button on website, it uploads the file (I guess as it takes few minutes) and after that, instead to execute upload.php, my firefox asks me to download upload.php, so I guess, file is being uploaded but my php script fails to execute.
Also after searching in google I found following settings for php.ini which I made and my php_info() function shows me that settings have been changed..
/*php.ini start*/
memory_limit = 512M
post_max_size = 15000M
file_uploads = On
upload_max_filesize = 15000M
max_input_time = 20000
max_execution_time = 20000
session.gc_maxlifetime = 20000
/*php.ini end*/
The limit on file size uploads is limited by a LOT more than just PHP. PHP has its limits, Apache has its limits, many web services have their own limits, and then you have the limit on the /tmp directory size which could be set by user permissions on a shared host. Not to mention just running out of hard drive space!
Your php.ini looks good, but as was already suggested- check the LimitRequestBody in Apache and make sure your web service allows this.
One common solution when you need to upload very large files is to use Flash or Java on the client-side since these have access to the client's actual file system. The Flash/Java can then break the file into small pieces which are sent one at a time and re-assembled on the server side. This has multiple benefits. First, you can resume broken downloads. Second, you don't have to worry about any scripts timing out. Third, it bypasses any possible file system limits that may be in play from PHP, Apache, web services, or other sources.
post_max_size = 15000M should be higher than upload_max_filesize = 15000M.
It can be either post_max_size = 15001M. I've increased it to 1 more.
Check that your web server also allows such large uploads. On Apache the setting is LimitRequestBody. This limit is be applied long before PHP ever enters the picture and cannot be changed/overriden from within PHP.
Related
I'm running a website which has a gallery. I want to add a function, which allows the user of the website to add a big amount of images (up to ~300). Because there are only a few users, which should be able to upload images, I'm currently helping them out by uploading them via FTP.
I've already implemented the function in PHP and it works on localhost, but because it's on localhost, it's "uploading" the images very fast and i have no real way to test, what happens, when you upload the images to a real webserver. (Internet connection to slow, timeout?)
Some settings in the php.ini are capped by my provider (max_execution_time, max_input_time, max_input_vars, memory_limit)
Is there a good way to handle this?
Webserver: Apache
PHP version: 5.5+
Other than those you may want to look at post_max_size and upload_max_filesize settings too to increase the file upload limit.
PHP has several configuration options to limit resources consumed by scripts. By default, PHP is set to allow uploads of files with a size of 2MB or less.
Try increasing the following values in php.ini, for example:
memory_limit = 32M
upload_max_filesize = 24M
post_max_size = 32M
I am try to uploading file on WordPress server.after 8 mb uploading file is break.There are three ways to increase the size of the uploading file.
PHP.ini (Changing the settings of PHP.ini file)
htaccess (Also change the settings in htaccess file but still of no use)
changing settings in wp-admin file.
all are not working.
Is it any other way to increase the size of the uploading file.
You have to configure this two things in php.ini upload_max_filesize and post_max_size then restart your webserver
Apart from the maximum filesize setting, try to look and configure max_execution_time and post_max_size in the php.ini if necessary. Then,restart Apache.
It depends on where you are uploading file ie. On which site.
What is your connection bandwidth.
How much uploading your WordPress service provider supporting.
Your script can have a limited set of time to be executed.
Try set the request timeout inside the php script with set_time_limit() (for a test on time limit, not for an upload)
Often, when the limit is represented by the size of file, you should have a message warning before the upload starts. In this case the server is letting you upload, so in most cases it's not a size problem.
Take into account also that some providers are imposing execution time from web server setup so you have to check this too. If I were you I'll try to execute a script that does nothing (wait) for some minutes, and see if that the time the request goes in timeout is the same for uploading a file.
I have a question - I have a file uploader and each user is allowed to upload max_size:30M.
Now I'd like to know from your experiences what I should set up in php.ini.
Here are the options I've already changed, but I'm not sure if that is the best:
upload_max_filesize = 30M
post_max_size = 30M
max_execution_time = 300
max_input_time = 300
memory_limit = 32M
Here I think memory_limit is a bit low. And is there something more I have to include?
When I tried to upload more than 30M, Firefox crashed.
Thanks for your answers.
If you work heavily with files you should split the file and upload it in separate parts, it doesn't choke the server as much and you can check the progress, also you don't need that much memory and excecution time.
For some reason someone said it's not true... Well I don't think Youtube or upload sites like Megaupload (rip) use 2gb of memory and 2 hours of maximum excecution time to upload the file of one user, they split the file, in whatever way they can, if you need to upload 30mb files, and you use standard PHP your webserver will choke until the upload finishes, unless you don't care about that, it's not recommended.
The easiest solution is to upload it by parts, you can do it using Flash/Silverlight/JS/HTML5 or whatever way you prefer, client-side, and then joining them server-side.
I am trying to upload large files through my cms and was wondering how to change the php.ini file for heart internet.
Is this possible in shared hosting, if not are there any other work arounds?
Thanks in Advance
To override settings all you need to do is create either a php.ini or php5.ini file (if you are running PHP5) in your root directory. Then you can change settings like this:
upload_max_filesize = 20M ;
post_max_size = 20M ;
max_execution_time = 60 ;
This gives you maximum file size of 20MB and 60 second timeout.
As long as you keep this size within the allowed limits on your account, you can use this to increase the default size - which is 5MB.
It depends on whether your web hosting company allows you to override certain PHP settings or not. It might be possible to change some values but not others.
Secondly, the process for overriding settings differs depending on whether your hosting is IIS or Apache. If its Apache, try adding these two lines to your .htaccess file:
php_value upload_max_filesize 8M
php_value post_max_size 8M
This .htaccess file should do to the directory where your php upload script resides, or higher. I'd rather put it in the root directory.
Once done, create a php page containing this code:
<?php phpinfo( ); ?>
Compare the Local Value and Master Value of these settings to see if the changes are in effect.
There is no way to upload files over 50MB without breaking your terms and getting your account shutdown.
However if the terms were different, you could split the file into parts and join them together on the server side.
I moved away from heart internet and got my own server for this exact reason as they wouldn't even let me pay a premium to get the restriction removed (restricted from there end I think)
If it is shared hosting you probably won't be able to, I have also discovered that you cannot use ini_set to set the correct settings because the file upload occurs before your script is executed. So if you want to accept large files via a form to a PHP script you have to use php.ini.
You might be a work around though, you could use an open FTP account, upload large files form there and code a bit of script to ask the user what file they have uploaded, then you can manipulate (move / rename) to your hearts content.
I have an image upload for a slideshow, and the users are continuously uploading files that are 2MB plus. Files under this size work fine, but files over the size cause what looks like a browser timeout.
Here are my php ini settings:
Max memory allocation: 12M
Max file upload size: 10M
Max HTTP Post size: 10M
Max execution time: 60
Max input parsing time: 120
These settings are in the configuration file itself, and I can change them directly. Changes show up when using phpinfo().
I am running on an apache server and php 4.3.9(client's choice, not mine). The apache server's request limit is set to default, which I believe is somewhere around 2GB?
When I use the firebug network monitor, it does look like I am not receiving a full response from the server, though I am not too experienced at using this tool. Things seem to be timing out at around 43 seconds.
All the help I can find on the net points to the above settings as the culprits, but all of those settings are much higher than this 2MB file and the 43 second time out.
Any suggestions at where I can go from here to solve this issue?
Here are relevant php ini settings from phpinfo(). Let me know if I need to post any more.
file_uploads On On
max_execution_time 60 60
max_input_nesting_level 64 64
max_input_time 120 120
memory_limit 12M 12M
post_max_size 10M 10M
safe_mode Off Off
upload_max_filesize 10M 10M
upload_tmp_dir no value no value
Make sure you have error reporting activated in php.ini: display_errors = On; this might give you a clue about what's going on. Production servers usually (should) have error reporting disabled.
I recently had a similar problem, and increasing the memory_limit setting worked for me. If you read files content into variables, each variable will take about as much memory as the file size, increasing the scripts memory requirements.
Where are those settings? If you're using .htaccess then your Apache configuration might not be allowing you to override those settings.
I'd suggest checking with a phpinfo() call if those settings are indeed being applied or not:
<?php
phpinfo();
?>
If it's a shared host, your host might have set a limit to override yours.
Otherwise, try making the POST limit higher than the file upload size. AFAIK, uploads are POSTED.
If the problem is due to the Host overriding your timeout, you can look for a host that still uses Apache 1: In Apache 1 the local .htaccess overrides the global setting even for the timeout.
Otherwise, there are dozens of Java applett uploaders available for just a few dollars (Google it). They split the file, upload the parts and put the parts back together transparently.
This is a guaranteed fix for timeout, and has the added advantage of letting users pause and resume their upload, see the progress, et all.
(Flash based uploaders don't have this advantage.)