I have a Symfony application using Sonata admin and Sonata media my client uses to upload mp4 videos. The problem is these files are getting bigger and bigger (> 1go) and I now have trouble with "out of memory" apache errors.
The website is hosted on an AWS EC2 ubuntu instance (PHP, Apache2).
I set the php.ini parameters like this:
upload_max_filesize = 1G
post_max_size = 2G
memory_limit = -1
I did this for cli and apache2 under /etc/php/7.0/. I tried various combinations and higher values. Always restart apache afterwards. I even tried rebooting the instance. I can't get to upload more than 350mo.
Are you sure of the right syntax ? I think it's in bytes like
1000 bytes = 1Mo so just write the number without "G","Mb",etc..
But im not sure try it and return response :)
OK, I found a solution. Here's how I did it.
First of all, i stopped my EC2 instance, detached the volume and made a snapshot of it.
From the snapshot, I created a bigger volume (30Go for now). I reattached it to my EC2 instance, which I rebooted.
I logged in through SSH and installed a swapspace (3Go). Tweaked it a little following this.
I cleant the cache, restarted my EC2 instance one last time and... well I just uploaded 1Go in 20 minutes.
One issue though : stopping and starting my EC2 instance I lost my IP and had to change my DNS zone. It's gonna take a few hours for the website to be back...
Hope this will be helpful to someone.
Can you check your .htaccess file in the project root directory or make sure that in the Symfony project files, the size is not limited with the PHP functions, which will overwrite the php.ini attributes.
Edit your php.ini file upload_max_filesize = 100000
Because if you write 1G it will not count 1 GB it sounts in byte so you will have to declare 1048576 for uploading limit of 1 GB
Related
I'm using Godaddy linux share hosting, and using the php.ini to configure the file upload max size
file_uploads = On
post_max_size = 100M
upload_max_filesize = 100M
but this is not working, the max filesize that allow is 2mb, although i configure already, i have tried the "php5.ini", "php56.ini" and store it in "/public_html/" or "root" folder, it still not working, at first i suspect it is the problem of SSL, because i just install SSL 2 weeks ago, but i found that my another hosting account(no SSL installed) that i configured before and tested before and show it is working correctly also have same problem,
Can i know what problem? is it only me encounter the this problem? and how to fix it?
I found the answer, we can straight away change the max upload size in system, follow this tutorial, https://www.youtube.com/watch?v=YmS92xfmbFU
I'm trying to upload files over 150MB with a PHP file, which works without any problem with files less than 40Mb. But when I try to upload files like 150 to 200 MB, it resets the upload process after uploading about half of the file.
It repeats again and again when the new upload process gets to same value, till the time limits on php.ini file kills the process.
The first thing I tried was increasing those values in my php.ini file:
post_max_size=450M
memory_limit=300M
max_execution_time=1600
max_input_time=1600
upload_max_filesize=400M
and file_uploads is of course set to On.
I also put this line to top of my upload.php file:
set_time_limit(0);
I'm running that website on Windows Server 2008 R2 with Parallels Plesk 12.0.8.
I search this problem on Google too much but none of the solutions work on my case.
I read the same scenario in this question, which the owner of the question replied that he solved the problem by increasing client_max_body_size value in nginx.conf file, but I could not find anything about this file in Windows server so I thought it is only exists on Linux systems.
What could cause this problem?
I've solved it.
I changed PHP's debugger from FastCGI Application to CGI Application on Plesk, and that solved it.
But be careful, it caused me a couple of errors in other pages of the website, and I've fixed them all again.
Hope that helps.
I have the following problem in Laravel.
I would like to upload a file through a form. But for some reason if the file is larger than around 2100 KB, the validation fails, and says that the file is 'required' and I did not provide it.
I've read numerous articles, that this can be because of php.ini settings. On my server they are the following:
upload_max_filesize 64M
post_max_size 64M
These values are copied from the output of phpinfo(), so they are in effect.
And despite this, the upload fails even for a 2 MB file. Do you have any ideas what I could check/set to solve this?
I am using laravel 5.2, and PHP 7.
Check which server software you are using. Nginx for instance has it's own limit (default set to 1MB I believe). Apache might have it too. Consult the respective manuals for those packages on how to configure them. Or if you're using shared hosting, contact support to see if they can increase the limit.
Though this isn't a really scalable solution. Next time you might want to upload a 100MB file, and you probably don't want to allow 100MB requests on your servers. A better approach would be to split the file in smaller chunks in the frontend, with JavaScript, and submit them as parts of the same upload, then recombine the parts on the server once the file is completely uploaded. Beware of additional checks you'll have to do here though.
You might want to incorporate the following into your own code:
<?php
//--- this tries to override the default of only 2M file uploads.
ini_set("upload_max_filesize","25M");
ini_set("max_execution_time",600); //--- 10 minutes
ini_set("post_max_size","35M");
ini_set("file_uploads","On");
?>
In my case, it was HDD space issue. not enough space to store the file.
Laravel should handle it with proper message, instead of indicating user didn't upload anything.
If you are not using any other package to upload files to check , then
then remember to restart apache .
I'm working with acquia-drupal 7 (just localhost for now), inside Microsoft WebMatrix.
I can't get my php upload limit to increase from the default 2MB. Having googled around I have done the following:
added to every existing htaccess:
php_value upload_max_filesize 10M
php_value post_max_size 10M
created php.ini file in every dir that had an htaccess (no php.ini files existed anywhere) containing the following:
upload_max_filesize = 10M
post_max_size = 10M
Restarted the site (through the Webmatrix GUI)
No apparent change whatsoever after any of this - my site still has the 2MB upload limit.
Thoughts?
The reason your changes are not taking affect are because you just created a php.ini file anew, rather than finding the one that is being used.
By definition, since you just created it, it won't be the one that exists in your php directory. :-)
But don't worry! There's an easy way to find the correct php.ini file that your site is using:
Go to /admin/reports/status on your drupal site. Here you will see information about which version of php and apache you're using, etc.
The line for 'PHP' on that page should have both the version of php you're using (something like 5.3.6) and a 'more information' link
Click on that link and you should be able to see detailed information about the php installation on your machine.
Find the line called 'Configuration File (php.ini) Path' on that screen and navigate to that to update it.
Just like you already knew, make sure you restart your server after any changes. :-)
Let us know if this fixes your problem!
Have you ever ran into a problem where you needed to upload relatively large files and still want to be able to manage these from the Drupal 7 administrative interface? If so, you may run into a situation like the one below:
You will notice the 12 MB text stating that we can only upload files that are 12 MB and under. In this case I needed this number to be a little bigger.
In order to do this you will need to modify your PHP settings in your php.ini file.
Note: You should make sure you know what you are doing and understand the consequences of increasing this number. In my case this is on a site that only users that I trust will be uploading files. If you allow any user to upload files, increasing this number can add an increased load on the server and possible eat up your disk space pretty quickly.
Now that you have been warned, here is how I was able to do this. I first found the php.ini file on my system. I am on an Ubuntu server so I was able to get to edit mine using vim like so:
vim /etc/php5/apache2/php.ini
Change the upload_max_filesize setting
The first step was to find the upload_max_filesize setting from 12MB to 30MB.
Change:
To:
Change the post_max_size setting
You may also need to modify the post_max_size setting. I changed the post_max_size php.ini setting from 20MB to 30MB.
Change:
To:
Restart Apache
You should check phpinfo to make sure your php.ini which you edited is the correct one.
I am trying to upload a file of 1GB size using php script and it works perfectly if file size is less than 20MB, but when I increase the file size than after pressing upload button on website, it uploads the file (I guess as it takes few minutes) and after that, instead to execute upload.php, my firefox asks me to download upload.php, so I guess, file is being uploaded but my php script fails to execute.
Also after searching in google I found following settings for php.ini which I made and my php_info() function shows me that settings have been changed..
/*php.ini start*/
memory_limit = 512M
post_max_size = 15000M
file_uploads = On
upload_max_filesize = 15000M
max_input_time = 20000
max_execution_time = 20000
session.gc_maxlifetime = 20000
/*php.ini end*/
The limit on file size uploads is limited by a LOT more than just PHP. PHP has its limits, Apache has its limits, many web services have their own limits, and then you have the limit on the /tmp directory size which could be set by user permissions on a shared host. Not to mention just running out of hard drive space!
Your php.ini looks good, but as was already suggested- check the LimitRequestBody in Apache and make sure your web service allows this.
One common solution when you need to upload very large files is to use Flash or Java on the client-side since these have access to the client's actual file system. The Flash/Java can then break the file into small pieces which are sent one at a time and re-assembled on the server side. This has multiple benefits. First, you can resume broken downloads. Second, you don't have to worry about any scripts timing out. Third, it bypasses any possible file system limits that may be in play from PHP, Apache, web services, or other sources.
post_max_size = 15000M should be higher than upload_max_filesize = 15000M.
It can be either post_max_size = 15001M. I've increased it to 1 more.
Check that your web server also allows such large uploads. On Apache the setting is LimitRequestBody. This limit is be applied long before PHP ever enters the picture and cannot be changed/overriden from within PHP.