I have a site developed in php where there is a function to download a zip file, extract him, parse csv inside and insert into database.
This script is very long because the csv file is large (many MB depends of the content).
I have already tried with this:
ini_set('max_execution_time', 0);
ini_set('memory_size', '500000M');
set_time_limit(0);
ignore_user_abort(1);
but isn't working, my page in a few minutes go in timeout and block the script.
The configuration of my server is:
max_input_time -1
max_execution_time 0
memory_limit 512M
safe_mode Off
How can I escape this problem?
I have see many question about it but any answer is good for me
Related
I am using php-ffmpeg for converting uploaded videos. With all videos it was working great before.
But from last few days through now only small videos are getting convert and giving me proper response on success.
But large videos which size are more than 25MB they are uploading well on server converting also well but on success response it giving me error. image is bellow.
so here when I refresh my page manually or check on server then I can see that video uploaded on server and converted also well already just issue facing in sending me success response when video get converted.
And code is same for small and large video convert so small is working well and with large facing issues from last few days through.
Any possible suggestion guys ?
Set this rule in your upload file or in main configuration file,By adding
this line it removes memory limit for any file in wordpress. So that you can upload any size of file without limitations.
ini_set('memory_limit', '-1');
May this thing help you to solve out your problem.
Maybe it's a problem of time limit or memory limit, so you can add these lines:
set_time_limit(3600); // For exemple or 0 for no time limit
ini_set('memory_limit', '2048M'); // For exemple or -1 fo no memory limit
Or you can directly change php.ini file to set configuration globally
-> here is just for idea....
ini_set('upload_max_filesize', '2000M');
ini_set('post_max_size', '2000M');
ini_set('memory_limit', '4048M');
ini_set('display_errors', 'On');
ini_set('max_execution_time', 0);
ini_set('max_input_time', 0);
set_time_limit(0);
error_reporting(E_ALL);
libxml_use_internal_errors(true);
--
and:
file_uploads
upload_max_filesize
max_input_time
memory_limit
max_execution_time
post_max_size
This question already has answers here:
Best way to manage long-running php script?
(16 answers)
Closed 5 years ago.
I'm trying to download a file from a remote server with file_put_contents. This script is called via ajax. The problem i'm having is the script timeout when the file is large e.g. (500mb). I get 504 Gateway Timeout - nginx
download.php
$destination = "/home/mywebsite/public_html/wp-content/channels/videos/test.mp4";
$link = "http://www.example.com/videos/movie.mp4"; //500mb
$result = file_put_contents($destination, fopen($link, 'r'));
I'm using dedicated hosting. I've changed my php.ini and confirmed in phpinfo();
max_execution_time 7200
max_input_time 7200
max_input_vars 1000
memory_limit -1
output_buffering 4096
post_max_size 1200M
upload_max_filesize 1000M
This script keeps timing out. Is there another solution how do i solve? When i check the directory the file is successfully downloaded but the page times out. So i can't return any data via ajax.
How do i solve?
You should also change nginx fcgi timeout values. PHP script continues executing but your connection between nginx and PHP timeouts.
Make download asynchronous. Like one process only fill some DB or rabbitMq with download requests and other wil cosume it (maybe cron)
I have html dom parser and php script to store the table data to mysql. Now I'm getting Fatal Error on line 18. Below is the code of line 18 and is for finding table from HTML Web Page. Also I had applied this script to so many same webpages of different sizes (in kb) and more content but same. Then I found that there is not any problem with the script as it works fine on less sized (like 100kb, 200kb) pages, while it not works with large sized pages having large data (like 800kb, 900kb). So I think there is limit in memory on my server. Please help me resolve this issue.
.......
foreach($html->find('table#GridView1') as $e){
.......
Maybe pastes the Error messages would give us more information to solve the problem...
Anyway, to extend the memory limit in php. All you need to do is edit your php.ini(which maybe in the dirctory /php5/ or /Windows/, it depends.)
Find the content
; Maximum amount of memory a script may consume (128MB)
; http://php.net/memory-limit
memory_limit = XXM
Change it to the size that satisfies your system. And restart your Apache server.
Open simple_html_dom.php, go to line 65, which has:
define('MAX_FILE_SIZE', 600000);
It's standard put at 600000 which is 600kb, so change it to your desired amount.
Source
Finally after trying so many ways and giving 10 hours to this question, I got the solution. First changed the max_file_size limit in html DOM as tald by #Koen Hoeijmakers. Then the must important factor which we must need to improve in dedicated server having centos 5 and kloxo panel is to change all limits in .htaccess.. as below:
php_value upload_max_filesize 2M
php_value max_execution_time 300
php_value max_input_time 600
php_value memory_limit 320M
php_value post_max_size 80M
and got out of this hell(Error!!). No matter, thanks for your suggestions.
I have a script which reads a mp3 file with readfile() (I have also tried fread() with a loop). And on one particular file I am getting a 500 error. All other files seem to read just fine.
When I review the error log on my server I notice I am getting a PHP memory error when ever this particular file is attempted to be read.
Can anyone give me some pointers as to what might be going on?
You already pointed out the problem - PHP is running out of memory. Maximum memory usage in PHP is limited by an ini setting.
You can set it at runtime at the top of your script with ini_set:
ini_set('memory_limit', '64M');
Or you can set it in your php.ini file so it takes effect for all scripts:
memory_limit = 64M;
Finally, you can see the memory used by the currently executing script with memory_get_usage()
The mp3 file is of a larger filesize than memory_limit. You'll need to increase the amount of memory PHP can access if you want to read from this mp3 file.
If you have access to the php.ini file, you can find memory_limit = 16M ; and replace the value with however much memory you need.
If you don't have php.ini access and you do have .htaccess access, add:
php_value memory_limit 16M
and replace the value.
If you have neither, try compressing the mp3 file or otherwise reducing the amount of memory it will take for you to perform this action. Try clearing variables which are unused and take up large amounts of memory.
Well, php is probably running out of memory before completing the script. You can just increase the memory php is allowed to run by changing the memory_limit option in your php.ini.
Try increasing the execution time:
set_time_limit(300); //300 seconds
ini_set('memory_limit','16M'); //increase to some value like 16 MB
Most likely it exceeds maximum memory. This can be adjusted in the php.ini, check here: http://www.php.net/manual/en/ini.core.php#ini.memory-limit
Im confused... I can't seem to upload files in the 2gb range. When i try using curl to send a 1.92gb file to my site (through an API), it doesn't report anything at all, its just blank. When i send a 1kb file, it reports back like it should.
When i try uploading via the upload form, it ends up freezing mid way, around 33%. Although im not sure if only the progress bar has froze or if the actual file upload it self has been suspended. I suspect that only the progress bar has froze because it still says data is being sent even though the progress bar freezes.
My php.ini (yes, its reflected by phpinfo as well):
register_globals = Off
magic_quotes_gpc = Off
post_max_size = 2047M
upload_max_filesize = 2047M
max_execution_time = 25200 ; Maximum execution time of each script, in seconds
max_input_time = 25200 ; Maximum amount of time each script may spend parsing request data
memory_limit = 2048M ; Maximum amount of memory a script may consume (16MB)
short_open_tag = On
My vps doesnt actually have 2gb of ram at its disposal, but does memory_limit really need to be set this high?
How should i go about testing this? I know 400mb files work, i haven't tested anything in between 400mb and 1.92gb
You will need a premium account to test up to 2gb, so here is one you can play with:
User: testreferral
Pass: 1234
http://filefx.com
I dont understand where this problem is arising.
Check for:
Memory limit. Try uploading files above and below the actual memory limit.
Time limit. Aren't your uploads take 7+ hours, are they?
The effective settings. Some setting might be overridden by server/etc settings.
PHP: mysql query skipped/ignored after large file uploads?
Mysql was timing out during the file upload. So the file wasn't showing up in the DB