Laravel ignores my PHP settings and times out - php

I need to run a really slow PHP/MySQL script once, on my local server.
The problem is that Laravel times out after 60 seconds with the message "Maximum execution time of 60 seconds exceeded".
I have set
max_execution_time = 360
and
max_input_time = 360
in my php.ini. The settings are there (checked phpinfo()) but Laravel still times out after 60 seconds. Is there anything in Laravel that I can set as well?

I dont think Laravel will override PHP settings. After changing the settings in your ini file, you have to restart the server to take effects.
So check whether you have restarted your server after the change in the settings

Related

Ajax request is getting cancelled

In a php application. I am uploading 20-30 files at once. Each files is around 100-200MB. Means more than 2GB of data i am uploading on server.
Because it takes time around 20-30 mins to upload. One general ajax pooling job getting cancelled after some time.
I have following configuration:
upload_max_filesize = 4096M
post_max_size = 4096M
max_input_time = 600
max_execution_time = 600
During this process my CPU consumption goes only upload 10-20%. I have 32 GB RAM and 12 CORE Linux machine.
Application is running on PHP 8.0, APACHE 2, MYSQL 8, Ubuntu 20.
Can anyone suggest what else i can check?
max_execution_time: This sets the maximum time in seconds a script is
allowed to run before it is terminated by the parser. This helps
prevent poorly written scripts from tying up the server. The default
setting is 30. When running PHP from the command line the default
setting is 0.
max_input_time: This sets the maximum time in seconds a script is
allowed to parse input data, like POST and GET. Timing begins at the
moment PHP is invoked at the server and ends when execution begins.
The default setting is -1, which means that max_execution_time is used
instead. Set to 0 to allow unlimited time.
I think change it:
max_input_time = 1800 & max_execution_time = 1800 (30 minutes)

Wordpress timeouts at 30 seconds

I have a wordress blog and there is plugin that does around 20000 SQL inserts on user request. I noticed that the process takes long time, which is normal, but the request times out at 30 seconds.
I checked PHP settings and notice that PHP max_execution_time was 30 second, so I increased it to 90, but the requst keeps to timeout at 30 seconds (I even logged what does ini_get('max_execution_time') return and it says "30". Then, I checked if there are any apache directives that limit request time and found that there is a "TimeOut" directive ( http://httpd.apache.org/docs/2.2/mod/core.html#timeout )
Its value was 60 and I increased it to 90 as well, but the problem persist - the request times out after 30 seconds as it was before I changed anything.
as a note: I restart the server after I do any modification
By modifying your PHP Settings
That’s not easy, as you need to have access to your server, or a way to change your PHP settings. If you have access to your php.ini, you need to look for the max_execution_time variable and set it to the number of seconds you would like, 60 seconds for example.
max_execution_time = 60
If that doesn’t work, or can’t access your php.ini, you can also try to set this variable by using the .htaccess (at the root of your WordPress install). You can add this line.
php_value max_execution_time 60
If you set the value to 0 (instead of 60, here), the process will be allowed to run forever. Don’t do this, you will run into much bigger issues, extremely difficult to resolve.
By calling a function in PHP
This is generally not really recommended to do this. But if you know what you are doing, you can tell PHP that you would like the process to run for more time. For this, you could write this call in your theme, in the functions.php for example.
set_time_limit(100);

How to prevent timeout when running a time consuming PHP

I am using PHP that gets so many data from several sites and write those data to the server which make files greater than 500 MB, but the process fails in between giving a 500 INTERNAL ERROR, how to adjust the timeout time of the php so that the process runs till it is completed.
If you want to increase the maximum execution time for your scripts, then just change the value of the following setting in your php.ini file-
max_execution_time = 60
If you want more memory for your scripts, then change this-
memory_limit = 128M
One more thing, if you keep on processing the input(GET or POST), then you need to increase this as well-
max_input_time = 60
you have to set some settings in the php.ini to solve this problem.
There are some options which could be the problem.
Could you pls post your php.ini config?
Which kind of webserver do you use?Apache?

MYSQL The connection has timed out even I have defined in php.ini

I am running some reports and those reports takes more than 300 seconds to execute and finally display on browser.
I have already defined the max execution time in my code and also set the same in php.ini.
ini_set('max_execution_time', 500);
I am using MySQL Workbench to monitor the execution but on 300 sec browser shows
The connection has timed out. The server at localhost is taking too long to
respond.
I just need to extend it to 400-500 sec and all of my reports will start working smoothly.
How can I do that?
Have you tried this:
ini_set('max_execution_time', 0);
This set the maximum execution time to unlimited. You could set this right before calling your report and set it back to a regular value, e.g 300 right after your report is done.
Your web server can have other Timeout configurations settings that may also interrupt PHP execution. Apache has a Timeout Directive which has default value to 300 seconds. Try changing the directive in your conf (httpd.conf for Apache) file.
For more details , See Apache Timeout Directive doc.

WAMP 2.2e - phpmyadmin Fatal error: Maximum execution time of 30 seconds exceeded

All services are running (wamp icon is green) by when I try to turn on phpmyadmin I get this error. What seems to be a problem?
You can set set max execution time like #HanhNghien said in the comment with in your php.ini.
max_execution_time = 120
max_input_time = 120
But i think the better question is why need phpmyadmin so much time. Perhaps you should check your apache logs and check if there are some errors.
I fixed this with the variables section in phpmyadmin by setting an insanely high number to the timeout. It wouldn't accept '0'.

Categories