PHP Script Times out after 45 seconds - php

I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.

I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich

It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.

First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.

The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min

First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .

Related

Storage::copy in laravel doesnt copy a Large file Perfect

I'm trying to copy a 10GB file to another directory in my local disk using this code
Storage::copy( 'file/test.txt', 'file2/dest.txt' );
But when I check it on the destination path it only copied 1.7GB out of 10GB.
It didn't show any timeout errors at all.
Is there any work around on this?
as long as invoking php script will be happen with a request. a malicious code can turns system into an infinite loop, or an attacker (Danial of Service) can make a script run for a long time.
so to preventing this. PHP came up with some default setting which make it to not run for a long time period and exit the execution if some amount of time out passed and it didn't returned a response.
usually this times are sufficient. and its an unusual case you have to copy 10 GB file within a script run
i suppose you have a regular HDD whit speed of about 60 MB/s for a
execution time of 30 seconds, it will copy about 1800 MB. and it make
sense with your evidence.
Temporary fix
10000 MB / 60 MBps = 168 second. i come up with 180 second via a margin
in your php.ini , in Resource Limit Section, there are parameters for time out.
uncomment them and make them 180 seconds:
max_execution_time = 180
max_input_time = 180

mysql server is responding too slow or not responding

I use mysql through phpmyadmin interface. i have no problem with the apache server. It responded,as it was before. But when i am trying to access the phpmyadmin page, the page is loading with a huge time. After a long time it came with a message
`Fatal error: Maximum execution time of 30 seconds exceeded in
C:\xampp\phpMyAdmin\libraries\classes\Dbi\DbiMysqli.php on line 213
` i have changed the value of the variable
$cfg['ExecTimeLimit']
from 300 t0 1200. I think for that, i am able to see the loaded page. But, after loading i can't do anything with the interface as it takes too much time to respond.
I have tried the mentioned things in the following link
WAMP/XAMPP is responding very slow over localhost
can anyone help me out to get rid of this problem, it's really waste a huge time of mine from several days
I was having the same problem you could edit your php.ini file and set the max_execution_time = 120. But this isn't always working. Another suggestion is the past this two lines in your code. This solved my problem. It will affect the execution time of the script itself so it will get more time to run then the 30 seconds.
ini_set('max_execution_time', 300);
set_time_limit(0);
I don't know if you use this but this is the easiest way to visualise the errors.
error_reporting(E_ALL);
ini_set('display_errors', 1);

PHP script timeout in Log Analyzer

I have been trying to get Log Analyzer to work for longer than I care to admit. I can't seem to get syslog messages to display in the Log Analyzer web-GUI, but this morning I got the following error:
"While reading the logstream, the php script timeout forced me to abort at this point. If you want to avoid this, please increase the LogAnalyzer script timeout in your config.php. If the user system is installed, you can do that in Admin Center."
I was not getting this error on Friday; only "No syslog records found." The timeout is set to 30 seconds in the config file, but I read that setting will get overwritten back to the default anyway. The database grew to over 4GB over the weekend. Does the the db size have anything to do with this?
It's pretty clear I am new to php and Log Analyzer, so any help with both would be greatly appreciated. I can post config file settings if needed.
Try to increase time. 30 second may be not enough for parse large file. For example set 600 seconds.
Another case to increase timeout - edit following line in php.ini file
; Duration of time, in seconds for which to cache realpath information for a given
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
max_execution_time = 30;
Too many records in the database, the script does not have enough time to process everything. I keep the last 14 days in the database, that's enough.
USE Syslog;
DELETE FROM Syslog.SystemEvents WHERE ReceivedAt < DATE_SUB(NOW(), INTERVAL 14 DAY);

php max execution time ignoring php.ini

I am trying to export a large database via phpMyAdmin. I jeep getting an error that the script stopped because the maximum execution time of 600 seconds was reached (or something like that). I tried setting max_execution_time in php.ini to 0 and -1. The change takes effect as I can see it in phpinfo(), but I am still getting the error. Another strang thing is that originally (before I changed it to 0) it wasn't 600 either. It was 180! Where is this 600 set?
See if it is manually set somewhere. Assuming you are on a UNIX type platform:
find /path/to/root/of/phpmyadmin -name "*.php" -print0 | xargs -0 grep "max_execution_time"
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. See your web server documentation for specific details.
Don't use phpMyAdmin to import large files. Try using the mysql CLI to import a dump of your DB. Transfer the SQL file to the server and execute the following on the server using PHP script like shell_exec or system
mysql --user=user --password=password database < database_dump.sql.
Of course the database has to exist, and the user you provide should have the necessary privilege(s) to update the database.
PHP by default places resource limits on all php scripts using the following three directives:
=> max_execution_time : Maximum execution time of each script, in seconds (default 30 seconds)
=> max_input_time : Maximum amount of time each script may spend parsing request data (60 seconds)
=> memory_limit : Maximum amount of memory a script may consume (default 8MB)
Your php script was timed out may be because of resource limits. All you need to do is setup a new resource limits so that the script will get executed.
If that doesn't work either,you can set it with set_time_limit(N) function, which sets the time limit in seconds.

How do you get a Cronjob executing a PHP script to run longer than 30 seconds?

How can I rewrite this into a cron that will run every day for longer than 30 seconds? Also, do I need to edit the .htaccess or php.ini file in the cron.php directory to say something? Over the browser it runs just fine for longer than 30 seconds; over the shell, it runs just fine too. But as a cron set task, it dies after 30 seconds. I'm on 1and1 share hosting.
0 12 * * * php5 /this/is/the/file/cron.php
There are several things that could be terminating your script. One could be the maximum execution time set in the php.ini file. If that's the case, you can override it in your script with set_time_limit(0); where zero means no limit and any number greater than zero is the number of seconds to allow the script to run for before being terminated. It's important to note that this time does NOT include the time it takes for the browser to make the request, so file upload time wouldn't count here.
If you're in a shared hosting environment (like Dreamhost), they have process watches that will kill off any PHP process after a set time limit. You cannot get around these. You would need to contact the hosting provider to see what you need to do to get access to run the script for longer (for Dreamhost, they want you to have a they're PS offering).
Use this syntax to start php:
php -c /path/to/another/php.ini /this/is/the/file/cron.php
Then you can specify a different timeout (or no timeout) in a different php.ini file.
ini_set('max_execution_time', 600);
Add this to the top of your php file and it will run for 600 seconds. Anything more is not recommended but you can have a go if you want.
You don't need to set a higher max_execution_time if you use PHP CLI:
CLI SAPI default value for "max_execution_time" is set to unlimited.
http://nl3.php.net/manual/en/features.commandline.differences.php
I would just use wget http://path.to.myscript.php
If it's dying after 30 you may need to set max_execution_time = 60 in your php.ini to allow the script to run longer than 30 seconds.
You could also use ini_set('max_execution_time', 60)
But as the manual page says, in some cases (i.e. running in safe mode) this will have no effect at all: http://uk.php.net/manual/en/info.configuration.php#ini.max-execution-time
It could also be possible that the php.ini for client line has different max execution values than for the browser. I have seen it sometimes.

Categories