Linux background task maximum process time - php

Is there any fixed time duration as how long a background task can run?
This is how I run the script (background task) manually:
php /var/www/html/app_v2/console.php massbulkinsert app.example.com 10 > /dev/null &
this script process huge data set, it takes about 1 hour to complete.
First time it stopped at 10100th record. second time it stopped at 9975th record. There is no pattern of it terminating.
top command and the mysql pid was at 98% and 100% and 130% most of the time and the free memory had about 200 MB. There is enough disk space.

Its a bit of a wild guess, but usually when you succeed with a smaller amount of data - and then gets crashes with larger amounts, it has to do with memory issues.
You should have a look at /etc/php5/cli. There is probably also a folder named cgi inthere - depending how your framework executes the background script i would expect either of these two configurations are used.
Files with extensions called 'ini' are configurations for PHP scripting, and these are among the values that you're interested in (values are defaults on debian 8):
; Maximum execution time of each script, in seconds
; http://php.net/max-execution-time
; Note: This directive is hardcoded to 0 for the CLI SAPI
max_execution_time = 30
; Maximum amount of memory a script may consume
; http://php.net/memory-limit
memory_limit = -1
Note, that there is also a timeout for how long the script can spend, reading the data sent to it through, say a pipe (max_input_time). But seeing your command, youre not piping values to it via stdin - but most likely reading a file already on the disk.
Hope it helps

Related

Is there any way to prevent out of memory from owncloud?

I have a owncloud in a Ubuntu with 5,3 GB RAM.
Every day I have a "out-of-memory" which kills mysql process and owncloud website fails. So I've to restart that server (dedicate to owncloud) every day at least one time (not a good practice...)
This is a px aux --sort -pmem |more ;
There are more than 50 process like files:scan -all.They increase continuously until OOM.
I read something about OCC but I can't get how to disable.
I try to edit mpm_prefork.conf and set;
Also read about edit overcommit_memory and set disable, but I don't want a kernel panic too.
Every process uses about 1.1% Memory. In a few hours get 100% and kernel kills MySQL and other process.
Any idea/solution?

Php and max_execution_time

I need to one of my script (not all) works only five second. And if execution not finished yet (within five seconds) it should be dropped.
So I use
ini_set('max_execution_time', 5);
Also if I do
ini_get('max_execution_time');
it shows me five second, but script not interrupt after 5 seconds.
P.S
safe_mode = off
nginx -> php-fpm
set_time_limit(5) also has no effect
You can use set_time_limit function on top in your code as follows:
set_time_limit(5)
NB: If you dont place it on top of your code, supposed the php script has ran for 3 seconds and you called set_time_limit(5) so the total time of allowed execution would be 3+5 = 8 sec not 5sec as expected
Update
From php documentation:
Any time spent on activity that happens outside the execution of the
script such as system calls using system(), stream operations,
database queries, etc. is not included when determining the maximum
time that the script has been running. This is not true on Windows
where the measured time is real.

How does memory_limit work in PHP?

If I have two scripts and my memory_limit option is set to 64M (for example). If I run script 1 and it takes 40M and simultaneously run script 2 how much memory does it have free?
Does every script has up to 64M or they share this memory?
Each PHP script running independently is subject to its own memory limit (as set by the configuration option memory_limit). This limit is inherited into all included scripts (as those are dependent on the parent script).
An example - two scripts running in parallel:
you open a webpage /script1.php
you also open a webpage /script2.php
As these two are completely independent (there is no simple way to discover that "the other script" is running, or even that it exists), they will get a 64M limit each - so the amount of memory they are allowed to use is 64M for one, which comes out to 128M for two. (To clarify, it is not possible to "share" this memory between scripts: if script1.php only consumes 1MB, it can not "give" the rest of its limit to script2.php)
A different scenario - one script including another script:
you open a webpage /script1.php
that script has a line require('script2.php')
In this case, there is only a single 64MB limit: this is still considered a single script, no matter how many files it include()s or require()s. In other words, script2.php inherits the limit (as well as the other PHP settings) from script1.php and all the memory used here is counted towards that limit.
Note that it is possible to change this limit from inside the script (if your server's configuration allows this - most do). Using ini_set('memory_limit', '128M'); sets the new limit to 128 MB - but all the memory used by the script so far still counts towards this limit.
Scripts including other scripts and dynamically changing the memory limit:
you open the webpage /script1.php (in script1.php, limit 64 MB set from configuration)
that runs require('script2.php') (in script1.php, limit 64 MB inherited)
that runs ini_set('memory_limit','200MB') (in script2.php, changed explicitly to 200 MB)
that runs require('script3.php') (in script2.php, limit 200 MB as set above)
that runs require('script4.php') (in script3.php, limit 200 MB inherited)
that runs ini_set('memory_limit','-1') (in script4.php, changed explicitly to "no limit")
that runs require('script5.php') (in script4.php, "no limit on memory" inherited)
Note: It is possible to set a lower limit than it is currently, but this risks immediately overrunning it ("we're using 80 MB" - "set limit to 64 MB" - "out of memory error"); because of this risk, this is rarely attempted.
Note also that memory_limit is a configuration setting and as such it can be set in various places; or its modification can be prevented by the system administrator.
From the PHP Manual:
This sets the maximum amount of memory in bytes that a script is
allowed to allocate. This helps prevent poorly written scripts for
eating up all available memory on a server. Note that to have no
memory limit, set this directive to -1.
Meaning each script gets the same 64M amount of memory.

php max execution time ignoring php.ini

I am trying to export a large database via phpMyAdmin. I jeep getting an error that the script stopped because the maximum execution time of 600 seconds was reached (or something like that). I tried setting max_execution_time in php.ini to 0 and -1. The change takes effect as I can see it in phpinfo(), but I am still getting the error. Another strang thing is that originally (before I changed it to 0) it wasn't 600 either. It was 180! Where is this 600 set?
See if it is manually set somewhere. Assuming you are on a UNIX type platform:
find /path/to/root/of/phpmyadmin -name "*.php" -print0 | xargs -0 grep "max_execution_time"
Your web server can have other timeout configurations that may also interrupt PHP execution. Apache has a Timeout directive and IIS has a CGI timeout function. See your web server documentation for specific details.
Don't use phpMyAdmin to import large files. Try using the mysql CLI to import a dump of your DB. Transfer the SQL file to the server and execute the following on the server using PHP script like shell_exec or system
mysql --user=user --password=password database < database_dump.sql.
Of course the database has to exist, and the user you provide should have the necessary privilege(s) to update the database.
PHP by default places resource limits on all php scripts using the following three directives:
=> max_execution_time : Maximum execution time of each script, in seconds (default 30 seconds)
=> max_input_time : Maximum amount of time each script may spend parsing request data (60 seconds)
=> memory_limit : Maximum amount of memory a script may consume (default 8MB)
Your php script was timed out may be because of resource limits. All you need to do is setup a new resource limits so that the script will get executed.
If that doesn't work either,you can set it with set_time_limit(N) function, which sets the time limit in seconds.

PHP Script Times out after 45 seconds

I am running a huge import to my database(about 200k records) and I'm having a serious issue with my import script timing out. I used my cell phone as a stop watch and found that it times out at exactly 45 seconds every pass(internal server error)... it only does about 200 records at a time, sometimes less. I scanned my phpinfo() and nothing is set to 45 seconds; so, I am clueless as to why it would be doing this.
My max_execution_time is set to 5 minutes and my max_input_time is set to 60 seconds. I also tried setting set_time_limit(0); ignore_user_abort(1); at the top of my page but it did not work.
It may also be helpful to note that my error file reads: "Premature end of script headers" as the execution error.
Any assistance is greatly appreciated.
I tried all the solutions on this page and, of course, running from the command line:
php -f filename.php
as Brent says is the sensible way round it.
But if you really want to run a script from your browser that keeps timing out after 45 seconds with a 500 internal server error (as I found when rebuilding my phpBB search index) then there's a good chance it's caused by mod_fcgid.
I have a Plesk VPS and I fixed it by editing the file
/etc/httpd/conf.d/fcgid.conf
Specifically, I changed
FcgidIOTimeout 45
to
FcgidIOTimeout 3600
3600 seconds = 1 hour. Should be long enough for most but adjust upwards if required. I saw one example quoting 7200 seconds in there.
Finally, restart Apache to make the new setting active.
apachectl graceful
HTH someone. It's been bugging me for 6 months now!
Cheers,
Rich
It's quite possible that you are hitting an enforced resource limit on your server, especially if the server isn't fully under your control.
Assuming it's some type of Linux server, you can see your resource limits with ulimit -a on the command line. ulimit -t will also show you just the limits on cpu time.
If your cpu is limited, you might have to process your import in batches.
First, you should be running the script from the command line if it's going to take a while. At the very least your browser would timeout after 2 minutes if it receives no content.
php -f filename.php
But if you need to run it from the browser, try add header("Content-type: text/html") before the import kicks.
If you are on a shared host, then it's possible there are restrictions on the system when any long running queries and/or scripts are automatically killed after a certain length of time. These restrictions are generally loosened for non-web running scripts. Thus, running it from the command line would help.
The 45 seconds could be a coincidence -- it could be how long it takes for you to reach the memory limit.. increasing the memory limit would be like:
ini_set('memory_limit', '256M');
It could also be the actual db connection that is timing out.. what db server are you using?
For me, mssql times out with an extremely unhelpful error, "Database context changed", after 60 seconds by default. To get around this, you do:
ini_set('mssql.timeout', 60 * 10); // 10 min
First of all
max_input_time and
set_time_limit(0)
will only work with VPS or dedicated servers . Instead of that you can follow some rules to your implementation like below
First read the whole CSV file .
Then grab only 10 entries (rows) or less and make a ajax calls to import in DB
Try to call ajax every time with 10 entries and after that echo out something on browser . In this method your script will never timeout .
Follow the same method untill the CSV rows are finished .

Categories