I have this code, that uses mysqldump to backup mysql database. The problem is I'm getting this fatal error:
Fatal error: Maximum execution time of
60 seconds exceeded in
C:\wamp\www\pos\php\backupdb.php on
line 13
Line 13 is the final line.
<?php
$backupFile = 'c:\\onstor'. date("Y-m-d-H-i-s") . '.sql';
$command = "mysqldump --opt -u root -p onstor > $backupFile";
system($command);
?>
What do I do, I think the code is okay since I've tried it in command prompt and it worked.
Is it bad that I have put the path to mysql/bin to the environment variables.
The problem, as the error message states pretty plainly, is that your script is running for too long. Scripts executing through a web server are not meant to run longer than a few seconds. You can change that using set_time_limit, but what you should really do is let long running scripts run from the command line. Since the only thing you're doing is running a CLI command anyway, just ditch the PHP wrapper completely. Make it a shell script if necessary. Run this shell script regularly as a cron job/Windows Scheduler task (or whatever the Windows equivalent is called).
Your code looks okay. Dumping is taking lot of time. that's all. Read this Fatal error: Maximum execution time of 400 seconds exceeded. Do what is written there first and then look for any problem in your code.
The message say you reached out the 60 seconds excution time.
You can change it using the set_time_limit function, i.e.:
set_time_limit(120); // 2 minutes
but I don't know why you would try to use mysqldump in PHP, it seems dangerous to me.
Related
I have a PHP console script that is called via cron which itself among other things creates a tar file of a directory.
When calling the PHP script via cron the the tar file is not created correctly. The following error is given when viewing the tar file:
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now
When calling the PHP script manually via console the tar file is created correctly. The cron log output shows no errors.
Here the tar call form the PHP script.
exec("cd $this->backupTempFolderName/$id; tar -czf ../../$this->backupFolderName/$tarFileName $dbDumpFileName documents");
Dose anybody have an idea why the tar is created correctly when manually called and fails when called via cron?
Update: The error given while creating the tar file via cron is:
tar: ../../backup/20150819-060003.tar.gz: Wrote only 4096 of 10240 bytes
tar: Error is not recoverable: exiting now
Sometimes the error is:
tar: ../../backup/20150819-054002.tar.gz: Cannot write: Broken pipe
tar: Error is not recoverable: exiting now
As said before, the when executed via cron the tar file is created, but always 50% of the correct size (when manually executing the script):
-rw-r--r-- 1 gtz gtz 1596099468 Aug 19 06:25 20150819-042330.tar.gz <- Manually called skript, working tar
-rw-r--r-- 1 gtz gtz 858570752 Aug 19 07:21 20150819-052002.tar.gz <- Script called via cron, broken tar
Update 2
After doing some further research based on the input given here, might should add that the cron called script is running on a virtual private server - I suspect that some limitations may exist for cron jobs that are not documented by the hoster (only limit on minimum repetition time is given in the docs).
That error comes usually from lack of disk space.
I would do some more researching on this subject, by adding some logs before and after the tar execution.
Also check what user your configuration is using for the cron job you have running the backup. It can be some quota limit on that user as well, that doesn't happen when you run on the console outside cron.
Ask your provider for quota limits on the VPS for users and for processes... That is what rings the bell here.
I guess that you have a resource limitation
As M. Ivanov has said, add this command in your PHP script:
shell_exec("php -info");
and check this parameter both when you execute your script from command line and from cron job
memory_limit => ???
You can also try to run your cron by enhancing the memory limit to 1600M
php -d memory_limit=1600M scriptCompressor.php
Hope that helps :)
You may want to try creating the tarball directly from PHP to avoid the exec call. See this answer: https://stackoverflow.com/a/20062628/5260945.
Also, looking at your cron entry, there is no leading slash on your example. I know this could just be a typo for the comment, but make sure you have an absolute path for the cd command. The default environment for a cron job is not the same as for your login shell.
cronjobs by themselves usually don't have limits. If you are using shared hosting they may have installed some enforcing scripts but I suspect they would also break your console backups.
If you are running cronjobs from some container, e.g. Drupal, they have special limits.
Also check bash limits with:
ulimit -a
Report disk space before backup starts and afterwards just in case. It's usually quite small on VPS.
I am sure its memory or execution time problem.
Do one thing run the same script for directory which contains only single test file and check the output, if your script works in this scenario then 100% sure its memory problem.
try to tweak memory parameter and execute your script.
I hope this help you.
Thanks
Looking at the following error.
tar: ../../backup/20150819-054002.tar.gz: Cannot write: Broken pipe
tar: Error is not recoverable: exiting now
I see that as the exec function in the PHP script is not blocking or causing an error prematurely. So the PHP session that get's called during the Cron job exits before the command finishes. This is just a guess but you can try to send this to the background when you run it from Cron.
exec("cd $this->backupTempFolderName/$id; tar -czf ../../$this->backupFolderName/$tarFileName $dbDumpFileName documents &");
This command should be blocking so this is just a shot in the dark.
http://php.net/manual/en/function.exec.php
Error occurs when you trying to execute the php file and gives EOF error. It means somewhere in your php file you must check the code of your cron file it may happen that you forget to complete the brackets of the condition or class etc...
Good luck ['}
To write the errors to the log when executed via cron replace >/paht/to/application/app/logs/backup-output.log in the cron line by 2>&1 >/path/to/application/app/logs/backup-output.log
Also check the path in the cron line .. maybe the change-dir is not working as you might think. Try printing getcwd() to a log or something, when running the php script from cron.
Edit: I wonder why this was voted not useful. The questioner mentioned that no errors are printed to the log when the cron executes the script. Thats not hard to imagine as > just redirects the STDOUT and not the STDERR (on which php errors would get printed) to the log. So adding 2>&1 might reveal some new informations.
I have seen this question on here before so I am sorry for the repetition but I have still not found an answer to my problem.
I have a bash script that takes a while to run. It needs to be passed variables set by a user on a webpage (don't worry there will be plenty of validation for security etc)
I can get the bash file to start but it dies after 20 seconds maybe when run from the webpage.
When run from the terminal.. runs absolutely fine.
Ok so I have the following:
$bashFile = shell_exec('./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" ');
Now this executes the bash file, but I read up about Shell_exec php with nohup and came up with the following:
$bashFile = shell_exec('nohup ./CoinCreationBashFile.sh "'.$coinName.'" "'.$coinNameAbreviation.'" "'.$blockReward.'" "'.$blockSpacing.'" "'.$targetTimespan.'" "'.$totalCoins.'" "'.$firstBitAddy.'" "'.$seedNode.'" "'.$seedName.'" "'.$headline.'" >/dev/null 2>&1 &');
But this still died after short time :(
So read up about set_time_limit and max_execution_time and set these to like 10000000 in the php.ini file.... Yet still no joy :(
Just want to run a bash script without it timing out and exiting. Don't really want to have to put an intermediate step in there but someone suggested I look at ZeroMQ to "detach worker from process" so I may have to go this route.
many thanks in advance
dont try runging a script via browser if they take more then 60 seconds instead try running this with SSH or as a cronjob.
I'm trying to run php script from CLI like this:
php -q /var/www/script.php
As i know if you run it from CLI there is no max_execution_time, but if i use functions from required/included files, after 5-10 minutes i have fatal error:
PHP Fatal error: Maximum execution time of 60 seconds exceeded in
/var/www/include.php on line 10
So max_execution_time does not apply to included files?
It is possible to avoid this without adding set_time_limit(0) in every included file?
Perhaps one of the included files set (for whatever reason) a set_time_limit(60)? If that's the case, you could probably work around it by calling set_time_limit(0) after every include in your PHP CLI script... Or edit the files contaning the set_time_limit(60), which might of course lead to unwanted side effects...
Try this?
php -d max_execution_time=5 script.php
With PHP on my WIN 7 + localhost, i'm testing to dump a mysql database via:system(mysqldump <parameters>). But while i run the code, my page is gone hanged (not responding) by never ending with loading sign. The test database is quite tiny. Whats wrong with it?
Here are the steps i've done for this work:
system ('mysqldump -u rootname -p rootpw dbname > output.sql');
Add the full path of mysqldump.exe into Windows Environment Variables
When i run the code, there appear output.sql file with just 0 kb sized and the page is not responding.
Try increasing the time limit:
// If system call
set_time_limit(600);
// Otherwise
ini_set('max_execution_time', 600);
set_time_limit reference
max_execution_time reference
Also, if you want to see what your database is doing, do a show full processlist from the mysql command line.
I have a command that when run direct on the command line works as expected. It runs for over 30 seconds and does not throw any errors. When the same command is called through a PHP script through the php function exec() (which is contained in a script called by a cron) it throws the following error:
Maximum execution time of 30 seconds
exceeded
We have a number of servers and i have run this command on a very similar server with the exact same dataset without any issues so i'm happy there is no script-level issue. I'm becoming more inclined to think this is related to something at the server level - either in the PHP setup or the server setup in some way but really not sure where to look. For those that are interested both servers have a max execution time of 30 seconds.
the command itself is called like this -
from command line as:
root#server>php -q /path/to/file.php
this works...
and via cron within a PHP file as:
exec("php -q /path/to/file.php");
this throws the max execution time error. it was always my understanding that there was no execution time limit when PHP is run from the command line.
I should point out that the script that is called, calls a number of other scripts and it is one of these scripts that is erroring. Looking at my logs, the max execution time error actually occurs before 30 seconds has even elapsed too! So, less than 30 seconds after being called, a script, called by a cron script that appears to be running as CLI is throwing a max execution error.
To check that the script is running as i expected (as CLI with no max execution time) i performed the following check:
A PHP script containing this code:
// test.php
echo exec("php test2.php");
where test2.php contains:
echo ini_get('max_execution_time');
and this script is run like this:
root#server> php test.php
// returns 0
This proves a script called in this way is running under CLI with a max execution time of 0 which just proves my thoughts, i really cannot see why this script is failing on max execution time!
it seems that your script takes too much time to execute, try to
set time limit, http://php.net/manual/en/function.set-time-limit.php
or check this post:
Asynchronous shell exec in PHP
Does the command take over 30 seconds on the command line? Have you tried increased the execution timeout in the php.ini?
You can temporarily set the timeout by including this at the top of the script. This will not work when running in safe mode as is specified in the documents for setting max_execution_time with ini_set().
<?php
ini_set('max_execution_time', 60); // Set to be longer than
// 60 seconds if needed
// Rest of script...
?>
One thing of note in the docs is this:
When running PHP from the command line
the default setting is 0.
What does php -v | grep cli, run from both the shell and in the exec command from the cron-loaded php file show?
Does explictly typing /usr/bin/php (modify as appropriate) make any difference?
I've actually found what the issue is (kinda). It seems that its maybe a bug with PHP reporting max_execution_time to be exceeded when the error is actually with max_input_time as described here
I tried changing the exec call to php -d max_execution_time=0 -q /path/to/file.php and i got the error "Maximum execution time of 0 seconds exceeded" which makes no sense, i changed the code to be php -d max_input_time=0 -q /path/to/file.php and the code ran without erroring. Unfortunately, its still running 10 minutes later. At least this proves that the issue is with max_input_time though
I'm surprised that no one above has actually timed the completed exec call. The problem is that exec(x) is taking a much longer time than command line x. I have a very complex perl script (with 8 levels of internal recursion) that takes about 40 sec to execute from the command line. Using exec inside a php script to call the same perl program takes about 300 sec to execute, i.e., a factor of about 7X longer time. This is such an unexpected effect that people aren't increasing their max execution time sufficiently to see their programs complete. As a result, they are mystified by the timeout. (BTW, I am running on WAMP in a fast machine with nominally 8 cpus, and the rest of my php program is essentially trivial, so the time difference must be completely in the exec.)
create wrapper.sh file as below
export DISPLAY=:0<br>
xhost + 2>>/var/www/err.log<br>
/usr/bin/php "/var/www/read_sms1.php" 2>>/var/www/err.log<br>
and put it in cron as below
bash /var/www/wrapper.sh<br>
y read_sms1.php contain<br>
$ping_ex = exec("/usr/local/bin/gnokii --getsms SM 1 end ", $exec_result, $pr);
and above solution workedfine for me in ubuntu 12.04