Write LogFile with date on running service - php

I've created a service with reactphp which runs and does some stuff. It is started as a daemon so all output should be logged in a file.
This log file should be named 'foo-log-file-$(date "+F")'. I want to have a single log file for each day.
Problem:
As mentioned the script runs as a service, without stopping. The starting call for the script is therefore only done once.
php my_script.php >> /var/log/bar/log-file--$(date "+%F") 2>&1
So everything which is printed to the console from this script is saved into the file, but the file is only created with the date-string when it was called and is not updated with a new date.
Question:
Is it possible to solve this without writing the log logic in the php-script? Can i handle this requirement with bash?

FYI
The answer of #fedorqui was a good approach, i solved it with a cronjob, which copies the file to a different one and empties the rest.
You cannot use move, cause the through the running service, it is open all time and you get the error:
cannot move 'foo.log' to 'bar.log': Text file busy
So i cp it and clear the old one with:
cp foo.log foo.log.$(date +"%F");
cp /dev/null file.log;

Related

Logging script commands and errors in linux

2 part question,
I am running a script that executes a second script.
I have it setup this way because I read if I put a script in the /etc/init.d directory it will run it at start up.(true or false?)
I have tried adding >> LoopTriggerLogging.log at the end of each line but nothing comes out in the log file
So I have a first script as follows
#!/bin/bash
/var/www/Dev/LoopTrigger.sh
exit
This triggers the following script to run
#!/bin/bash
while true; do
# do some work
php /var/www/Dev/FindNearestDriverAndSendPush.php
php /var/www/Dev/ProcessCustomerPayment.php
php /var/www/Dev/ProcessDriversPayment.php
# write to LoopTriggerLogging.log
sleep 2 # sleep and repeat
done
What I would like is to have the commands logged along with any errors. I have tried to read a little on this but get lost in the answers and what they are trying to tell the user. I am still new at this and learning, kindly give definition to any commands or options. I am open to a best practice scenario.
Also, with putting in the etc/init.d directory will this tell the script to run at start up?
Is there a way to run this script without it taking up the command line because its an endless script?
My ultimate goal is to get the 3 php files to execute every 2 seconds with some sort of logging.
I did some reading on Cron but seems it is not meant for this type of use case.
Ive also seen this:
exec > logfile 2>&1 (dont know what this does)
set -x makes bash print every command before executing it
FOO=BAR (dont know what this means)
echo $FOO (dont know what this means)
if I put a script in the /etc/init.d directory it will run it at start
up.(true or false)
True. If you put a script in init.d then that script will run for every startup.
My ultimate goal is to get the 3 php files to execute every 2 seconds
You are using the correct way of running it approx every 2sec depending upon the time your php script takes to run. Crontab runs after a minimum of one minute so that would not an option.
I have tried adding >> LoopTriggerLogging.log at the end of each line
but nothing comes out in the log file
You can use /var/www/Dev/LoopTrigger.sh >> LoopTriggerLogging.log in your first script so that whenever it runs it will
Create a file for the first time and append the content from the next time.
Push all the logs of the second script into the file.
Note: As logs will keep on appending to the single file, this file will become very huge at some point of time so make sure your handle it well.

why is my "at job" not executing my php script when created through a php webpage?

$output = shell_exec('echo "php '.$realFile.'" | at '.$targTime.' '.$targDate.' 2>&1');
print $output;
Can someone please help me figure out why the above line isn't doing what it's supposed to be doing? The idea is for it to create an 'at' job that will execute a php script. If I switch to the user apache(which will ideally control the at function when the php file is complete) I can run
echo "php $realFile.php" | at 00:00 05/30/17
and it'll do EXACTLY what I want. The problem is in the above snippet from my php file it will not create the at job correctly. when I do a at -c job# on both of them the job made from my file is about a 3rd the length missing the User info and everything. It basically starts at PATH= and goes down. Doesn't include HOSTNAME=, SHELL=, SSH_CLIENT=, SSH_TTY=, USER=. I assume it needs most of this info to run correctly. The end output (below)is always the same though it just doesn't have any of the top part for some reason. Let me know if you need more info. I didn't want to paste all of my code here as it contains job specific information.
${SHELL:-/bin/sh} << 'marcinDELIMITER0e4bb3e8'
php "$realFile".php
marcinDELIMITER0e4bb3e8
It doesn't seem to be a permission issue because I can su to apache and run the exact command needed. The folder the files are located in are also owned by apache. I've also resulted to giving each file I try to run 777 or 755 permissions through chmod so I don't think that's the issue.
I figured out a coupe ways around it a while back. The way I'm using right now is an ssh2 connect to my own server as root and creating it that way. No compromise as you have to enter the password manually each time. Really bad work around. The main issue is that apache doesn't have the correct permissions to do everything needed for the AT job so someone figuring that out would be awesome. Another option I found on a random webpage would be to use sudo through the php script, but basically the same minus having to reconnect to your own server. Any other options would be appreciated.
Reading the manual and logs would be a good place to start. In particular:
The value of the SHELL environment variable at the time of at invocation will determine which shell is used to execute the at job commands. If SHELL is unset when at is invoked, the user’s login shell will be used; otherwise, if SHELL is set when at is invoked, it must contain the path of a shell interpreter executable that will be used to run the commands at the specified time.
Other things to check are that the user is included in at.allow, SELinux is disabled and the webserver is not running chrrot.

Tar error: Unexpected EOF in archive when running via Cron/PHP

I have a PHP console script that is called via cron which itself among other things creates a tar file of a directory.
When calling the PHP script via cron the the tar file is not created correctly. The following error is given when viewing the tar file:
gzip: stdin: unexpected end of file
tar: Unexpected EOF in archive
tar: Error is not recoverable: exiting now
When calling the PHP script manually via console the tar file is created correctly. The cron log output shows no errors.
Here the tar call form the PHP script.
exec("cd $this->backupTempFolderName/$id; tar -czf ../../$this->backupFolderName/$tarFileName $dbDumpFileName documents");
Dose anybody have an idea why the tar is created correctly when manually called and fails when called via cron?
Update: The error given while creating the tar file via cron is:
tar: ../../backup/20150819-060003.tar.gz: Wrote only 4096 of 10240 bytes
tar: Error is not recoverable: exiting now
Sometimes the error is:
tar: ../../backup/20150819-054002.tar.gz: Cannot write: Broken pipe
tar: Error is not recoverable: exiting now
As said before, the when executed via cron the tar file is created, but always 50% of the correct size (when manually executing the script):
-rw-r--r-- 1 gtz gtz 1596099468 Aug 19 06:25 20150819-042330.tar.gz <- Manually called skript, working tar
-rw-r--r-- 1 gtz gtz 858570752 Aug 19 07:21 20150819-052002.tar.gz <- Script called via cron, broken tar
Update 2
After doing some further research based on the input given here, might should add that the cron called script is running on a virtual private server - I suspect that some limitations may exist for cron jobs that are not documented by the hoster (only limit on minimum repetition time is given in the docs).
That error comes usually from lack of disk space.
I would do some more researching on this subject, by adding some logs before and after the tar execution.
Also check what user your configuration is using for the cron job you have running the backup. It can be some quota limit on that user as well, that doesn't happen when you run on the console outside cron.
Ask your provider for quota limits on the VPS for users and for processes... That is what rings the bell here.
I guess that you have a resource limitation
As M. Ivanov has said, add this command in your PHP script:
shell_exec("php -info");
and check this parameter both when you execute your script from command line and from cron job
memory_limit => ???
You can also try to run your cron by enhancing the memory limit to 1600M
php -d memory_limit=1600M scriptCompressor.php
Hope that helps :)
You may want to try creating the tarball directly from PHP to avoid the exec call. See this answer: https://stackoverflow.com/a/20062628/5260945.
Also, looking at your cron entry, there is no leading slash on your example. I know this could just be a typo for the comment, but make sure you have an absolute path for the cd command. The default environment for a cron job is not the same as for your login shell.
cronjobs by themselves usually don't have limits. If you are using shared hosting they may have installed some enforcing scripts but I suspect they would also break your console backups.
If you are running cronjobs from some container, e.g. Drupal, they have special limits.
Also check bash limits with:
ulimit -a
Report disk space before backup starts and afterwards just in case. It's usually quite small on VPS.
I am sure its memory or execution time problem.
Do one thing run the same script for directory which contains only single test file and check the output, if your script works in this scenario then 100% sure its memory problem.
try to tweak memory parameter and execute your script.
I hope this help you.
Thanks
Looking at the following error.
tar: ../../backup/20150819-054002.tar.gz: Cannot write: Broken pipe
tar: Error is not recoverable: exiting now
I see that as the exec function in the PHP script is not blocking or causing an error prematurely. So the PHP session that get's called during the Cron job exits before the command finishes. This is just a guess but you can try to send this to the background when you run it from Cron.
exec("cd $this->backupTempFolderName/$id; tar -czf ../../$this->backupFolderName/$tarFileName $dbDumpFileName documents &");
This command should be blocking so this is just a shot in the dark.
http://php.net/manual/en/function.exec.php
Error occurs when you trying to execute the php file and gives EOF error. It means somewhere in your php file you must check the code of your cron file it may happen that you forget to complete the brackets of the condition or class etc...
Good luck ['}
To write the errors to the log when executed via cron replace >/paht/to/application/app/logs/backup-output.log in the cron line by 2>&1 >/path/to/application/app/logs/backup-output.log
Also check the path in the cron line .. maybe the change-dir is not working as you might think. Try printing getcwd() to a log or something, when running the php script from cron.
Edit: I wonder why this was voted not useful. The questioner mentioned that no errors are printed to the log when the cron executes the script. Thats not hard to imagine as > just redirects the STDOUT and not the STDERR (on which php errors would get printed) to the log. So adding 2>&1 might reveal some new informations.

How to execute php file from windows batch file

I have windows batch file and also its scheduled in task scheduler.
task schedule working fine but the windows batch file does not execute
its in an php file. I am learning php and the windows batch file was coded previous employee.
The below code which i took from inside the windows batch file.
The path is correct , what is the purpose of - f ? and can you correct me the code.
List of commandline options can be found here http://php.net/manual/en/features.commandline.options.php
in your case -f stands for Parse and execute File
Change your batch to something like this:
#ECHO ON
echo Before php %TIME% %DATE% >> C:\temp\task.log
C:
Cd \www2
C:\PHP5.3\php.exe -f "C:\www2\cron.php" >> C:\temp\task.log
echo After php %TIME% %DATE% >> C:\temp\task.log
As long as you have the echo command and the call of the php interpreter on a single line this only output a meaningless text to the screen.
The first echo is there to help you diagnose problems: if the scheduled tasks is being called you will see this logged in task.log.
To make sure that the working directory is set to C:\www2 (cron.php will look for files there unless they get addressed with a full path) we change to that directory before running php.
In case php prints any error messages or information we redirect that output to the same log because otherwise you will not see this. You should delete that log from time to time because it can grow considerably. It is up to you to decide which of the logging statements you want to keep or remove ;-)
You can even have the log roll over by size, I bet there are question here on SO that will show you how this can be done ;-)

Openshift cron doesn't run php file

I have 2 .php files in my application - book.php and weather.php. I create a file named "runscript" in /.openshift/cron/minutely. This file contents:
#!/bin/bash
php -f $OPENSHIFT_REPO_DIR/weather.php
This script send me message to phone every minute, it's OK.
Then I replace to:
php -f $OPENSHIFT_REPO_DIR/book.php
This script MUST send me message too, but nothing is happing. But if I just run this script by my webbrowser (go to the http://xxx-xxxxxxx.rhcloud.com/book.php) so I got my message. How is it possible? Magic?
Did you miss the #!/bin/bash part? That's needed to run the shell script.
For why your cron job is not executing, check the cron logs on OpenShift. You can find them at ~/app-root/logs/cron_*.log when you SSH into your gear.
Make sure your cron job is execuable with chmod, and has the shebang line as #gnaanaa says. Also check if you have one of the .openshift/cron/minutely/jobs.{allow,deny} files as they may cause cron to skip your job. (See the cron README for more information.)
And after your cron job is working, you can get rid of the wrapper script runscript and have cron call book.php directly. To do so, place book.php directly into .openshift/cron/minutely, make it executable, and add this shebang to it:
#!/usr/bin/env php
Hope this helps.
I use openshift aswell and executed a php file with a cron aswell.
#!/bin/bash
php ${OPENSHIFT_REPO_DIR}index.php
This executes the script normally at first sight. However no output was produced. The problem was, that all the required php files couldnt be loaded because the working directory was not the same as it would be when loaded by the webserver. Setting the working directoy in the php script itself will prevent this error and makes the script perfectly executable by the cron.
This should help some people to get their script running.

Categories