I have a .php script that includes other php scripts and as final result a db is updated and a report file has been created.
if I manually call it from a browser all works fine.
I tried to call it from the crontab with the following syntax
/usr/local/bin/php /home/PATH/myscript.php
and at the planned time, it seems to stop at the first include(); and die.
I added a mail(); sending after each include();
If I call the file from the browser I receive all the 5 mails, when it's called from the crontab, I receive only the first email
Where am I doing wrong?
*edit:
I tried to call always the first script and I received all the mail. It seems the second script contains something that is stopping the code but ONLY is it's called by the cron
You can run a php script like this:
php -f script.php
If you want to use another php.ini (what often happens with console apps), use
php -c path/where/php.ini -f script.php
type
php --help
for more options.
Use this instead
php -q /home/PATH/myscript.php
As per this answer by St. Woland, the -q flag:
The -q flag suppresses HTTP header output.
Related
I am running a command that creates a zip file in my php controller script.
Once the file is created, an email is been send to a user for notification. The command is as follows :
system("7za a path/zip_file -mem=AES256 -v2g -mx9 -pPassword path/zipcontent > /dev/null 2>&1 &");
The creation of the zip file last for about 20 minutes and I wish to send the email once the file is created completely (that is after the 20 minutes) or if an error occur notify the user of the error.
How can I do to know if the execution of the command is completed and file well created before sending the mail in case of an error or not?
Thanks in advance.
A simple trick is to write error to a file like command1 2> myerror.txt and then use a cron job to send this file every 20 minutes by mail, and remove it. in case of no error, there is no file to send. so basically you just need a cronjob and a simple if-condition script.
An other method is using journalctl utility to send errors to specific output.
You even can use sed and grep to redirect error level to a file, then email it (or send to wall)
For notification, you can use yourProgram & notifyScript. if the first command has completed successfully, the second command will run.
With "&" at the end of command, you run in background that command (background on OS, non php), then system() php function return output immediately also if command work for 20minutes. Php script goes to the next instruction until has terminated without wait end of execution command run with system().
If you want to check when command are terminated, you can add a command in your system() call for a check.
7za has many exit code (see https://sevenzip.osdn.jp/chm/cmdline/exit_codes.htm)
system("7za a path/zip_file -mem=AES256 -v2g -mx9 -pPassword path/zipcontent > /dev/null 2>&1 ; echo $? > /tmp/myfile.txt &")
With this you can try to write exit return code (in shell command "$?" is exit return code of previous command) to a tempfile and check it periodically. If it contain 0, extract has successfull then you can send an email.
Alternatively, you can write a bash script shell that does extract, check, send email and call this script trought system() function
It's not a best solution for me, but explain how work with external process in php using system()
There is a PHP extension , https://www.php.net/manual/en/book.pcntl.php, maybe it can help you
I have 2 .php files in my application - book.php and weather.php. I create a file named "runscript" in /.openshift/cron/minutely. This file contents:
#!/bin/bash
php -f $OPENSHIFT_REPO_DIR/weather.php
This script send me message to phone every minute, it's OK.
Then I replace to:
php -f $OPENSHIFT_REPO_DIR/book.php
This script MUST send me message too, but nothing is happing. But if I just run this script by my webbrowser (go to the http://xxx-xxxxxxx.rhcloud.com/book.php) so I got my message. How is it possible? Magic?
Did you miss the #!/bin/bash part? That's needed to run the shell script.
For why your cron job is not executing, check the cron logs on OpenShift. You can find them at ~/app-root/logs/cron_*.log when you SSH into your gear.
Make sure your cron job is execuable with chmod, and has the shebang line as #gnaanaa says. Also check if you have one of the .openshift/cron/minutely/jobs.{allow,deny} files as they may cause cron to skip your job. (See the cron README for more information.)
And after your cron job is working, you can get rid of the wrapper script runscript and have cron call book.php directly. To do so, place book.php directly into .openshift/cron/minutely, make it executable, and add this shebang to it:
#!/usr/bin/env php
Hope this helps.
I use openshift aswell and executed a php file with a cron aswell.
#!/bin/bash
php ${OPENSHIFT_REPO_DIR}index.php
This executes the script normally at first sight. However no output was produced. The problem was, that all the required php files couldnt be loaded because the working directory was not the same as it would be when loaded by the webserver. Setting the working directoy in the php script itself will prevent this error and makes the script perfectly executable by the cron.
This should help some people to get their script running.
So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.
I currently use nohup to run a long php script and redirect the live results to a file using this command
nohup php long_file.php >logs 2>&1 &
so i just go and visit logs file continuously to see the results
Now i want to do the exact same thing using another php file to execute the above command
i tried the above command with php exec and the redirect output doesn't seem to be working,
I know i can just retrive the output using php and store it using any file write function but the thing is .. the output is too long thats why i keep it running on server's background
a similar question :
Shell_exec php with nohup, but it had no answer
any solution ?
Please try with -q
nohup php -q long_file.php >logs 2>&1 &
http://ubuntuforums.org/showthread.php?t=977332
Did you try passthru instead of exec?
You are redirecting STDOUT to a file by using >
This will truncate the file each time the script is run. If two scripts simultaneously redirect their output to the same file, the one started last will truncate the output from the first script.
If you really want to properly append to a log file with multiple concurrent running scripts, consider using >> to avoid having the log file truncated.
Side effect however is that the log file never truncates and keeps expanding, so if the file gets really large you can consider including it in a logrotate scheme.
I've set up a cron job to run. It executes a php file which is named cronj.php
But it doesn't work and cron job notification I get is:
/root/website/myworld/blabla/cronj.php: line 1: ?php: No such file or directory
And line 1 in that file is simply a php tag <?php I don't know how
Cron is executing the file as if it was a shell script. Normally you would put in a shebang line (like #!/usr/bin/env php) at the top of the file so that the shell knows how to invoke it, but PHP doesn't like it - as it outputs everything outside its tags. Thus, instead of this:
0 3 * * * /mypath/myscript.php ...
try this:
0 3 * * * /usr/bin/env php /mypath/myscript.php ...
or use #Ravenex's trick.
EDIT I was just rightly admonished for assuming PHP behaves in a consistent way. Apparently, shebang does work in PHP. My apologies to #chess007.
We use cron to run nightly tasks in a php facebook game. We do it by using curl like this:
/usr/bin/curl http://www.ourdomain.com/page.php
If I remember right we had some issues using localhost to try to avoid external lookups. Also we tried using php command line execution, which mostly worked but caused a few strange bugs.
Try to call the web url (http://.....).
It's apparently not parsing it as an PHP script.
Edit:
Please show use the cronjob you used, to verify my hunch was right.
Use this to set your cron and also give email address in your cron setting Cpanel so that you get an email when cron runs successfully
wget -O - http://YOURSITE/cron.php