I have a cron command below that is sending me status outputs on the wget.... I really don't want these outputs, I just want the code to run.
wget http://www.domain.com/cron/dailyEmail 2>&1;
How can I turn off the output?
Send the output to the null device.
wget http://www.domain.com/cron/dailyEmail >/dev/null 2>&1
Send it to a temporary file thus:
wget http://www.domain.com/cron/dailyEmail >/tmp/my_wget.out 2>&1
That way, you can see the output if you need to but it doesn't otherwise bother you.
If you want to keep older copies of the output around rather than over-writing them on each run, you can use something like:
wget http://www.domain.com/cron/dailyEmail >/tmp/my_wget_$(date +%Y_%m_%d).out 2>&1
which will give you a filename containing the date (and time if you change the arguments to the date command) but then you'll probably want an automated process to clean up older log files.
Related
With cron jobs, the output of the file being executed is emailed to me. I recently discovered via this answer that it's possible to asynchronously execute a PHP file using the shell_exec() function. Per the above answer, I've been using the following command:
shell_exec('php /file/path.php parameter1 parameter2 > /dev/null 2>/dev/null &');
I think what is of most interest with regards to this question is the stuff at the end:
> /dev/null 2>/dev/null &
Is there any way to change that so that the output is emailed, like with a cron job?
The line
> /dev/null 2>/dev/null &
Essentially sends the output to a null file. If you take that bit off, the output will be sent to the standard out, which should in turn should email you the results assuming you run it with a cron job.
So,
shell_exec('php /file/path.php parameter1 parameter2');
If you're not running it with a cron job, you'll need to build in the email functionality to the script itself.
First off, shell_exec() is not asynchronous. Execution of the PHP code will be suspended until the shell_exec() call has terminated.
This: > /dev/null tells the shell to redirect stdout from the process being executed to /dev/null, which means it disappears. This: 2> /dev/null does the same, but for stderr in stead of stdout.
If you remove these parts of the shell_exec() call, the call will return whatever is written to stdout:
$result = shell_exec('php /file/path.php parameter1 parameter2');
mail('me#email.com', 'Shell output', $result);
There are also other alternatives to shell_exec() that may suit your needs better. For example popen()andproc_open()` allow more fine grained control over input and output.
Also, since you are executing a PHP script, you may be able to simply use include() or require(), depending on how the script is written. Another option would be to read the file and then execute the PHP code using eval().
Use :
| mail -s "Result of cron job" myemail#company.com
At the end of your command. Also, you can pipe stderr and stdout into stdout 2> &1 | mail....
That should do the trick. Maybe because you use shell_exec("... &"), you'll have to wrap the whole php /file/... | mail -s "result" myemail#company.com in a subshell. like:
shell_exec('(php /file/path.php parameter1 parameter2 2> &1 | mail -s "Result of cron job" myemail#company.com) &')
This is a rather ugly way to get the output to send a message and I'd advise in favor of refactoring the outer call of shell_exec to read stderr and stdout, craft a message and send that to yourself. Meanwhile the | mailsolution should do the trick.
Happy mailing !
I want to run wget as follows
shell_exec('wget "'http://somedomain.com/somefile.mp4'"');
sleep(20);
continue my code...
What I want is to let PHP wait for the shell_exec wget file download to finish before continuing on with the rest of the code. I don't want to wait a set number of seconds.
How do I do this, because as I run shell_exec wget, the file will start downloading and run in background and PHP will continue.
Does your URL contain the & character? If so, your wget might be going into the background and shell_exec() might be returning right away.
For example, if $url is "http://www.example.com/?foo=1&bar=2", you would need to make sure that it is single-quoted when passed on a command line:
shell_exec("wget '$url'");
Otherwise the shell would misinterpret the &.
Escaping command line parameters is a good idea in general. The most comprehensive way to do this is with escapeshellarg():
shell_exec("wget ".escapeshellarg($url));
shell_exec does wait for the command to finish - so you don't need the sleep command at all:
<?php
shell_exec("sleep 10");
?>
# time php c.php
10.14s real 0.05s user 0.07s system
I think your problem is likely the quotes on this line:
shell_exec('wget "'http://somedomain.com/somefile.mp4'"');
it should be
shell_exec("wget 'http://somedomain.com/somefile.mp4'");
For a website, I need to be able to start and stop a daemon process. What I am currently doing is
exec("sudo /etc/init.d/daemonToStart start");
The daemon process is started, but Apache/PHP hangs. Doing a ps aux revealed that sudo itself changed into a zombie process, effectively killing all further progress. Is this normal behavior when trying to start a daeomon from PHP?
And yes, Apache has the right to execute the /etc/init.d/daemonToStart command. I altered the /etc/sudoers file to allow it to do so. No, I have not allowed Apache to be able to execute any kind of command, just a limited few to allow the website to work.
Anyway, going back to my question, is there a way to allow PHP to start daemons in a way that no zombie process is created? I ask this because when I do the reverse, stopping an already started daemon, works just fine.
Try appending > /dev/null 2>&1 & to the command.
So this:
exec("sudo /etc/init.d/daemonToStart > /dev/null 2>&1 &");
Just in case you want to know what it does/why:
> /dev/null - redirect STDOUT to /dev/null (blackhole it, in other words)
2>&1 - redirect STDERR to STDOUT (blackhole it as well)
& detach process and run in the background
I had the same problem.
I agree with DaveRandom, you have to suppress every output (stdout and stderr). But no need to launch in another process with the ending '&': the exec() function can't check the return code anymore, and returns ok even if there is an error...
And I prefer to store outputs in a temporary file, instead of 'blackhole'it.
Working solution:
$temp = tempnam(sys_get_temp_dir(), 'php');
exec('sudo /etc/init.d/daemonToStart >'.$temp.' 2>&1');
Just read file content after, and delete temporary file:
$output = explode("\n", file_get_contents($temp));
#unlink($temp);
I have never tried starting a daemon from PHP, but I have tried running other shell commands, with much trouble. Here are a few things I have tried, in the past:
As per DaveRandom's answer, append /dev/null 2>&1 & to the end of your command. This will redirect errors to standard output. You can then use this output to debug.
Make sure your webserver's user's PATH contains all referenced binaries inside your daemon script. You can do this by calling exec('echo $PATH; whoami;). This will tell you the user PHP is running under, and it's current PATH variable.
How to setup a cron job command to execute an URL?
/usr/bin/wget -q http://www.domain.com/cron_jobs/job1.php >/dev/null 2>&1
Why can't I make this work!? Have tried everything.. The PHP script should send an email and create some files, but none is done
The command returns this:
Output from command /usr/bin/wget -q http://www.domain.com/cron_jobs/job1.php ..
No output generated
... but it still creates an empty file in /root on each execute!? Why?
Use curl like this:
/usr/bin/curl http://domain.com/page.php
Don't worry about the output, it will be ignored
I had the same problem. The solution is understanding that wget is outputting two things: the results of the url request AND activity messages about what it's doing.
By default, if you do not specify an output file, it will create one, seemingly named after the file in your url, in the current folder where wget is run.
If you want to specify a different output file:
-O outputfile.txt
will output the url results to outputfile.txt, overrwriting what's there.
If you wish to append to that file, write to std out and then append to the file from there:
and here's the trick: to write to std out use:
-O-
the second dash is in lieu of a filename and tells wget to write the url results to std out.
then use the append syntax, >>, to append to a file of your choice:
wget -O- http://www.invisibility.com >>/var/log/invisibility.log
The lower case o, specifies the location of the activity log, so if you wish to log activity for the url request, you can:
wget -o http://someurl.com /var/log/activity.log
-q suppresses output of activity messages
wget -q http://someurl.com /var/log/activity.log
will not log any activity to the specified file, and I think that is the crux where people get confused.
Remember:
-O is shorthand for --output-document
-o is shorthand for --output-file, which is the activity log.
Took me hours to get it working. Thank you for people writing down solutions.
One also needs to make sure to check whether single or double quotes are needed, otherwise it will parse the url wrong leading to error messages:
This worked (using single quotes):
/usr/bin/wget -O -q 'http://domain.com/cron-file.php'
This gave errors (using double quotes):
/usr/bin/wget -O -q "http://domain.com/cron-file.php"
Don't know if the /usr/bin/ is needed. Read about different ways of how to do the order of the -O -q. It is hard to find a reliable definitive source on the web for this subject. So many different examples.
An online wget manual can be found here, for the available options (but check with the Linux distro one is using for an up to date version):
http://unixhelp.ed.ac.uk/CGI/man-cgi?wget
For use wget to display HTML:
wget -qO- http://www.example.com
I have a php script I want to run every minute to see if there are draft news posts that need to be posted. I was using "wget" for the cron command in cPanel, but i noticed (after a couple days) that this was creating a blank file in the main directory every single time it ran. Is there something I need to stop that from happening?
Thanks.
When wget runs, by default, it generates an output file, from what I need to remember.
You probably need to use some option of wget, to specify to which file it should write its output -- and use /dev/null as destination file (It's a "special file" that will "eat" everything you can write to it)
Judging from man wget, the -O or --output-file option would be a good candidate :
-O file
--output-document=file
The documents will not be written to the appropriate files, but all will be concatenated together and written to file.
so, you might need to use something like this :
wget -O /dev/null http://www.example.com/your-script.php
And, btw, the output of scripts run from the crontab is often redirected to a logfile -- it can always help.
Something like this might help, about that :
wget -O /dev/null http://www.example.com/your-script.php >> /YOUR_PATH_logfile.log
And you might also want to redirect the error output to another file (can be useful, to help with debugging, the day something goes wrong) :
wget -O /dev/null http://www.example.com/your-script.php >>/YOUR_PATH/log-output.log 2>>/YOUR_PATH/log-errors.log