I have an email parsing script on a Codeigniter site that I want to trigger each day with a cronjob. I don't have much experience with cronjobs, or command line on remote servers.
I have a cronJobs controller at mysite.com/public_html/application/controllers/cronJobs. In it is a parseMail method. I'm also using mod_rewrite to get rid of index.php from URLS.
The parseMail method does work when I hit the controller "normally" through my browser at MYSITE.com/cronJobs/parseMail. There is a DB insert that goes off.
But to trigger it with cronjob I have tried >>>
wget http://MYSITE.com/cronJobs/parseMail
And I do get a notification email.. but I'm not sure how to interpret it. It's finding the script? There is no error? Regardless, the parseMail doesn't fire.
--2013-01-19 12:00:02-- http://MYSITE.com/cronJobs/parseMail
Resolving MYSITE.com... xx.xx.xx.xxx
Connecting to MYSITE.com|xx.xx.xx.xxx|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: 0 [text/html]
Saving to: `parseMail.203'
0K 0.00 =0s
2013-01-19 12:00:03 (0.00 B/s) - `parseMail.203' saved [0/0]
I also tried
-q wget http://MYSITE.com/cronJobs/parseMail
And received "/bin/sh: get: command not found"
Then I have also tried variations on this..
/usr/local/bin/php -q /public_html/index.php cronJobs parseMail
/usr/local/bin/php -q /public_html/ cronJobs parseMail
/usr/local/bin/php -q home/myusername/public_html/index.php cronJobs parseMail
/usr/local/bin/php -q home/myusername/public_html/ cronJobs parseMail
With these methods I cant even seem to hit the controller. My email notification just says "Could not open input file".
I'm just not really familiar with any of these errors.. so I don't know how to hone in on a solution.
Can anyone give me any tips how to move forward?
solved
After much Googling I found the solution that worked for me. Here is what I used (I was so close in one of my first attempts.. just didn't use an absolute path. And then when I started using some of the suggestions from the comments below, the php path was different and I did not notice)
Here is the correct version.
/usr/local/bin/php /home/myusername/public_html/index.php cronJobs parseMail
None of your paths are real path. All are either invalid (/public_html ones) or relative (home/myusername ones). Following should do.
/usr/bin/env php /home/myusername/public_html/index.php cronJobs parseMail
You might need to change the directory to document root first. In that case use this,
(cd /home/myusername/public_html/ && /usr/bin/env php index.php cronJobs parseMail)
Related
I am trying to setup a cron job for my WP All Import plugin. I have tried setting up cron jobs via Bluehost cpanel with the following 4 options:
php /home2/slotenis/public_html/wp-cron.php?import_key=*****&import_id=9&action=trigger
GET http://www.slotenis.si/wp-cron.php?import_key=*****&import_id=9&action=trigger
/usr/bin/GET http://www.slotenis.si/wp-cron.php?import_key=*****&import_id=9&action=trigger
curl http://www.slotenis.si/wp-cron.php?import_key=*****&import_id=9&action=trigger
NONE of them is working.
I have setup an email confirmation every time a cron job is run and I receive the following email:
cp: cannot stat `exim.pl': No such file or directory
cp: not writing through dangling symlink `/var/fake/slotenis/etc/./exim.pl.local'
Can anyone provide me the exact command line to make it working?
Try using wget.
wget -O /dev/null -o /dev/null "https://www.domain.com/wp-cron.php?import_key=*****&import_id=9&action=trigger
It's what I use on my sites.
For troubleshooting try visiting the URL yourself. If that doesn't work there's either a problem with the plugin, WordPress or Bluehost.
Important to know, the error you are seeing about "cp: cannot stat `exim.pl'" is produced before the command actually runs, and it does not stop your actual command from working. (This is an issue on Bluehost's side. They recently added broken symlinks in /etc/exim.pl and /etc/exim.pl.local.)
About the actual cron command: If you have special characters like "?" and "&", you need to escape them, e.g. enclose the whole URL in double quotes. It works to run a php script, but if you want to pass query parameters, you don't use the "?" syntax. See PHP, pass parameters from command line to a PHP script.
With curl it should work:
curl "http://www.slotenis.si/wp-cron.php?import_key=*****&import_id=9&action=trigger"
Recently my site was moved to a different server, due to maintenance at the host. Ever since I can't this script to run as a cronjob anymore: http://www.filmhuisalkmaar.nl/wp-content/themes/filmhuis-alkmaar/cron/load-shows.php
I tried running it using PHP with the follow cronjob:
php /home/provadja/domains/filmhuisalkmaar.nl/public_html/wp-content/themes/filmhuis-alkmaar/cron/load-productions.php
But I kept getting the following error:
PHP Warning: require_once(../inc/api.php): failed to open stream: No such file or directory in /home/provadja/domains/filmhuisalkmaar.nl/public_html/wp-content/themes/filmhuis-alkmaar/cron/load-productions.php on line 3 PHP Fatal error: require_once(): Failed opening required '../inc/api.php' (include_path='.:/usr/local/lib/php') in /home/provadja/domains/filmhuisalkmaar.nl/public_html/wp-content/themes/filmhuis-alkmaar/cron/load-productions.php on line 3
I checked if the files stating missing were still in place. And they were. I checked the file permissions and they're set to 755, which should be more than fine. Right?
Then I tried wget with the following cronjob:
/usr/bin/wget -O https://www.filmhuisalkmaar.nl/wp-content/themes/filmhuis-alkmaar/cron/load-shows.php
But then I keep getting the following URL:
wget: missing URL
Usage: wget [OPTION]... [URL]...
Try ‘wget --help’ for more options.
I'm really at a loss here. Especially because it used to work fine in the past. It's very frustrating because these scripts are kind of essential for my site to stay updated.
Any help would really be appreciated. Thank you.
Try to run it like this:
cd /home/provadja/domains/filmhuisalkmaar.nl/public_html/wp-content/themes/filmhuis-alkmaar/cron/ && php load-productions.php
Note the use of cd command at start. This means "change current working directory to ../cron/ and then run script load-productions.php".
I prefer for cron tasks to use the use the full path to included and required scripts. So, instead of:
require_once("../inc/api.php");
I generally do:
$base = dirname(dirname(__FILE__));
require_once($base . "/inc/api.php");
This way the server knows exactly where to look and is not relative to certain directories.
Side note: I also like to do /path/to/php -q /path/to/script.php too. : )
I will quote fvu's comment to my question, which I have tried and can confirm now as fully working:
1) does /home/provadja/domains/filmhuisalkmaar.nl/public_html/wp-content/themes/filmhuis-alkmaar/inc/api.php exist? 2) obvious error in wget usage (-O needs the name of the file in which to save the script output), try wget -O /dev/null https://www.filmhuisalkmaar.nl/wp-content/themes/filmhuis->alkmaar/cron/load-shows.php
Thanks a lot everyone, for your help!
A script has an execution time of more than a minute. So, therefore I would like to run the script as a background task.
I've read a lot about it on the internet and read that print shell_exec('/usr/bin/php -q page.php &'); isn't the solution since the taks is still a child of the process. I've tested it with sleep(10) and indeed, the page which should call the cron job is waiting for 10 seconds.
So, symcbean have written an article ( http://symcbean.blogspot.nl/2010/02/php-and-long-running-processes.html?m=1 ) and is suggesting the following code:
print `echo /usr/bin/php -q longThing.php | at now`;
But, unfortunately, i script didn't do anything and after adding 2>&1 I get the following response:
sh: at: command not found
I've search a lot for solving this issue, but can't find any solution.
You should provide the fully qualified path to the at command for example /bin/at.
If you're not sure of the path you can usually type which at at the command line to find the path to the command.
I have a php script that I'd like to run everyday on my DreamHost website using a cron job. I've tested the script manually so I know it works properly.
I've tried setting up the cron job with this command line:
/usr/bin/wget -O /dev/null "http://www.mysite.org/cronjobs/cronjob.php"
along with a few other methods including this one as well:
/dh/cgi-system/php54.cgi /home/username/mysite.org/cronjobs/cronjob.php
None of these have worked and even worse, I have not received any email with the results so I have no way of knowing what went wrong.
Any idea you may have as to what isn't working would be great!
Try this command in the cpanel.
/usr/local/php5/bin/php /home/username/mysite.org/cronjobs/cronjob.php
It's been a while but since here is still no "solution" for users that might look here for a wget example here is how it works at DreamHost:
wget -q -O /dev/null http://www.domain.com/path-to-be-called
I have several PHP files to be run by cron. I set up the crons using command-
crontab crontab.txt
Inside the crontab.txt file, I have written cron commands like this:-
#(Updating tutor activities) - every minute
* * * * * /usr/bin/wget -O - -q -t 1 http://project/cron/tutor_activities.php
But none of the functionalities are working (database queries, sending reminder mails etc.). Running the URLs manually works.
Then I put my mail address in MAILTO and received the mails. In the mail, I received entire HTML source of the page. What is expected in the mail? Why are my functionalities not working?
Updates
If I change my cron commands to
#(Updating tutor activities) - every minute
* * * * * /usr/bin/wget http://project/cron/tutor_activities.php
Still no success and this comes in my mail -
--15:03:01-- http://project/cron/tutor_activities.php
=> `tutor_activities.php'
Resolving project... IP Address
Connecting to test.project|IP Address|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: http://project./ [following]
--15:03:01-- http://project./
=> `index.html.1'
Resolving project.... IP Address
Connecting to project.|IP Address|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://project/home/ [following]
--15:03:01-- http://project/home/
=> `index.html.1'
Resolving project... IP Address
Connecting to wproject|IP Address|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
index.html.1 has sprung into existence.
Retrying.
And lots of index.html.1 , index.html.2 files are accumulating in the root of my project. I do not want these files to be created. Just want the files to execute.
Same results if I use either of the two commands -
* * * * * /usr/bin/wget http://project/cron/tutor_activities.php
* * * * * wget http://project/cron/tutor_activities.php
running php command with MAILTO set sends me this error /bin/sh: php: command not found.
* * * * * php /path/to/test.php
So, I am not able to use php command.
I have written a simple mailto() inside my test.php. The mail does not come when run through cron (using both wget and php fails) but running the URL manually works.
My problem
To make it clear again, my main problem is that the functionality inside the cron files is not running. Creation of files is a secondary issue.
Any help would be appreciated
Thanks,
Sandeepan
if you want to call an url as cronjob, you'll have to use somthing like wget. if it's a php-script on your server it would be easier to use php /...pathtomyscript.../cron/tutor_activities.php
try
which php
The path which is returned should be placed with the command which is passed to run the Cron file.If you are setting up the Cron through Shell,it won't give any problem,but to be assured,try giving absolute path when you are trying to run a php page.
/path/to/php /path/to/cron/script/
Try to give your comand like this,if the problem persists;feel free to discuss.
When you call wget with -O -, it will send the downloaded content to stdout, which cron is sending to you via the email message. In the first case, it's doing exactly what it should.
When you call wget witout the -O parameter, it will try to save the downloaded content as a file of the same name as the web page being downloaded. If it exists, it will add the incrementer to the name, as you saw. In this second case, it's doing exactly what it should.
It's not clear from your question where you want the output to go, but if you want to save the output to the same file each time, use -O myfilename.html.
If your running PHP from cron/command line make sure you put the full path to the php executable
It's entirely possible that PHP's not in the path within the cron environment - it's definitely not going to have the same setup as your regular shell. Try using the absolute path to BOTH the php interpreter AND the php script in the cron command:
* * * * * /path/to/php /path/to/test.php
As for the creation of files, you just have to add a redirect to your wget command:
wget -O - ... http://.... > /dev/null
-O - forces wget to write anything it downloads to standard output, which cron will then happily email to you. By adding the > /dev/null at the end of the command, this output will instead go the Great Bitbucket in the Sky. If you don't want wget's stderr output emailed either, you can also add a 2&>1 after the /dev/null, which further redirects stderr to stdout, which is now going to /dev/null.
I found the problem myself. I did not put the same URL in my crontab file which I was running manually and that was my mistake.
While running manually I was just typing test in the URL, my browsers's list of saved URLs was appearing and I was selecting the URL http://www.test.project.com/cron/tutor_activities.php, but in the crontab file I had put http://test.project.com/cron/tutor_activities.php. I was mistakenly assuming this would run http://www.test.project.com/cron/tutor_activities.php (because we have a rewrite rule present to add www)
But the rewrite rule was redirecting it to http://www.test.project.com/home. That's why the HTML content in the mails.
So, the most important thing to learn here is to make sure we don't miss the minute things and don't assume that we did everything correctly. In my case, better to copy-paste the working URL into the cron file.
An easy and secure (no tmp files) way to do this is to use bash's process substitution:
* * * * * bash -c "/path/to/php <(/usr/bin/wget -O - -q -t 1 http://project/cron/tutor_activities.php)"
Process substitution runs the command within <() and puts the output into a file object that is only visible from the current process. In order to use it from cron, invoke bash directly and pass it as a command string.
And as others have mentioned, use the full path to php which you can find out with which php.