I have several PHP files to be run by cron. I set up the crons using command-
crontab crontab.txt
Inside the crontab.txt file, I have written cron commands like this:-
#(Updating tutor activities) - every minute
* * * * * /usr/bin/wget -O - -q -t 1 http://project/cron/tutor_activities.php
But none of the functionalities are working (database queries, sending reminder mails etc.). Running the URLs manually works.
Then I put my mail address in MAILTO and received the mails. In the mail, I received entire HTML source of the page. What is expected in the mail? Why are my functionalities not working?
Updates
If I change my cron commands to
#(Updating tutor activities) - every minute
* * * * * /usr/bin/wget http://project/cron/tutor_activities.php
Still no success and this comes in my mail -
--15:03:01-- http://project/cron/tutor_activities.php
=> `tutor_activities.php'
Resolving project... IP Address
Connecting to test.project|IP Address|:80... connected.
HTTP request sent, awaiting response... 301 Moved Permanently
Location: http://project./ [following]
--15:03:01-- http://project./
=> `index.html.1'
Resolving project.... IP Address
Connecting to project.|IP Address|:80... connected.
HTTP request sent, awaiting response... 302 Found
Location: http://project/home/ [following]
--15:03:01-- http://project/home/
=> `index.html.1'
Resolving project... IP Address
Connecting to wproject|IP Address|:80... connected.
HTTP request sent, awaiting response... 200 OK
Length: unspecified [text/html]
index.html.1 has sprung into existence.
Retrying.
And lots of index.html.1 , index.html.2 files are accumulating in the root of my project. I do not want these files to be created. Just want the files to execute.
Same results if I use either of the two commands -
* * * * * /usr/bin/wget http://project/cron/tutor_activities.php
* * * * * wget http://project/cron/tutor_activities.php
running php command with MAILTO set sends me this error /bin/sh: php: command not found.
* * * * * php /path/to/test.php
So, I am not able to use php command.
I have written a simple mailto() inside my test.php. The mail does not come when run through cron (using both wget and php fails) but running the URL manually works.
My problem
To make it clear again, my main problem is that the functionality inside the cron files is not running. Creation of files is a secondary issue.
Any help would be appreciated
Thanks,
Sandeepan
if you want to call an url as cronjob, you'll have to use somthing like wget. if it's a php-script on your server it would be easier to use php /...pathtomyscript.../cron/tutor_activities.php
try
which php
The path which is returned should be placed with the command which is passed to run the Cron file.If you are setting up the Cron through Shell,it won't give any problem,but to be assured,try giving absolute path when you are trying to run a php page.
/path/to/php /path/to/cron/script/
Try to give your comand like this,if the problem persists;feel free to discuss.
When you call wget with -O -, it will send the downloaded content to stdout, which cron is sending to you via the email message. In the first case, it's doing exactly what it should.
When you call wget witout the -O parameter, it will try to save the downloaded content as a file of the same name as the web page being downloaded. If it exists, it will add the incrementer to the name, as you saw. In this second case, it's doing exactly what it should.
It's not clear from your question where you want the output to go, but if you want to save the output to the same file each time, use -O myfilename.html.
If your running PHP from cron/command line make sure you put the full path to the php executable
It's entirely possible that PHP's not in the path within the cron environment - it's definitely not going to have the same setup as your regular shell. Try using the absolute path to BOTH the php interpreter AND the php script in the cron command:
* * * * * /path/to/php /path/to/test.php
As for the creation of files, you just have to add a redirect to your wget command:
wget -O - ... http://.... > /dev/null
-O - forces wget to write anything it downloads to standard output, which cron will then happily email to you. By adding the > /dev/null at the end of the command, this output will instead go the Great Bitbucket in the Sky. If you don't want wget's stderr output emailed either, you can also add a 2&>1 after the /dev/null, which further redirects stderr to stdout, which is now going to /dev/null.
I found the problem myself. I did not put the same URL in my crontab file which I was running manually and that was my mistake.
While running manually I was just typing test in the URL, my browsers's list of saved URLs was appearing and I was selecting the URL http://www.test.project.com/cron/tutor_activities.php, but in the crontab file I had put http://test.project.com/cron/tutor_activities.php. I was mistakenly assuming this would run http://www.test.project.com/cron/tutor_activities.php (because we have a rewrite rule present to add www)
But the rewrite rule was redirecting it to http://www.test.project.com/home. That's why the HTML content in the mails.
So, the most important thing to learn here is to make sure we don't miss the minute things and don't assume that we did everything correctly. In my case, better to copy-paste the working URL into the cron file.
An easy and secure (no tmp files) way to do this is to use bash's process substitution:
* * * * * bash -c "/path/to/php <(/usr/bin/wget -O - -q -t 1 http://project/cron/tutor_activities.php)"
Process substitution runs the command within <() and puts the output into a file object that is only visible from the current process. In order to use it from cron, invoke bash directly and pass it as a command string.
And as others have mentioned, use the full path to php which you can find out with which php.
Related
I am trying to execute this script that will go fetch data from a site and import it into my database.
I have created the cronjob and waited for 20 minutes. There is no error or result, it is just silent like nothing happened.
I am also not getting an email showing the result of the command. How can I execute this script and also receive the result via email?
This is the cronjob I am currently using:
20 * * * * /usr/bin/GET http://example.com/wp-content/plugins/ScriptName/scrap_data2.php?request_type=import_animes&site=2
Although Hostgator does use GET as one of its cron examples; the usual way to "run" a script from cron via its http link is to use "wget" or "curl".
I beleive "GET" is part of "libwww-perl" package and it may depend on where or whether this is installed. Try using WGET instead.
20 * * * * wget -O /dev/null http://example.com/wp-content/plugins/ScriptName/scrap_data2
'-O /dev/null' above is used to ditch cron wgets output as you won't need it since your script emails success.
As implied in the title, the Cron Job is supposed to execute a php file (update.php, to be specific). The php file then writes to a csv file stored in the same directory.
I have the time set to * * * * * so that it executes every minute. The command is written as follows:
php -q /home//public_html/wallboard/update.php
I don't believe this is causing any errors, though it also doesn't seem to write to the CSV file. When I visit update.php in a browser, however, it executes and writes to the CSV file immediately. I'm not experienced with Cron Jobs and I'm sure there's an issue, but I don't know what exactly that issue is. Let me know if you have suggestions/questions. Any help is appreciated!
Current Command:
* * * * * usr/bin/php -q /home/<user>/public_html/wallboard/update.php
update.php:
<?php
include('lib/HelpDeskView.php');
include('lib/WallboardDisplay.php');
include('helpdesk.csv');
$helpdesk = new HelpDeskView();
$text="\r\ntest,test,test";
file_put_contents( "helpdesk.csv" , $text, FILE_APPEND);
Since your script resides in your public_html directory you can use wget for your Cron Job
wget -O - -q https://yoursite.com/wallboard/update.php
-O - output is written to the standard output in this case it will go to the email address you specify in CPanel
-q quiet mode
IMHO the best way is to contact support and ask them about command line syntax.
This is how I'm doing it at my linux server using cPanel.
This runs script.php which is stored in public root. Course, replace <username> in command line with your username.
At another server I'm using same command line with /usr/bin/php instead of php at the beginning of line, but I'm aware that not all servers use same command line. Some require php-cli in command line instead of php, some don't "like" -f argument, etc. So try various combinations.
To find more suggestions check out this SO topic too: Run a PHP file in a cron job using CPanel
Important thing: When trying different commands wait at least a minute (this case) to see if it works because Cron doesn't fire your script immediately.
Try to execute the same command in PHP CLI and check if it gives you any error, you might be missing some libraries or references required for CLI execution.
/usr/bin/php -d register_argc_argv=On /home/USERNAME/public_html/DOMAIN/artisan AMIR:HOME
I have 2 php scripts which need to run every few minutes, the php itself is fine as I can trigger it manually by typing http://myfakesite.com/myphpfile.php into a browser. Both work as they should.
I don't have much experience with cron but I have found the crontab file (I am using Parallels Power Panel on a VPN), and it already had jobs to run-parts cron.hourly, daily, weekly and monthly. I added my email address to the MAILTO field, and get a message telling me that cron.hourly is not a directory - even though it is (although it is currently empty).
I also added my own 2 jobs to this file as follows:
*/1 * * * * wget http://myfakesite.com/script1.php
*/5 * * * * wget http://myfakesite.com/script2.php
Neither of these ever get called. In the php script I also have a line to email me once the script is called, so that I know it is working, and have never received an email via the cronjob. If I ssh into the server and use wget the php works and emails me to confirm.
I have also tried
*/1 * * * * php http://myfakesite.com/script1.php
and
*/1 * * * * wget -O /dev/null http://myfakesite.com/script1.php>/dev/null 2>&1
with no luck. The MAILTO in the crontab file never sends me a message to say there was a problem running the script, but clearly it is not doing it.
Can anyone help at all? I have no idea what to try next.
Edit: Found some info that said that the 'not a directory' error can be down to a corrupt crontab file. So I downloaded it copied the info to a new file and uploaded that. The 'not a directory' error disappeared...COMPLETELY. Even if I tell it to look for a folder which I know doesn't exist like cron.myfakefolder I don't get a Not a directory: /etc/cron.myfakefolder email. Replaced my new crontab file with the original, and I'm still not getting any feedback from the cronjob.
Edit 2: As dAm2K and prodigitalson suggested, I tried using the absolute paths to both wget and php. Neither have worked, and I've double checked the locations of usr/bin/wget and usr/bin/php just to make sure they were actually there.
I also checked var/log/cron and var/log/messages, both contained this:
2002Can't perform "download" operation: Requested file "/var/log/cron" is to big to be sent at once. Try to request file in pieces of 512KB
No idea what is happening with this. As I mentioned in edit 1, I'm also no longer receiving error emails even when I purposely add a false location. Is this related to the error in the log files?
Also, I have checked that crond is running and according to both 'System Services' and 'System Processes' it is running.
Try using absolute paths to executables with cron: eg
*/5 * * * * /usr/bin/wget 'http://myfakesite.com/script1.php'
instead of
*/5 * * * * wget http://myfakesite.com/script1.php
Double check /var/log/messages or /var/log/syslog log files for details on cron execution, and what happens to the system in general.
If your script is on the same server, Try execute it with php -q:
*/1 * * * * php -q pathToTheFile/script1.php
I need to run a php script to generate snapshots using CutyCapt of some websites using crone job, i get websites' addressess from database and then delete this record after generating screenshots.
i used */5 * * * * /usr/bin/php -f /path/generate.php
it didn't worked for crone job but if i access the same file using browser it works fine, and if run this script using command php from command line it also works fine.
then i just created another file and accessed the url using file_get_contents; added this file to crone job it worked fine for some days but now this approach is also not working. i don't know what happened. i didn't change any of these files.
I also tried wget command to access that url but failed to get required out put.
my crontab is now looks like this
*/5 * * * * wget "http://www.mysite.com/generate.php" -O /dev/null
Strange thing is that crone job executes fine it fetches data from database and deletes record as well but does not update images.
Is there any problem with rights or something similar that prevents it to generate images using crone job but not when accessed using browser.
Please help i am stuck.
I don't know what your script is doing internally, but keep in mind that a user's cron jobs do not inherit the users environment. So, you may be running into a PATH issue if your php script is running any shell commands.
If you are running shell commands from the script, try setting the PATH environment variable from within your php script and see if that helps.
is there any user credintials on this page , such as Basic authentication ?
if so , you have to define the user name and password in wget request like
wget --http-user=user --http-password=password "http://url" ?
and try another solution by running yor script from php command line
so your crontab could look like
*/5 * * * * /usr/bin/php -f /path/to/generate.php
try this solution it will work and it is better than hitting the server to execute background operations on your data
and I hope this helps
I've been experimenting with cronjobs recently and have it setup like so:
crontab:
SHELL=/bin/sh
MAILTO=me#me.com
26 11 * * * wget http://diruser:pass#domain.com/path/to/file.php
Within the php file it runs is a SQL query which at the moment just runs an insert. This works fine and runs the insert but when I look at the email it produces it says this:
email:
wget http://diruser:pass#domain.com/path/to/file.php:
Command failed with exit status 1
I was just wondering if anyone knows why it would return command failed? It obviously doesn't fail but the email response makes me think I've done something wrong somewhere.
Also, is it good practice to put the file the cron is calling inside a password protected directory or is there a better way around it (such as checking that the request ip is that of the server or something).
Most likely, wget throws an error because it cannot save the result. By default wget will try to store thr result on disk. When running in a cron job, the working directory is (usually) /. You , as a user, probably cannot store a file there.
Instead, tell wget to drop the result, or pipe it to /dev/null. Fox example:
wget -O - -q http://diruser:pass#domain.com/path/to/file.php
I have no idea why error is produced. But in my opinion this is not good practive to call PHP via wget. You can write PHP script which will not be accessible over Apache (or Nginx or other HTTP server) and can be called from cron:
SHELL=/bin/sh
MAILTO=me#me.com
26 11 * * * php /path/to/file.php
And better way is to call it like an separated user, phpuser for example, who will have only permissions that script needs.
Try:
26 11 * * * /usr/local/bin/wget "http://diruser:pass#domain.com/path/to/file.php" > /dev/null 2>&1
Or
26 11 * * * /usr/bin/wget "http://diruser:pass#domain.com/path/to/file.php" > /dev/null 2>&1
1. crontab needs the full path to the command, in order to run it
2. wget will try to save the response of the file.php, if it doesn't have the necessary permissions, it will fail. That's why you should redirect the output somewhere else, other than a file, which is accomplished by > /dev/null.
There are three standard sources of input and output for a program, i.e. STDIN, STDOUT, STDERR, respectively numbered as 0, 1, 2.
When you redirect the output by using the greater-than >, if you don't explicitly mention which one you want to redirect, the default one is STDOUT (1). Thus, we will redirect all STDOUT output to null/trash, and all errors 2>&1 to STDOUT, which in turn will go to trash, as denoted by the previous rule.