How can I check in my PHP script, that script execute exactly from Cron Tab by wget util, not from web browser by any user?
There is no reliable solution. Anything wget can do, your browser can do too.
Your best shot is something like sending to wget to http://example.com/mysript.php?accesskey=some+really+obscure+password+that+only+you+should+know and check for that access key in your script. Of course, if anyone gets the password, this kind of protection is useless, but it's a far more consistent point of failure than blindly relying on User-Agent sniffing.
A possibility would be to use $argv. Check if $argv[1] is a certain value and call the script from crontab with the value as argument, like so: script.php argument1.
You're question is a bit difficult to understand bus I guess you wan't to make sure a PHP script is requested by wget (that get initiated by cron)
Although it might be more efficient to call the PHP script directly by cron in this case you could check the server's logging end search for the user agent matching something like wget.
An insecure solution would be to check the headers for the User-Agent:
wget -d http://www.google.com/
---request begin---
GET / HTTP/1.0
User-Agent: Wget/1.12 (linux-gnu)
Accept: */*
Host: www.google.com
Connection: Keep-Alive
---request end---
So you could do:
<?php
$userAgent = $_SERVER['HTTP_USER_AGENT'];
if (strstr($userAgent, 'Wget')) {
// wget Request
}
You can pass some arguments in crontab for your script http://php.net/manual/reserved.variables.argv.php
Then checking for these args you'll know if your script is used from command line or web.
EDIT :
Seeing answers, let's make this clear.
Calls with Wget or cURL or whatever HTTP GET request WON'T PASS ARGS!
ARGS will only pass with local call (like : php script.php arg1 arg2).
Please, noobs, stop talking when you don't know about it, and try it out yourself on your server if you aren't sure about it.
Related
I need to download several zip files from this web page ....
http://www.geoportale.regione.lombardia.it/download-pacchetti?p_p_id=dwnpackageportlet_WAR_geoportaledownloadportlet&p_p_lifecycle=0&metadataid=%7B16C07895-B75B-466A-B980-940ECA207F64%7D
using curl or wget, so not in interactive way,
A sample url is the follow ...
http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12
If I use this link in a new browser tab or window, all works fine but using curl or wget it's not possible to download the zipfile.
Trying to see what happen in the browser using Firebug, or in general the browser console, I can see that there is first a POST request and then a GET request (using Firebug ... ), so I'm not able to reproduce these requests using curl or wget.
Could be also that some cookies are sets in the browser session and the links do not work without that cookie?
Any suggestion will be appreciated ....
Cesare
NOTE: when I try to use a wget this is my result
NOTE 2: 404 Not found
NOTE 3 (the solution): the right command is
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
then I've to rename the file in something like "pippo.zip" and this is my result, or, better using the -O option in this manner
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12" -O pippo.zip
Looking at your command, you're missing the double quotes. Your command should be:
wget "http://www.geoportale.regione.lombardia.it/rlregis_download/service/package?dbId=323&cod=12"
That should download it properly.
My question is I have a wget cron job setup in cpanel and it does not work.. I mean the cron job is running but the work inside code is not happening...
But when I just enter the same URL in the browser the code get executed successfully...
So can some one tell me what is exactly equal situation command in wget as like we are performing a browser request
current cron looks like this. I am using Zend php framework
wget -b http://www.**myhost**/index/db-backup
If you have curl enabled - you could just do this as your cron entry:
curl http://www.**myhost**/index/db-backup
Then it is exactly as if you hit the address in a browser
If you dont want any data dump from the cron output - you can do this
curl http://www.**myhost**/index/db-backup > /dev/null 2>&1
The http request using wget and your browser may differ. That may cause some troubles in application. You may want to debug your http request with wget --debug URL option.
For simple crawling for cron purpose wget has spider option wget --spider URL
http://www.gnu.org/software/wget/manual/wget.html
So I want to execute a bash command from PHP on my web server. I can do this using shell_exec. However, one of the commands I want to execute is curl. I use it to send a .wav file to another server and record its response. But when invoked from PHP, curl doesn't work.
I reduced the error to the following small example. I have a script named php_script.php which contains:
<?php
$ver=shell_exec("curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver");
echo $ver
The curious thing is that when I run this php script from command line using php php_script.php, the result I get is
Status: 500 Internal Server Error
Content-type: text/html
However, if I run curl -F file=#uploads/2013-7-24-17-31-43-29097-flash.wav http://otherserver directly, I get the response I was expecting:
verdict = authentic
(Edit:) I should probably mention that if I put some bash code inside the shell_exec argument which does not contain curl, the bash command executes fine. For example, changing the line to $ver = shell_exec("echo hello > world"); puts the word "hello" into the file "world" (provided it exists and is writable). (End edit.)
Something is blocking the execution of curl when it is invoked from PHP. I thought this might be PHP's running in safe mode, but I found no indication of this in php.ini. (Is there a way to test this to make 100% sure?) What's blocking curl and, more importantly, how can I bypass or disable this block?
(And yes, I realize PHP has a curl library. However, I prefer to use commands I can run from the command line as well, for debugging purposes.)
cheers,
Alan
The reason is the administrative privileges when you run the command directly you are running it as root and thus the command gets executed. But, when you run the command through PHP it runs as an user. By, default user has not the privileges to run the shell_exec commands.
You have to change the settings of shell_exec through CPanel/Apache config file. But, it is not recommended to provide the shell_exec access to the user as it help hackers to attack on server and thus, proper care should be taken.
It would be more appropriate to use the curl library provided in PHP.
Hi is it possible to use sessions in a cronjob?
The Script I use is:
session_start();
if(empty($_SESSION['startwert'])){$startwert = 0;}
else {$startwert = $_SESSION['startwert'];}
if(empty($_SESSION['zielwert'])){$zielwert = 10000;}
else {$zielwert = $_SESSION['zielwert'];}
....
$_SESSION['startwert'] = $zielwert;
$_SESSION['zielwert'] = $zielwert + 10000;
echo "Startwert: ".$_SESSION['startwert']."<br>";
echo "Zielwert: ".$_SESSION['zielwert']."<br>";
But the Cron allways start set "startwert" to 10000 and "zielwert" to 20000 and it does not increase the values.
Ok now I have tried this.
/usr/bin/wget -O - http://mydomain.com/script.php
But the cron starts allways with 10000 and 20000. Any ideas?
If you're invoking the PHP script from cron via wget, use the --save-cookies option; if via curl, use --cookie-jar. (If you're invoking the PHP script via php -f [...] or similar, then you'll first need to invoke it via wget or curl instead.)
For example:
wget --load-cookies /tmp/cron-session-cookies --save-cookies /tmp/cron-session-cookies --keep-session-cookies [...]
or
curl -b --cookie-jar /tmp/cron-session-cookies [...]
wget by default doesn't save session cookies, which you want it to do, hence the --keep-session-cookies option; curl by default does save them, so all that's necessary is -b to enable cookies and --cookie-jar to tell curl where to find them. In either case, replace the [...] with whatever options and arguments you're already passing to the program, and adjust the location of the cookie jar file to taste.
Not really. PHP sessions are dependent on cookies (ignoring trans-sid mode), which really only exist in an HTTP context. cron jobs are running in CLI mode, so there's no http layer to deal with.
You CAN force a CLI script to use a particular session file by setting the session ID before calling session_start();, but there's no guaranteed that particular ID would actually exist when the cron job starts, as some other PHP instance's session garbage collector may have deleted it.
I have set up a cronjob which updates a bunch of contracts in a certain system. When I run the PHP-script in a browser it all works fine, but when the cronjob has to do the trick it fails. I'm kinda stuck on this one since I don't have a lot of experience with cronjobs (heck.. I can only set them up in DirectAdmin).
My PHP scripts has some includes to some classes, these includes work properly (i've tested it by sending mails to myself line by line). When the base-classes are included I have a class which handles autoloading. When I do something like Class::GetInstance() it fails.
My cronjob looks like:
* * * * * /usr/local/bin/php /home/username/domains/domain/public_html/path/to/script.php
What can I do to fix this? Perhaps not run it via php, but by a browser or something? I'm sorry if this is a stupid question, but I don't know this ;)
Remeber that when PHP is executed on CLI with /usr/local/bin/php you do not have the $_SERVER variable setted properly! I had that problem too because my script had to use $_SERVER['DOCUMENT_ROOT']. As said, try to run it in a normal shell to see if it works. Alternatively you can change your cronjob command to:
wget -q http://yourdomain.com/path/to/script.php
Usually this works well because it is just identical to fetch that URL from a normal browser.
wget man page here: http://linux.die.net/man/1/wget
You can't always call the php file directly that expects to be called via HTTP. Judging from path, it's a part of website, which is normally executed by browser, hence I'ld set the cronjob up to not to be directly called by php-cli, but rather by doing a curl request to the website's URL.
"it fails" is not the problem description one can call a suffucient one.
add this line in your crontab file
MAILTO=your#mail
and run your jobs.
You will get the script output and be able either to correct your code or ask a sensible question.
You may also redirect stdout and stderr to a log file