Can I run a php script from command line with the following usage:
php http://phpfile.php username password config_file
When I run this it says cannot open input file http://phpfile.php
If there is what would be the best way to execute this within a php script?
Thanks
This is practically not possible. You cannot execute a php script hosted on someone else's server in your cli.
WHY?
Consider this case. Facebook has a php script which adds a comment to the database. So What would be the outcome if someone executes this script from local command line, and goes about adding comments to the database ? This would really mess up the systems. Or consider something like a file hosting script. You can very well imagine the outcomes if anyone can delete any file from their own cli.
Solution!
The file can be executed if:
Either you have the script saved locally and run it
Or make a get or post request to script with required data and make it do stuff.
Summing up
You can execute it only if the owner allows it (via get and post requests)
Refer these
Do a post request using
Guide to making requests
Related
My webhost offers "one-click" cronjobs where you can set up a simple cronjob by pointing it to a php-file and choosing the desired run frequency. However the set-up only seems to accept regular URL's and so does not accept including various parameters for the target php-file.
I was wondering whether I could simply create a new php-file with a script that will run the target php-file including the parameters, and then let the webhost's cronjob point to the new php-file instead.
I am totally new to php, and finding out how to program a script that executes a php-file with parameters is apparently beyond my capacity.
[Background: I've installed tiny tiny rss (as a replacement for Google reader) on my hosted webpage and to ensure regular feed updates in my mobile device, running cronjob on the webpage is necessary. However, the relevant update.php file needs to run with some parameters (--feeds and --quiet), which are therefore the parameters I need to include when running the cronjob.]
You can try the system() function:
system('/usr/bin/php /home/user/public_html/tt-rss/update.php --feeds --quiet');
don't forget to update the path to php and the path to update.php
If it doesn't work, you could try http://feedly.com to replace Google Reader
you can use the command shell through php to run the file :
exec("php name.php");
Works For me.
Not sure if I understood the use/purpose of PHP entirely, but what seems to me that a .php file only executes when it is being called/executed by something before it, could be a html or a php file.
A thought, is it possible that a php file written, and it would just be activated by its own, example over a duration span of time, every week or so it would do something?
Thanks for looking...
You are looking for a cron job. This allows you to save a line of code on your remote server that will execute based on the criteria you set. You can make it execute a variety of files but PHP files are definitely one of the files you can execute in this manner.
As mentioned by nathan, you will be looking for a cron job. This is a server side setting in the server that will call a url at a set interval.
You seem to not really understand how PHP works. PHP scripts are called server-side before sending data to the client. They are run once when the client is accessing the script.
what my page do is:
download an array from different server (first.php)
php script parse values
parsed values are sent with ajax call
on the next (ajax called) page (second.php) there are some mysql queries
if values pass condition, values are written to database
.... So, when I run my first.php.. it loads second.php, everything's fine..
but what I want to know if it is possible to let it make by cron?
If not, what should I do?
Thanks.
There are certain things you need to understand in this regard.
The first is that PHP can be run as either a web server module or as a standalone executable. When you run it as a web server module, you open it from the browser, all related web technologies (html/css/js) etc get parsed and work in unison.
When you run it from command line using cron like say /usr/bin/php mywebpage.php
then the php executable DOES NOT parse/understand the other web technologies and so your page will fail.
There are two workarounds for this:
Rewrite only those web-enabled parts so that the ajax/js stuff gets
handled by PHP. Basically rule of the thumb is that if you are
running a CLI php script, it should contain ONLY core PHP. This is the preferred way. You will need to move the ajax calls to inside the same file and just make it a single execution flow like any regular program.
If for some reason you cannot do the above, you can try something like this:
/path/to/browser http://mysite/mywebpage.php. Here what you are doing is, you are running a browser executable and then calling the webpage URL. This way the page is being executed within the browser's environment and it will be able to parse and understand the ajax/js calls.
Yes you can create a cron job in the below way.
1) download an array from different server (first.php)
2) php script parse values in first.php
3) Include the second file, second.php by include_once which executes mysql queries
4) If everything is correct insert them to database.
It sounds like you need a standalone JavaScript shell. There are a number listed at:
https://developer.mozilla.org/en-US/docs/JavaScript/Shells
Sorry if this is a duplicate question...I've searched around and found similar advice but nothing that helps my exact problem. And please excuse the noob questions, CRON is a new thing for me.
I have a codeigniter script that scrapes the html DOM of another site and stores some of that in a database. I'd like to run this script at a regular interval. This has lead me to looking into cron jobs.
The page I have is at myserver.com/index.php/update
I realize I can run a cron job with curl and run this page. If I want to be a bit more secure I can put a string at the end like:
myserver.com/index.php/update/asdfh2784fufds
And check for that in my CI controller.
This seems like it would be mostly secure, but doesn't seem like the "right" way to do things.
I've looked into running CI from the command line, and can execute basic pages like:
php index.php mycontroller
But when I try to do:
php index.php update
It doesn't work. I suspect this is because it needs to use HTTP to scrape the DOM of the outside page.
So, my question:
How do I securely run a codeigniter script with a cron job that needs HTTP access?
You have a couple options. The easiest would be to have your script ensure that the $_SERVER['REMOTE_ADDR'] is coming from the same machine before executing.
Another would be to use https and have wget or curl use HTTP authentication.
What exactly went wrong?
What error did it throw?
I have used CI from the command line before without any problems.
Don't forget that in case you are not on the folder the script is located you need to specify the full path to it.
something like
php /path/to/ci_folder/index.php update
Also on your controller you can add.
if ($this->input->is_cli_request())
// run the script
else
// echo some message saying not allowed.
This will run what is needed only if the php script is running on the command line.
Hope it Helped.
Hey folks, the way i understand it is that cron can be used to execute php code by launching the php interpreter and passing it the path to the script to be executed.
The code I would like to schedule is in a codeigniter controller/model. So basically the controller contains 3 functions that perform some db stats. Each function will have its own schedule.
How can I secure that controller so that the code doesn't get executed maliciously? do I pass some creds to the controller as part of the cron job? or do i take that code an set it up as a separate ci app?
Any thoughts on the matter would be appreciated.
thanks
You shouldn't create a controller for doing a script. You should just create a normal PHP script, and launch it via command line/cron.
The script shouldn't be in your public web directory, it should be elsewhere (in a script folder for example), not accessible by the public (a script shouldn't be a web page).
Because if you have a script as a controller, that means you lanch the script via the HTTP server, which isn't secure, and in your cron task you'd have to use something like wget "localhost/mycontroller/myaction" (less clean).
You could always move the file outside the web directory, so you can only access it from the server side. Another way is to change the permissions on the file, so your server cant read the file, and execute the cron under root (not recommended).
As for credis, you can make the script only run if you pass the correct get variable. For example, the script only runs when you call:
http://localhost/script.php?chjfbbhjscu4iu793673vhjdvhjdbjvbdh=bugy34gruhw3d78gyfhjbryufgbcgherbciube
I don't think the querystring idea is that bad actually, especially if this URL is being passed along your own network behind a firewall then there's no real cause for concern.
Another security feature you could implement is making sure the "client's" request IP address is equal to the server's IP address, hence the script can only proceed if it is being called from the server that executes the controller action.