"Visiting" a URL in PHP - php

The way my site is setup, I need to manually visit two URLs to trigger the mail system. One URL compiles a list of emails, another sends them off.
I'd like to automate this using a cronjob, but here's the problem. I am using the Kohana framework and I don't think copy pasting the code within the controllers will work.
The easiest way to accomplish what I am doing is to have the two URLs visited every 5 minutes or so. Is it possible to "visit" (for a lack of better word) sites in PHP?

Yes, if you just use file_get_contents or access it by cURL, it would be considered "visited" as it will simply create a GET request.
file_get_contents($url1);
file_get_contents($url2);

If you just want to 'visit' a web site you could retrieve it via file_get_contents(), or if you have the curl extension installed you could fire up a curl request at your URLS.

If you are running the cron job on the same machine as the server you can call Kohana on the command line using this syntax.
/usr/bin/php index.php --uri=controller/action
Replace controller/action with the route you wish to call.
Note that any $SERVER variables are not defined when you invoke Kohana in this manner.

Related

Aurelia + Php Possible/Recommended?

Before this question gets closed, I know the setup above is possible. I just want clarification on some things.
I just started learning Aurelia because I want to convert one of my projects into a web app. My project is built with html+css+JavaScript(jQuery)+ PHP(MySql).
I havent used any sort of framework before.
In the guide, they mention a few ways to setup a web server. I used the http server with node. Now this is where I need some help understanding a few things.
I dont want to use node.js. I want to use PHP on the server. Will that work and how?
When using Apache server, I know any PHP page is sent to the interpreter that renders the final html. I use XAMPP and its apache comes bundled with PHP. Does the http server used by node come with PHP? Is this even a sensible question?
Now I know Aurelia is purely front end. If it used to make single page applications, it uses Ajax. So now I made the following assumption:
Using Aurelia, the user accesses the root page of the app that the web server sends. After that, Aurelia makes various Ajax requests to the server which will use my PHP files to do database query stuff.
Is that right or am I missing something. And can I just use xampp(apache) to host my app instead of server from node?
Aurelia is a framework that, after you export it to any server, does not rely on any back-end software at all. This means that with the help of the http- / fetch-client API, you can just call out to your php script.
I have an example in my github:
https://github.com/rjpvroegop/randyvroegop.nl-made-with-aurelia
Here I use the http-client to post data to my php script wich has a very simple email functionality.
You can see the action inside my view-model in src/pages/contact/index.js.
You can see the PHP script in src/assets/components/contactengine.php.
These work the way they should. Note: you have to change your gulp build if you want your PHP served the way I serve mine, from the dist folder after gulp-watch or gulp-export.
Next to that you can use any back-end functionality you would like, as long as it returns the proper data. This PHP script does that. If you would download my distribution to test this you can simply do the following:
gulp export from your terminal in the root folder
copy everything from the export folder to your PHP webserver.

detecting calls to outside PHP (curl, file get contents ...)

Is there a simple way to detect all outside calls from PHP?
In an open sourced project I have a lot of 3rd party scripts. With use of new relic I was able to debug long execution times leading back to some of this scripts making calls back to their servers.
I dont mind this but I want to know what data this scripts are sending and most of all I dont want to have slow site when 3rd party script server is down or not accessible.
Is there an easy way to log all curl, file get contents etc requests?
Thanks!
You are searching for a packet sniffer. Usually you'll use tcpdump and/or wireshark.
http://wiki.ubuntuusers.de/tcpdump
https://www.wireshark.org/
There are many solutions, but my preferred is
You build your own proxy : example a dedicated Apache Server (it can running in the same IP but different port who will handle this type of operations). After that, you change all of your old URL to pass by your proxy
Imagine that you have this in your code : curl_init('www.google.com'); so you have to change it by: curl_init('http://localhost:8090/CALLS_OUTSIDE_PHP_CONTROLLER.php?url_to_redirect=www.google.com');
The PHP controller running under 8090 can do many operations as : blacklist/whitelist some urls, doing regular URL check in background... many cool stuff

file_get_contents/curl blocks other clients

I use file_get_contents/curl to get access for one API at the another server from my php script. This API isn't fast and can take up to 10 seconds to respond.
When I try to open 2 pages on my web site at the same time, which uses this API, they loaded one by one, i.e. I need to wait 1st to be loaded before server will start to server request for 2nd page.
I use Apache2 and php under linux.
How I can avoid such behaviour, I don't want to block other clients while one of them access this API. Need help!
Thanks.
Yes.
There is this PHP library: http://code.google.com/p/multirequest/ (it's a multithreaded CURL lib).
As another solution, you could write a script that does that in a language that supports threading, like Ruby or Python. Then, just call the script with PHP. Seems rather simple.

Cronjob for a zend view?

this isn't the best method for doing the task, but how would you run a cronjob of a zend view.
The view is used to generate a file using an output buffer and then save the file on the server, it runs once a day.
Would it just be a matter of calling the url of action of the controller with curl:
23 50 * * curl http://pclite.com/statistics/generate
The application required authentication though.
If you are the admin of the server, I will not do this way,
I will code a PHP page using curl to download and save the file, since you coding a php file,you are able to simulate the login procedure , you can write the username and password in the php file, and make sure the file is saved by where you want
then I using LYNX in the corn, a text browser , it will call this php file once a day, so you don't have to record any username password in the cronjob and this php do what ever you wan to grab
Since you said, that this is not the best method for doing such a task, i won't tell it again :D
If the cronjob runs on the same server your webserver is on, you could check the client-ip and skip authentication if they are the same. Because if the "attacker" can send requests from your own server to the application you really have a serious security issue.
So, yes. If you skip authentication when the ip is the same you just need to call the url.
As any other class Zend_View can be instantiated from anywhere and in particular Zend_View can render to a variable. This means that you do not need to call the whole web application if all you want to do is render something.
As stated your other option is to have an entry point to the application and call it to get the return. But if you're just saving some file to the server it could be perceived as a better approach to have the cronjob be a script that does any thing. This way you will also save some load of your web application. The last thing may not be so relevant but what if in the future you want to call this endpoint several times per day for a lot of users or something?
So, you can create a CLI script that includes Zend_View and renders within itself. As always with Zend Framework the implementation choice i left entirely to you.

codeigniter cron job with http access

Sorry if this is a duplicate question...I've searched around and found similar advice but nothing that helps my exact problem. And please excuse the noob questions, CRON is a new thing for me.
I have a codeigniter script that scrapes the html DOM of another site and stores some of that in a database. I'd like to run this script at a regular interval. This has lead me to looking into cron jobs.
The page I have is at myserver.com/index.php/update
I realize I can run a cron job with curl and run this page. If I want to be a bit more secure I can put a string at the end like:
myserver.com/index.php/update/asdfh2784fufds
And check for that in my CI controller.
This seems like it would be mostly secure, but doesn't seem like the "right" way to do things.
I've looked into running CI from the command line, and can execute basic pages like:
php index.php mycontroller
But when I try to do:
php index.php update
It doesn't work. I suspect this is because it needs to use HTTP to scrape the DOM of the outside page.
So, my question:
How do I securely run a codeigniter script with a cron job that needs HTTP access?
You have a couple options. The easiest would be to have your script ensure that the $_SERVER['REMOTE_ADDR'] is coming from the same machine before executing.
Another would be to use https and have wget or curl use HTTP authentication.
What exactly went wrong?
What error did it throw?
I have used CI from the command line before without any problems.
Don't forget that in case you are not on the folder the script is located you need to specify the full path to it.
something like
php /path/to/ci_folder/index.php update
Also on your controller you can add.
if ($this->input->is_cli_request())
// run the script
else
// echo some message saying not allowed.
This will run what is needed only if the php script is running on the command line.
Hope it Helped.

Categories