Im thinking in a cron which executes daily one php script.
This script will make a file_get_contents() to one url I assign.
Can i do this for simulate a user's visit?
Does it work like a visit?
$page = file_get_contents('http://www.example.com/');
echo $page;
You can "simulate" this kind of action, but it's better to be done with curl. Also to do this I would recommend going through this stackoverflow post, it explains all the variables which are needed to be supplied by doing a server side request, rather than opening the page through a browser and loading the analytics js.
If your visit is counted by Google Analytics implemented through javascript, then no, it won't work - file_get_contents() doesn't run any javascript, just downloads the file. However, you could do it by sending page view through PHP: https://developers.google.com/analytics/devguides/collection/protocol/v1/devguide
Note that javascript gathers more information about the user than PHP can, so use with caution.
If your intention is to check whether the script was run, there are better and easier ways, such as logging any opening, or from specific IP, to your database or a text file.
Everything I suggested here presumes you can edit the opened file - your http://www.example.com/index.php. If you need to trigger Google Analytics on a site you can't edit, you need something way more robust, like a web crawler or scraper, to execute the javascript. For inspiration: http://www.jacobward.co.uk/using-php-to-scrape-javascript-jquery-json-websites/
Related
I'm trying to force to reload a (second) page if a criteria is met in php
But if the criteria is met, i want the page to force reload everywhere, even if 10 people have it open at once for example.
for simplicty lets say the code is like this:
in /filelocation/script.php:
if {$data == "ok"}{
reload/refresh "reload.php" if it's open somewhere;
}
I came across a software that basicly does this, and i want to understand how this is done.
(it works cross device somehow, so i asume its done through php somehow)
Well, in your PHP code, you cannot simply reload/refresh something for all the users connected. This is simply because the PHP code is only executed when your browser requests a page on the server so it's only executed to build the HTML response and then it stops executing. Once the browser has the HTML response it will render the page and then it waits for an action from the user to do something else (such as clicking on a link or posting a form).
I imagine that you would like that when a specific user does something, like posting a comment or buying a product, you would like all the other visitors to be notified that a new comment has been posted or that the number of products available has been reduced.
To do that, you need to implement some JavaScript which is executed in the browser of each visitor. The idea is to keep a connection with the server with the help of web sockets. This way, you can inform the browser that something has changed.
You could google to find some examples of PHP apps using web sockets. The first example I found:
https://www.twilio.com/blog/create-php-websocket-server-build-real-time-even-driven-application
Another solution could be to have some JavaScript doing some pooling, meaning that every N seconds, it executes an Ajax request to the server to ask if something has changed. This can be done with the help of setTimeout(yourFunction, 10000) to call a JavaScript function every 10 seconds. This function will do the Ajax request and then update the part of your page that needs to change. Just be carefull that if you get a lot of users on your site then you'll produce quite a lot of load on your server. So this wouldn't be a good solution, but it could be an alternative to the web sockets.
I used cURL to login into a website. The natural question is how to perform clicks on buttons and than eventually logout. For example..javascript uses click() function. What does php use? Thanks for clues.
I am following the book on web scraping. In it the author logins into it's publishers website. The book is old and out of date. More over, it says nothing about logging out. This is the publisher: https://www.packtpub.com/
You can't click a button using PHP alone. PHP doesn't work like that. PHP can download the HTML of a webpage, but it can't perform actions like a browser can.
If you want to do that, you will need a headless browser. A headless browser is typically seen as an invisible browser. You can do most things a regular browser can do. There's PhantomJS, and CasperJS, for this.
There are also PHP libraries that use PhantomJS. For example PHP PhantomJS. Personally, I've never done this with PHP, but I do use PhantomJS and CasperJS on a regular basis.
Alternative to that, what you can do with PHP is parse the DOM for links, or buttons, and replicate the HTTP request that's made when clicking the links/buttons.
For example, if there's a link that goes to /contactus, you simply create a GET request to this page using cURL. The response will be the source code and/or headers.
I am currently working on a project that uses CasperJS, PHP and Redis to create a rather complex scraper/automation/analysis tool for a large social network.
As a side note, some sites rely heavily on JavaScript and using cURL may not be enough. You can get around this by parsing the JavaScript file/s, and some other advanced magic, but believe me you do not want to go down this route. Which is why I use CasperJS on occasions. It's slower, but that's all we've got at the moment.
As for the logging out ... delete your cookies file. Done.
I recently published a project that gives PHP access to a browser. Get it here: https://github.com/merlinthemagic/MTS, Under the hood is an instance of PhantomJS like others have suggested, this project just wraps the functionality.
After downloading and setup you would simply use the following code:
$myUrl = "http://www.example.com";
$windowObj = \MTS\Factories::getDevices()->getLocalHost()->getBrowser('phantomjs')->getNewWindow($myUrl);
//select the username input field, in this case it has id=username
$windowObj->mouseEventOnElement("[id=username]", 'leftclick');
//type your username
$windowObj->sendKeyPresses("yourUsername");
//select the password input field, in this case it has id=passwd
$windowObj->mouseEventOnElement("[id=passwd]", 'leftclick');
//type your password
$windowObj->sendKeyPresses("yourPassword");
//click on the login button, in this case it has id=login
$windowObj->mouseEventOnElement("[id=login]", 'leftclick');
//click on all the buttons you need with this function
$windowObj->clickElement("[id=someButtonId]");
$windowObj->clickElement("[id=someOtherButtonId]");
//if you want the DOM or maybe screenshot and any point run:
$dom = $windowObj->getDom();
$imageData = $windowObj->screenshot();
Lately i have been working on php-Browser-alike program. The goal of this program is to use this php-browser platform to browse only 'safe' web sites. the capabilities will be to track an adult site and not displaying it.
unfortunately , there are two major problems:
Cookies - user can't log-in their users in different sites while using this platform.
Security redirecting - some sites check the url either in PHP or JS and then redirect to their page.
So , simply i though about plain B:
i was thinking about using iFrame and build the whole program in JavaScript and Ajax! but unfortunately , iFrame is super secured and i can't touch anything in it!
- and there is gone plain B.
My question is: is there anything you can think of / advices that can help building PHP/javascript+ajax browser alike program?
For the PHP side you'll need to use curl. You'd probably want to change the html on the server side. Take a look at this Is there a PHP HTML tag library?.
For checking if the site is adult. You should just pass the domain through a database of adult sites.
For javascript I don't know of any pre-made browsers. You'll probably have to block it in yourself, it shouldn't be to hard.
Update
basic structure:
js client makes ajax request to php server using GET or POSt (ex "url=site.com/page/foo.html")
Php gets url using GET or POST
php uses curl to get page contents
php parses through html and changes urls or js prevent link press and send href="" to server via ajax (back to top) : Is it possible to stop redirection to another link page?
php echos out the page
javascript places it in display
I know my ans is too late, posting for so that anyone get help. There is a simple solution for creating complete php browser. Here is the link: http://sourceforge.net/projects/snoopy/
DO they use a php page to analyze the link, and return all of the images as josn?
Is there a way to do this with just javascript, so you dont have to go to the server to analyze the page?
I don't now how they do it. I'd implement a small service for that purpose. Given an URL return some relevant image (or generate a screenshot). This service could also cache results for better performance. But still, the page needs to be accessed in order to grab the <img src=... or to take the photograph.
Facebook calls back to the server. If you use Firebug (or, as I did, the Web Inspector in Safari), you can inspect the ajax calls. Facebook calls back to a script at /ajax/composer/attachment.php - in there is some JavaScript which contains HTML that gets inserted into the page. Here is what it looks if I point the Facebook attach link dialogue to the BBC News homepage in Safari Web Inspector:
Facebook JavaScript response when you attach a link in Safari Web Inspector http://tommorris.org/files/Facebook-20100529-181745.jpg
I put up the full JavaScript response on Gist (it is all one-line and minified originally, so I just flung it through TextMate to wrap it).
I'm not sure if you could do it on the client-side - because of browser protections on cross-site scripting - and even if you could, you probably ought not to because of this potential security problem: imagine if someone puts in a URL that points to a page which only they have access to. You don't necessarily want to put what's on someone else's customised or private page up on your Facebook/Digg type site. Imagine if it was something like Flickr and there were private pics - or worse, a porno site. No, better to proxy it back to your server and then grab the images. Plus, it'll probably be faster. No need to tax your end user's potentially slow connection downloading a page when your server will probably be able to do it quicker...
I have a Google calendar embedded on a webpage, with events related to activities the site is organizing. Some calendar events have links that redirect the user to a page, within the same website, which has more information and the option to enroll in the event.
The problem however, is that since the end of last month, Google imposed a redirect notice that doesn't even automatically redirect. The links I create on events are changed by Google and, once a user clicks on a link, a new tab opens leading to a page with a redirect warning that the user must click. Since I am providing the users with a link to within the same website this is very inconvenient and makes no sense at all.
I'd want the users to be able to click a link on the calendar and go through to the webpage with the relevant data.
Do you guys know how I can go around this warning?
My thought process:
Initially, I thought of using JS to rewrite the links but since the calendar's iframe is in a different domain, the browser won't allow it due to XSS exploits (AFAIK).
I could build my own AJAX calendar and sync it with Google's using the API, but that's a hell of a lot of work because of stupid "feature" that makes no sense. I like Google's calendar and I'd like to use it.
The third thing that I though of was that, instead of having an iframe with the calendar I could use AJAX to fetch the entire code on the frame's url. Then I'd just rewrite the links on the that code with JS. Could this work?
I would be REALLY thankful for any help. This is driving me insane!
Using Jon Cram's input I created a php script that parses the code and makes the adjustments. However I could only get that working for the html version. No AJAX for me. =(
The same origin policy will prevent JavaScript served from your domain from interacting with data served from a different domain.
You are therefore right in saying that option 1 won't work.
The same origin policy also applies to option 3 as you have stated it. JavaScript served from your domain won't be able to make a direct HTTP request to whichever domain serves the calendar code.
You will need to acquire and modify the calendar code, neither of which can be achieved with JavaScript using today's most commonly used browsers. When FireFox 3.1 and IE8 are in common use and Google serves the correct HTTP Access Control headers this could be achieved with JavaScript alone.
To modify code served from another domain, you will need to utilise some form of server-side process.
A server-side script will be able to request the calendar code. The same script can then modify the code as needed and output it in whatever form you require.
If it is a private internal site you could install greasemonkey on all clients (if they use firefox) and make a short script that fixes the urls. That only works if the original url is contained within google's redirecturl though.
If I had this problem I wound change the calendar provider, that's probably the easiest solution. I did a google search and found Kiko, looks like they might have what you need?
Simply remove the "http://" part of the URL. I am not sure why this works but it does!