What is the proper way, using PHP, to record web page views? I believe that currently we just record a view each time a page is hit on, but I am assuming that is including hits from bots, or other things we don't want to be recording.
How can we just record real legit views into our DB and not include stuff that shouldn't be counted as an actual page view?
Thanks!
Use google analytics
To set up the web tracking code:
Find the tracking code snippet for your property.
Sign in to your Google Analytics account, and select the Admin tab. ...
Find your tracking code snippet. ...
Copy the snippet. ...
Paste your snippet (unaltered, in its entirety) into every web page you want to track. ...
Check your setup.
1) Ignore any known bots that visits your web page. (best use robots.txt)
2) Use Ajax call at the end of page to get rid of bouncers (visitors that opens web by mistake and closes it before everything is loaded).
3) I assume you can call Ajax in some delay, so robots will already has left your page and visitor is still browsing it.
4) Record IP addresses and (if possible) some device identifier to find unique visitors.
Related
I use Google Analytics on my site and now I need to track refer-urls from some customers, i.e. I need to see the exact url they come from. My clients use a special url when referring to my site so I set up a capture of HTTP_REFER in the script on that landing page. The url the customers use is unique so there's no chance that a user accidently types that url.
The problem I face is that when i record HTTP_REFER to my database I get a lot of calls without refer, but with correct GUID and other parameters. That means that I know that they came from e.x. www.client1.com but I need to know the exact refer e.g. www.client1.com/objectx.
When I compare my database with Google Analytics the difference is huge as well. Last month Analytics said that about 1100 clicks came from my clients but my recorded clicks from my landing page are 5400. I can't see what should be wrong with my capture. It works quite simple.
the client has a link to my page with a unique GUID e.g
www.mypage.com/api/123d-213-12321-23434
When the request hits
my site I check the GUID to see if it's a valid customer and then
saves the click in my db along with HTTP_REFER.
I then redirect them to my start page.
Does anyone have any idea why they differ so much and witch one is correct?
I'm an affiliate marketer with essentially no programming experience, but I need a little help and I'm hoping you guys can help me out.
Currently I use a simple PHP redirect so that my advertiser can't see the ref URL of the site I have my ads on. (I assume this accomplishes that?)
<?php
header("Location: DESINATION URL");
exit;
?>
The ad server that I use allows me to target or exclude my ad campaigns to/from any user that has fired a particular pixel (iframe or script). The idea is that with this re-targeting capability, you can target your ad campaigns to people who visited a particular website in the past that you have a pixel on.
So, what I need to do is not only redirect users, but I want to figure out how to simultaneously fire either the iframe or script pixel. Here's the idea. Someone that clicks on one of my ads but doesn't end up making a purchase isn't someone that I want to serve ads to over and over again. Once they click, I want them to no longer be eligible to see one of my ads. The only way to do this is to fire a pixel for that user when they click, so that I can then use this retargeting feature of the ad server I use to exclude those users from my campaigns. Make sense?
User lands on the page
pixel fires
page redirects to destination URL
Is this possible?
No, a header-based redirect will ALWAYS show the url to the client. It's essentially telling the browser "hey, go over here and fetch this address". Your server isn't fetching the contents of that address, it's just telling the browser where to go to fetch it itself.
What you want is a proxy - your server fetches the URL and sends the results back to the client.
And what do you mean, "fired a pixel"? Loading a web bug?
I have user profiles on my site. Users can make it public by checking a checkbox (search able via a search engine) and uncheck the box to block the page from being searched on a search engine. Site is in php codeignitor.
How is this accomplished? I am esp lost on when the user unchecks the box to block the page from being public how is that done and how to do this in as real time as possible? A good example are the profiles on fb or linkedin.
This isn't secure, but you could check the referring URL of the visitors and allow/deny their requests by looking for a search engine's address. The results would still show up in Google, and there would be page caching (which you can kinda stop with a <meta> tag).
Basically, warn the users that when they make the page public, it's not that easy to make it private after that. I'd make it a tedious and painful process, as people will complain that "your site is brokeded".
the fastest way to get pages automatically removed from google
set a new last modified date of that specific page
return an HTTP 410 for a page URL (you can still display content on it, or make the HTTP 410 google user agent specific, or no-cookie specific, or no language header specific)
put the URL into a sitemap.xml (with the new last modified date)
ping sitemap.xml to google
with this method you can make pages disappear from google SERPS in 1 to 2 hours (if your site gets crawled regularly).
note: if you want to kick out thousands and thousands of pages at once this process will take longer.
I need a way to count how many times a link is being clicked and I was thinking of creating a php script to redirect to and do the counting. Is there a better way to do this and how would i count each time the user visits the link and would it be best to save in the database somewhere...any suggestions
Yes, it must be a PHP script - JavaScript for example won't work all the time.
So - instead of a link to
http://some.site.com/page2.php
You would link to
http://some.site.com/redirect.php?page2.php
And in the redirect.php you will track, for example, in a database, the values, and in the end throw this header:
header("Location: http://some.site.com/".$_SERVER["QUERY_STRING"]);
To redirect to the path after ?...
// yeah - logs might work... a little bit more work, though and it is also very server specific.
I would analyze your web log files as this will work whether it's a static page or a script.
If the page you need to count is a script, you could insert code that updates a table.
Website statistics is a big industry and there are many free and pay solutions out there to explore and get ideas from.
If you need to track clicks on a specific link then you'll probably need to use javascript to capture the click and send a notification to a tracking server. If you need to track page views then you're best off looking at your server logs. Remember that a page can have many links pointing to it, you have to differentiate between link clicks events page page impressions. Another possibility, depending on your application, is to use Google tracking, or a similar third party tracking app.
I have an affiliate tracking script that is currently being exploited by an affiliate. In the main, site I track the affiliate clicks using this url www.example.com?member=affiliatecode,
then I capture the query string $_GET['member'];
Problem is, an affiliate is exploiting this simple system and page loads on his site is being recorded as clicks going to mine.
What measures can I add to prevent this without changing the affiliate link to my site? An idea that I had is to check if my page has actually been loaded, but how would I do that in PHP?
Thanks
I don't quite grasp the exact problem (or more to the point, exactly how the affiliate is logging hits), but a solution may be to put a image on your page which should ensure that a browser has loaded it. So at the bottom of you page you should insert
<?
if ( isset($_GET['affiliate']) ){
echo '<img src="affimg.php">';
}
?>
And then in the affimg.php, you would log the affiliate and then output a 1x1 image (remembering to set the headers). The downside is that there is no way to stop an affiliate just simply putting that image in to his page and if the affiliate is using an iframe, the image would still be loaded.
A better way may be to simply do some tracking. Rather than just requiring that one page gets visited, change it so that you require two or more using a database to track the ip addresses.
There may be a better way, but then I don't know the exact details.
First, you can never be sure that a bot/script instead a human "clicks" on an image, this is a fact. Secondly, you can make things a bit difficult. An idea would be:
Deliver a banner including a unique link that is triggered via a Javascript-click-event, like:
<img src="http://www.targetsite.com/image.jpg" />
Save the token in your database before and give it a expiration time of some minutes. Then, only count the click if the token is valid later. So your affiliate has to change the "onClick"-Event or parse the source code to extract the token.
As said, it only makes things more difficult. You could also parse your affiliate's site source to see whether there, your banner is "clicked" automatically (which would be very cheeky).
Another addition would be to read a cookie on the client side and attach it to the generated link to implement a check if the client has already requested your target site.
Since you can't protect yourself completely from fakes, you can build several little tools like these that increase safety.
HTH