I am using a simple bit of PHP code in my header file which increments a counter in my SQL database by 1 once per PHP session. I've tested this and it works fine.
However when I leave it for a day the counter has gone up by way more than I believe it should, and comparing this to the pageview counter in my Google Analytics it is far too high.
What could be happening and how could I stop this?
Google-analytics has a very different way of counting visits than a simple session based counter. I can't tell you exactly how it counts it because it is very closed source on that aspect but there is definitely cookies, sessions and javascript involved.
If you want my opinion. I built my own stat system once and it was hell with all those robots detection, trends, false visits. I switched to GA and it was worse because the client then started complaining that the numbers werent the same in both sites.
IMO? Don't use both, make up your own or use GA only, but not the two, you'll probably NEVER hit the same numbers.
Good luck
What do you mean by "once per session"?
You need to do a start_session() and then set a variable to ensure you are only counting unique sessions:
if(!isset($_SESSION['started'])) {
doHitCounter();
$_SESSION['started']=true;
}
Related
Do you use cookies or session to time out or lock out someone from your website if they have used it for 2 hours? I want it to lock and have them wait an hour before them using it again.
I got the idea from this website that had a message pop up and say
"you have used this site for 2 hours, please wait an hour and come back or become a member and get unlimited hours"
Two approaches:
The simple one, in which you set a cookie or session (both use cookies in the end, so it hardly matters) to identify a user. The problem with this is that a user may simply discard his cookies or use a different browser, so this solution will only ever work for rather clueless users. It is probably good enough for your use case.
Go crazy with fuzzy identification of users via neural networks, which may/would allow you to identify users pretty uniquely in a way that would not allow them to "change their identity" easily. This is a really complex solution though and may be overkill or unrealistic for your purposes.
probably this will be helpful to you
if( !isset($_SESSION['last_access']) || (time() - $_SESSION['last_access']) > 60 )
$_SESSION['last_access'] = time();
This will update the session every 60s to ensure that the modification date is altered.
please visit below link to know more details with php code
http://riturajkumar12.blogspot.in/2014/04/expire-session-automatically-after.html
Basically, I am trying to make a split-test web app, and I'm a bit confused on how to do this without any race conditions.
Basically, there are 3 pages:
main_page.php
page1.php
page1_alt.php
So the process is as follows:
user visits main_page.php
main_page.php checks for a cookie
a. if there is no cookie create a cookie,
b. check the page the last visitor was sent to
c. send the current visitor to the other page (if last visitor went to page1.php, send this one to page1_alt.php)
I have the cookie issue sorted out, I just want to know what you think is the best method on how to do step 2b. If i write to the db, it would be impractical. If I refer to a text file, it would produce possible race conditions.
EDIT: If you think there's an easier way than starting from scratch, do give me some suggestions.:)
I'm not sure I completely understand why you want to do this, or why writing to the DB would be impractical. If you want to split them with exact precision, this seems to me to be the best possible solution.
If however you just need to split them approximately without DB-access, you could be creative:
if (date('s') % 2 == 1) {
header("Location: page1.php");
} else {
header("Location: page1_alt.php");
}
Of course there is some chance involved in this, but since seconds are equally distributed, if you have enough users, the two groups should be quite close to equal size.
I know google analytic can do split testing, I'm not sure what your end goal is but with GA It will split the users and test which page get's the user to the goal the most. Sorry if this wasn't what you were looking for.
Consider using sessions instead of cookies. Sessions are more flexible and secure way to track user activities.
You should be using sessions and header("Location:...);. From that you can do something like this
if (!isset($_SESSION["var"])) {
header("Location: main_page.php");
}
Where $_SESSION["cookie"] is whatever you want to set it to be. I guess in your case you could set to what page they were at last. Then you can just check if it exists and/or what it is and use header to send them to the appropriate page.
I wonder if the following would be a good Idea or rather contra-productive performance-wise:
An Ajax-Application, like for example a pagebrowser needs some language- and config-values, which are stored in the database. So each time, the user is using this app, in the ajax-script the mysql-query to get the variables is done again and again. concidering this for a pagebrowser, there might be like 10 or more requests (back and forward, back, forward, and so on), aka 10 x database-select, while it is needed only one time actually.
My idea was, to safe the config-vars in a session-array the first time, the ajax-app is requested. If the sessions-array exists, the mysql-query isnt done again.
if the user calls another regular page, these session-array is deleted again.
Now im not really sure, what would consume more server-resources, using sessions in the above described way for saving the vars teporarily, or just using a mysql-query to get the vars each time, the user klicks the ajax-app.
Thanx in advance, Jayden
If you working with massive amount of data, you could consider using Cookies as well instead of session for server resources, which will be stored in the user's local browser.
I'd bet sessions would be more effective, but the best way is to test and measure the different execution times.
On one of my pages I have users queue up search terms to be to queried from a 3rd party API. As they're building the queue, my site is making the queries in the background (through ajax) so I can cache the responses, saving them time when they submit. I store a session variable $_SESSION['isloading'] as true during the time that the background queries are running, and its false when they're done.
When they submit, the results page waits for $_SESSION['isloading'] to be false before querying the cache for result. Meanwhile they're shown a progress wheel.
Is there a name for this technique of using a session to locally "lock" a user before proceeding to the next step in the code? I came up with this approach on my own and was wondering if it is a common (or good) solution to this problem, and is used elsewhere.
Putting it in $_SESSION will be a wasted effort. Been there, done that and it didn't work out.
You will be much better off if you provide your "search query string" in as a $_GET variable for your XHR ( marketing people call it - Ajax ).
Off the top of my head, this sounds a little similar to the way some older forum software performs forum searches in the background, and the visible page does a repeated refresh until the background search is complete.
I don't think there's a name for it; I'm also note entirely convinced that it's a great solution. As stevecomrie pointed out, you're going to run into difficulties with concurrency (unless the session variable's name is unique per search query).
I'd instead recommend an XmlHttpRequest (as teresko points out, it's not really called "AJAX", ugh!) and you can handle the "waiting" quite simply with Javascript.
I asked about this on IRC (Hat-Tip to ##php on freenode), and they suggested I just make the search form and search results one page. Then, when they're done entering their searches the view would change rather than submitting to the next page. This would remove the necessity of keeping track of an 'isloading' state. This seems like a better approach to me, are there any problems with it?
As soon as I implemented only one cookie into my website, the entire website became very slow on first start.
I guess it's because it is fetching the cookie information. But is this always the case?
There is no heavy code behind fetching the cookie, just plain simple php like this:
$arr = $_COOKIE['name']; // array maximum of 10 values
one for loop and nothing else!
Should I be worried?
With slows down, I mean like loading for 3-5 seconds.
Thanks
This is always the case for the first time. For the second time, browser gets stuff from it's internal cache and temporary files.
Retrieving a cookie shouldn't slow your site. Something else is going on. You'll need to do some profiling.