I want to build a statistic graph that shows how many users have registered per day and maybe some other data. I have a MySql table in which I store the date they registered and usernames and etc.
How would I build such a graph ? What do I need for it ?
You don't always need to do things using real graphics.
<?php
// mysql connection setup
// ...
// Get the dates in a single SELECT. 2592000 seconds = 30 days
$result = mysql_query("SELECT regdate FROM users WHERE regdate > NOW()-2592000");
foreach ($row = mysql_fetch_row($result)) {
$output[date("Y-m-d", strtotime($row['regdate']))]++;
}
$fmt = ' <tr><td>%s</td><td width="%s" background="#FF0000"> </td></tr>' . "\n";
?>
<table border="0" cellspacing="0" cellpadding="0"><tr height="100">
<?php
for ($date = time()-2592000; $date < time(); date += 86400) {
$thisdate = date("Y-m-d", $date);
printf($fmt, $thisdate, $output[$thisdate]);
}
?>
</tr></table>
?>
Untested, obviously. Possibly incomplete. YMMV. Salt to taste.
There are many ways to build a graph. I can think of a few methods, use one you think best depending on your knowledge.
In every case you need to query your database. So basics of MySQL.
Then you can either create graph on server side by looping trough result set and creating graph by simple HTML divs or using GD library.
Or you can send result set as JSON object and create graph on client side using simple HTML divs or canvas tag.
Server side graphs are much simpler but cant be animated or updated without page refresh. Client side graphs require additional knowledge (JSON, security etc.)
If you only need bar graphs, you might be much better of using divs of a calculated height, all anchored to a bottom line. This is quite trivial to write and uses a whole lot less of CPU, RAM and bandwidht.
Related
I'm writing a chat program for a site that does live broadcasting, and like you can guess with any non application driven chat it relies on a looping AJAX call to get new information (messages) in my case once every 2 seconds. My JSON that is being created via PHP and populated by SQL is of some concern to me, while it shows no noticeable impact on my server at present I cannot predict what adding several hundred users to the mix may do.
<?PHP
require_once("../../../../wp-load.php");
global $wpdb;
$table_name = $wpdb->prefix . "chat_posts";
$posts = $wpdb->get_results("SELECT * FROM ". $table_name ." WHERE ID > ". $_GET['last'] . " ORDER BY ID");
echo json_encode($posts);
?>
There obviously isn't much wiggle room as far as optimizing the code itself, but I am a little worried about how well the Wordpress SQL engine is written and if it will bog my SQL down once it gets to the point where it is receiving 200 requests every 2 seconds. Would caching the json encoded results of the DB query to a file then age checking it against new calls to the PHP script and either re-constructing the file with a new query or passing the files contents based on its last modification date be a better way to handle this? At that point I am putting a bigger load on my file-system but reducing my SQL load to one query every 2 seconds regardless of number of users.
Or am I already on the right path with just querying the server on every call?
So this is what I came up with, I went the DB only route for a few tests and while response was snappy, it didn't scale well and connections quickly got eaten up. So I decided to write a quick little bit of caching logic. So far it has worked wonderfully and seems to allow me to scale my chat as big as I want.
$cacheFile = 'cache/chat_'.$_GET['last'].'.json';
if (file_exists($cacheFile) && filemtime($cacheFile) + QUERY_REFRESH_RATE > time())
{
readfile($cacheFile);
} else {
require_once("../../../../wp-load.php");
$timestampMin = gmdate("Y-m-d H:i:s", (time() - 7200));
$sql= "/*qc=on*/" . "SELECT * FROM ". DB_TABLE ."chat_posts WHERE ID > ". $_GET['last'] . " AND timestamp > '".$timestampMin."' ORDER BY ID;";
$posts = $wpdb->get_results($sql);
$json = json_encode($posts);
echo $json;
file_put_contents($cacheFile,$json);
}
Its also great in that it allows me to run my formatting functions against messages such as parsing URL's into actual links and such with much less overhead.
I'm running a php script that pulls data from MySQL table. I will be running this on a frequently visited server and would like to to keep the data in cache for X amount of time. So once you pull it, the data gets saved on the server and while the time has not passed. Here's the script:
<?php
include('mysql_connection.php');
$c = mysqlConnect();
$locale = $_GET['locale'];
$last_news_id = $_GET['news_id'];
sendQuery ("set character_set_results='utf8'");
sendQuery ("set collation_connection='utf8_general_ci'");
if(strcmp($locale,"ru") != 0)
$locale = "en";
$result = sendQuery("SELECT * FROM news WHERE id > ".$last_news_id." and locale = '".$locale."' ORDER BY id DESC LIMIT 10");
echo '<table width=\"100%\">';
while($row = mysqli_fetch_array($result, MYSQL_NUM))
{
echo '<tr><td width=\"100%\"><b>Date: </b>'.$row[2].'</td></tr>';
echo '<tr><td width=\"100%\">'.preg_replace('/#([^#]*)#(.*)/', ' $1', $row[3]).'</td></tr>';
echo '<tr><td width=\"100%\"><hr style="height: 2px; border: none; background: #515151;"></td></tr>';
}
echo '</table>';
mysqliClose($c);
?>
What php functions to use to cache the data? What are the best methods? Thank you!
You can use php Memcache:
Just add this code in your script after "sendQuery()" funciton and store it in cache like below:
$memcache_obj = memcache_connect('memcache_host', 11211);
memcache_set($memcache_obj, 'var_key', $result, 0, 30);
echo memcache_get($memcache_obj, 'var_key');
The two go-to solutions are APC and Memcache. The former is also an opcache and the latter can be distributed. Pick what suits you best.
As a matter of fact, your data already saved on the server.
And such a query should be pretty fast.
So, it seems that caching is unnecessary here. Especially if you experiencing no load problems but doing it just in case.
Apc/Memcached can be used and are used generally for this type of things. You have to be aware though about the problems that might arise from this approach: managing new inserts/updates and so on. As long as you don't really care about this information, you can set up arbitrary intervals for which the data will expire, but if the information is really relevant to your application, then this approach will not work.
Also, mysql already caches selects that are not modified between 2 requests, so basicly, if you do a select now, and one in 10 minutes with the exact same query, if nothing changed in the table, you will get the result from the query cache of mysql. There is still the overhead of issuing a data request and receiving data, but it is extremly fast. This approach works by default with the update/delete problem, because whenever a record in the table has modified, the associated query caches get erased, so you will get all modifications as they are.
Is there a way to know how much Internet connection speed someone has with PHP? I've seen a lot that a site says whether you have a good connection or not, like live.chess.com? (Actually its accurate)
How to know it with PHP?
This is not a trivial task to do. You have to consider many other factors that may slow down the connection, such as work load on your server etc.
If you just want to know whether the connection between your server and your user is good or not, you could use something like timing page load from a javascript, or so. You could use ping time, but that result wouldn't always represent the truth, since there might be a proxy / nat between you and the user. With the ajax request way, you still don't have the guarantee results are trustable.
If you need that information for, example, decide which content to show the user (eg. switch between flash/html versions, ..), better way is always to ask the user which one wants to see.
If you just want to show the user how good is his connection, or you need that information to be quite trustful, i'd recommend using an external service (for example, here in Italy I use test.ngi.it), that has everything needed to be (almost) sure measured values are realistic.
Assuming that you wish to stick with PHP, put the following code at the very beginning of your PHP page:
<?
$load_time = microtime();
$load_time = explode(' ',$load_time);
$load_time = $load_time[1] + $load_time[0];
$page_start = $load_time;
?>
Put the following code at the end of your page right before the tags:
<?
$load_time = microtime();
$load_time = explode(' ',$load_time);
$load_time = $load_time[1] + $load_time[0];
$page_end = $load_time;
$final_time = ($page_end - $page_start);
$page_load_time = number_format($final_time, 4, '.', '');
echo("Page generated in " . $page_load_time . " seconds");
?>
The load time will be displayed.
there are plenty of tutorials on how to request and parse a list of events from Google Calendar using Zend GData.
But all tutorials assume that events never repeat. (Kind of, they describe how to set up repeating events, but not how to parse / display them.)
So I wrote a script to copy events from Google Calendar to a web site, but it just doesn't work because some of the events in the calendar are repeating and the method described in the tutorials results in pretty random output.
Any idea?
I think I've finally found the answer you're really looking for. Per http://code.google.com/apis/calendar/data/1.0/reference.html#Parameters, you need to set the 'singleevents' parameter to 'true', forcing the data returned to do it's own parsing and ordering of recurring events. So your code (based on http://code.google.com/apis/calendar/data/1.0/developers_guide_php.html#RetrievingDateRange) will look something like:
function outputCalendarByDateRange($client, $startDate='2007-05-01', $endDate='2007-08-01') {
$gdataCal = new Zend_Gdata_Calendar($client);
$query = $gdataCal->newEventQuery();
$query->setUser('default');
$query->setVisibility('private');
$query->setProjection('full');
$query->setOrderby('starttime');
$query->setStartMin($startDate);
$query->setStartMax($endDate);
$query->setsingleevents('true');
$eventFeed = $gdataCal->getCalendarEventFeed($query);
echo "<ul>\n";
foreach ($eventFeed as $event) {
echo "\t<li>" . $event->title->text . " (" . $event->id->text . ")\n";
echo "\t\t<ul>\n";
foreach ($event->when as $when) {
echo "\t\t\t<li>Starts: " . $when->startTime . "</li>\n";
}
echo "\t\t</ul>\n";
echo "\t</li>\n";
}
echo "</ul>\n";
}
The data that's returned from this function has a single event for each instance of your repeating events, ordered correctly among all the rest of the "normal" events. Exceptions to the recurrance rules (single event cancellations, for instance) are correctly reflected, as well.
So I think you can now use that method without any caveats or warnings...it should give you the data you want, in the way you want.
You can probably do it without the second "foreach" loop, since each event should only have one "when" now...replace lines 18-20 with
echo "\t\t\t<li>Starts: " . $event->when->startTime . "</li>\n";
But since Google's example does include that second foreach loop, it's probably safer to leave it in.
Hope it's not too late to help you!
-----Original answer:-----
(included just for the sake of completeness and because I'm still using this basic method to combine events from multiple calendars)
I'm working on this right now myself, using PHP to parse the feed and display some customized XML based on the data. The only solution I have come up with is to retrieve the dates/times of all the events, recurring or not, using:
$eventFeed = $gdataCal->getCalendarEventFeed($query);
foreach ($eventFeed as $event) {
foreach ($event->when as $when) {
$start=strtotime($when->startTime);
$end=strtotime($when->endTime);
}
}
Which works pretty well. The issue is that all the events will be returned "grouped" in order of the next occurances. That is, say it's Monday right now. If you've got a repeating event every Tuesday and another repeating event every Thursday, and you ask it for all events in the next 90 days, the list you'll get will first list every instance of the Tuesday event for the next 90 days, and THEN it will go on to list every instance of the Thursday event. For my purposes (and it sounds like, yours too), I wanted the list to be in order of the individual events coming up.
The only way I've found to do it, is to insert the data from each individual instance into a temporary SQL database table, including a column indicating the timestamp of the event's beginning. Then once it's all entered in the database, I can request that it give me back the events, ordered by the timestamp.
Thus my loop became something like:
mysql_query("CREATE TEMPORARY TABLE `temp` (`title` TEXT NOT NULL,`date` TEXT NOT NULL,`timestamp` TIMESTAMP NOT NULL)");
$eventFeed = $gdataCal->getCalendarEventFeed($query);
foreach ($eventFeed as $event) {
foreach ($event->when as $when) {
$start=strtotime($when->startTime);
$end=strtotime($when->endTime);
mysql_query("INSERT INTO `temp` (`title`,`date`,`timestamp`) VALUES ('".$event->title->text."','".date("M d h:i a",$start)."-".date("h:i",$end)."','".date("Y-m-d H:i:s",$start)."')");
}
}
$result=mysql_query("SELECT * FROM `mobile_app_events` ORDER BY `timestamp` ASC");
while($row=mysql_fetch_assoc($result)) {
echo "<item>\n";
echo "<title>".$row['title']."</title>\n";
echo "<date>".$row['date']."</date>\n";
echo "</item>\n";
}
Now, I'll caution you- the reason I've found this topic is because I'm looking for an answer myself...it seems that if the recurring events have any exceptions (for instance, next Thursday's event is cancelled), that doesn't get reflected in the output using these codes. Though next Thursday's event is deleted from your Google Calendar view, it still shows up on this page.
Other than that, (and assuming you've got access to a database), this seems to do the trick. I did add in a few lines to start a transaction before the process, with the theory that it might speed up the rendering of the data, not having to commit every insert.
I would like to measure how much time a user spends on my website. It's needed for a community site where you can say: "User X has been spending 1397 minutes here."
After reading some documents about this, I know that there is no perfect way to achieve this. You can't measure the exact time. But I'm looking for an approach which gives a good approximation.
How could you do this? My ideas:
1) Adding 30 seconds to the online time counter on every page view.
2) On every page view, save the current timestamp. On the next view, add the difference between the saved timestamp and the current timestamp to the online time counter.
I use PHP and MySQL if this does matter.
I hope you can help me. Thanks in advance!
This is probably pointless.... what if the user has three tabs open and is "visiting" your site while actually working on the other two tabs? Do you want to count that?
Two factors are working against you -
You can only collect point-in-time statistics (page views), and there's no reasonable way to detect what happened between those points;
Even then, you'd be counting browser window time, not user time; users can easily have multiple tabs open on multiple browser instances simultaneously.
I suspect your best approximation is attributing some average amount of attention time per click and then multiplying. But then you might just as well measure clicks.
Why not just measure what actually can be measured?: referrals, page views, click-throughs, etc.
Collecting and advertising these kinds of numbers is completely in line with the rest of the world of web metrics.
Besides—if someone were to bring up a web page and then, say, go on a two week holiday, how best to account for it?
What you could do is check if a user is active on the page and then send an ajax request to your server every X seconds (would 60 secs be fine?) that a user is active or not on the page.
Then you can use the second method you have mentioned to calculate the time difference between two 'active' timestamps that are not separated by more than one or two intervals. Adding these would give the time spent by the user on your site.
google analytics includes a very powerful event logging/tracking mechanism you can customize and tap into get really good measurements of user behavior - I'd look into that
A very simple solution is to use a hidden iframe that loads a php web page periodically. The loaded web page logs the start time (if it doesn't exist) and the stop time. When the person leaves the page you are left with the time the person first came to the site and the last time they were there. In this case, the timestamp is updated every 3 seconds.
I use files to hold the log information. The filename I use consists of month-day-year ipaddress.htm
Example iframe php code. Put this in yourwebsite/yourAnalyticsiFrameCode.php:
<?php
// get the IP address of the sender
$clientIpAddress=$_SERVER['REMOTE_ADDR'];
$folder = "yourAnalyticsDataFolder";
// Combine the IP address with the current date.
$clientFileRecord=$folder."/".date('d-M-Y')." ".$clientIpAddress;
$startTimeDate = "";
// check to see if the folder to store analytics exists
if (!file_exists($folder))
{
if (!mkdir($folder))
return; // error - just bail
}
if (file_exists($clientFileRecord) )
{
//read the contents of the clientFileRedord
$lines = file($clientFileRecord);
$count = 0;
// Loop through our array, show HTML source as HTML source; and line numbers too.
foreach ($lines as $line_num => $line)
{
echo($line);
if ($count == 0)
$startTimeDate = rtrim( $line );
$count++;
}
}
if ($startTimeDate == "")
$startTimeDate = date('H:i:s d-M-Y');
$endTimeDate = date('H:i:s d-M-Y');
// write the start and stop times back out to the file
$file = fopen($clientFileRecord,"w");
fwrite($file,$startTimeDate."\n".$endTimeDate);
fclose($file);
?>
The javascript to periodically reload the iframe in the main web page.:
<!-- Javascript to reload the analytics code -->
<script>
window.setInterval("reloadIFrame();", 3000);
function reloadIFrame() {
document.getElementById('AnalyticsID').src = document.getElementById('AnalyticsID').src
// document.frames["AnalyticsID"].location.reload();
}
</script>
The iframe in the main web page looks like this:
<iframe id="AnalyticsID" name="AnalyticsID" src="http://yourwebsite/yourAnalyticsiFrameCode.php" width="1"
height="1" frameborder="0" style="visibility:hidden;display:none">
</iframe>
A very simple way to display the time stamp files:
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN"
"http://www.w3.org/TR/html4/loose.dtd">
<html>
<head>
<title></title>
</head>
<body>
Analytics results
<br>
<?php
$folder = "yourAnalyticsDataFolder";
$files1 = scandir($folder);
// Loop through the files
foreach ($files1 as $fn)
{
echo ($fn."<br>\n");
$lines = file($folder."/".$fn);
foreach ($lines as $line_num => $line)
{
echo(" ".$line."<br>\n");
}
echo ("<br>\n <br>");
}
?>
</body>
</html>
You get a results page like this:
22-Mar-2015 104.37.100.30
18:09:03 22-Mar-2015
19:18:53 22-Mar-2015
22-Mar-2015 142.162.20.133
18:10:06 22-Mar-2015
18:10:21 22-Mar-2015
I think client side JavaScript analytics is the solution for this.
You have the google analitycs, piwik, and there also commercials tools in JS that do exactly that.