active users script, user count not working properly - php

i have written a script to output active users on my site....
part of this is counting unique ips in the log, as the array i use to split the lines / data unloads active users from the array list after 5 minutes.....
however the "3 online users now" count is not working properly.....
it kinda works.... when someone views a page, it says there is 1 user
lets say i view a page.... 1 visitor
then user 2 views a page .... 2 visitors
but if i then view another page, it displays 3 users.....
even though i use the same ip for both page requests....
here is my code
$data = file_get_contents('active-log.txt');
$break = "\r\n";
$lines = explode($break, $data);
foreach ($lines as $key => $value) {
$active_ip[] = $lines[$key][1];
}
$active_ip_count = array_unique($active_ip);
$active_users = (count($active_ip_count));
$active_users is the variable i use to display how many unique visitors are online at one time
thanks in advance for anyone that can help me thanks
....
EDIT
.....
here is a sample of the log saved....
1328469393|157.55.39.84|g-book
1328469398|157.55.39.84|downloads
1328469400|157.55.39.84|badger
1328469404|157.55.39.84|home
1328469408|157.55.39.84|boneyard-dogs
the first part is timestamp (to remove the line from array, if timestamp is older than 5 minutes... this works fine)
the second part is ip
third part is page viewed and the new line is created with \r\n
$lines[$key][1] is the variable for each ip in each line....
as im not exacly a php expert, when writing scripts, i test them heavily while developing, and each time i add a new line of script , i echo the data, to check its what i hope, to make sure i make no mistakes......
here is a section of code that i didnt paste as i didnt think it was necessary....
foreach($lines as $k=>$v) {
$lines[$k] = explode("|", $v); }
// echo $lines[0][0]; // now this is first array of first line .... line 2 / url would be - $lines[1][2]
this is in my code, straight after the line "$lines = explode($break, $data);" in my code

Have you looked at the output of var_dump($active_ip) after the foreach loop ends? With this setup, I'm pretty sure $lines[$key][1] is simply the first character of the line you're dealing with, so that's not going to work well for a number of reasons. What does active-log.txt look like? Does it only contain IP addresses or user names, too? If it only contains IP addresses, consider using something like this:
<?php
$data = file('active-log.txt');
$no_duplicate_ips = array_unique($data);
$active_users = (count($no_duplicate_ips));
?>
Edit:
Right, that makes sense then. Try this:
$data = file_get_contents('active-log.txt');
$break = "\r\n"; //Note that it's generally a good idea to use PHP_EOL throughout your code, for greater cross-platform compatibility
$lines = explode($break, $data);
$exploded_data = array();
$active_ips = array();
foreach($lines as $v) {
$exploded_data = explode("|", $v);
//Now check whether the timestamp is not > 5 min
if(TIMESTAMP CHECK HERE) {
//OK, this one is not too old
$active_ips[] = $exploded_data[1];
}
}
$active_ip_count = array_unique($active_ip);
$active_users = (count($active_ip_count));

Related

Filter a php array to only elements that have a matching value to $name in $data

I have a php script getting all folders in a posts folder and making them into a list.
I have a $postinfo_str variable assigned to a json file for each folder which I am using to store post date and category/tag info etc in.
I also have a $pagetitle variable assigned to a title.php include file for each folder. So say I am on a "June 2018" archive page, the text in that file will be "June 2018". If I am on say a "Tutorials" category page, that will be the text in the title.php.
In the json file, I have:
{
"Arraysortdate": "YYYYMMDD",
"Month": "Month YYYY",
"Category": ["cat1", "cat2", "etc"]
}
I am ordering the array newest to oldest using krsort with Arraysortdate as key.
How do I filter the array using $pagetitle as input, finding if there is a match in $postinfo_str, and if there isn't, remove that folder from the array?
All I can seem to find regarding array sorting is where the info in the $pageinfo_str is basically the array and so by that, the $title is the input and the output is the matching text from the $postinfo_str, whereas I want the output to be the folders that only have the matching text in the $postinfo_str to what the input ($pagetitle) is.
Here is my code I have.. Keep in mind this is flat file, I do not want a database to achieve this. See comments if you want an explaination.
<?php
$BASE_PATH = '/path/to/public_html';
// initial array containing the dirs
$dirs = glob($BASE_PATH.'/testblog/*/posts/*', GLOB_ONLYDIR);
// new array with date as key
$dirinfo_arr = [];
foreach ($dirs as $cdir) {
// get current page title from file
$pagetitle = file_get_contents("includes/title.php");
// get date & post info from file
$dirinfo_str = file_get_contents("$cdir/includes/post-info.json");
$dirinfo = json_decode($dirinfo_str, TRUE);
// add current directory to the info array
$dirinfo['dir'] = $cdir;
// add current dir to new array where date is the key
$dirinfo_arr[$dirinfo['Arraysortdate']] = $dirinfo;
}
// now we sort the new array
krsort($dirinfo_arr);
foreach($dirinfo_arr as $key=>$dir) {
$dirpath = $dir['dir'];
$dirpath = str_replace('/path/to/public_html/', '', $dirpath);
?>
<!--HTML HERE SUCH AS--!>
TEXT <br>
<?php
};
?>
I have difficulties following your problem description. Your code example is slightly confusing. It appears to load the same global includes/title.php for each directory. Meaning, the value of $pagetitle should be the same every iteration. If this is intended, you should probably move that line right outside the loop. If the file contains actual php code, you should probably use
$pagetitle = include 'includes/title.php';
or something similar. If it doesn't, you should probably name it title.txt. If it is not one global file, you should probably add the path to the file_get_contents/include as well. (However, why wouldn't you just add the title in the json struct?)
I'm under the assumption that this happened by accident when trying to provide a minimal code example (?) ... In any case, my answer won't be the perfect answer, but it hopefully can be adapted once understood ;o)
If you only want elements in your array, that fulfill certain properties, you have essentially two choices:
don't put those element in (mostly your code)
foreach ($dirs as $cdir) {
// get current page title from file
$pagetitle = file_get_contents("includes/title.php");
// get date & post info from file
$dirinfo_str = file_get_contents("$cdir/includes/post-info.json");
$dirinfo = json_decode($dirinfo_str, TRUE);
// add current directory to the info array
$dirinfo['dir'] = $cdir;
// add current dir to new array where date is the key
// ------------ NEW --------------
$filtercat = 'cat1';
if(!in_array($filtercat, $dirinfo['Category'])) {
continue;
}
// -------------------------------
$dirinfo_arr[$dirinfo['Arraysortdate']] = $dirinfo;
array_filter the array afterwards, by providing a anonymous function
// ----- before cycling through $dirinfo_arr for output
$filtercat = 'cat1';
$filterfunc = function($dirinfo) use ($filtercat) {
return in_array($filtercat, $dirinfo['Category']));
}
$dirinfo_arr = array_filter($dirinfo_arr, $filterfunc);
you should read up about anonymous functions and how you provide local vars to them, to ease the pain. maybe your use case is bettersuited for array_reduce, which is similar, except you can determine the output of your "filter".
$new = array_filter($array, $func), is just a fancy way of writing:
$new = [];
foreach($array as $key => $value) {
if($func($value)) {
$new[$key] = $value;
}
}
update 1
in my code samples, you could replace in_array($filtercat, $dirinfo['Category']) with in_array($pagetitle, $dirinfo) - if you want to match on anything that's in the json-struct (base level) - or with ($pagetitle == $dirinfo['Month']) if you just want to match the month.
update 2
I understand, that you're probably just starting with php or even programming, so the concept of some "huge database" may be frightening. But tbh, the filesystem is - from a certain point of view - a database as well. However, it usually is quite slow in comparison, it also doesn't provide many features.
In the long run, I would strongly suggest using a database. If you don't like the idea of putting your data in "some database server", use sqlite. However, there is a learning curve involved, if you never had to deal with databases before. In the long run it will be time worth spending, because it simplifys so many things.

Detect Languages; CakePHP updateAll Bad Performance

UPDATE: I think the cakePhp updateAll is the problem. If i uncomment the updateAll and pr the results i get in 1-2 seconds so many language Detections like in 5 minutes!!!! I only must update one row and can determine that row with author and title... is there a better and faster way???
I'm using detectlanguage.com in order to detect all english texts in my sql database. My Database consists of about 500.000 rows. I tried many things to detect the lang of all my texts faster. Now it will take many days... :/
i only send 20% of the text (look at my code)
i tried to copy my function and run the function many times. the copied code shows the function for all texts with a title starting with A
I only can run 6 functions at the same time... (localhost)... i tried a 7th function in a new tab, but
Waiting for available socket....
public function detectLanguageA()
{
set_time_limit(0);
ini_set('max_execution_time', 0);
$mydatas = $this->datas;
$alldatas = $mydatas->find('all')->where(['SUBSTRING(datas.title,1,1) =' => 'A'])->where(['datas.lang =' => '']);
foreach ($alldatas as $row) {
$text = $row->text;
$textLength = round(strlen($text)*0.2);
$text = substr($text,0,$ltextLength);
$title = $row->title;
$author = $row->author;
$languageCode = DetectLanguage::simpleDetect($text);
$mydatas->updateAll(
['lang' => $languageCode], // fields
['author' => $author,'textTitle' => $title]); // conditions*/
}
}
I hope some one has a idea for my problem... Now the language detection for all my texts will take more than one week :/ :/
My computer runs over 20 hours with only little interruptions... But i only detected the language of about 13.000 texts... And in my database are 500.000 texts...
Now i tried sending texts by batch, but its also to slow... I always send 20 texts in one Array and i think thats the maximum...
Is it possible that the cakePhp 3.X updateAll-function makes it so slowly?
THE PROBLEM WAS THE CAKEPHP updateAll
Now i'm using: http://book.cakephp.org/3.0/en/orm/saving-data.html#updating-data with a for loop and all is fast and good
use Cake\ORM\TableRegistry;
$articlesTable = TableRegistry::get('Articles');
for ($i = 1; $i < 460000; $i++) {
$oneArticle = $articlesTable->get($i);
$languageCode = DetectLanguage::simpleDetect($oneArticle->lyrics);
$oneArticle->lang = $languageCode;
$articlesTable->save($oneSong);
}

php foreach in foreach looping

I want to extrect all usernames and passwords each from his file and output it nicely.
I wrote a code on my appserv 2.5.1 on my computer but only the last loop gave the username output.
Tested the code on other machines and it worked perfectly.
Dont know what is the problem ...
usernames.txt content :
user1
user2
user3
passwords.txt content :
pass1
pass2
pass3
script content :
$usernames = explode("\n", file_get_contents("usernames.txt"));
$passwords = explode("\n", file_get_contents("passwords.txt"));
foreach( $usernames as $username )
{
foreach( $passwords as $password )
{
echo $username.":".$password."\n";
}
}
output :
:pass1
:pass2
:pass3
:pass1
:pass2
:pass3
user3:pass1
user3:pass2
user3:pass3
for ($i=0;$i<count($usernames) && $i<count($password); $i++) {
echo $usernames[$i].':'.$passwords[$i];
}
But $password[x] must be related to $usernames[x]
There's always those that will say you don't need it (and you often don't) but I tend to use regular expressions whenever I'm parsing these kind of flat files - there's always some quirky character, extra line-break or difference that finds it's way into a text file - be it from transferring servers, restoring backups or simply user-interference. You could also make use of array_combine in this situation if you'd prefer to carrying on using a foreach loop - I know some folks prefer it for readability.
preg_match_all('/\w+/m', file_get_contents('usernames.txt'), $usernames);
preg_match_all('/\w+/m', file_get_contents('passwords.txt'), $passwords);
if(count($usernames[0]) !== count($passwords[0]))
die('Computer says: mismatch!'); // some resemblance of error handling...
$result = array_combine($usernames[0], $passwords[0]);
foreach($result as $name => $pass)
echo "{$name}:{$pass}\n";
demo
After debugging with the post author, I guessed that the problem was with the line return character. Using a \r\n fixed the problem:
$usernames = explode("\n\r", file_get_contents("usernames.txt"));
$passwords = explode("\n\r", file_get_contents("passwords.txt"));
For reference, please note that it is very important not to assume your input data is right. If you see that something is wrong and it points obviously to a mistake you made previously (in that case it is clearly not the foreach function that is buggy, but the array), then you need to swallow your pride and debug your own code. I have been programming PHP for 10 years, and I still have to remember that every single day.

Make a Unique PHP Array based on a section of the value?

I am working on a WordPress site where the client has uploaded thousands of photos to Flickr and now want me to move them all back into WordPress and associate them with there proper posts.
Even though there are thousands of images, there is really only about 50 unique images, all the other versions are the same image but uploaded to a different location on Flickr or a slightly different size or name.
In helping me track down all the unique images, based on a list like below, that part I have highlighted, I need to pull every record into a PHP array, the catch is the part I have highlighted, is what I want to make sure is UNIQUE among all records in the array.
Any help in taking an existing PHP ARRAy that has every record and making the array only show unique values based on that part of the Value string?
Is this a Regular Expressions use case?
If it used Regex or similar I think a pattern it could look for is /4485116555_ / followed by 10 digits and then followed up with a _
Appreciate any help in getting me 1 step closer to my goal, this is just 1 piece of the big puzzle.
http://farm5.static.flickr.com/4042/4485116555_19cc0eaa85.jpg
http://farm3.static.flickr.com/2703/4485767454_77476dbdd0.jpg
http://farm5.static.flickr.com/4008/4485116637_ff085b0ab2.jpg
http://farm5.static.flickr.com/4002/4485766896_af83d349c4.jpg
http://farm5.static.flickr.com/4037/4485766950_50d5739344.jpg
http://farm3.static.flickr.com/2785/4485116905_1fa0e2ea6c.jpg
http://farm5.static.flickr.com/4052/4704387613_77542dac2e.jpg
http://farm3.static.flickr.com/2734/4485767622_7b04c3bd3e.jpg
http://farm5.static.flickr.com/4037/4485767292_1a37fe6c57.jpg
http://farm5.static.flickr.com/4038/4485116955_f9c47672c3.jpg
http://farm5.static.flickr.com/4051/4485115681_6d7419a00b.jpg
http://farm3.static.flickr.com/2753/4485116095_30161a56bb.jpg
http://farm5.static.flickr.com/4123/4831194968_3977dff9dc.jpg
http://farm5.static.flickr.com/4054/4538941056_cda5a8242d.jpg
http://farm3.static.flickr.com/2091/4515081466_43cd1624ce.jpg
http://farm3.static.flickr.com/2684/4485766664_3bb9dd9c80_m.jpg
http://farm5.static.flickr.com/4010/4485115557_a38aac0e1f.jpg
http://farm5.static.flickr.com/4055/4485115633_19e6e92276.jpg
http://farm5.static.flickr.com/4045/4485766710_08691e99ed_m.jpg
http://farm5.static.flickr.com/4024/4485115521_9ab2a33d53_m.jpg
http://farm5.static.flickr.com/4048/4505577820_81ce080f2a_t.jpg
http://farm6.static.flickr.com/5294/5389182894_920a54ce97_m.jpg
http://farm5.static.flickr.com/4152/5073487038_5bdb9e3cbc_t.jpg
http://farm5.static.flickr.com/4024/4485115401_67a8957509_m.jpg
http://farm5.static.flickr.com/4062/4485766842_2209843592_m.jpg
$ids = array(); // Where we will keep our unique list of IDs
$lines = array(/* your list of URLs here */);
foreach ($lines as $line) {
preg_match(
'|^http://[A-Za-z0-9\\.]+/[0-9]+/([0-9]+)_[a-f0-9]+.*\\.jpg$|',
'http://farm5.static.flickr.com/4054/4538941056_cda5a8242d.jpg',
$matches
);
echo $matches[1]; // 4538941056
$ids[] = $matches[1]; // Push that into the IDs array
}
$ids = array_unique($ids);
print_r($ids);
Use this code to get your ID Portion
$url = 'Your image url';
$path = parse_url($url, PHP_URL_PATH);
$pathFragments = explode('/', $path);
$end = end($pathFragments);
$id = substr($end,0,9);
And then run array_unique() to get the unique values.

PHP random link without repetitions

My PHP is poor, but I'm trying my best to improve!!
I'm attempting to code a really simple php script that loads a random html page from a text file list.
Once people have viewed the html page, they link back to the random.php file and it loads another page... this can continue on forever.
I'm using a text file list as I'll regularly be adding more pages. My issue is there is no where in my code to prevent repeat visits!! Right now I only have about 8 links, and on more than one occasion I've had the same link 'randomly' come up 3 times in a row :( Hoping there is something simple I can add to this to prevent repetitions, and if all links have been viewed, then it resets. Many Thanks :)
<body>
<?php
$urlist=file("randomlinks.txt");
$nl=count($urlist);
$np=rand(0,$nl-1);
$url=trim($urlist[$np]);
header("Location: $url");
exit;
?>
</body>
Since the user does not know in what order the links are in the text file, if you were to read said links in sequence they would seem "random" (and you can shuffle them when first creating the file).
So you can:
save in session the index of the last link seen
link the link index to system time. This does not prevent repetitions, but guarantees that no two links come out equal, unless you hit 'refresh' after exactly the right amount of time.
Method 1:
$urlist=file("randomlinks.txt");
$nl=count($urlist);
session_start();
if (!isset($_SESSION['link'])) // If link is not in session
$_SESSION['link'] = 0; // Start from 0 (the first)
$np = $_SESSION['link']++; // Next time will use next
$_SESSION['link'] %= $nl; // Start over if nl exceeded
$url=trim($urlist[$np]);
Header("Location: $url");
Method 2:
...
$nl=count($urlist);
$np = time() % $nl; // Get number of seconds since the Epoch,
// extract modulo $nl obtaining a number that
// cycles between 0 and $nl-1, every $nl seconds
$url=trim($urlist[$np]);
Header("Location: $url");
Another method would be to remember the last N links seen - but for this, you need a session variable - so as not to get them again too soon.
session_start();
if (!isset($_SESSION['urlist'])) // Do we know the user?
$_SESSION['urlist'] = array(); // No, start with empty list
if (empty($_SESSION['urlist'])) // Is the list empty?
{
$_SESSION['urlist'] = file("randomlinks.txt"); // Fill it.
$safe = array_pop($_SESSION['urlist']);
shuffle($_SESSION['urlist']); // Shuffle the list
array_push($_SESSION['urlist'], $safe);
}
$url = trim(array_pop($_SESSION['urlist']));
If you have five URLS 1, 2, 3, 4 and 5, you might get:
1 5 3 4 2 1 4 2 5 3 1 2 3 5 4 1 4 3 2 5 1 4 ...
...the list is N-1 random :-), all links appear with equal frequency, and the same link may reappear at most at a 2-remove, like the "4" above (...4 1 4...); if it does, you'll never see it again for at least $nl visits.
ALSO
You should not use Header() from within a <BODY> tag. Remove <BODY> altogether.
You don't need to use exit() if you are at the natural end of the script: the script will exit by itself.
The simplest way I can think of would be to use a cookie.
The Internet is full of tutorials such as the following:
http://www.w3schools.com/php/php_cookies.asp
For example:
<?php
if (isset($_COOKIE["vistList"]))
$visited = split(","$_COOKIE["visitList"]);
foreach ($visited as &$value) {
if ($value == /* new site url */) {
//Find a new one
}
}
else
$expire=time()+60*60*24*30;
setcookie("vistList", "List-of-visited-URLs, separated-by-commas", $expire);
?>
I have not had a chance to test this code, but hopefully it can give you ideas.
As noted in the comments, the same thing could be accomplished using php sessions:
<?php
session_start();
if (isset($_SESSION["vistList"]))
$visited = split(","$_SESSION["visitList"]);
foreach ($visited as &$value) {
if ($value == /* new site url */) {
//Find a new one
}
}
else
$_SESSION['vistList']=/* new site URL */
?>
I would use PHP sessions to do this. Take a look at this example.
Store an array of available pages in a session variable. Every time you get a page, you remove that page from the array. When the array is empty, you reset it again from your original source.
Here's what your code might look like:
session_start();
if (empty($_SESSION["pages"]))
$_SESSION["pages"] = file("randomlinks.txt");
$nl = count($_SESSION["pages"]);
$np = mt_rand(0, $nl-1);
// get the page, remove it from the array, and shift all higher elements down:
list($url) = array_splice($_SESSION["pages"], $page, 1);
die(header("Location: $url"));

Categories