Pagination without using database in PHP - php

I am trying to create a webpage which shows 10 files stored in a directory each time. I don't want to use database for it though. This is what I have up to this point.
<?php
$exclude = array("index.php");
$cssfiles = array_diff(glob("*.php"), $exclude);
foreach ($cssfiles as $cssfile) {
$filename = "http://example.com/lessons/css/".$cssfile;
outputtags($filename,true,true);
}
?>
This prints out all results, I can't figure out how to show just first ten results and after that when user clicks next 10 more results without using database. I think using a database just for this purpose doesn't make sense.
EDIT The reason I want to do it this way is because I am getting max_user_connection error.

You could do that by storing your files into an array and sort it as you wish, like following :
$exclude = array("index.php");
$cssfiles = array_diff(glob("*.php"), $exclude);
$files = array();
foreach ($cssfiles as $cssfile) {
$filename = "http://example.com/lessons/css/".$cssfile;
$files[] = $filename;
}
asort($files);
// Pagination start from here
$page = 1; // You will get this parameter from the url using $_GET['page'] for instance
$limit = 10; // Number of files to display.
$offset = (($page-1) * $limit);
$max = ($page * $limit);
$max = $max > count($files) ? count($files) : $max;
for ($i=$offset; $i<$max; $i++) {
echo $files[$i] . PHP_EOL;
}
echo 'Total Pages : ', count($files). PHP_EOL;
echo 'Page number : ' , $page;
Maybe you could use ajax and extract the pagination page to avoid fetching all the files each time.

It depends on how you want to select those ten pages. One could use a for loop that lists the files 0-9 as and may count from 10-19,..., depening on the page range the user requests.
However, when a file is added, the system would get out of order. In that case, loading / saving some sorting Information to sessions / cookies could solve the problem.
EDIT : Using a database is, however, the standard for tasks like this. Even if you don't like them, you will most likely need them for more complex Tasks that require sorting, searching or joining multiple datasets, that are just too complicated to achieve with the filesystem functions.

Related

Recursively search directories and list the x newest files (based on creation date on server)

Ok, I don't fully understand what I'm doing here, so I thought I'd get some feedback on my code.
Trying to recursively search through specific folders on my server, and return the 30 newest *.jpg images that were added (with full filepath).
At the moment my current code gives me (I'm assuming) timestamps (they each look like a string of 10 numbers), and actually I seem to only be getting 22 out of the full 30 I was expecting. I saw another post using directoryIteratorIterator, but I'm not able to upgrade my PHP version for my server and I can't find a lot of clear documentation on that.
Hoping someone can steer me in the right direction on this.
<?php
function get30Latest(){
$files = array();
foreach (glob("*/*.jpg") as $filename) { //I assume "*/*.jpg" would start from the root of the server and go through each directory looking for a match to *.jpg and add to $files array
$files[$filename] = filemtime($filename);
}
arsort($files); //I may not need this since I'm looking to sort by earliest to latest (among the 30 newest images)
$newest = array_slice($files, 0, 29); //This should be the first 30 I believe.
foreach ($newest as $file){ //Assuming I would loop through the array and display the full paths of these 30 images
echo $file . "</br>"; //Returns something similar to "1451186291, 1451186290, 1451186290, etc..."
}
}
?>
You are on a good way. This should work for you:
First of all we create a RecursiveDirectoryIterator which we pass to our RecursiveIteratorIterator so we have an iterator to iterate recursively through all files of your specified path. We filter everything expect *.jpg files out with a RegexIterator.
Now we can convert the iterator into an array with iterator_to_array(), so we can sort the array as we want to. Which we do with usort() combined with filectime() so we compare the creation date of the files and sort it by that.
At the end we can just slice the 30 newest files with array_slice() and we are done. Loop through the files and display them.
Code:
<?php
$it = new RecursiveIteratorIterator(new RecursiveDirectoryIterator("your/path"));
$rgIt = new RegexIterator($it, "/^.+\.jpg$/i");
$files = iterator_to_array($rgIt);
usort($files, function($a, $b){
if(filectime($a) == filectime($b))
return 0;
return filectime($a) > filectime($b) ? -1 : 1;
});
$files = array_slice($files, 0 , 30);
foreach($files as $v)
echo $v . PHP_EOL;
?>
I think what you may want to do is keep you function more general, incase you want to use it's function(s) for other uses or just plain change it. You won't have to then create a get10Latest() or get25Latest(), etc. This is just a simple class that contains all the script that you need to fetch and return. Use what you want from it, the methods are in order of use, so you could just take out the guts of the methods to create one big function:
class FetchImages
{
private $count = 30;
private $arr = array();
private $regex = '';
public function __construct($filter = array('jpg'))
{
// This will create a simple regex from the array of file types ($filter)
$this->regex = '.+\.'.implode('|.+\.',$filter);
}
public function getImgs($dir = './')
{
// Borrowed from contributor notes from the RecursiveDirectoryIterator page
$regex = new RegexIterator(
new RecursiveIteratorIterator(
new RecursiveDirectoryIterator($dir)),
'/^'.$this->regex.'$/i',
RecursiveRegexIterator::GET_MATCH);
// Loop and assign datetimes as keys,
// You don't need date() but it's more readable for troubleshooting
foreach($regex as $file)
$this->arr[date('YmdHis',filemtime($file[0]))][] = $file[0];
// return the object for method chaining
return $this;
}
public function setMax($max = 30)
{
// This will allow for different returned values
$this->count = $max;
// Return for method chaining
return $this;
}
public function getResults($root = false)
{
if(empty($this->arr))
return false;
// Set default container
$new = array();
// Depending on your version, you may not have the "SORT_NATURAL"
// This is what will sort the files from newest to oldest
// I have not accounted for empty->Will draw error(s) if not array
krsort($this->arr,SORT_NATURAL);
// Loop through storage array and make a new storage
// with single paths
foreach($this->arr as $timestamp => $files) {
for($i = 0; $i < count($files); $i++)
$new[] = (!empty($root))? str_replace($root,"",$files[$i]) : $files[$i];
}
// Return the results
return (!$this->count)? $new : array_slice($new,0,$this->count);
}
}
// Create new instance. I am allowing for multiple look-up
$getImg = new FetchImages(array("jpg","jpeg","png"));
// Get the results from my core folder
$count = $getImg ->getImgs(__DIR__.'/core/')
// Sets the extraction limit "false" will return all
->setMax(30)
// This will strip off the long path
->getResults(__DIR__);
print_r($count);
I don't really need a giant, flexible class of functions. This function will always output the 30 latest images. If I'm understanding correctly, you're assigning a timestamp as a key to each file in the array, and then sorting by the key using krsort? I'm trying to pull out just those pieces in order to get an array of files with timestamps, sorted from latest to oldest, and then slice the array to just the first 30. Here's just a quick attempt as a talking point (not complete by any means). At the moment its outputting only one file several hundred times:
<?php
function get30Latest(){
$directory = new RecursiveDirectoryIterator('./');
$iterator = new RecursiveIteratorIterator($directory);
$regex = new RegexIterator($iterator, '/^.+\.jpg$/i', RecursiveRegexIterator::GET_MATCH);
foreach($regex as $file){
$tmp->arr[date('YmdHis',filemtime($file[0]))][] = $file[0];
krsort($tmp->arr,SORT_NATURAL);
foreach($tmp->arr as $timestamp => $files) {
for($i = 0; $i < count($files); $i++)
$new[] = (!empty($root))? str_replace($root,"",$files[$i]) : $files[$i];
echo $new[0] . "</br>"; //this is just for debugging so I can see what files
//are showing up. Ideally this will be the array I'll
//pull the first 30 from and then send them off to a
//thumbnail creation function
}
}
}
?>

Random image from directory with no repeats?

I am successfully able to get random images from my 'uploads' directory with my code but the issue is that it has multiple images repeat. I will reload the page and the same image will show 2 - 15 times without changing. I thought about setting a cookie for the previous image but the execution of how to do this is frying my brain. I'll post what I have here, any help would be great.
$files = glob($dir . '/*.*');
$file = array_rand($files);
$filename = $files[$file];
$search = array_search($_COOKIE['prev'], $files);
if ($_COOKIE['prev'] == $filename) {
unset($files[$search]);
$filename = $files[$file];
setcookie('prev', $filename);
}
Similar to slicks answer, but a little more simple on the session front:
Instead of using array_rand to randomise the array, you can use a custom process that reorders based on just a rand:
$files = array_values(glob($dir . '/*.*'));
$randomFiles = array();
while(count($files) > 0) {
$randomIndex = rand(0, count($files) - 1);
$randomFiles[] = $files[$randomIndex];
unset($files[$randomIndex]);
$files = array_values($files);
}
This is useful because you can seed the rand function, meaning it will always generate the same random numbers. Just add (before you randomise the array):
if($_COOKIE['key']) {
$microtime = $_COOKIE['key'];
else {
$microtime = microtime();
setcookie('key', $microtime);
}
srand($microtime);
This does means that someone can manipulate the order of the images by manipulating the cookie, but if you're okay with that this this should work.
So you want to have no repeats per request? Use session. Best way to avoid repetitions is to have two arrays (buckets). First one will contains all available elements that your will pick from. The second array will be empty for now.
Then start picking items from first array and move them from 1st array to the second. (Remove and array_push to the second). Do this in a loop. On the next iteration first array won't have the element you picked already so you will avoid duplicates.
In general. Move items from a bucket to a bucket and you're done. Additionally you can store your results in session instead of cookies? Server side storage is better for that kind of things.

How to show a random picture in a web page?

My website has an image in a certain place and when a user reloads the page he should see a different image on the same place. I have 30 images and I want to change them randomly on every reload. How do I do that?
Make an array with the "picture information" (filename or path) you have, like
$pictures = array("pony.jpg", "cat.png", "dog.gif");
and randomly call an element of that array via
echo '<img src="'.$pictures[array_rand($pictures)].'" />';
Looks weird, but works.
The actual act of selecting a random image is going to require a random number. There are a couple of methods that can help with this:
rand() is used to generate a random number.
array_rand() is used to select a random element's index from an array.
You can think of the second function as a shortcut for using the first if you're specifically dealing with an array. So, for example, if you have an array of image paths from which to select the one you want to display, you can select a random one like this:
$randomImagePath = $imagePaths[array_rand($imagePaths)];
If you're storing/retrieving the images in some other way, which you didn't specify, then you may not be able to use array_rand() as easily. But, ultimately, you need to generate a random number. So some use of rand() would work for this.
If you store the information in your database, you can also SELECT a random image:
MySQL:
SELECT column FROM table
ORDER BY RAND()
LIMIT 1
PgSQL:
SELECT column FROM table
ORDER BY RANDOM()
LIMIT 1
Best,
Philipp
An easy way to create random images on popup is this method below.
(Note: You have to rename the images to "1.png", "2.png", etc.)
<?php
//This generates a random number between 1 & 30 (30 is the
//amount of images you have)
$random = rand(1,30);
//Generate image tag (feel free to change src path)
$image = <<<HERE
<img src="{$random}.png" alt="{$random}" />
HERE;
?>
* Content Here *
<!-- Print image tag -->
<?php print $image; ?>
This method is simple and I use this every time when I need a random image.
Hope this helps! ;)
I've recently written this which loads a different background on every pageload. Just replace the constant with the path to your images.
What it does is loop through your imagedirectory and randomly picks a file from it. This way you don't need to keep track of your images in an array or db or whatever. Just upload images to your imagedirectory and they will get picked (randomly).
Call like:
$oImg = new Backgrounds ;
echo $oImg -> successBg() ;
<?php
class Backgrounds
{
public function __construct()
{
}
public function succesBg()
{
$aImages = $this->_imageArrays( \constants\IMAGESTRUE, "images/true/") ;
if(count($aImages)>1)
{
$iImage = (int) array_rand( $aImages, 1 ) ;
return $aImages[$iImage] ;
}
else
{
throw new Exception("Image array " . $aImages . " is empty");
}
}
private function _imageArrays( $sDir='', $sImgpath='' )
{
if ($handle = #opendir($sDir))
{
$aReturn = (array) array() ;
while (false !== ($entry = readdir($handle)))
{
if(file_exists($sDir . $entry) && $entry!="." && $entry !="..")
{
$aReturn[] = $sImgpath . $entry ;
}
}
return $aReturn ;
}
else
{
throw new Exception("Could not open directory" . $sDir . "'" );
}
}
}
?>

Crunch lots of files to generate stats file

I have a bunch of files I need to crunch and I'm worrying about scalability and speed.
The filename and filedata(only the first line) is stored into an array in RAM to create some statical files later in the script.
The files must remain files and can't be put into a databases.
The filename are formatted in the following fashion :
Y-M-D-title.ext (where Y is Year, M for Month and D for Day)
I'm actually using glob to list all the files and create my array :
Here is a sample of the code creating the array "for year" or "month" (It's used in a function with only one parameter -> $period)
[...]
function create_data_info($period=NULL){
$data = array();
$files = glob(ROOT_DIR.'/'.'*.ext');
$size = sizeOf($files);
$existing_title = array(); //Used so we can handle having the same titles two times at different date.
if (isSet($period)){
if ( "year" === $period ){
for ($i = 0; $i < $size; $i++) {
$info = extract_info($files[$i], $existing_file);
//Create the data array with all the data ordered by year/month/day
$data[(int)$info[5]][] = $info;
unset($info);
}
}elseif ( "month" === $period ){
for ($i = 0; $i < $size; $i++) {
$info = extract_info($files[$i], $existing_file);
$key = $info[5].$info[6];
//Create the data array with all the data ordered by year/month/day
$data[(int)$key][] = $info;
unset($info);
}
}
}
[...]
}
function extract_info($file, &$existing){
$full_path_file = $file;
$file = basename($file);
$info_file = explode("-", $file, 4);
$filetitle = explode(".", $info_file[3]);
$info[0] = $filetitle[0];
if (!isSet($existing[$info[0]]))
$existing[$info[0]] = -1;
$existing[$info[0]] += 1;
if ($existing[$info[0]] > 0)
//We have already found a post with this title
//the creation of the cache is based on info[4] data for the filename
//so we need to tune it
$info[0] = $info[0]."-".$existing[$info[0]];
$info[1] = $info_file[3];
$info[2] = $full_path_file;
$post_content = file(ROOT_DIR.'/'.$file, FILE_IGNORE_NEW_LINES | FILE_SKIP_EMPTY_LINES);
$info[3] = $post_content[0]; //first line of the files
unset($post_content);
$info[4] = filemtime(ROOT_DIR.'/'.$file);
$info[5] = $info_file[0]; //year
$info[6] = $info_file[1]; //month
$info[7] = $info_file[2]; //day
return $info;
}
So in my script I only call create_data_info(PERIOD) (PERIOD being "year", "month", etc..)
It returns an array filled with the info I need, and then I can loop throught it to create my statistics files.
This process is done everytime the PHP script is launched.
My question is : is this code optimal (certainly not) and what can I do to squeeze some juice from my code ?
I don't know how I can cache this (even if it's possible), as there is a lot of I/O involved.
I can change the tree structure if it could change things compared to a flat structure, but from what I found out with my tests it seems flat is the best.
I already thought about creating a little "booster" in C doing only the crunching, but I since it's I/O bound, I don't think it would make a huge difference and the application would be a lot less compatible for shared hosting users.
Thank you very much for your input, I hope I was clear enough here. Let me know if you need clarification (and forget my english mistakes).
To begin with you should use DirectoryIterator instead of glob function. When it comes to scandir vs opendir vs glob, glob is as slow as it gets.
Also, when you are dealing with a large amount of files you should try to do all your processing inside one loop, php function calls are rather slow.
I see you are using unset($info); yet in every loop you make, $info gets new value. Php does its own garbage collection, if thats your concern. Unset is a language construct not a function and should be pretty fast, but when using not needed, it still makes whole thing a bit slower.
You are passing $existing as a reference. Is there practical outcome for this? In my experience references make things slower.
And at last your script seems to deal with a lot of string processing. You might want to consider somekind of "serialize data and base64 encode/decode" solution, but you should benchmark that specifically, might be faster, might be slower depenging on your whole code. (My idea is that, serialize/unserialize MIGHT run faster as these are native php functions and custom functions with string processing are slower).
My answer was not very I/O related but I hope it was helpful.

VIew files in directory with pagination - php

I want to display files in my directory in browser. I know that this is possible using #opendir and readdir .. But what I want is to limit the number of files in the list to a specific number and display next using pagination.
You could use scandir to read all the contents of the directory into an array. Then output the contents of the array based on the pagination value.
$offset = 10; //get this as input from the user, probably as a GET from a link
$quantity = 10; //number of items to display
$filelist = scandir('/mydir');
//get subset of file array
$selectedFiles = array_slice($filelist, $offset-1, $quantity);
//output appropriate items
foreach($selectedFiles as $file)
{
echo '<div class="file">'.$file.'</div>';
}
Cross-posting an example (also in this question) --
DirectoryIterator and LimitIterator are my new best friends, although glob seems to prefilter more easily. You could also write a custom FilterIterator. Needs PHP > 5.1, I think.
No prefilter:
$dir_iterator = new DirectoryIterator($dir);
$paginated = new LimitIterator($dir_iterator, $page * $perpage, $perpage);
Glob prefilter:
$dir_glob = $dir . '/*.{jpg,gif,png}';
$dir_iterator = new ArrayObject(glob($dir_glob, GLOB_BRACE));
$dir_iterator = $dir_iterator->getIterator();
$paginated = new LimitIterator($dir_iterator, $page * $perpage, $perpage);
Then, do your thing:
foreach ($paginated as $file) { ... }
Note that in the case of the DirectoryIterator example, $file will be an instance of SplFileInfo, whereas glob example is just the disk path.
Depends on how you want to go about it. There are a ton of javascript/jquery pagination libraries out there .. just google "Javascript pagination."
If javascript is not an option or if you would prefer to just use php, then it should be relatively simple.
Use opendir/readdir to get a list of all the files. Set up however many you want for display. Divide the remainder by this number to get the number of pages. Then take a slice from the array of the (page - 1) * (number to list) up to (number to list). These are the files you will show. Pass the page number through get/post. If it's too high, then use the last page, too low or non-numeric, use the first page.
For pagination, you can use Zend_Paginator.
Once you get list of files in the directory, you only configure the paginator and it will take care of the rest.
Maybe something like this?
$page = 1;
$resultsPerPage = 10;
$files = array();
while(($obj = readdir($dir))) {
$files[] = $obj;
}
$limit = $page * $resultsPerPage;
($limit > count($files)) ? $limit = count($files) : $limit = $limit;
for($i = ($limit - $resultsPerPage); $i < $limit; $i++) {
echo($files[$i];
}
And then have your nav buttons modify the page number.
you should try YUI 2 pagination and maybe if you want to display the file in a table use the datatable there are a very usefull components from yahoo user interface
greetings

Categories