How can i make my 3 level hierarchy load fast - php

Here's how I'm getting my data from database. It is a 3 level hierarchy. Parent, child and children of child.
page_id | parent_id
1 0
2 1
3 0
4 3
The top data shows 0 as the parent.
$id = 0;
public function getHierarchy($id)
{
$Pages = new Pages();
$arr = array();
$result = $Pages->getParent($id);
foreach($result as $p){
$arr[] = array(
'title' => $p['title'],
'children' => $this->getHierarchy($p['page_id']),
);
}
return $arr;
}
So far I'm getting data but it's kinda slow. The browser loads so long before showing the data. How can I make my code faster so it does not take long for the browser to load data?
Thanks.

You have multiple options to speed it up:
Select all rows from table and do the recursion only on the PHP
side. It's only an option if there are not too many rows in table.
You can self join the table. This is a nice option if you know that there are no more levels of hierarchy in the future. Look at the post: http://blog.richardknop.com/2009/05/adjacency-list-model/
Use nested sets: http://en.wikipedia.org/wiki/Nested_set_model
Or my favorite: Closure Table
Use some kind of caching. You could cache the db queries or the whole tree menu html or the PHP array for example.
Caching example:
Caching you can do in normal files or use specialized api's like memcache, memcached or apc.
Usually you have a class with 3 basic methods. An interface could look like:
interface ICache
{
public function get($key);
public function set($key, $value, $lifetime = false);
public function delete($key);
}
You can use it like:
public function getHierarchy($id)
{
$cached = $this->cache->get($id);
if (false !== $cached) {
return $cached;
}
// run another method where you build the tree ($arr)
$this->cache->set($id, $arr, 3600);
return $arr;
}
In the example the tree will be cached for one hour. You also could set unlimited lifetime and delete the key if you insert another item to the db.

Related

How to refactor php foreach using Laravel (Eloquent's) map collection function

I've just watched Adam Wathan's video on Refactoring Loops and Conditionals, and feel like I can use the map collection method in the sumLeagueStats method on my Team model (rather than the foreach).
I have a relationship on teams -> leagues, and the getLeagueStats function gets all of the stats (played, won, drew, lost, for, against, points) from the leagues table for the relevant team.
In the sumLeagueStats method I was going to use a foreach loop and loop through each stat by year, and take the sum of all of the played, etc, and return it, but having watched the above video,
class Team extends Model {
public function league()
{
return $this->hasMany('league');
}
public function getLeagueStats($year = [2018])
{
return $this->league()->whereIn('year', [$year])->get();
}
public function sumLeagueStats($year = [2018])
{
foreach {
...
return
}
/*
* Want to return a colleciton with the following:
*
$this->getLeagueStats()->sum('played');
$this->getLeagueStats()->sum('won');
$this->getLeagueStats()->sum('drew');
$this->getLeagueStats()->sum('lost');
$this->getLeagueStats()->sum('for');
$this->getLeagueStats()->sum('against');
$this->getLeagueStats()->sum('points');
*/
}
}
I'm new to Laravel, so firstly want to check. my suspicions are correct, and secondly looking for any insight/resource for more information, as the docs are slightly lacking).
Calling getLeagueStats everytime is making a request to your database everytime you call the sum method, so you can create a variable to work with and create a collection that represents the data :
$stats = $this->getLeagueStats();
return collect([
'played' => $stats->sum('played'),
'won' => $stats->sum('won'),
'drew' => $stats->sum('drew'),
'won' => $stats->sum('won'),
'lost' => $stats->sum('lost'),
'for' => $stats->sum('for'),
'points' => $stats->sum('points')
]);
is that a what you want ?

PHP Multi Threading - Synchronizing a cache file between threads

I created a script, that, for a game situation tries to find the best possible solution. It does this, by simulating each and every possible move, and quantifying them, thus deciding which is the best move to take (which will result in the fastest victory). To make it faster, I've implemented PHP's pthread, in the following way: each time the main thread needs to find a possible move (let's call this JOB), it calculates all the possible moves in the current depth, then starts a Pool, and adds to it, each possible move (let's call this TASK), so the threads develop the game tree for each move separately, for all the additional depths.
This would look something like this:
(1) Got a new job with 10 possible moves
(1) Created a new pool
(1) Added all jobs as tasks to the pool
(1) The tasks work concurently, and return an integer as a result, stored in a Volatile object
(1) The main thread selects a single move, and performs it
.... the same gets repeated at (1) until the fight is complete
Right now, the TASKS use their own caches, meaning while they work, they save caches and reuse them, but they do not share caches between themselves, and they do not take caches over from a JOB to another JOB. I tried to resolve this, and in a way managed, but I don't think this is the intended way, because it makes everything WAY slower.
What I tried to do is as follows: create a class, that will store all the cache hashes in arrays, then before creating the pool, add it to a Volatile object. Before a task is being run, it retrieves this cache, uses it for read/write operation, and when the task finished, it merges it with the instance which is in the Volatile object. This works, as in, the caches made in JOB 1, can be seen in JOB 2, but it makes the whole process way much slower, then it was, when each thread only used their own cache, which was built while building the tree, and then destroyed, when the thread finished. Am I doing this wrong, or the thing I want is simply not achieavable? Here's my code:
class BattlefieldWork extends Threaded {
public $taskId;
public $innerIterator;
public $thinkAhead;
public $originalBattlefield;
public $iteratedBattlefield;
public $hashes;
public function __construct($taskId, $thinkAhead, $innerIterator, Battlefield $originalBattlefield, Battlefield $iteratedBattlefield) {
$this->taskId = $taskId;
$this->innerIterator = $innerIterator;
$this->thinkAhead = $thinkAhead;
$this->originalBattlefield = $originalBattlefield;
$this->iteratedBattlefield = $iteratedBattlefield;
}
public function run() {
$result = 0;
$dataSet = $this->worker->getDataSet();
$HashClassShared = null;
$dataSet->synchronized(function ($dataSet) use(&$HashClassShared) {
$HashClassShared = $dataSet['hashes'];
}, $dataSet);
$myHashClass = clone $HashClassShared;
$thinkAhead = $this->thinkAhead;
$innerIterator = $this->innerIterator;
$originalBattlefield = $this->originalBattlefield;
$iteratedBattlefield = $this->iteratedBattlefield;
// the actual recursive function that will build the tree, and calculate a quantify for the move, this will use the hash I've created
$result = $this->performThinkAheadMoves($thinkAhead, $innerIterator, $originalBattlefield, $iteratedBattlefield, $myHashClass);
// I am trying to retrieve the common cache here, and upload the result of this thread
$HashClassShared = null;
$dataSet->synchronized(function($dataSet) use ($result, &$HashClassShared) {
// I am storing the result of this thread
$dataSet['results'][$this->taskId] = $result;
// I am merging the data I've collected in this thread with the data that is stored in the `Volatile` object
$HashClassShared = $dataSet['hashes'];
$HashClassShared = $HashClassShared->merge($myHashClass);
}, $dataSet);
}
}
This is how I create my tasks, my Volatile, and my Pool:
class Battlefield {
/* ... */
public function step() {
/* ... */
/* get the possible moves for the current depth, that is 0, and store them in an array, named $moves */
// $nextInnerIterator, is an int, which shows which hero must take an action after the current move
// $StartingBattlefield, is the zero point Battlefield, which will be used in quantification
foreach($moves as $moveid => $move) {
$moves[$moveid]['quantify'] = new BattlefieldWork($moveid, self::$thinkAhead, $nextInnerIterator, $StartingBattlefield, $this);
}
$Volatile = new Volatile();
$Volatile['results'] = array();
$Volatile['hashes'] = $this->HashClass;
$pool = new Pool(6, 'BattlefieldWorker', [$Volatile]);
foreach ($moves as $moveid => $move) {
if (is_a($moves[$moveid]['quantify'], 'BattlefieldWork')) {
$pool->submit($moves[$moveid]['quantify']);
}
}
while ($pool->collect());
$pool->shutdown();
$HashClass = $Volatile['hashes'];
$this->HashClass = $Volatile['hashes'];
foreach ($Volatile['results'] as $moveid => $partialResult) {
$moves[$moveid]['quantify'] = $partialResult;
}
/* The moves are ordered based on quantify, one is selected, and then if the battle is not yet finished, step is called again */
}
}
And here is how I am merging two hash classes:
class HashClass {
public $id = null;
public $cacheDir;
public $battlefieldHashes = array();
public $battlefieldCleanupHashes = array();
public $battlefieldMoveHashes = array();
public function merge(HashClass $HashClass) {
$this->battlefieldCleanupHashes = array_merge($this->battlefieldCleanupHashes, $HashClass->battlefieldCleanupHashes);
$this->battlefieldMoveHashes = array_merge($this->battlefieldMoveHashes, $HashClass->battlefieldMoveHashes);
return $this;
}
}
I've benchmarked each part of the code, to see where am I losing time, but everything seems to be fast enough to not warrant the time increase I am experiencing. What I am thinking is, that the problem lies in the Threads, sometimes, it seems that no job is being done at all, like they are waiting for some thread. Any insights on what could be the problem, would be greatly appreciated.

how to achieve pagination html php mysql

I am working on a web project in php that could potentially have a lot of users. From an administrative point of view, I am attempting to display a table containing information about each user, and seeing as the users table could grow extremely large, I would like to use pagination.
On the backend, I created a service that expects a limit and an offset parameter that will be used to query the database for records within the appropriate range. The service returns the total count of records in the table, along with the records matching the query
public static function getUsersInfo($limit = 50, $offset=1)
{
$users_count = Users::count(
array(
"column" => "user_id"
)
);
$users_info = array();
$users = Users::query()
->order('created_at')
->limit($limit, $offset)
->execute()
->toArray();
foreach ($users as $index => $user) {
$users_info[$index]['user_id'] = $user['user_id'];
$users_info[$index]['name'] = $user['first_name'] . " " . $user['last_name'];
$users_info[$index] ['phone'] = $user['phone'];
$users_info[$index] ['profile_image_url'] = $user['profile_image_url'];
}
$results = array(
'users_count' => $users_count,
'users_info' => $users_info
);
return !empty($results) ? $results : false;
}
On the frontend, what I would like to achieve ideally is, have the navigation displayed at the bottom of the table, with the typical previous, next buttons, and additionally a few numbers that allow the user to quickly navigate to a desired page if the page number displayed. This is what I have so far for the UsersController, with no pagination.
class UsersController extends ControllerBase
{
public function indexAction()
{
$usersObject = new Users();
$data = $usersObject->getUsers();
if ($data['status'] == Constants::SUCCESS) {
$users = $data['data']['users_info'];
$users_count = $data['data']['users_count'];
$this->view->setVar('users', $users);
}
echo $this->view->render('admin/users');
}
public function getUsersAction()
{
echo Pagination::create_links(15, 5, 1);
}
}
I don't have any working pagination yet, but I was thinking a good way to go would be to create a Pagination library with a create_links function that takes the
total_count of records in the database, so I know how many pages are expected
limit so I know how many records to collect
cur_page so I know where to start retrieving from
So when that function is called with the correct parameters, it would generate the html code to achieve the pagination, and that in turn can then be passed to the view and displayed.
I have never done this before, but from the research I have done so far, it seems like this might be a good way to approach it. Any guidance, suggestions, or anything at all really, regarding this would be greatly appreciated.
It looks like you are you using some bespoke MVC-ish framework. While it does not answer your question exactly I have a few points:
If you are looking at a lot of users, pagination is the least of your problems. You need to consider how the database is indexed, how the results are returned and much more.
Without understanding the underlying database abstract layer / driver you are using it is difficult to determine whether or not your ->limit($limit, $offset) line will work correctly. The offset should probably default to 0, but without knowing the code it is hard to say.
The ternary operator in your first method (return !empty($results) ? $results : false;) is currently valueless, because the statement before it will mean the variable will always be an array.
Avoid echo statements in controllers. They should return to a templating engine to output a view.
You Users class would be better named User, as the MVC framework implies that the 'Model' is a singular entity.
While it is not a general rule, most pagination systems I have used have worked on a zero-index system (Page 1 is Page 0), so calculating the limit range is simple:
$total_records = 1000;
$max_records = 20;
$current_page = 0;
$offset = $max_records * $current_page;
$sql = "SELECT * FROM foo LIMIT $offset, $max_records";

Best way to store this data in a variable?

Some quick background info: I'm coding up a site which matches books to the classes they're required for.
I have two pieces of data that I need to represent in my code-- which books go with which classes, and the data (titles, authors, pricing, etc.) on these books.
Currently I represent this all with two arrays: $classArray, and $Books_data.
The advantage of this approach over a one-variable approach is that I don't repeat myself-- if a Book is required multiple times for different classes, only the ISBN needs to be stored in the $classArray and I can store the data in the $Books_array. This advantage is especially poignant because I have to query the pricing data from API's on the fly. If I only had a $classBooksArray, I'd have to loop the query responses into a big array, repeating myself (seemingly) unnecessarily.
The disadvantage of this approach is that these variables follow each other almost everywhere like Siamese twins. Nearly every function that needs one, needs the other. And my coding spidey sense tells me it might be unnecessary.
So, what would be the best way to store this data? Two arrays, one array, or some other approach I haven't mentioned (e.g. passing by reference).
Why not an associative which has two keys - one pointing to an array of classes, one to store Books
data?
$allData = array("classes" => &$classArray, "books" => &$Books_data);
That way you're only passing around 1 variable (less clutter) but retain all the benefits of separate data stores for books and classes.
Though, to be honest, if it's just TWO sets of data, so IMHO your spidey sense is wrong - passing both as separate parameters is perfectly fine. Once you get into a set of siamise sextuplet variables, then the above approach starts to actually bring benefits.
A multidimensional array.
$collection = array(
'classes' => array( /* Contents of $classArray */),
'books' => array( /* Contents of $Books_data */)
);
function some_function($collection) {
// looping over books
foreach ($collection['books'] as $book) {
// yadda yadda
}
}
Or better yet a class:
/* Define */
class Collection {
private $books;
private $classes;
public function __construct($classes = array(), $books = array()) {
$this->books = $books;
$this->classes = $classes;
}
public function addBook($book) {
$this->books[] = $book;
}
public function addClass($class) {
$this->classes[] = $class;
}
public function get_classes() {
return $this->classes;
}
public function get_books() {
return $this->books;
}
}
function some_function(Collection $col) {
// looping over books
foreach ($col->get_books as $book) {
// yadda yadda
}
}
/* Usage */
$collection = new Collection(); // you also could pass classes and books in the
// constructor.
$collection->addBook($book);
somefunction($collection);
If your datas were a database, your current proposal being a normal form would be canonical. The two variables would just become tables and ISBN a foreign key to books table (with a third table as a class has several books). I would probably stick with the current implementation as it will be very easy to transform to database when that will be necessary (and it usually happens faster than expected).
EDIT: a comment, say it is already in a database... what I do not understand is why you would want to store a full database in memory instead of just keeping what is necessary for the current task.
Let's be OO and put the arrays into an object. Define a class with those properties, load it up, and call its methods. Or, if you must have other functions operating with the object's data, pass the instance around. Disallow direct access to the data, but provide methods for extracting the salient info.
class book_class_association {
protected $books_to_classes = array();
protected $classes_to_books = array();
function __construct() {
$this->books_to_classes = array(
'mathbook1' => array('math'),
'mathbook2' => array('math'),
);
$this->classes_to_books = array(
'math' => array('mathbook1', 'mathbook2'),
);
}
function classes_for_book( $class_name ) {
return $this->books_to_classes[$class_name];
}
function books_for_class( $class_name ) {
return $this->classes_to_books[$class_name];
}
}
Is there a reason you are not storing this data in a database and then querying the database? It is a many to many relationship, and you would need 3 tables - class , book, and class_book_intersection.
So for example, you ui could have "select class" from a list, where the list is derived from the rows in class.
Then if the class id selected is 123. The query would then be something like:
Select book.title, book.cost
from book
inner join class_book_intersection
on
class_book_intersection.classid = 123 and
class_book_intersection.bookid = book.bookid

Zend framework caching pagination results for different queries

Using Zend Paginator and the paginator cache works fine, but the same cached pages are returned for everything. Ie. I first look at a list of articles, when i go to view the categories the articles list is returned. How can I tell the paginator which result set i am looking for?
Also, how can I clear the paginated results without re-querying the paginator. Ie. I am updated a news article therefore the pagination needs to be cleared.
Thanks
Zend_Paginator uses two methods to define cache ID: _getCacheId and _getCacheInternalId. Second function is calculating cache ID based on two parameters: the number of items per page and special hash of the adapter object. The first function (_getCacheId) is calculating cache ID using result from _getCacheInternalId and current page.
So, if you are using two different paginator objects with 3 same internal parameters: adapter, current page number and the number of items per page, then your cache ID will be the same for these two objects.
So the only way I see is to define you own paginator class inherited from Zend_Paginator and to re-define one of these two internal functions to add a salt to cache ID.
Something like this:
class My_Paginator extends Zend_Paginator {
protected $_cacheSalt = '';
public static function factory($data, $adapter = self::INTERNAL_ADAPTER, array $prefixPaths = null) {
$paginator = parent::factory($data, $adapter, $prefixPaths);
return new self($paginator->getAdapter());
}
public function setCacheSalt($salt) {
$this->_cacheSalt = $salt;
return $this;
}
public function getCacheSalt() {
return $this->_cacheSalt;
}
protected function _getCacheId($page = null) {
$cacheSalt = $this->getCacheSalt();
if ($cacheSalt != '') {
$cacheSalt = '_' . $cacheSalt;
}
return parent::_getCacheId($page) . $cacheSalt;
}
}
$articlesPaginator = My_Paginator::factory($articlesSelect, 'DbSelect');
$articlesPaginator->setCacheSalt('articles');
$categoriesSelect = My_Paginator::factory($categoriesSelect, 'DbSelect');
$articlesPaginator->setCacheSalt('categories');

Categories