I have an array that gets queried each time a page is loaded, so I want to minimize the overhead and store the array into a session variable. The array file is 15kb so far. I still have to add about 300 words to each array sub key. So the file size might grow to anywhere from 100kb to 500kb. Just a guess.
The array is used to store the page data such as title, description and post content.
Here is the structure of the array:
11 main keys.
Within those main keys are sub keys anywhere from 1 to 20. Most have about 3 to 7.
each sub key has it's own array with title, description and post.
Title and description do not hold much, but post will hold approximately 300 words or less.
The values in the array will remain static.
Here's a sample of what it looks like with 2 main keys and 2 sub keys under each.
$pages = array(
'Administrator' => array(
'network-administrator' => array('title' => 'title info here', 'description' => 'description info here', 'post' => '<p>post info here - about 300 words.</p>'),
'database administrator' => array('title' => 'title info here', 'description' => 'description info here', 'post' => '<p>post info here - about 300 words.</p>'),
),
'Analyst' => array(
'business systems analyst' => array('title' => 'title info here', 'description' => 'description info here', 'post' => '<p>post info here - about 300 words.</p>'),
'data-analyst' => array('title' => 'title info here', 'description' => 'description info here', 'post' => '<p>post info here - about 300 words.</p>'),
),
);
My questions are three part.
1) Can I put this into a session variable and still be able to access the data from the session array the same way I'm accessing it directly from the array itself?
2) Is there any benefit to putting the array into a session to lessen the overhead of looping through the array on each page load?
This is how I access a value from the array
$content = $pages['Administrator']['network-administrator'];
$title = $content['title'];
$description = $content['description'];
$post = $content['post'];
3) Would I now access the array value using the same as above or writing it like this?
$pages = $_SESSION[$pages];
$content = $pages['Administrator']['network-administrator'];
$title = $content['title'];
$description = $content['description'];
$post = $content['post'];
Need some clarity, thanks for your help.
Having them in the session would increase the overhead and decrease the performance, since it would be once more stored for each user. By default sessions are stored as files, so you'd introduce extra file I/O overhead as well, increasing the load - and I don't see how storing them in the database either would be a lot better.
If you really want to increase performance of handling that data, they should be in a memory cache. Memcache or APC (as already mentioned by Cheery) are good alternatives.
However, that will only help if your array handling is really a bottleneck. Based on your description I'm really not convinced. Measure first, and only after that try to optimize.
If the table values are "static" (not different for each user) there is no benefit putting it in session, and I think it will not improve performance at all.
Though, here are my answers to your questions :
1) You will be able to access the table like you already do, sessions can handle arrays
2) it won't lessen the overhead. Sessions are stored in files, data are serialized.
3) $pages = $_SESSION['pages'] or directly $_SESSION['pages']['Administrator']
Related
I'm currently working on scanning a folder in my S3 bucket and removing files that are no longer in my database. The problem is that I have millions of files, so no way of scanning this in one go.
// get files
$files = $s3->getIterator('ListObjects', array(
"Bucket" => $S3Bucket,
"Prefix" => 'collections/items/',
"Delimiter" => '/'
), array(
'return_prefixes' => true,
'names_only' => true,
'limit' => 10
));
The documentation included something about limiting results, but I can't find anything about offsetting. I want to be able to start from 0, scan 500 items, remove them, stop, save the last scanned index and then run the script again, start from the saved index (501), scan 500 items, and so on.
Does the SDK offer some sort of offset option? Is it called something else? Or can you recommend a different method of scanning such a large folder?
Remember the last key you processed and use it as the Marker parameter.
$files = $s3->getIterator('ListObjects', array(
"Bucket" => "mybucket",
"Marker" => "last/key"
));
BTW, dont set Limit, its slowing down. Limit 10 will cause a request to the API every 10 objects, the API can return up to 1000 objects per request.
I'm currently working on a gallery system using cakephp. In it, we have paginated each image album so each page contains a set number of images at most. I have attained this by using:
$this->paginate = array(
'conditions' => array(
'Item.album_id' => $id
),
'order' => array(
'Item.added' => 'desc'
),
'limit' => '50'
);
On my view controller. With this, I can show all the items on each page with no problems.
I'm currently, however, facing a single issue: I need to show, after all the items in the current page, a button that leads to the next page. This is not a problem, except that by design the button that says NEXT PAGE should have an item from the next page as its background.
I have looked around, trying to figure out a way to pull an item from the next page of a paginated system in cakephp without luck. Has anyone done this, or does anyone have a clue how to go about it?
You have to query the preview image manually in the current page.
To access the page number you can use $this->params in the action. Then you have to query images with 'limit' and 'page' parameters as mentioned in the documentation.
After that set the image URL like this:
$this->set('preview_image_url', $queried_url);
Then in the view use inline styling to set the background for the next button.
With Alto's help, I ended up doing this (Putting it here just in case anyone wonders exactly how I did it):
$CurrPage = $this->params['paging']['Item']['page'];
$NextParams = array(
'limit' => 1,
//'page' => $CurrPage + 1,
'order' => array(
'Item.added' => 'desc'
),
'offset' => $CurrPage*50
//'order' =>('rand()')
);
$NextImage = $this->Item->find('first', $NextParams);
$this->set('NextImage', $NextImage);
This gives me the first item from the following page, which is good enough for me. Note the commented 'order' =>('rand()') - I tried to grab a random item from the following page but, alas, it seems like Cake first scrambles all items and THEN paginates. There's probably a way around this, but for now this one did it for me.
Also, using 'page' => $CurrPage + 1 didn't seem to work, with the system instead returning a seemingly randomly chosen image. That's why I instead defaulted to using 'offset' => $CurrPage*50, where 50 is the amount of items per page I'm showing.
I want to log every action that the users do. What's the best way to do this and why? Using Custom post types to insert every action as a new post or using user_meta and save details in a multidimensional array? The data would look like this:
array(
array(
'type' => 'comment',
'time' => 1416335275,
'comment_id' => 210
),
array(
'type' => 'post'
'time' => 1416335275,
'post_id' => 450
),
array(
'type' => 'visited'
'visit_type' => 'page',
'time' => 1416335275,
'page_id' => 378
),
// ... etc.
)
I don't ask how to do this, just what do you think is the best way to store that data.
Relevant question: Why in the world would you want to do this?
OK, well to answer this is a generalized sense, Wordpress is a program that interfaces a database with HTTP requests. So you'd have to capture the content of each HTTP request, and probably filter that through requests that contain logged in users versus not logged in users. To be really specific you'd also have to capture the state of the database at each intersection! Sounds like a nightmare.
Probably you have something much more specific in mind than: "I want to log every action that the users do."
The best way to store data relevant to a user, is via the user meta system. For example:
http://codex.wordpress.org/Function_Reference/add_user_meta
I'm not sure if this is even possible but what the heck :) I have two URL, both trying to insert different data into the same table. Example
We have table "food" and two URL with functionality that insert into FOOD table some values
http://example.com/insert_food_1
http://example.com/insert_food_2
When loading both URLs in the same time, every one of them waits for the other one to finish first and afterwards inserts into the DB the specific values.
I know this is called multithreading or something... but i'm not sure if this can be done with PHP (or Laravel).
Any help would be much appreciated. My code looks like so ...
$postToInsert = Post::create(
array(
'service_id' => 1,
'custom_id' => $post->body->items[0]->id,
'url' => $post->body->items[0]->share_url,
'title' => $post->body->items[0]->title,
'title_tags' => $tags,
'media_type' => $mediaType,
'image_url' => $imageURL,
'video_url' => $videoURL
));
$postToInsert->save();
I kind of fix it. Opening them in separate browsers or php them via CURL from terminal solves the problem.
Thanks for all the help
I have an application that generates an array of statistics based on a greyhounds racing history. This array is then used to generate a table which is then output to the browser. I am currently working on creating a function that will generate an excel download based on these statistics. However, this excel download will only be available after the original processing has been completed. Let me explain.
The user clicks on a race name
The data for that race is then processed and displayed in a table.
Underneath the table is a link for an excel download.
However, this is where I get stuck. The excel download exists within another method within the same controller like so...
function view($race_id) {
//Process race data and place in $stats
//Output table & excel link
}
function view_excel($race_id) {
//Process race data <- I don't want it to have to process all over again!
//Output excel sheet
}
As you can see, the data has already been processed in the "view" method so it seems like a massive waste of resources having it processed again in the "view_excel" method.
Therefore, I need a method of transferring $stats over to the excel method when the link is clicked to prevent it having to be reproduced. The only methods I can think of are as follows.
Transferring $stats over to the excel method using a session flash
The variable may end up being too big for a session variable. Also, if for some reason the excel method is refreshed, the variable will be lost.
Transferring $stats over to the excel method using an ordinary session variable
As above, the variable may end up being too big for a session variable. This has the benefit that it wont be lost on a page refresh but I'm not sure how I would go about destroying old session variables, especially if the user it processing alot of races in a short period of time.
Storing $stats in a database and retrieving it in the excel method
This seems like the most viable method. However, it seems like a lot of effort to just transfer one variable across. Also, I would have to implement some sort of cron job to remove old database entries.
An example of $stats:
Array
(
[1] => Array
(
[fcalc7] =>
[avgcalc7] =>
[avgcalc3] => 86.15
[sumpos7] =>
[sumpos3] => 9
[sumfin7] =>
[sumfin3] => 8
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 4
[total_races] => 5
)
[2] => Array
(
[fcalc7] => 28.58
[avgcalc7] => 16.41
[avgcalc3] => 28.70
[sumpos7] => 18
[sumpos3] => 5
[sumfin7] => 23
[sumfin3] => 7
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 7
[total_races] => 46
)
[3] => Array
(
[fcalc7] => 28.47
[avgcalc7] => 16.42
[avgcalc3] => 28.78
[sumpos7] => 28
[sumpos3] => 11
[sumfin7] => 21
[sumfin3] => 10
[total_wins] => 0
[percent_wins] => 0
[total_processed] => 7
[total_races] => 63
)
)
Would be great to hear your ideas.
Dan
You could serialize the array into a file in sys_get_temp_dir() with a data-dependet file name. The only problem left is cleaning up old files.
Putting it into the database is also possible as you said, and deleting old data is easier than on the file system if you track the creation time.