Ok here is the link to the cache library he wrote: https://github.com/philsturgeon/codeigniter-cache
Anyway, his documentation is absolutely.. vague and not helpful at all. I know it's self explanatory.. to a point.
$this->cache->model('blog_m', 'getPosts', array($category_id, 'live'), 120); // keep for 2 minutes
What is the 3rd parameter?
And is that what creates the cache or this what creates a cache file:
$this->cache->write($data, 'cached-name');
And if that is, what exactly is $data suppose to be holding a value of? The overall query orrr...??
If anyone could give explanation on this on how you create a cache file.. Basically I want to cache the query that selects a bunch of news postings.. and everytime a new new post is created, to delete that cache and recache it so it shows the new news posting..
The documentation seems to be fairly clear. Anyways, I'll try to explain it in better terms:
// cached model call
$this->cache->model('blog_m', 'getPosts', array($category_id, 'live'), 120); // keep for 2 minutes
This calls the method getPosts on the model blog_m and caches the result for 120 seconds. If you make the same call again within the next 2min, it will return the cached results, otherwise it will fetch the data from the database and update the cache. It's good for methods on models that you will be calling very frequently.
If you want to manually add and get data from a cache, then you use:
// cached array or object
$this->cache->write($data, 'cached-name');
$data = $this->cache->get('cached-name');
$data will hold whatever you want to cache. If you want to cache the user's email, for example, here's how you would cache and fetch it
$email = 'foobar#example.com';
$this->cache->write($email, 'user-email');
// to fetch
$user_email = $this->cache->get('user-email');
Related
I want to cache the API responses so that number of requests to the API server are reduced. API's are written in PHP using zend framework.
My approach: I created a redis cluster and I used phpfastcache to connect to redis cluster. Using phpfastcache, we can only set the expiry time for the cached response.
Whenever the response is updated before expiry of cache, we get the older response with the approach mentioned above. Desired thing is that whenever response is updated, old cache must be cleared and new cache must be written with same key.
I have attached a sample script that I used.
It would be great if anyone can provide me solution for this.
Thanks in advance.
Code:
<?php
// phpfastcache is a package used for caching
use Phpfastcache\CacheManager;
use Phpfastcache\Drivers\Redis\Config;
require //path for composer autoloader;
#InstanceCache must be global
$InstanceCache = CacheManager::getInstance('redis', new Config([
'host' => 'IP_address',
'port' => 6379,
'password' => //password
'database' => //db_name
]));
public function function_name(parameter){
$key = "unique_name";
$CacheString = $InstanceCache->getItem($key);
if(is_null($CacheString->get())){
$sql="SELECT * FROM employees";//sql query for function_name
$res=$this->db_query($sql);
if($this->db_num_rows($res)==0):
$this->db_free_results($res);
else:
$row = $this->db_fetch_object($res);
$this->db_free_results($res);
endif;
$CacheString->set($row)->expiresAfter(/*time*/);
$InstanceCache->save($CacheString);
echo $CacheString->get();
}
else{
echo $CacheString->get();
}
}
?>
Like I told you on Github, I think you misunderstood the concept itself of caching.
The concept caching means that you cache your data for a desired TTL.
If you need the most fresh data then you must re-fetch from source (your database here ).
Cache is not means to be dynamic, it's means to be static and to help you cool down the request on your backend.
So in your case, just fetch from source without caching and it'll be good. It does not make any sense to ask Phpfastcache interrogate your database each time then compare the data to the cached data to check if your database data are fresher.
In fact the cost in time of the whole operation will be longer than only fetching from the source.
I've got SugarCrm plugin which is exporting data to external service. I'm using logic hooks for updated/deleted/new Contacts, but I've got problem with synchronizing already existing data. I have to extract all the data from the SugarCRM and there are two SugarBean methods I've tried to use: get_full_list() and get_list(). First one gives me the full Contact list, but I need to send it in batches 1000 Contacts in one Json max, the second method returns only first page of the Contacts (depends on config settings 10 - 1000max entries).
I'm using this method ATM:
// prepare contacts data from SugarBean
$bean = BeanFactory::getBean($module);
$contactResults = $bean->get_full_list();
Then foreach on $contactResults and save the data I want to the required format and send it as a Json via postrequest. I've tried to find the solution to split it into batches, but Im stuck :( Neither get_full_list or get_list seems to work for me.
Any suggestions? Maybe someone solved this issue already?
Thanks in advance!
It sounds to me like your problem is creating batches? If not please be more specific about what isn't working.
For splitting an array into batches, you may want to have a look at https://php.net/manual/en/function.array-chunk.php
Also get_list supports retrieving later pages. It is defined like this: function get_list($order_by = "", $where = "", $row_offset = 0, $limit=-1, $max=-1, $show_deleted = 0, $singleSelect=false, $select_fields = array()).
That means for the second page you could specify $row_offset = 1000, for the third page make it 2000, etc. So basically run a loop that calls get_list with $limit = 1000 and increases an initial $row_offset of 0 by 1000 after each iteration, until less than 1000 records or null is returned by the function.
Here are some general hints if you run into problems with processing those beans:
If the problem you're having is incomplete data, try loading each bean manually by using its ID. Some Sugar functions don't load all (special) fields by default.
If things seem to just fail for no reason, make sure to check your PHP log for errors. Maybe loading as many beans at once could possibly cause problems with your PHP's max_execution_time or memory_limit.
I have a website where the front page contains a search form with several fields.
When the user performs a search, I make an ajax call to a function in a controller.
Basically, when the user clicks on the submit button, I send an ajax call via post to:
Route::post('/search', 'SearchController#general');
Then, in the SearchController class, in the function general, I store the values received in a session variable which is an object:
Session::get("search")->language = Input::get("language");
Session::get("search")->category = Input::get("category");
//I'm using examples, not the real variables names
After updating the session variable, in fact, right after the code snippet shown above, I create (or override) a cookie storing the session values:
Cookie::queue("mysite_search", json_encode(Session::get("search")));
And after that operation, I perform the search query and send the results, etc.
All that work fine, but I'm not getting back the values in the cookie. Let me explain myself.
As soon as the front page of my website is opened, I perform an action like this:
if (!Session::has("search")) {
//check for a cookie
$search = Cookie::get('mysite_search');
if($search) Session::put("search", json_decode($search));
else {
$search = new stdClass();
$search->language = "any";
$search->category = "any";
Session::put("search", $search);
}
}
That seems to be always failing if($search) is always returning false, and as a result, my session variable search has always its properties language and category populated with the value any. (Again: I'm using examples, not the real variables names).
So, I would like to know what is happening here and how I could achieve what I'm intending to do.
I tried to put Session::put("search", json_decode($search)); right after $search = Cookie::get('mysite_search'); removing all the if else block, and that throws an error (the ajax call returns an error) so the whole thing is failling at some point, when storing the object in the cookie or when retieving it.
Or could also be something else. I don't know. That's why I'm here. Thanks for reading such a long question.
Ok. This is what was going on.
The problem was this:
Cookie::queue("mysite_search", json_encode(Session::get("search")));
Before having it that way I had this:
Cookie::forever("mysite_search", json_encode(Session::get("search")));
But for some reason, that approach with forever wasn't creating any cookie, so I swichted to queue (this is Laravel 4.2). But queue needs a third parameter with the expiration time. So, what was really going on is that the cookie was being deleted after closing the browser (I also have the session.php in app/config folder set to 'lifetime' => 0 and 'expire_on_close' => true which is exactly what I want).
In simple words, I set the expiration time to forever (5 years) this way:
Cookie::queue("mysite_search", json_encode(Session::get("search")), 2592000);
And now it seems to be working fine after testing it.
I am looking for a solution in CakePhp, to store and read temporary datas :
I read some XML from others websites in order to display some news in my website, but on each page load, it does a call to the other xml websites.
Is there a way (memcached like) to save temp. data in CakePhp in order to store data for 1 hour and read temp. data to display them in my webpages ; then 1 hour after update them (with cron) ?
Thanks.
CakePHP Caching seems what you'd want.
WHICH cache you use (Redis, Memcache...etc) would be up to you though. Set your cache to last an hour, and you're all set. (read more about cache on the link above).
If you're on CakePHP 2.5+, you can use the remember method described here.
public function newest() {
$model = $this;
return Cache::remember('newest_posts', function() use ($model){
// get your data from whatever source here, and return it
return $model->getMyData();
}, 'long');
}
Basically, this just checks to see if the cache key exists, and if not, runs some code in order to populate it again.
If you're below 2.5, you can do the same basic thing, but without the remember:
public function newest() {
$result = Cache::read('newest_posts', 'long');
if (!$result) {
// get your data from whatever source here, and write it
Cache::write('newest_posts', $this->getMyData(), 'long');
}
return $result;
}
If you don't have a cache engine installed or are aren't wanting to mess w/ your own server, there are companies that you can use for cache, and you can just set your cache settings to connect to them. ObjectRocket (Redis) is the one I know offhand, but I'm sure there are plenty.
One of many awesome things about CakePHP, is that in this case, your code doesn't change regardless of Cache type/location/configuration you choose.
How can i cache a particular part of the web page, I know how CI caching mechanism works also i am aware of Partial Caching.
Assume that we have a page with some dynamic data associated with. If i used caching then i cannot get the actual data on page refresh.
How can i override this problem ?
I have one idea in my mind, While inserting the data just keep another field, lets say MD5_CONTENTS which will store the MD5 hash of the contents ( Normally form fields ). And next time on update i can compare the MD5 strings to determine changes. If changes are found then delete the cache file.
I dont know this is gonna work or not, But its littlebit hard for my current implementation.
What is the best method to achieve Partial Caching ?
Thanks
Would the caching driver do the trick?
http://ellislab.com/codeigniter/user-guide/libraries/caching.html
$this->load->driver('cache', array('adapter' => 'apc', 'backup' => 'file'));
if ( ! $foo = $this->cache->get('foo'))
{
echo 'Saving to the cache!<br />';
$foo = 'foobarbaz!';
// Save into the cache for 5 minutes
$this->cache->save('foo', $foo, 300);
}
echo $foo;