Laravel: Cache just for the script duration - similar to Session::flash - php

Does Laravel have any Caching mechanism to cache something just for the duration of the current script/request? Whether that's using a Cache driver like FileCache or DatabaseCache, or just in-memory cache.
For example, I have some data that is quite violate and changes often, and my scripts gets it in multiple places. So I would like to cache it once I got it the first time, but forget it after the execution of the current script has finished (so on next request it fetches it again).
It would be equivalent of me having some global variable $cache or similar, where I could store for example $cache['options']. Is there something like that in Laravel already?
Thank you.

One way is just following the singleton pattern for the class that does the repeated functionality.
You can also just bind an instance of a class to the Service Container and pull in that dependency where you need to use it.
https://laravel.com/docs/5.4/container
Singleton or instance binding would allow your application to share the same instance of a class anywhere during a single execution.

Related

doctrine: call symfony method based on Entity DateTime age

I'm new to Symfony and Doctrine.
I got a project where I need a method inside a Symfony service to be called with data from the DB whenever a dateTime object saved in that DB table "expires" (reaches a certain (dynamic) age).
As I'm just starting out I do not have any code yet. What I need is a start point to get me looking in the right direction as neither the life cycle callbacks nor the doctrine event listener / dispatcher structure seems to be able to solve this task.
Am I missing something important here or is it maybe just a totally wrong start to my problem which actually can't be solved by doctrine itself?
What came to my mind is a cron-job'ish structure, but that kind of implementation is not as dynamic as required but bound to specific time frames which may be not reactive enough and maybe even immensly decreases the performance in different situations.
If I'm getting your problem right: You want something that executes when a record's datetime expires.
The main problem is that you would have to call PHP based on a DB event which is not straight forward...
One possible solution can be a Symfony command that's executed periodically(using cron) and you select the expired entities and do the required actions.
So as far as I found out doctrine is really not able to do this task in the descriped way. Of course the DB can't react to a part of a record it saved without an external action triggering the lookup.
So what I will propably go with is a shell programm called at.
It actually is something like I (and katon.abel) mentioned. It is able to enter one time crons which are then executed according to the provided time (that I then do not need to save in the DB but just pass it to at).
This way I can easily create the crons via symfony, save the needed data via doctrine and call the callback method via a script triggered by at.

What are the benefits of caching instances of a class?

I have an application in which I repeatedly use the same (big) class. Since I use AJAX for that App, I always have to create a new object of this class. Someone advised me to cache an instance of this class and use it whenever it is required (using apc in a php environment)
What are the benefits of it ? is it really saving some time ?
$this->tickets_persist = unserialize(#apc_fetch("Tickets"));
if (!$this->tickets_persist) {
$this->tickets_persist = new Tickets_Persistance(); // Take long time
apc_store("Tickets", serialize($this->tickets_persist));
}
The benefits are only really realized if you are dealing with a class that has an expensive instantiation cost. If there is things that take a lot of time, memory or other resource being done in the constructor of the class (for example: reading an XML sitemap and building a complex data structure to build your navigation.) you can dodge this by leveraging caching.
It's also worth noting that resources (like database links and such) are not able to be cached and they would have to be re-established after the object is unserialized (here is where the __sleep and __wakeup magic method comes in).
It would only be worth it if your object requires a lot of processing during instantiation. Caching will not help you with "big" objects, it will help you when you want to avoid processing that can be repeated. In your case, it would only be worth it if your construct method required a lot of processing. Let's take an example of how caching would work in the context of a webpage :
On first page load, instantiate and cache the object for x hours
On any subsequent page load for the next x hours, it will directly return the object, without processing the instantiation
After x hours, the cached object will be expired, the next page load will re instantiate the object and re cache it
Your application will behave in the same way, the only difference is that you will "re-use" the instantiation process that has already been done.

how to process stored requests in CI

I have been using a controller method post directly to perform some db and social network operations but im finding a few points of failure between it and the hardware — so I came up with the idea of storing all the request in a db table to be used as a queuing system instead so I can process them in my own time rather than real time
The thing I'm struggling with now is handling my requests . I know this isn't very MVC — but its quick fix.
How do I call another controller's method from within my process queue method? I have tried including the file and instantiating it — then passing it the variables i would have done from the web.
function process(){
$result = $this->mque->get_all();
include('post.php');
$get = new post();
foreach($result->result_array() as $item){
$get->index($item['rfid'],$item['station_id'],$item['item']);
}
}
but i get an error- when i call the normal index method- it runs fine but i get an undefined method error when call it through the instantiated class method- (this is my problem)
Message: Undefined property: post::$db
The why
I am setting the process queue method to run based on a cron job running at a set interval of time.
Originally everything ran to index method of post — but since post::index() can take 10-15 seconds to run and the reader is not multi threaded — someone could use the reader within 7 seconds and the script wouldn't have run completely.
Is there a better way of doing this rather than using my current process method?
update
there is two ways to do this- either use php to fopen/get from the web
or do it sprogramming using $class->method()- i would prefer to do this the first method but dont really see any option with the error i mentioned before
That's easy: you don't have one controller call another. As a rule, if you need something to exist in two different places, you have two options:
Have them both subclass the same object
pro: That way the method is already there
con: You can only subclass one thing, and you have to build your own class loading system (NOT GOOD)
Have a library (or model) which they both share
pro: The method can then be tested better (it is (or it was at one point) easier to unit test models than it is to test controllers), the code can be shared without a custom class-loading syntax.
con: This may involve a little refactoring (but it should be as easy as moving the code from the controller's method to a library's method and then simply calling the library in the public controller method).
Either one of those would solve your particular problem. Personally, because of how CI loads controllers, my preference is to create libraries.
CodeIgniter: Load controller within controller
Is this something that could help you out quickly? Check the bottom reply.

Doctrine 2 EntityManager causes time loss in first request

Recently I integrated Doctrine 2 ORM into CodeIgniter 2. I configured Doctrine 2 as a library and autoloaded it in CodeIgniter. Within a page I instantiate doctrine entity manager in the following way:
private static $em = null;
public function __construct() {
parent::__construct();
$this->em = $this->doctrine->em;
}
And then I start to use the Entity Manager when needed. The issue I have is that in each page request Entity Manager takes some time to initialize (appr. 1 second). This causes the user to wait until the page is loaded. Below you can see some performance results I measured:
BENCHMARKS
Loading Time: Base Classes 0.0166
Doctrine 0.0486
GetArticle 1.0441
Functions 0.0068
Controller Execution Time 1.1770
Total Execution Time 1.1938
The GetArticle function basicly makes an EntityManager->find() call:
$currentart = $this->em->find('Entities\Article', $artid);
I have to wait that 1 second even if I use the EntityManager->createQuery() method.
In every page, I have a time loss of approximately 1 second because of EntityManager's first request.
Is this common?
Does this 1 second come from the fact that EntityManager needs to establish a connection to the DB? The functions/requests after the first request are quite fast though.
The most time consuming thing that Doctrine does is load metadata for your entities, whether it's annotations, XML, or YAML. Doctrine lazy loads the metadata when possible, so you will not see the performance hit until you start using entities. Since the metadata doesn't change unless you make changes in your code, Doctrine allows you to cache the metadata across requests. DQL queries also need to be parsed into SQL, so Doctrine provides another caching configuration for this.
In a production environment you should set these caches up (it sounds like you have already, but for others reading this):
$cache = new \Doctrine\Common\Cache\ApcCache(); // or MemcacheCache
$configuration->setMetadataCachImpl($cache); // caches metadata for entities
$configuration->setQueryCachImpl($cache); // caches SQL from DQL queries
In order to prevent the first page load from taking the full metadata load, you can set up a cache warmer that loads all of the class metadata and save it to the cache.
$em->getMetadataFactory()->getAllMetadata();
Another potential bottleneck is the generation of proxy classes. If this is not configured correctly in a production environment, Doctrine will generate the classes and save them to the file system on every page load. These proxy classes do not change unless the entity's code changes, so it is again unnecessary for this to happen. To speed things up, you should generate the proxies using the command line tool (orm:generate-proxies) and disable auto-generation:
$configuration->setAutoGenerateProxyClasses(false);
Hopefully this helps you out. Some more information can be found at http://www.doctrine-project.org/docs/orm/2.0/en/reference/improving-performance.html#bytecode-cache

What is the best way to create a Singleton Webservice in PHP?

We have a need to access a DB that only allows one connection at a time. This screams "singleton" to me. The catch of course is that the singleton connection will be exposed (either directly or indirectly) via a web-service (most probable a SOAP based web-service - located on a separate server from the calling app(s) ) - which means that there may be more than one app / instance attempting to connect to the singleton class.
In PHP, what is the best way to create a global singleton or a web-service singleton?
TIA
This screams "use a DB SERVER" to me. ;-), but...
You could create an SoapServer and use a semaphore to allow only 1 connection at a time
$s1 = sem_get(123, 1);
sem_acquire($s1);
// soapserver code here
sem_release($s1);
In PHP, there is no such thing as a "global" object that resides across all requests . In a java webserver, this would be called "application level data store". In php, the extent of the "global" scope (using the global keyword) is a single request. Now, there is also a cross session data store accessible via $_SESSION, but I'm trying to highlight that no variable in php is truly "global". Individual values emulate being global by being stored to a local file, or to a database, but for something like a resource, you are stuck creating it on each request.
Now, at the request level, you can create a Singleton that will return an initialized resource no matter which scope within the request you call it from, but again, that resource will not persist across or between requests. I know, it is a shortcoming of php, but on the other hand, the speed and stability of the individual requests help make up for this shortcoming.
Edit:
After reading over your question again, I realized you may not be asking for a singleton database access class, but rather something that can resource lock your database? Based on what you said, it sounds like the database may do the locking for you anyway. In other words, it won't allow you to connect if there is already another connection. If that is the case, it seems kind of like you have 2 options:
1) just let all your pages contend for the resource, and fail if they don't get it.
2) Create a queue service that can accept queries, run them, then cache the results for you for later retrieval.

Categories