I would like to create a set of persistent objects that load their state from the database and are then persisted in memory for Wordpress/PHP page loads to use as cached memory objects. I would imagine an interface for these objects to include:
initialise() - load state from database and perform any other initialisation functions needed prior to servicing requests
getter_foo() - a series of getter methods for PHP code to call for memory cached responses
getter_bar() - a series of getter methods for PHP code to call for memory cached responses
update() - called by time or event driven processes that ask the object to go back to the database and refresh its state
The two tricks I suspect are:
Have the main PHP process alloc and hold the memory reference for these objects so that they remain pinned to memory across web transactions/requests without needing to reinitialise each time against the database
Having a mechanism to allow the transactional processes to gain a pointer to this objects.
Are there any examples of solutions that do this? I've been programming for years but am very new to both Wordpress and PHP so maybe this is quite straight forward. Not sure. In any event, I do recognise that technical solutions like redis and memcached might achieve similar goals but in a less elegant and non-contextual way. That said, if there's no easy way to do this I'm happy to use the 80/20 rule. :^)
It's not possible to store data in memory during 1 request, and then read it back from memory during another request using nothing but plain PHP. Sure the PHP process uses memory, but as soon as your request is finished, that part of the memory gets garbage collected. Which means that a second request cannot access that previous part of the memory again.
What you are hinting at, is called caching. Simply put, caching means that you save the output of an expensive transaction for later re-use, to save on the cost of that transaction. What you then use as a backend to store that output is up to you or what you have available. If you want to save it to the RAM, then you would need something like Memcached. You could also store it in regular file, but that is slower because of the hard drive being accessed.
Related
I've been looking into the problems of having persistent data available between pages in PHP. This particularly applies to objects that have been set up in one page that need to be accessed later. It seems this is more difficult than I assumed it would be, but there are several ways this could be done, although they all seem a bit awkward to use especially when the data gets quite complex:
Passing the data via $_GET or $_POST to the next page
Copying the data to a database and retrieving it in the next page
Putting the data in a session or cookie
Serializing the object and recreating it with the same parameters and values
These all seem quite laborious as they mostly rely on having to deconstruct your existing data structure and then rebuild it again on the next page. I assume this is to reduce memory requirements of the PHP server by purging data from one page as soon as its closed and starting with a 'clean slate'.
Is there a more direct way of passing larger data structures between pages in PHP?
Many thanks,
Kw
I assume this is to reduce memory requirements of the PHP server by purging data from one page as soon as its closed
Nope, this is not because of memory efficiency concern. This is because HTTP protocol is stateless. Each request must carry all information that is necessary to fulfill it.
Counter-example to your proposed scenario:
let's suppose Alice visits page A, some objects are created and you want them to be available in page B.
You track a visit to page B.
2.1. But it's not Alice, it's Bob. How do you determine which objects to show and where do you get them from?
2.2. It is Alice again, but the request arrived to another machine from your 1000 server farm. Naturally, you don't have original PHP objects. What do you do now?
If you use $_GET or $_POST you are limited to non-sensitive data and you expose your objects to any user. You don't want that.
Cookies are limited in size
cookies are usually limited to 4096 bytes and you can't store more than 20 cookies per site.
The best way to persist objects between requests (for the same user) is to use Sessions. There are already session save handlers for memcached, redis, mysql etc. You can also write your own if you need something custom.
What is a good method to retain a small piece of data across multiple calls of a PHP script, in such a way that any call of the script can access and modify its value - no matter who calls the script?
I mean something similar to $_SESSION variable, except session is strictly per-client; one client can't access another client's session. This one would be the same, no matter who accesses it.
Losing the value or having it corrupted (e.g. through race conditions of two scripts launched at once) is not a big problem - if it reads correctly 90% of the time it's satisfactory. OTOH, while I know I could just use a simple file on disk, I'd still prefer a RAM-based solution, not just for speed, but this running from not very wear-proof flash, endless writes would be bad.
Take a look at shared memory functions. There are two libraries that can be used to access shared memory:
Semaphores
Shared Memory
For storing binary data or one huge String, the Shared Memory library is better, whereas the Semahpores library provides convenient functions to store multiple variables of different types (at the cost of some overhead, that can be quite significant especially for a lot of small-sized (boolean for example) variables.
If that is too complex, and/or you don't worry about performance, you could just store the data in files (after all, PHPs internal session management uses files, too....)
A good alternative to using a database would be memcache!
I've done a fair bit of PHP over the years but I'm currently learning ColdFusion and have come across the Application.cfc file.
Basically this is a class that's created once (has an expire date). The class handles incoming users and can set session variables and static memory objects, such as queries. For example I can load site wide statistical data for one user in another thread from the Application.cfc. Something that would usually take a few seconds for each page would make the whole site quick and responsive.
Another example (just for clarification).
If I put an incremental variable that's set to 0 in OnApplicationStart this variable can be incremented with each user request (multiple users) or in OnSessionStart without the need to contact the SQL database since it's constantly in the server's memory under this application.
I was wondering if PHP has a similar file or object? Something that can be created once and used to store temporary variables?
The PHP runtime itself initializes the environment from scratch on every HTTP request, so it has no built-in mechanism to do this. Of course you can serialize anything into common storage and then read it back and deserialize on each request, but this is not the same as keeping it in-memory.
This type of functionality in PHP is achieved by outsourcing to other programs; memcached and APC are two of the most commonly used programs that offer such services, and both come with PHP extensions that simplify working with them.
I am writing a stateful web application in PHP in which the state potentially contains lots of objects. At this moment, I am keeping all these objects in $_SESSION and have them serialised at the end of the request. This is a bit of a pain, because serialising the entire session takes seconds, and unserialising it takes even more time.
I wanted to try APC, because I hoped that the objects are then just being memcopied, instead of serialised. Alas, it seems that if I feed apc_store($object) an object, it seems to serialise it anyway before passing it to another process. (The story goes that primitive values and arrays are being memcopied without serialisation, but that is not a relevant solution for me, since my session objects have a lot of (circular) references.)
My question: Is there a known way of keeping objects persistent in PHP without having to serialise them after every request? I've heard rumours that the HipHop interpreter can help with this, but I haven't found any working examples on the net. Can somebody tell me if it is possible in PHP at all?
(I know, I should be writing this project in Java or another language that supports persistent instances without a TCP connection)
Whenever you need to store (freeze) an object, it needs to be serialized. That's independent to the storage (APC, session files, database etc.), it's because the script process will terminate and next time it starts, the objects need to come to life again.
So things can not be kept in a "run-state", objects will always be serialized to be stored.
It's known that PHP serialization is not the fastest. There are alternative implementations to it, you can use, for example igbinary PHP extension. It offers a serialize /deserialize function next to transparent session handling. Maybe this is actually helpful for your scenario.
In any case: The more you store inside the session, the more you need to un-freeze or wake-up at the beginning of the request, the more time it will take.
Related: What is the php_binary serialization handler?
I've inherited a php/js project that creates audio-playing widgets. The requirement is really that they be able to stand up to some pretty heavy load at "peak times": when a new track is first announced there may be a lot of people rushing to play it at once.
Unfortunately, the widgets tend to do pretty badly under such stressful conditions. We had considered that saving and looking up an access key in a SQLite database might have been causing fatal errors due to locking. Experimentally I changed the access keys to be stored in session variables, but I'm now worried this may just be creating a new kind of bottleneck: does every request have to wait for the session to free up before it can go ahead?
I downloaded Pylot and did some basic load tests: it doesn't take many agents trying to access the same widget to make it glitchy or completely unusable, maybe 10 or 20. Ideally we'd like to be able to handle a considerably greater volume of traffic than this. What strategies can I sensibly adopt to be able to field many times more requests?
A PHP file-based session will lock the session file until the script exits, or you call session_write_close(). You can do a quick session_start(); session_write_close(). The $_SESSION array will still be available, but any subsequent changes will NOT be written to disk, as PHP has been told the session is closed.
Store your session into a database that does only locking while wrtiing on the concrete session id (primary key that is) in a database that is supporting MVCC, like MySQL and the InnoDB backend. You can further optimize this by improving the file-system beneath it.
This done you might run into race-conditions but not into lockings. Have fun!