Memory management in php - php

My query is how can i store data in secondary memory by fetching it from database and then displaying it in small units using zend framework coding ?

It's hard to work out exactly what you mean. If you are finding database access too slow, then you should use caching to store the results received from the database into files or memory. If you are storing to memory, then the memcached server is very popular.
Zend Framework provides the Zend_Cache component to help you manage storing, retreiving and expiring cached data. It also allows you to easily change the "back end" storage system from file to memcached so that you can compare performance.
Some useful links:
http://framework.zend.com/manual/en/zend.cache.html
http://www.talkincode.com/a-simple-introduction-to-zend_cache-1048.html
http://www.slideshare.net/akrabat/caching-for-performance

You are looking for Zend_Cache and Zend_Paginator with ArrayAdapter

Related

PHP performance (Opcode caching / function volatility)

BACKSTORY
I maintain a spectrum of (web-)applications that all use a large homegrown PHP library. Some of these applications are traditional desktop applications that are used by employees, but others (which are also more relevant to this question), are PHP websites for which performance is becoming a more important issue as the popularity continues to grow.
CURRENT PHP CACHING METHODS
To speed up one of our websites (it's a shop, think of it as thinkgeek.com), i employ memcached to cache certain segments of the website that don't require constantly being dynamicly build (such as the product listing for a certain category).
We also use a pretty much factory-default installation of APC as an OPCode cache.
Both of these methods bring significant improvements to the website's performance, but i'm very much looking to go further down the road of optimisation.
Function Volatility in PHP
Coming from a Database background myself, i'm very fond of how PostgreSQL for example, uses function volatility to get massive performance gains while maintaining reliable and accurate results.
My question is, is there any extension to PHP that allows the developer to mark certain functions (or class methods) as IMMUTABLE? (meaning the result of that function is always the same, when given the same input arguments). This caching extension could then cache the result of that function which should result in massive performance gains when using big libraries of code.
A simple example would be a method such as SomeClass::getWebsiteFooter(); which returns some HTML code that's always the same, unless the website has been altered (in which case the cache would be flushed).
Does something like this exist ? I haven't been able to find anything remotely similar on the market. Are there any other methods of performance improvement that might benefit my situation ?
I would say you have look at php application as a web application and implement several levels of caching.
IMMUTABLE methods - I don't think is good approach. Usage of caching on db level, application level (memcached) is good start.
Then I would suggest caching on view level Smarty caching and caching proxy like Squid or Varnish

Yii Using mongo DB and MySQL at the same time

I'm staring to build a system for working with native languages, tags and such data in Yii Framework.
I already choose MongoDB for storing my data as I think it feets nicelly and will get better performance with less costs (the database will have huge amounts of data).
My question regards user authentication, payments, etc... This are sensitive bits of information and areas where I think the data is relational.
So:
1. Would you use two different db systems? Should I need them or I'm I complicating this?
2. If you recommend the two db approach how would I achieve that in Yii?
Thanks for your time!
PS: I do not intend this question to be another endless discussion between the relational vs non-relational folks. Having said that I think that my data feets mongo but if you have something to say about that go ahead ;)
You might be interested in this presentation on OpenSky's infrastructure, where MongoDB is used alongside MySQL. Mongo was utilized mainly for CMS-type data where a flexible schema was useful, and they relied upon MySQL for transactions (e.g. customer orders, payments). If you end up using the Doctrine library, you'll find that the ORM (for SQL databases) and MongoDB ODM share a similar API, which should make the experimentation process easier.
I wouldn't shy away from using MongoDB to store user data, though, as that's often a record that can benefit from embedded document storage (e.g. storing multiple billing/shipping addresses within a single user document). If anything, Mongo should be flexible enough to enable you to develop your application without worrying about schema changes due to evolving product requirements. As those requirements become more clear, you'll be able to make a decision based on the app's performance needs and types of database queries you end up needing.
There is no harm in using multiple databases (if you really need), many big websites are using multiple databases so go a head and start your project.

Persistence with Zend Framework

I have some data which I rarely have to update. Further I want that data to be very fast to access. What kind of solution do you recommend me in Zend Framework. The options I thaught are a Mysql database, some XML files, or directly writing the data in php arrays... Is there any ORM library I should use?
Since you're already using Zend Framework, why not use Zend_Config and store the data as ini/xml/json/yaml.
That's how Zend already stores your application settings. And if it's really not that much data, just store it in application.ini.
I'd say you can use whatever you want in your backend but then wrap it in Zend_Cache. This way you have some control over a refresh cycle but also the data in a convenient way and fast access.
Don't use an ORM if your aiming at fast access. Use an ORM for easy developing.
The fastest solution is storing this data in a plain PHP array I guess. But it's not the best solution if you ask me.
What kind of data are we talking about? How often and when does is change?
Store your data in a MySQL database but also index it using Zend_Search_Lucene.
Retrieving the data from a Lucene index is pretty fast from my experience
My favorite option in this cases is use Zend Cache. If you want to optimize the response time even more you can use the memcached library http://memcached.org/ . That can be used with Zend_Cache with little effort.

Performance increase using Smarty + Caching?

I am going to start using codeigniter, but since it only offers to cache everything or nothing (which would not work, because I have logins, and other areas which cannot be cached) I was wondering whether it is a good idea to use Smarty.
The only concern I have in this question is speed. (No yes/no smarty general question.)
My Question:
CodeIgniter with some db queries (blog, loading data for pages from the database, etc.)
vs.
CodeIgniter + same db + smarty + partial caching (and of course if smarty->is_cached(.tpl) do not do any db requests)
What is fast, what should I use. Are there any smarty-benchmarks I did not see? Starting at how many db request, would you say, smarty improves performance noticeable considering you have to also load the smarty library?
Thanks in advance.
Premature optimization is the root of all evil. I'd suggest not to worry about caching unless your application is done. Then see how it performs by profiling it with xdebug or Zend_Debugger and do some load tests with ab. Use an opcode cache if you can.
If you think the app is too slow then, consider page/partials caching. You dont want caching for caching's sake, but to locate and remove bottlenecks. If you feel comfortable with Smarty and want to use it as template engine, well, use it. If you dont need a template engine, you could also use Zend_Cache with APC or memcached for caching.
Smarty or any template system is another layer of complexity. It comes with overload not with performance increase, even when cached. Its advantages are others, like easiness to develop with.
Why not implementing your own caching method? It's not that hard.
I'm using both Smarty and CodeIgniter in different projects. They are both very fine libraries, but I've never felt the need to combine them.
A caching method could use CI's hooks: pre_system to see, if there is a whole page cached, post_controller to intercept the calls to the views and ... just scanning the CI user guide. There is a hook 'cache_override'. I suppose you could use this too.

XML as a Data Layer for a PHP application

I was wondering how i should go about writing an XML data layer for a fairly simple php web site. The reasons for this are:
db server is not available.
Simple data schema that can be expressed in xml.
I like the idea of having a self contained app, without server dependencies.
I would possibly want to abstract it to a small framework for reuse in other projects.
The schema resembles a simple book catalog with a few lookup tables plus i18n. So, it is quite simple to express.
The size of the main xml file is in the range of 100kb to 15mb. But it could grow at some point to ~100mb.
I am actually considering extending my model classes to handle xml data.
Currently I fetch data with a combination of XMLReader and SimpleXml, like this:
public function find($xpath){
while($this->xml_reader->read()){
if($this->xml_reader->nodeType===XMLREADER::ELEMENT &&
$this->xml_reader->localName == 'book' ){
$node = $this->xml_reader->expand();
$dom = new DOMDocument();
$n = $dom->importNode($node, true);
$dom->appendChild($n);
$sx = simplexml_import_dom($n);
// xpath returns an array
$res = $sx->xpath($xpath);
if(isset($res[0]) && $res[0]){
$this->results[] = $res;
}
}
return $this->results;
}
So, instead of loading the whole xml file in memory, I create a SimpleXml object for each section and run an xpath query on that object. The function returns an array of SimpleXml objects. For conservative search I would probably break on first found item.
The questions i have to ask are:
Would you consider this as a viable solution, even for a medium to large data store?
Are there any considerations/patterns to keep in mind, when handling XML in PHP?
Does the above code scale for large files (100mb)?
Can inserts and updates in large xml files be handled in a low overhead manner?
Would you suggest an alternative data format as a better option?
If you have a saw and you need to
pound in a nail, don't use the
saw. Get a hammer. (folk saying)
In other words, if you want a data store, use a data-base, not a markup language.
PHP has good support for various database systems via PDO; for small data sets, you can use SQLite, which doesn't need a server (it is stored in a normal file). Later, should you need to switch to a full-featured database, it is quite simple.
To answer your questions:
Viable solution - no, definitely not. XML has its purposes, but simulating a database is not one, not even for a small data set.
With XML, you're shuffling strings around, all the time. That might be just bearable on read, but is a real nightmare on write (slow to parse,large memory footprint, etc.). While you could subvert XML to work as a data store, it is simply the wrong tool for the job.
No (everything will take forever, if you don't run out of memory before that).
No, for many reasons (locking, re-writing the whole XML-string/file, not to mention memory again).
5a. SQLite was designed with very small and simple databases in mind - simple, no server dependencies (the db is contained in one file). As #Robert Gould points out in a comment, it doesn't scale for larger applications, but then
5b. for a medium to large data store, consider a relational database (and it is usually easier to switch databases than to switch from XML to a database).
No, it won't scale. It's not feasible.
You'd be better off using e.g. SQLite. You don't need a server, it's bundled in with PHP by default and stores data in regular files.
I would go with SQLite instead, which is perfect for small websites and x-copy style deployments.
XML-based data storage won't scale well.
"SQLite is an ACID-compliant embedded relational database management system contained in a relatively small (~225 kB) C programming library. The source code for SQLite is in the public domain.
Unlike client-server database management systems, the SQLite engine is not a standalone process with which the program communicates. Instead, the SQLite library is linked in and thus becomes an integral part of the program. It can also be called dynamically. The program uses SQLite's functionality through simple function calls, which reduces latency in database access as function calls within a single process are more efficient than inter-process communication. The entire database (definitions, tables, indices, and the data itself) is stored as a single cross-platform file on a host machine. This simple design is achieved by locking the entire database file at the beginning of a transaction."
Everyone loves to throw dirt on XML files, but in reality it works, I've seen large applications use them, and I know of an MMO that uses simple flatfiles for storage and it works fine( by the way the MMO is among the top 5 worldwide, so it's not just a toy). However my job right now is creating a better and more savy persistence layer based on SQL, and if your site will be big SQL is the best solution but XML is capable of Massive (MMO) scalability if done well.
But a caveat is migration from XML to SQL is rough if the mapping isn't easy.

Categories