Storing database connection in a session variable [duplicate] - php

This question already has answers here:
Closed 10 years ago.
Possible Duplicate:
Can't pass mysqli connection in session in php
Many of us have written PHP applications that require databases; mostly MySQL, but I have often used very small MS Access databases for people less technically capable so they can download an tweak them/save backups/etc. on their own (whether this is correct or not, I have no idea).
What I notice, is a lot of time is spent connecting and running some of the same queries. Because of this I had an interesting thought: Storing the connection and possible result sets that are mostly static in a $_SESSION variable to reduce the burden as the user navigates the site.
Obviously doing so requires a lot of consideration. Things like closing the connection when the session is destroyed is just the start.
My question boils down to: Is this really something possible? And if so, what things should I be aware of (besides session fixation, as it is its own problem that applies to all sessions)?

You can't store database connections or result sets in the session, since those are resources, and:
Some types of data can not be serialized thus stored in sessions. It includes resource variables or objects with circular references (i.e. objects which passes a reference to itself to another object).
http://php.net/manual/en/intro.session.php
You can extract a result set into a normal array and store that in the session like any other variable. That would be a fairly typical use case for sessions anyway. Just be careful not to store too much data in the session, as that can become more taxing than fetching it from the database.

Even if you could do this (resource vs. data), this is a bad idea. You'll wind up with lots of concurrent open connections, which will blow your max connections very quickly... especially if its lifecycle is expanded beyond sub 100ms (depending on your queries) to 20 minutes or more. With open connections, something like MySQL also won't be able to reset its memory allocations properly, and the whole system sort of goes to hell. In short, this is not what DBs are for unless the only consumer of your code will be a single user.
As an alternative, I'd highly recommend caching technologies which are designed specifically to reduce database load and obviate connection times. Using something like, at its simplest, memcached will dramatically improve performance all the way around, and you'll be able to specify exactly how many system resources go into the cache -- while letting the database do its job of getting data when it needs to.

You can check permanent connections for the connection part.
http://php.net/manual/en/function.mysql-pconnect.php

You should those in some config file for better use. Sessions are for specific sessions and not for global.

Related

Thoughts on doing MySQL queries vs using SESSION variables? [closed]

Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 8 years ago.
Improve this question
Just curious how other people feel about this. Will appreciate opinions or facts, whatever you got :)
I am working on an application where a lot of info is pulled from MySQL and needed on multiple pages.
Would it make more sense to...
Pull all data ONCE and store it in SESSION variables to use on other pages
Pull the data from the database on each new page that needs it
I assume the preferred method is #1, but maybe there is some downside to using SESSION variables "too much"?
Side question that's kind of related: As far as URLs, is it preferable to have data stored in them (i.e. domain.com/somepage.php?somedata=something&otherdata=thisdata) or use SESSION variables to store that data so the URLs can stay general/clean (i.e. domain.com/somepage.php)?
Both are probably loaded questions but any possible insight would be appreciated.
Thanks!
Your question can't be answered to the point where the answer is applicable everywhere.
Here's why: many web server architectures deal with having HTTP server (Apache, Nginx), serverside language (PHP, Ruby, Python) and RDBMS (MySQL, PostgreSQL) on one and the same machine.
That's one of the most common setups you can find.
Now, this is what happens in your scenario:
You connect to MySQL - you establish a connection from PHP > MySQL and that "costs" a little
You request the data, so MySQL reads it from the hard drive (unless cached in RAM)
PHP gets the data and allocates some memory to hold the information
Now you save that to a session. But by default, sessions are disk based so you just issued a write operation and you spent at least 1 I/O operation of your hard drive
But let's look at what happened - you moved some data from disk (MySQL) to RAM (PHP variable) which then gets saved at disk again.
You really didn't help yourself or your system in that case, what happens is that you made things slower.
On the other hand, PHP (and other languages) are capable of maintaining connections to MySQL (and other databases) so they minimize the cost of opening a new connection (which is really inexpensive in the grand scheme of things).
As you can see, this is one scenario. There's a scenario where you have your HTTP server on a dedicated machine, PHP on dedicated machine and MySQL on dedicated machine. The question is, again, is it cheaper to move data from MySQL to a PHP session. Is that session disk based, redis based, memcache based, database based? What's the cost of establishing the connection to MySQL?
What you need to ask, in any scenario that you can imagine - what are you trading off and for what?
So, if you are running the most common setup (PHP and your database on the same machine) - the answer is NO, it's not better to store some MySQL data in a session.
If you use InnoDB (and you probably are) and if it's optimized properly, saving some data to a session to avoid apparent overhead of querying the db for reads won't yield benefits. It's most likely going to be quite the opposite.
Putting it into the session is almost always a terrible idea. It's not even worth considering unless you've exhausted all other options.
Here's how you tackle these problems:
Evaluate if there's anything you can do to simplify the query you're running, like trim down on the columns you fetch. Instead of SELECT * try SELECT x,y where those are the only columns you need.
Use EXPLAIN to find out why the query is taking so long. Look for any easy wins like adding indexes.
Check that your MySQL server is properly tuned. The default configuration is terrible and some simple one-line fixes can boost performance dramatically.
If, and only if, you've tried all these things and you can't squeeze out any more performance, you want to try and cache the results.
You only pull the pin on caching because caching is one of the hardest things to get right.
You can use something like Memcached or Redis act as a faster store for pre-refetched results. They're designed to automatically expire cached data that's no longer used.
The reason using $_SESSION is a bad idea is because once data is put in there very few take the time to properly expunge it later, leading to an ever growing session. If you're concerned about performance, keep your sessions as small as possible.
Just think about your users(client pc). session takes some spaces to user pc, also session can get lost, may e after closing page, or copying the link and paste it to other browser. God practice there i think just use query, but note something, try as much as possible to reduce number of queries in page, it will slow down your site.

Storing data in global/session variables vs. MySQL queries

A general PHP question about organizing a website: for efficiency purposes, is it better to store data from MySQL queries into global arrays, or to make a new query every time data is needed? I am thinking specifically of a sports stats-oriented website, with a lot of data that does not necessarily change very often.
I have heard that storing the data into arrays is much more efficient, but I don't see how since global variables are only global in the scope of the current PHP page. Ideally, I'd like to populate all my arrays once I start my server. Should I use session variables then? I haven't heard of anybody doing that.
Session variable won't resolve the issue, as the session is not global as well (unless you hack it by setting the same session_id to all visitors).
If you have a lot of traffic and you need to save queries, than use a cache server like memcached or redis.
If you can't install memcached or redis, you can create a PHP file that contains the arrays, and include it in the scripts - e.g. use file caching. The bad thing about this approach is that you will use a lot of memory - the whole data should be read in the memory by any PHP script, by any visitor. So in case the database is not the bottleneck, better keep the queries.

PHP/HTTP non non-stateless

PHP uses cookies, sessions or databases (and ORMs) in order to remember data (so they are not lost after single HTTP request). However, in Java (I mean servlets etc.) there is another solution: in brief you may choose for an object different scopes (how long it exists). Besides of session-scope or simple single HTTP-request "life" (scope), it can "live" during whole HTTP-server runtime and can be initialized at the startup of the HTTP-server.
Data can be therefore shared between different users / sessions, and no database requests are required (causing decrease of efficiency of the whole web-application). (I mean they're not required when HTTP-Server is already running - the object and its state is "remembered").
(And I do as much as I can to decrease SQL requests, using even PHP arrays for frequently read, but actually never modified DB data).
What I need in PHP is a way to:
Remember (store somewhere) data that can be changed and shared between many users, but not into DB
Without using sessions (nor cookies) I want to have multiple data-informations for many requests (etc. AJAX no single, but many requests to the same URL), which of course must be stored somewhere else for some time. For instance, I want to read all data (rows) with a single SQL request, remember them for a short period in PHP, and only then, one by one row, send responses with, say, each row in seperate response into appropriate AJAX function
Anyone can give me some hints how can I achieve this in PHP, preferably easiest possible way?
As a preface to this answer (which I'm sure you've already grasped), PHP's execution model essentially 'restarts' the process between requests and as such storage of anything cross-request in PHP alone is unachievable.
That leaves you with a few options, and they're all really 'strengths' of database:
Use a simple key-value in-memory persistance layer, like memcached or Redis
Use a noSQL solution with a bit more structure (and consistency should this be required) but that's still working in-memory and is comparably quicker than an RDB
Use an RDBMS because it'll work great, and the quantity if traffic you'll need to topple a well designed schema on moderate hardware is probably much higher than you think
HTH

mysql_data_seek versus storing data in array

I have searched for a few hours already but have found nothing on the subject.
I am developing a website that depends on a query to define the elements that must be loaded on the page. But to organize the data, I must repass the result of this query 4 times.
At first try, I started using mysql_data_seek so I could repass the query, but I started losing performance. Due to this, I tried exchanging the mysql_data_seek for putting the data in an array and running a foreach loop.
The performance didn't improve in any way I could measure, so I started wondering which is, in fact, the best option. Building a rather big data array ou executing multiple times the mysql_fetch_array.
My application is currently running with PHP 5.2.17, MySQL, and everything is in a localhost. Unfortunatly, I have a busy database, but never have had any problems with the number of connections to it.
Is there some preferable way to execute this task? Is there any other option besides mysql_data_seek or the big array data? Has anyone some information regarding benchmarking testes of these options?
Thank you very much for your time.
The answer to your problem may lie in indexing appropriate fields in your database, most databases also cache frequently served queries but they do tend to discard them once the table they go over is altered. (which makes sense)
So you could trust in your database to do what it does well: query for and retrieve data and help it by making sure there's little contention on the table and/or placing appropriate indexes. This in turn can however alter the performance of writes which may not be unimportant in your case, only you really can judge that. (indexes have to be calculated and kept).
The PHP extension you use will play a part as well, if speed is of the essence: 'upgrade' to mysqli or pdo and do a ->fetch_all(), since it will cut down on communication between php process and the database server. The only reason against this would be if the amount of data you query is so enormous that it halts or bogs down your php/webserver processes or even your whole server by forcing it into swap.
The table type you use can be of importance, certain types of queries seem to run faster on MYISAM as opposed to INNODB. If you want to retool a bit then you could store this data (or a copy of it) in mysql's HEAP engine, so just in memory. You'd need to be careful to synchronize it with a disktable on writes though if you want to keep altered data for sure. (just in case of a server failure or shutdown)
Alternatively you could cache your data in something like memcache or by using apc_store, which should be very fast since it's in php process memory. The big caveat here is that APC generally has less memory available for storage though.(default being 32MB) Memcache's big adavantage is that while still fast, it's distributed, so if you have multiple servers running they could share this data.
You could try a nosql database, preferably one that's just a key-store, not even a document store, such as redis.
And finally you could hardcode your values in your php script, make sure to still use something like eaccelerator or APC and verify wether you really need to use them 4 times or wether you can't just cache the output of whatever it is you actually create with it.
So I'm sorry I can't give you a ready-made answer but performance questions, when applicable, usually require a multi-pronged approach. :-|

Cache data in PHP SESSION, or query from db each time?

Is it "better" (more efficient, faster, more secure, etc) to (A) cache data that is used on every page load in the $_SESSION array (but still querying a table for a flag to reload the data fresh), or (B) to load it from the database each time?
I'm using the cache method (A), but I'm worried that with hundreds of users, memory could become an issue? It's just simple data, like firstname, lastname, birthday, etc.
With either method, there's still a query being run. Thoughts?
If your data is used on every pages, and is the same for all users, I wouldn't cache it in $_SESSION (which means having a different copy of that data for each user), but with another mecanism, like :
file
In memory, with APC for instance (if only 1 server)
In memory, with memcached, for instance (if you have several servers)
If your data requires long calculations or several DB queries to be obtained, caching it in database could be another possibility (would mean only 1 query to fetch back, and less calculations)
If your data is not the same for each user (which seems to be the case in your situation, as you are caching names, birthdates, ...) :
I would make sure I only cache what is necessary
Once you only have a few data to cache, putting it in session should be quite OK
If you really have that many users, you'll probably have some other scalability problems, and will most likely come to use something like memcached anyway ; which means you'll have some other way of caching ;-)
As a sidenote : if you are doing the same query over and over again, you DB server should cache it by itself (for MySQL, it would go into the "query cache") ; so, it would not be as bad as you think, I suppose -- even if not that much optimized ^^
It depends on what you're session handler is. Your session handler could be MySQL, and thus the question would not be which is better, but how to optimize your session handling.
The default PHP session handler is files, but it can be changed to mysql quite easily.
If you're talking about non-user specific data, then just save it to the DB. Worry about optimizing if you run into problems later. It is usually much more beneficial to use a better design pattern then thinking about optimizing before hand. Design your code so you can easily use a different handler for storage, and you won't have optimizing problems later.
If it is user specific, use the session, but use an appropriate session handler if necessary.

Categories