I'm quite new on php and am trying to learn it.
My question is rather simple but I've been a bit lost while googling for it.
I just want to create an object to manage the database connection. I already did the object and now the problem I'm facing is:
How do i keep it instanced to a session? (So that i don't need to open/close the connection to the database on every page load) And how do I call it afterward?
Are there any way to declare a destroyer, so that when the instance is dying, the connection to the database gets closed?
If you define a __destruct() method on your object, it'll get called when the object is about to be destroyed. You can use this to close your database connection.
You can also use __sleep() and __wakeup() for serializing your objects. They'll get called automatically upon serialization and unserialization. You could use these methods to connect and disconnect as required.
I hate to answer by refuting the question but I think you are going about resolving the problem in the wrong way here.
Rather than worrying about the connection being opened/closed I would suggest using caching, indexing, etc. to solve performance problems when they arise rather than being concerned about the resources involved in establishing a connection.
If you are really concerned about performance why not cache the affected pages and avoid using the database connection at all?
I think you could get the desired effect with this function (don't use this myself, assuming mysql) but be sure to read the comments:
http://www.php.net/mysql-pconnect
I don't think you want to start using sleep/wakeup techniques as to get this working as I understand it would involve creating a whole bunch of separate threads each with its own database connection which will just sap your resources and produce the opposite of the intended effect.
Related
I'm trying to pass a session with mysqli connection for multiple queries on the site, but when I try to do a query it outputs a warning "Couldn't fetch mysqli"
$_SESSION['db']=new mysqli($host,$username,$password,$db);
Is it impossible to pass a mysqli connection reference thru session? is there a different method to use?
Yes, it is explicitly impossible.
See PHP Documentation here mentioning in a highlighted Warning: "Some types of data can not be serialized thus stored in sessions. It includes resource variables or objects with circular references (i.e. objects which passes a reference to itself to another object)."
MySQL connections are one such kind of resource.
You have to reconnect on each page run.
This is not as bad as it sounds if you can rely on connection pooling via mysql_pconnect(), but first see more background info on mysql_pconnect() in this article.
Database connections are resources and can't be stored in a session. You'll have to rebuild the connection in every page, or use a persistent connection (PHP 5.3+).
Yeah - you can't pass it through a session. The handle is specific to the request. You might be able to put it in a shared resource like memcache, and fetch the handle when you need it.
However, if your using connection pooling, grabing a new handle when you need (and closing it when your done) isn't a large performance hit.
Always depends on your needs, but yeah, I'd either:
Create a new handle when you need it /per request/ (turn on connection pooling)
Stick the dbhandle in memcache
I would like to know if it's correct to re-use a PDO db connection multiple times ?
For example, I set it up in my controller then pass it as a parameter in a constructor of a class, is it correct to call the same connection (by using a function like get_Database) thoughout all my functions in the class and even pass it as a parameter in another class construct to continue working with the same connection ?
Or should I reopen a connection at some point ?
I was able to get it working by simply passing it around, however I am not quite sure if this would be performing well when going live.
Yes, you should reuse the connection.
Or should I reopen a connection at some point ?
The only reason to open a new connection is if connecting to another database. Otherwise, throughout a single script only one connection should be used.
To achieve this it is important to try and avoid using a static singleton throughout your application, rather learn about dependency injection to design your code to share the same PDO instance to every function or class that needs it.
however I am not quite sure if this would be performing well when going live.
As commented, if you reopen a connection often, it will be a lot slower.
There are many dependency injector's out there, and it is almost certainly a matter of opinion, but I like Auryn. Learning about it should help you design code where it is easier to share a single PDO instance, amongst others.
Yes, this is OK and it is better than connecting to the database many times.
The documentation even suggests to use the opened connection between calls to your php script / application:
Many web applications will benefit from making persistent connections to database servers. Persistent connections are not closed at the end of the script, but are cached and re-used when another script requests a connection using the same credentials. The persistent connection cache allows you to avoid the overhead of establishing a new connection every time a script needs to talk to a database, resulting in a faster web application.
I'm refactoring PHP code to move from mysql_* and mysqli_* functions to PDO, and would like to accomplish this as efficiently as possible.
I use a file I call 'core.common.php'. I initializes the PDO database access object, has $_SESSION management features, and other features such as generic message and error notification functions (in the form of s.) I use the MsgBox() and ErrBox() functions extensively while developing and debugging. This file gets included at the top of every PHP-generated web page.
I have managed, at least through some testing, to successfully pass the PDO object (by injecting the PDO object in the __construct method) to classes that require database access.
It seems to me that this approach, while it still works so far, would only apply to each visitor of the site... Each visitor can use the same PDO connector throughout all pages they visit on the site.
My real questions are...
What happens when there are many visitors ?? Does each get their own PDO instance ? Would this mean that there would be many instances of database connections ?
The reason I'm asking is the host I currently use has "limited my resource usage" due to... according to them... "excessive resource usage". It is a "shared server". They suggest upgrading to a VPS (Virtual Private Server), and, of course at additional cost.
Is the host scamming me for more $$ ??
What, in the eyes of the Pros here, would be my best approach to this issue ?
And... Absolutely for all... Be critical and specific.
Any ideas are greatly welcomed !
What happens when there are many visitors ??
Exactly the same thing that happened with mysql and mysqli functions.
Does each get their own PDO instance ?
Yes. In a way.
Would this mean that there would be many instances of database connections ?
Yes. Just like it was with mysql functions.
the host I currently use has "limited my resource usage" due to... according to them... "excessive resource usage".
Make sure there is only one connection per script instance. According to your description it is already so, but just to be sure. That could be only issue with PDO. Also turn off persistent connection if used.
Regarding other aspects of this excessive usage, you better start a separate question, to ask recommendations on how to profile your code to get to the bottleneck.
When I was writing PHP code with PDO and MySQL, I would always create connections when I needed them within functions like so:
result pseudo_function() {
create and open connection
do stuff
close connection
return result
}
Now, I started programming in C, I have seen that pointers are an interesting way to pass the entire connection as parameter to the function. I was wondering if it would be better to pass the connections between functions until the entire user request is served.
To clarify: For one user request, there could be 1-5 calls to a function that then opens a database, fetches data, does something, closes and returns.
Also does it make a difference performance wise if you keep a connection opened?
The "standard idiom" for most PHP code I've seen seems to be "open the connection, and leave it open".
php.net seems to be down at the moment, but these two links might be of interest:
http://php.net/manual/en/function.mysql-pconnect.php
http://php.net/manual/en/mysqlnd-ms.pooling.php
If you're running Apache, perhaps mod_dbd might be a good solution:
http://httpd.apache.org/docs/2.2/mod/mod_dbd.html
Here's a good discussion on the implications of not closing your connections:
Is it totally reckless to leave mysql connection open through a page?
It's better to keep a connection open and perform 5 operations on that connection than opening a new connection every time.
You can also lazy-load your database with a singleton repository pattern; the repository only opens a connection once, upon next invocations it will return a cached result.
If you develop web applications with PHP, it's common (and I suppose the most efficient way) to open the database connection once and close it when the script terminates. Keeping a connection open while other stuff is done does not really produce any overhead or require any actions at all, but reconnecting every time does.
i'm planning on making my own custom session handling functions. i want to save it in the database but i have some doubts.
is it viable or will just slow down my app?
i have to save the session after each set or can i just save it all at once? because i had this idea to put this function in the class destructor method. so, when the program ends, it will save the data in the database.
but how can i use my other class (database class) for this, being sure it won't be destructed before the session class?
if the user quits the connection and the app stops running, the destructor will still be called?
so, what do you guys think? what do you suggest me to do?
I use DB sessions all the time with Zend and Symfony so its definitely viable, there will be a cost of course but most likely nothing significant.
Normally the way these handlers work is to use session_set_save_handler that way it works as normal except for the actual function called which writes the data. However pay attention to the warnings about object destruction.
Yes it will normally be slightly slower than PHP's native session handler, however it shouldn't be noticeable. This is unless you are having problems with file locking issues (like Windows does)
Why would you want to permanently store session data?
Usually people use different session handlers to make application faster (we use memcache for sessions, because our application is quite complex and distributed and we want it to run fast). I consider this requirement as bad application design, if you want to track your visitors in some way, there are a lot of better ways doing it. Or you are using session for the things it is not quite intended/suitable for. Of course i can imagine that there might be that kind of requirement, just i dont think that this is the case.