Is it possible to ask for all data in my database and make objects from it and save it into an array or something, so I just need to call the database once and afterwards I just use my local array? If yes, how is it done?
public function getAllProjects(){
$query="SELECT * FROM projects";
$result=mysql_query($query);
$num=mysql_numrows($result);
while ($row = mysql_fetch_object($result)) {
// save object into array
}
}
public function fetchRow($row){
include("Project.php");
$project = new Project();
$id=$row->id;
$project->setId($id);
$title=$row->title;
$project->setTitle($title);
$infos=$row->info;
$project->setInfo($infos);
$text=$row->text;
$project->setText($text);
$cate=$row->category;
$project->setCategory($cate);
return $project;
}
If I have for example this code. How do i store the objects correctly into an array, where I grab the data from? And why can't I make more than one object of type "Project"?
Let's ignore the fact that you will run out of memory.
If you have everything in an array you will no longer have the functionalities of a relational database.
Try a search over a multi megabytes, multi dimensional array in php and be prepared for a extended coffee break.
If you are thinking in doing something like that is because you feel that the database is slow... You should learn about data normalization and correct use of indexes then.
And no NoSQL is not the answer.
Sorry to pop your balloon.
Edited to add: What you CAN to is use memcache to store the final product of some expensive processes. Don't bother storing the result of trivial queries, the internal cache of mysql is very optimized for those.
You should use the $_SESSION vars in php, To use them, add a session_start() at the beginning of your code. Then you can set vars with $_SESSION['selectaname'] = yourvar
Nothing prevent you to make a sql query like "SELECT username FROM users WHERE id = 1" and then set a $_SESSION['user'] = $queryresult
Then you'll have this :
echo $_SESSION['user'];
"mylogin"
Related
I have finished writing my php scripts for a project I am doing. My next step is I would like to see if I can improve my code from a memory stand point as some of my scripts eat a lot of memory. I have been doing research on this and one suggestion is to NULL and unset variables, but I never see an example of doing this. So I wanted to give an example of a common action done in my scripts and wanted to know if this is the proper way of doing this:
$query = $dbconn->get_results("SELECT id,name FROM account WHERE active = 1");
if(isset($query))
{
foreach($query AS $currq)
{
$account_id = intval($currq->id);
$account_name = trim($currq->name);
//Code to stuff with this data
//NULL the variables before looping again
$account_id = NULL;
$account_name = NULL;
//Unset the variables before looping again
unset($account_id);
unset($account_name);
}
$query = NULL;
unset($query);
$currq = NULL;
unset($currq);
Would that be the correct way to free up memory? I read the garbage collection in PHP can be lazy, so that is why they recommend to NULL the value as it will shrink it right away.
I know this might be too vague for this site, but if anyone can just let me know if this is the proper way of freeing up memory? Or if there is a different way, can you please provide an example just so I can visually see how it work. Thanks in advance!
Please read up on PHP generators, that is what they are exactly for.
You don't want to fetch all records at once, this would blow holes into your memory like a shotgun.
Instead you want to fetch your records one at the time, process it then fetch the next one.
Here is an example:
function getAccountData(\PDO $pdo)
{
$stmt = $pdo->prepare("SELECT id,name FROM account WHERE active = 1");
$stmt->execute();
while ($row = $stmt->fetch()) {
yield $row;
}
}
foreach (getAccountData($pdo) as $account){
//process the record for each iteration
//no need to unset anything
}
Well, if the function $dbconn->get_results returns an array with all the data, then there is no point in using generators since the memory has already being allocated for the data.
You can also use the mysqli_fetch_assoc function to get one row at a time. It should be more memory efficient then fetching all rows at once. http://php.net/manual/en/mysqli-result.fetch-assoc.php
because a provider I use, has a quite unreliable MySQL servers, which are down at leas 1 time pr week :-/ impacting one of the sites I made, I want to prevent its outeges in the following way:
dump the MySQL table to a file In case the connection with the SQL
server is failed,
then read the file instead of the Server, till the Server is back.
This will avoid outages from the user experience point of view.
In fact things are not so easy like it seems and I ask for your help please.
What I did is to save the data to a JSON file format.
But this got issues because many data on the DB are "in clear" included escaped complex URLs, with long argument's line, that give some issue during the decode process from JSON.
On CSV and TSV is also not workign correctly.
CSV is delimited by Commas or Semilcolon , and those are present in the original content taken from the DB.
TSV format leave double quotes that are not deletable, without avoid to go to eliminate them into the record's fields
Then I tried to serialize each record read from the DB, store it and retrive it serializing it.
But the result is a bit catastrophic, becase all the records are stored in the file.
When I retrieve them, only one is returned. then there is something that blocks the functioning of the program (here below the code please)
require_once('variables.php');
require_once("database.php");
$file = "database.dmp";
$myfile = fopen($file, "w") or die("Unable to open file!");
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
fwrite($myfile, serialize($row));
}
fclose($myfile);
mysql_close();
// Retrieving section
$myfile = fopen($file, "r") or die("Unable to open file!");
// Till the file is not ended, continue to check it
while ( !feof($myfile) ) {
$record = fgets($myfile); // get the record
$row = unserialize($record); // unserialize it
print_r($row); // show if the variable has something on it
}
fclose($myfile);
I tried also to uuencode and also with base64_encode but they were worse choices.
Is there any way to achieve my goal?
Thank you very much in advance for your help
If you have your data layer well decoupled you can consider using SQLite as a fallback storage.
It's just a matter of adding one abstraction more, with the same code accessing the storage and changing the storage target in case of unavailability of the primary one.
-----EDIT-----
You could also try to think about some caching (json/html file?!) strategy returning stale data in case of mysql outage.
-----EDIT 2-----
If it's not too much effort, please consider playing with PDO, I'm quite sure you'll never look back and believe me this will help you structuring your db calls with little pain when switching between storages.
Please take the following only as an example, there are much better
way to design this architectural part of code.
Just a small and basic code to demonstrate you what I mean:
class StoragePersister
{
private $driver = 'mysql';
public function setDriver($driver)
{
$this->driver = $driver;
}
public function persist($data)
{
switch ($this->driver)
{
case 'mysql':
$this->persistToMysql($data);
case 'sqlite':
$this->persistToSqlite($data);
}
}
public function persistToMysql($data)
{
//query to mysql
}
public function persistSqlite($data)
{
//query to Sqlite
}
}
$storage = new StoragePersister;
$storage->setDriver('sqlite'); //eventually to switch to sqlite
$storage->persist($somedata); // this will use the strategy to call the function based on the storage driver you've selected.
-----EDIT 3-----
please give a look at the "strategy" design pattern section, I guess it can help to better understand what I mean.
After SELECT... you need to create a correct structure for inserting data, then you can serialize or what you want.
For example:
You have a row, you could do that - $sqls[] = "INSERT INTOsong(field1,field2,.. fieldN) VALUES(field1_value, field2_value, ... fieldN_value);";
Than you could serialize this $sqls, write into file, and when you need it, you could read, unserialize and make query.
Have you thought about caching your queries into a cache like APC ? Also, you may want to use mysqli or pdo instead of mysql (Mysql is deprecated in the latest versions of PHP).
To answer your question, this is one way of doing it.
var_export will export the variable as valid PHP code
require will put the content of the array into the $data variable (because of the return statement)
Here is the code :
$sql = mysql_query("SELECT * FROM song ORDER BY ID ASC");
$content = array();
// output data of each row
while ($row = mysql_fetch_assoc($sql)) {
// store the record into the file
$content[$row['ID']] = $row;
}
mysql_close();
$data = '<?php return ' . var_export($content, true) . ';';
file_put_contents($file, $data);
// Retrieving section
$rows = require $file;
Can anyone suggest the best solution for creating a recently viewed items/pages logic using codeigniter? I'd prefer to use the codeigniter sessions rather than the standard $_SESSION if possible.
Also - to add to session but once I hit 10 items in the array to remove the oldest item in the array.
$recentlyViewed = $this->session->userdata('recentlyViewed');
if(!is_array($recentlyViewed)){
$recentlyViewed = array();
}
//change this to 10
if(sizeof($recentlyViewed)>3){
array_shift($recentlyViewed);
}
//here set your id or page or whatever
if(!in_array($data['id'],$recentlyViewed)){
array_push($recentlyViewed,$data['id']);
}
$this->session->set_userdata('recentlyViewed', $recentlyViewed);
$recentlyViewed = array_reverse($recentlyViewed);
//var_dump($recentlyViewed);
Now use a foreach on recentlyViewed with an or_where query
I have no idea if this approach will work for you, but you could dump that kind of data into a SQL database with a timestamp and then use the "Ascending"/"Descending" property of a SQL query in conjunction with a "limit(10)" property... it might be too much effort for what you are trying to accomplish, but you could also sync the query data with your CI session object? Dunno, just a thought :D
Say I have this code in PHP :
$query = mysql_query("SELECT ...");
the statement returns a resource object. Normally it would get passed to mysql_fetch_array() or one of the mysql_fetch_* functions to populate the data set.
I'm wondering if the resouce object - the $query variable in this case can be cached in memcache and then a while later can be fetched and used just like the moment it's created.
// cache it
$memcache->set('query', $query);
// restore it later
$query = $memcache->get('query');
// reuse it
while(mysql_fetch_array($query)) { ... }
I have googled the question, didn't got much luck.
I'm asking this is because it looks way much light-weighted than the manner of "populate the result array first then cache".
So is it possible?
I doubt it. From the serialize manual entry
serialize() handles all types, except the resource-type.
Edit: Resources are generally tied to the service that created them. I don't know if memcached uses serialize however I'd guess it would be subject to the same limitations.
The Memcache extension serializes objects before sending them to the Memcached server. As the other poster mentioned, resources can't be serialized. A resource is basically a reference to a network connection to the database server. You can't cache a network connection and reuse it later. You can only cache the data that gets transmitted over the connection.
With queries like that, you should fetch all rows before caching them.
$all_results = array();
while ($row = mysql_fetch_array($query)) {
$all_results[] = $row;
}
$memcache->set('query', $all_results);
Modern database drivers such as MySQLi and PDO have a fetch_all() function that will fetch all rows. This simplifies things a bit.
When you retrieve that array later, you can just use foreach() to iterate over it. This doesn't work with very large query results (Memcached has a 1MB limit) but for most purposes you shouldn't have any problem with it.
how can i store an object in the $_GET array in PHP. i want to pass an object containing database information from one page to another page via the $_GET array, so there would be no need to access the database again.
To pass any sort of object, you'd have to serialize it on one end, and unserialize it on the other end.
But note that this will not work for database connections themselves : the connection to the database is automatically closed when a PHP script ends.
You could pass some connection informations, like login, host, or stuff like that (but that would not be a good idea -- quite not safe to expose such critical informations !) ; but you cannot pass a connection resource.
Really, you should be passing data from one page to another via the $_SESSION variable instead, if possible. That is what sessions are for. Ideally just store an id in the session and look up the data on each page. If you do use $_SESSION then it is as simple as ...
$_SESSION['myarray'] = $myarrayobject;
$_SESSION['someotherthing'] = 42;
If you have to use $_GET, then I would recommend just passing an id of some kind, then re-looking up the data on each page refesh.
Keep in mind, it would be easy for a malicious user to change the values that are sent via $_GET between pages, so make sure there is nothing that can be abused in this information.
You would need to serialize it to text (possibly using json_encode), then generate a URL that included it in the query string (making sure that it was urlencoded)
That's very bad idea.
Database were invented to serve each request, while query string were designed to pass only commands and identificators, not tons of data between browser and server
Instead of using get, another possibility to pass something from one page to another is to use the $_SESSION array and then to unset that variable on the other side whenever you're done with it. I've found that to be a pretty good way to do this.
Like everyone else has said though, passing database information can be a bad idea, assuming the information you're passing isn't something like username, first name, etc. If that's what you're passing when you say "database information" then I would store all that stuff in the $_SESSION variable and then destroy their session when they log out. Don't store the entire connection or something.
As has been said before you should avoid (mis-)using GET parameters as a "cache" for overly complex and loooong data.
On the other hand your question is vague enough to assume that you want to transmit only a few values. And your "second" script needs nothing else from the database, in fact it might not even have to check if those values came from a database at all.
In this case extract the values from your database result and append them as parameters to the url. Try to make the parameter list as simple as possible but yet unambiguous. http_build_query() can help you with that.
But keep in mind that you want to keep GET operations idempotent as described in http://www.w3.org/Protocols/rfc2616/rfc2616-sec9.html.
if ( isset($_GET['a'], $_GET['b'], $_GET['c']) ) {
// this doesn't care about the values in the database at all.
echo 'product: ', $_GET['a'] * $_GET['b'] * $_GET['c'], "\n";
}
else {
$pdo = new PDO("mysql:host=localhost;dbname=test", 'localonly', 'localonly');
$pdo->setAttribute(PDO::ATTR_ERRMODE, PDO::ERRMODE_EXCEPTION);
// let's make it a self-contained example (and extra constly) by creating a temproary table ...
$pdo->exec('CREATE TEMPORARY TABLE foo (id int auto_increment, a int, b int, c int, primary key(id))');
// ... with some garbage data
$stmt = $pdo->prepare('INSERT INTO foo (a,b,c) VALUES (?,?,?)');
for($i=0; $i<10; $i++) {
$stmt->execute(array(rand(1,10), rand(1,10), rand(1,10)));
}
foreach( $pdo->query('SELECT a,b,c FROM foo', PDO::FETCH_ASSOC) as $row) {
printf('%s<br />',
http_build_query($row),
join(' * ', $row)
);
}
}